crysis...hmm...if it was cpu bottlenecked then why is everyone getting higher framerates with lower settings..msfs 2020 on the other hand is harshly bottlenecked
ok i'll put is this way when you have a cpu bottleneck and increase settings framerate stays the same. because the cpu is taxed...but having a cpu utilization of 25 percent does not equate to a bottle neck...i'm starting to think most people don't even know what a bottleneck is they seem to think its when framerates are low and cpu isn't being used fully.
and tbh there is always a cpu bottleneck present in almost every game according to you guys and your standards...like if you ran just cause 2 at 1080p all low and got 300fps and then lowered the resolution to 800x600 and still got 300fps then your cpu bottlenecked......get my drift
-
Biker Gremling and JRE84 like this.
-
JRE84 likes this.
-
yrekabakery Notebook Virtuoso
-
Even the XBox One and PS4 were 8C/8T (2 x 4 core )
-
Sure, PC-exclusive simulators etc are an obvious exception.hfm likes this. -
Kunal Shrivastava Notebook Consultant
Aren't dx12/vulkan API supposed to directly address 6+ cores & low ghz Cpus?
-
The PC ports are just going to expose more graphical options and ability to turn things up or down manually instead of locked settings for a 100% known platform capability. I'm still gaming just fine on a 4C/8T 15W CPU.. most things are still GPU bound (at my 1600P res...) I was just running RDR2 benchmarks just to check out the new performance monitor in GFE.. GPU was 99% utilized quite a few times..seanwee likes this. -
I have a personal opinion about the CPU multi-thread use in video games. CPU and GPU have incredible different tasks in video games, while for a GPU is pretty natural to multi task the rendering of frames, CPU has a much more sequential way of thinking, more like the human thinking. We definitely need another way of computing programming to change that. When you see consoles and PC going on more cores per CPU that helps only because the platform can run more tasks on the background: broadcasting, online trophies, multiplayer communication, etc. But the more cores doesn't help much more to increase fps. Until the computer programming keeps working the way it works video games will keep using 2/3 cores max and will get more benefit on higher CPU frequencies and CPU cycles efficiency.
-
etern4l likes this.
-
The way to use up all those cores would be to apply them to non-graphics tasks, that wouldn't be actually better done by the GPU. Simulations perhaps. Of course, this part of the code would have nothing to do with adjustable graphics. It would have to run on all platforms or the game would be non-deliverable to slower hardware, hence the need to scale any such design ambitions to console reality (or just do a PC exclusive, usually if some additional rationale exists, e.g. a mouse or keyboard is required). Additionally, simulations might require a lot of RAM, which again, is typically in short supply on the consoles, or lower end PCs to which this argument applies similarly.
In summary, a modern gaming laptop or desktop is very unlikely to be bottlenecked by the CPU in the vast majority of games. Hope that makes sense.Last edited: Sep 20, 2020 -
Funnily enough, moving forward shunt modding will be more useful on laptops than on desktops.
2% performance improvement with all 5 shunt resistors modded. Performance cap is Vrel and VOp so basically the cards are bumping up against nvidia's set voltage limit.hfm likes this. -
My CPU is starting to show it's weaknesses in a select group of games though. As long as I can maintain something close to 60FPS I'm ok, as I don't play any competitive multiplayer. I'm pretty much 100% SP games where it's mostly slower paced. If I cared about competitive MP games I wouldn't be using this setup and would have a loud and heavy notebook with a high refresh panel or find a way to talk my wife into letting me jam a desktop into the office desk area (with probably no success lol, maybe someday we'll buy a bigger place and I'll have my own office room). All i need is s new iteration of something like the Gram 17 to have a 20% boost in CPU for me to be ok with it. It's really GPU where I need the muscle so I can crank up some more details and maintain 1600p 60fps..
I'm even ok with lower than 60 in some games.. RDR2 is hovering around 40ish with some decent detail cranked and it's good enough to not notice it 95% of the time. But the CPU is hovering around 60-85% sometimes a little higher.. It's teetering on the edge.Last edited: Sep 20, 2020Prototime likes this. -
Heres an unpopular opinion. I actually welcome the high tdp of the Ampere gpus with open arms.
Why? Because it forces laptop makers to innovate and create new and better cooling systems to handle the higher tdp parts. That is, provided nvidia doesn't stick a pacifier in their mouths, severely gimp performance and call it a day.
Either way, they cant sell products if its only marginally better than last gen so i expect tdps to go up across the board. I'd love to see 300w thin and lights and 450w medium sized laptops, perhaps this is the generation we see gallium nitride ac adapters hit the mainstream.
And with the beefed up laptop designs, we might finally see true desktop parity in laptops with 5nm RTX 4000 gpus, maybe even a xx80ti/xx90 gpu in laptops. Laptop manufacturers aren't going to go backwards and discart all their prior innovations, improved deigns will be carried forward and even improved upon. -
thegreatsquare Notebook Deity
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-8700+@+3.20GHz&id=3099
https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+5+4600H&id=3708
https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+9+4900HS&id=3694
...The whole reason I went with the G14 was to circumvent CPU bottlenecks in nextgen ports. Outrunning the nextgen consoles is just like outrunning a bear, You don't have to outrun the bear, you just have to outrun the person behind you. ...I just needed to be ahead of Ryzen2 8c/16t @ 3.5ghz.
... but, seriously, I'd say that all mobile CPUs paired with Ampere should match or beat the i7 8700 ...the 10850h matches it now, so a 11750h should get there or better..
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-10850H+@+2.70GHz&id=3734JRE84 likes this. -
One of many.
Userbenchmark should no longer be used after they lowered ...Last edited: Sep 21, 2020 -
thegreatsquare Notebook Deity
I knew they nerf'd multicore and reweighted tests, but knowing that and understanding that AMD is better than it's scoring ...when it's gimp'd score is still trouncing the other CPU, I can still make some use of that. -
Sorry, but if you nerf multithreaded performance, you throw off the scales and the test becomes pointless as a way to relay needed information (which also puts into question their other tests as well).Last edited: Sep 22, 2020 -
etern4l likes this.
-
Todays question will be... How far behind will the 3080 notebook cards come vs. desktop cards. Forget seeing 20GB GDDR6X vRam in notebooks...
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-250#post-11048082
Report: Why The GeForce RTX 3080's GDDR6X Memory Is Clocked at 19 Gbps tomshardware.com
Blame the heat...
German publication Igor's Lab has launched an investigation into why Nvidia chose 19 Gbps GDDR6X memory for the GeForce RTX 3080 and not the faster 21 Gbps variant. There are various claims, but it's not entirely clear how exactly some of the testing was conducted, or where the peak temperature came from.
According to the results, the hottest GDRR6X memory chip had a Tjunction temperature of 104C, resulting in a delta of around 20C between the chip and the bottom of the board. Another interesting discovery is that the addition of a thermal pad between the backplate and the board helps drop the board temperature by up to 4C and the Tjunction temperature around 1-2C.Normimb, Biker Gremling, etern4l and 2 others like this. -
The picture is from this article: https://wccftech.com/undervolting-ampere-geforce-rtx-3080-hidden-efficiency-potential/
Yes, I know this is just one article, but it suggests that Ampere can be pretty aggresively undervolted while retaining most of the performance. Based on the picture I linked to, you can get around 97% of the original performance at around 74% of the original power draw. Granted, this test was only performed in one game (Forza Horizon 4), but again, it suggests the performance gap between laptops and desktops may not be as wide as previously thought. Needing an extra 80 watts to get an extra 4 FPS is not worth it in my opinion, and is a negligible difference when you're already getting very high framerates.
So perhaps over 90% of the performance of the desktop cards will be achievable in laptops. This means the gap may get narrower again like we saw with Pascal. I'd say a less than 10% difference is really good, especially considering the much smaller power requirement to achieve that performance level. -
-
OneSickOmen17t Notebook Consultant
I mean my 150w 2080 Super can barely get anything over 60-65*. I'm sure my Omen could cool at least 200w easily.
-
yrekabakery Notebook Virtuoso
- This still leaves an extra 40W before it reaches the ~200W GPU ceiling for the heftiest desktop replacement notebooks. And what about the more normal-sized models with GPUs around 150W?
- Such an aggressive undervolt that can perform with 100% stability cannot be standardized as it requires both a silicon lottery winner and manual end-user tuning.
- As a corollary to the above, desktop GPUs running at much lower temperatures can undervolt better. Higher temperatures both increases the voltage needed for stability as well as the power consumption.
- Mobile Pascal didn't just close the gap, it effectively eliminated it at stock without any end-user tuning, at least in the case of the GTX 1080N.
NuclearLizard likes this. -
Those are some perfectly valid points.
For the Max-P GPUs with higher power budgets, I'd expect those to be pretty close to desktop level performance, but don't quote me on that. I will happily eat my words if I'm wrong. It does seem that the desktop cards are using a lot more power than necessary though, which is what led me to the conclusion that the performance gap between desktop and laptop cards could get narrower vs Turing.
Remember that you can save around 80 watts of power with barely any drop in performance. I do expect the performance loss to become greater as the power drops further, but I think it's still possible to get awesome performance on mobile Ampere. We'll just have to wait for the mobile Ampere GPUs to get concrete statistics on their behavior in laptops. -
http://forum.notebookreview.com/threads/hp-omen-17-2019-rtx-2080-vbios-modding-190w.830760/ -
Beefier and more effective cooling systems will enable laptop manufacturers to cram ever more powerful parts into laptops.
There's a LOT of room for laptop cooling to improve still.Clamibot and NuclearLizard like this. -
I said this before when Ampere was announced: you should be happy if you get desktop 2080Ti performance with ampere equipped laptops. The power draw is insane and even with the 65% power draw = 98% performance thing, you’d be ~200W which is only attainable by the beefiest of machines, which are generally not the most commonly sold computers.
-
-
Kunal Shrivastava Notebook Consultant
Allegedly there's a 10% drop in performance,so there goes the performance gap we saw claimed with pascal desktop and laptop.
The ampere card still runs at 300+ Watts after the driver update.
Secondly, the samsung 8nm node has poor yields out of the box: 75-80%.
Shockingly they made a statement recently that your chances of getting a 'good' or 'great' GPU are better than getting an 'okay' GPU. Reading between the lines of PR marketing, nvidia is not confident with the node.
The fact that even RTX 3090 which is a Titan level card is stuck with 8nm should say heaps about nvidia trying to cut corners here. There are confirmed reports that nvidia were in talks with TSMC to try and shove in 7nm somehow on their higher end GPUs but instead had to go the Samsung route because TSMC was charging per die and Samsung per wafer, if they had a working chip at the end.
These cards are also priced pretty aggressively so distributors are trying to increase margins by opting for second grade non-ceramic capacitors- another cheap move by nvidia because AMD clearly has a much superior node with TSMC 7nm+.
Nvidia made in house capacitors until Turing but why did they abandon it with Ampere? Because they want to retain maximum profit margins per board.
Everything they pulled, including buying ARM is a knee jerk reaction to AMD, who genuinely stand a chance to win this gen if they price the 6800/6900xt sensibly.(look at the consoles!)
Lisa Su single handedly dethroned Intel with Ryzen, and now they are gunning for Nvidia. Ampere and ARM are Nvidia desperately trying to cling to the performance crown-Although if Nvidia plays their cards right they might have a winner with ARM and their AI tech.
As for the GPUs, it's time to see what Big Navi can do. It's got a better chance even on laptops. -
But since the first launch config (which didn't allow AIBs to do any testing at all it seems) was too aggressive and it's been scaled back a tad, people are considering there was something "taken away" from them. In reality, if there was proper time for testing they probably would have launched with this configuration to begin with. Seems Ampere is "living on the edge" moreso than Turing was.
nVidia seems to have bungled this by not giving AIB time to do things properly.Prototime likes this. -
-
You'll get enough time searching for leaks. We won't see 30xx series Mobile graphics until Q2 / 2021
I can't see a single reason the ODM/OEM will release Ampere before Tiger lake-H BGA Mobile is out for the Jokebooks.
https://www.computerbase.de/2020-10/intel-tiger-lake-h-acht-kern-cpu-q2-2021/JRE84 likes this. -
Kunal Shrivastava Notebook Consultant
If a 100 mhz overclock is causing crashes, clearly ampere was rushed? We have the evidence of laptop performance with Turing already, it'll be very surprising if ampere gives even 2x that, tier for tier(unless of course they find a way to up the wattage and lower the temperatures on laptops) .hfm likes this. -
-
Kunal Shrivastava Notebook Consultant
-
Seems legit. -
Kunal Shrivastava likes this. -
RTX 3080 performance when undervolted
Seems like 250w will be the sweetspotNormimb likes this. -
Normimb likes this.
-
-
Need to add in this
NVIDIA GeForce RTX 3080, 3070 and 3060 Mobile to be paired with Intel 10th Gen Core CPUs? videocardz.com
3080 Max-Q = 3070 on GA104 / Desktop 3080 on GA102 (A neat disgusting change from Pascal and Turing).
Yeah, Nvidia enjoy screw the notebook customers. Why change the mobile surname when M was good enough? Max-Q as I have said the whole time was a awful idea by Nvidia.
@Meaker@Sager you said Max-Q due the same silicon (specifications) vs desktop cards hence Nvidia could charge damn high price for Max-Q although it have crippled performance. Will we now get cheaper priced Max-Q Jokebooks due the innferior silicon and lower performance vs. same destop cards from same so called top tier cards? Or is Nvidia back to their disgusting practice? Yeah, I know this is rumors, but still.... @Mr. Fox
1080 Max-Q = 1080 on GP104
2080 Max-Q = 2080 on TU104
3080 Max-Q = 3070 on GA104 / Desktop 3080 on GA102
And there's also a big possibility that NVIDIA utilizes slower GDDR6 dies rather than GDDR6X to power the RTX 3080 Max-Q graphics card. Damn nice!
The GA102 GPU is not expected to come to mobile form factor, although NVIDIA has made a commitment to offer desktop and mobile SKUs with similar specifications. In reality, though, those SKUs are never the same, spec and performance-wise since they have different TGP targets.
With this move, Nvidia can come up later with a Super refresh with the GA102 silicon. And change to GDDR6X 10GB vRam (but slower speed vs. the desktop ram) as for todays desktop cards while Nvidia already prepare for more and faster vRam for the desktop refresh. Yeah, Nvidia continue treats notebooks like unwanted stepchildren! On top behave as liars (NVIDIA has made a commitment to offer desktop and mobile SKUs with similar/same specifications). What a letdown/and what a betrayal if this is correct.
Yeah, I expect my predictions on this Mess will come true, LOOL
Last edited: Oct 25, 2020Normimb, TBoneSan, Mr. Fox and 1 other person like this. -
Question is, how will nvidia nerf the 3070 mobile now that it shares the same GA104 die as the 3080 Max-q? Maybe GDDR6X GA104 for the 3080 Max-q and Gddr6 GA104 for the 3070 mobile.
The simplest thing to do would be to completely boycott the 3080 if that comes to fruition. And for that we need the help of reviewers if possible.Papusan likes this. -
Or maybe swap to GA102 for the refresh is enough?
Though, they have SK Hynix's GDDR6 memory rated at 1.2V with speeds of 12 Gbps up their sleeve. Aka lower powered/slower dies than the desktop variant.
I can't see a low powered GA104 with lower clocks than 3070 desktop cards is a proper upgrade over 2080 Mobile if this will be the 3080 Max-Q Sku.Last edited: Oct 25, 2020 -
I'm not sure it really matters, as long as it's some iterative improvement over the 20x0 series notebook silicon people will buy it. I mean there really won't be a choice for people buying a NEW notebook, it'll all just be whatever TDP they can fit from the 30x0 parts that are avaialble. If you're looking to upgrade from your current Pascal or Turing notebook, I guess it would be is the improvement across the board (CPU, features, panel, and GPU, SSD) as a entire package, worth it.
Papusan likes this. -
The super refresh is a brilliant idea to sell more cards later in the life cycle by holding back the performance of the launch cards and I hate how brilliant it is.
That said, I'll be watching from the sidelines this gen, its going to be bumpy ride. -
hertzian56, Normimb, seanwee and 1 other person like this.
-
What is better than throw out a cut down GA102 as the top tier when Nvidia find the right time to help the notebook manufacturers with boosting sales next time? Several ways to screw the notebook users. They will buy whatever they being offered as long it perform slightly better than it's predecessor.
The sku branded Max-Q was a huge Scam! And the sad part.... Some still don't see it. Nvidia just can't differentiate Max-Q and Max-P silicon. Both crippled vs. desktop cards. The only difference will be higher TGP and slightly higher clocks for the Max-P variant. See... Back to old days but now with two Mobile cards instead for 1 and with minor differences.
I wonder why so many people out there still don't feel or can see that they become scammed by greedLast edited: Oct 25, 2020 -
Max-P versions (3070/3080) will probably be only in the big three (Area-51m R3, Clevo X170SM-G, MSI GT76)
I could see 3070 Max-P in thinner 17" models too. Depends how good they can bin them.
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.