The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    How will Ampere scale on laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    crysis...hmm...if it was cpu bottlenecked then why is everyone getting higher framerates with lower settings..msfs 2020 on the other hand is harshly bottlenecked


    ok i'll put is this way when you have a cpu bottleneck and increase settings framerate stays the same. because the cpu is taxed...but having a cpu utilization of 25 percent does not equate to a bottle neck...i'm starting to think most people don't even know what a bottleneck is they seem to think its when framerates are low and cpu isn't being used fully.

    and tbh there is always a cpu bottleneck present in almost every game according to you guys and your standards...like if you ran just cause 2 at 1080p all low and got 300fps and then lowered the resolution to 800x600 and still got 300fps then your cpu bottlenecked......get my drift
     
    Last edited: Sep 19, 2020
  2. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    If the game isn't able to fully utilize a modern CPU due to how the engine is architected I would think that is also a CPU bottleneck, but one that's the fault of the engine relying on pure brute force frequency, not the fault of the CPU being inadequate. It's probably going to be a while though before a lot of the engines are able to saturate 16 threads.
     
    Biker Gremling and JRE84 like this.
  3. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    It's going to be a while for a number of reasons, one of them being games are usually designed with consoles in mind, and PC ports result in pure GPU load (larger textures, better shaders etc). Also, the computations involved are often done more efficiently on the GPU anyway. The point stands: rarely will modern/cutting edge games be truly bottlenecked on a reasonable CPU, even at 1080p. It's predominantly a GPU game (!).
     
    JRE84 likes this.
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    FS2020, Crysis Remastered, and FC5 aren't particularly well threaded, but the other games, and the aforementioned Battlefield V, scale really well on 6+ cores.
     
  5. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Good thing the upcoming consoles are 8C/16T AMD cores. :) Even the XBox One and PS4 were 8C/8T (2 x 4 core )

    It looks like FS2020 they are going to patch within the next 12 months. Supposedly DX11 is gimping them due to limited threading capability. In fact I think they are doing some form of optimization for that in the next patch.
     
  6. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    The upcoming consoles are merely catching up to last year's mobile CPUs. PS4 CPU is clocked at a laughable 1.6GHz. Although I do take the point that the multicore CPU architecture forces a muliththreaded engine design, the overall console system performance is necessarily well below enthusiast PC levels, even at launch, so most game developers will be targeting the latest console specs. The main added value of playing such titles on the PC is in being able to crank up the image quality thanks to a more powerful GPU.

    Sure, PC-exclusive simulators etc are an obvious exception.
     
    hfm likes this.
  7. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Aren't dx12/vulkan API supposed to directly address 6+ cores & low ghz Cpus?
     
    seanwee and hfm like this.
  8. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    That's all that we need, engines and rendering APIs giving us the scenario of taking advantage of more cores and threads. The consoles having 8C/16T is just going to be nothing but a good thing for the PC community where that is common at the enthusiast desktop level.

    The PC ports are just going to expose more graphical options and ability to turn things up or down manually instead of locked settings for a 100% known platform capability. I'm still gaming just fine on a 4C/8T 15W CPU.. most things are still GPU bound (at my 1600P res...) I was just running RDR2 benchmarks just to check out the new performance monitor in GFE.. GPU was 99% utilized quite a few times..
     
    seanwee likes this.
  9. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I have a personal opinion about the CPU multi-thread use in video games. CPU and GPU have incredible different tasks in video games, while for a GPU is pretty natural to multi task the rendering of frames, CPU has a much more sequential way of thinking, more like the human thinking. We definitely need another way of computing programming to change that. When you see consoles and PC going on more cores per CPU that helps only because the platform can run more tasks on the background: broadcasting, online trophies, multiplayer communication, etc. But the more cores doesn't help much more to increase fps. Until the computer programming keeps working the way it works video games will keep using 2/3 cores max and will get more benefit on higher CPU frequencies and CPU cycles efficiency.
     
  10. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    And all of these games are poorly optimised on the cpu front. We need more games using lower level apis like R6 Siege and Doom that are pretty much futureproof and will scale really well with more powerful cpus and gpus.
     
    etern4l likes this.
  11. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    The thing is that all those higher image quality options result mostly with increased GPU load, at the same applying downward pressure on the FPS. Any reduction in FPS would further reduce the load on the CPU.

    The way to use up all those cores would be to apply them to non-graphics tasks, that wouldn't be actually better done by the GPU. Simulations perhaps. Of course, this part of the code would have nothing to do with adjustable graphics. It would have to run on all platforms or the game would be non-deliverable to slower hardware, hence the need to scale any such design ambitions to console reality (or just do a PC exclusive, usually if some additional rationale exists, e.g. a mouse or keyboard is required). Additionally, simulations might require a lot of RAM, which again, is typically in short supply on the consoles, or lower end PCs to which this argument applies similarly.

    In summary, a modern gaming laptop or desktop is very unlikely to be bottlenecked by the CPU in the vast majority of games. Hope that makes sense.
     
    Last edited: Sep 20, 2020
    hfm and JRE84 like this.
  12. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Funnily enough, moving forward shunt modding will be more useful on laptops than on desktops.



    2% performance improvement with all 5 shunt resistors modded. Performance cap is Vrel and VOp so basically the cards are bumping up against nvidia's set voltage limit.
     
    hfm likes this.
  13. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Yep that was exactly my point in where I state a 4C/8T 15W CPU is still serving me just fine for most gaming purposes. But that doesn't mean I don't want to see this multi core usage proliferate as it's the only way to move forward. We're not going to be hitting 6-7GHz sustained any time soon for the normal gamer. People overclocking to the edge of stability is an extreme edge case.

    My CPU is starting to show it's weaknesses in a select group of games though. As long as I can maintain something close to 60FPS I'm ok, as I don't play any competitive multiplayer. I'm pretty much 100% SP games where it's mostly slower paced. If I cared about competitive MP games I wouldn't be using this setup and would have a loud and heavy notebook with a high refresh panel or find a way to talk my wife into letting me jam a desktop into the office desk area (with probably no success lol, maybe someday we'll buy a bigger place and I'll have my own office room). All i need is s new iteration of something like the Gram 17 to have a 20% boost in CPU for me to be ok with it. It's really GPU where I need the muscle so I can crank up some more details and maintain 1600p 60fps..

    I'm even ok with lower than 60 in some games.. RDR2 is hovering around 40ish with some decent detail cranked and it's good enough to not notice it 95% of the time. But the CPU is hovering around 60-85% sometimes a little higher.. It's teetering on the edge. :)
     
    Last edited: Sep 20, 2020
    Prototime likes this.
  14. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Heres an unpopular opinion. I actually welcome the high tdp of the Ampere gpus with open arms.

    Why? Because it forces laptop makers to innovate and create new and better cooling systems to handle the higher tdp parts. That is, provided nvidia doesn't stick a pacifier in their mouths, severely gimp performance and call it a day.

    Either way, they cant sell products if its only marginally better than last gen so i expect tdps to go up across the board. I'd love to see 300w thin and lights and 450w medium sized laptops, perhaps this is the generation we see gallium nitride ac adapters hit the mainstream.

    And with the beefed up laptop designs, we might finally see true desktop parity in laptops with 5nm RTX 4000 gpus, maybe even a xx80ti/xx90 gpu in laptops. Laptop manufacturers aren't going to go backwards and discart all their prior innovations, improved deigns will be carried forward and even improved upon.
     
    unlogic and etern4l like this.
  15. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    ...for Intel, sure.

    https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-8700+@+3.20GHz&id=3099

    https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+5+4600H&id=3708

    https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+9+4900HS&id=3694

    ...The whole reason I went with the G14 was to circumvent CPU bottlenecks in nextgen ports. Outrunning the nextgen consoles is just like outrunning a bear, You don't have to outrun the bear, you just have to outrun the person behind you. ...I just needed to be ahead of Ryzen2 8c/16t @ 3.5ghz.

    ... but, seriously, I'd say that all mobile CPUs paired with Ampere should match or beat the i7 8700 ...the 10850h matches it now, so a 11750h should get there or better..

    https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-10850H+@+2.70GHz&id=3734
     
    JRE84 likes this.
  16. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Passmark Performance Test isn't reliable or trustworthy. I wouldn't buy hardware on basis on their results. And there is a reason, this benchmark won't get green flag at the Hwbot.

    One of many.
    Userbenchmark should no longer be used after they lowered ...
     
    Last edited: Sep 21, 2020
    etern4l and seanwee like this.
  17. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    It is and it isn't.

    I knew they nerf'd multicore and reweighted tests, but knowing that and understanding that AMD is better than it's scoring ...when it's gimp'd score is still trouncing the other CPU, I can still make some use of that.
     
  18. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    You and other 'informed' people can, but the majority of people are NOT that informed and they think its a viable metric.
    Sorry, but if you nerf multithreaded performance, you throw off the scales and the test becomes pointless as a way to relay needed information (which also puts into question their other tests as well).
     
    Last edited: Sep 22, 2020
    hfm and etern4l like this.
  19. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Yeah, those tests are not great. You shouldn't have to be armed with a bunch of caveat info to decode it.
     
    etern4l likes this.
  20. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Todays question will be... How far behind will the 3080 notebook cards come vs. desktop cards. Forget seeing 20GB GDDR6X vRam in notebooks...
    http://forum.notebookreview.com/threads/nvidia-thread.806608/page-250#post-11048082

    Report: Why The GeForce RTX 3080's GDDR6X Memory Is Clocked at 19 Gbps tomshardware.com

    Blame the heat...

    German publication Igor's Lab has launched an investigation into why Nvidia chose 19 Gbps GDDR6X memory for the GeForce RTX 3080 and not the faster 21 Gbps variant. There are various claims, but it's not entirely clear how exactly some of the testing was conducted, or where the peak temperature came from.

    According to the results, the hottest GDRR6X memory chip had a Tjunction temperature of 104C, resulting in a delta of around 20C between the chip and the bottom of the board. Another interesting discovery is that the addition of a thermal pad between the backplate and the board helps drop the board temperature by up to 4C and the Tjunction temperature around 1-2C.
     
  21. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Regarding one of my earlier posts on Ampere's performance scaling with power, 98% of the performance at 65% of the power draw was overly optimistic. However, new information has come to light: https://wccftech.com/wp-content/uploads/2020/09/vs-2080-Ti-Overall.png

    The picture is from this article: https://wccftech.com/undervolting-ampere-geforce-rtx-3080-hidden-efficiency-potential/

    Yes, I know this is just one article, but it suggests that Ampere can be pretty aggresively undervolted while retaining most of the performance. Based on the picture I linked to, you can get around 97% of the original performance at around 74% of the original power draw. Granted, this test was only performed in one game (Forza Horizon 4), but again, it suggests the performance gap between laptops and desktops may not be as wide as previously thought. Needing an extra 80 watts to get an extra 4 FPS is not worth it in my opinion, and is a negligible difference when you're already getting very high framerates.

    So perhaps over 90% of the performance of the desktop cards will be achievable in laptops. This means the gap may get narrower again like we saw with Pascal. I'd say a less than 10% difference is really good, especially considering the much smaller power requirement to achieve that performance level.
     
  22. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I'm telling you man, 250W should be more than enough to get over 95% of performance of the 320W 3080.. by the way I have zero proofs of what I'm saying, just my guts , lol
     
  23. OneSickOmen17t

    OneSickOmen17t Notebook Consultant

    Reputations:
    14
    Messages:
    166
    Likes Received:
    14
    Trophy Points:
    31
    I mean my 150w 2080 Super can barely get anything over 60-65*. I'm sure my Omen could cool at least 200w easily.
     
  24. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I'm not sure I agree with this conclusion. You're postulating that an aggressively undervolted desktop Ampere GPU can get within 10% of its stock performance while shaving off about a quarter of its power as a representation of the performance we might see on mobile Ampere without considering that:
    1. This still leaves an extra 40W before it reaches the ~200W GPU ceiling for the heftiest desktop replacement notebooks. And what about the more normal-sized models with GPUs around 150W?
    2. Such an aggressive undervolt that can perform with 100% stability cannot be standardized as it requires both a silicon lottery winner and manual end-user tuning.
    3. As a corollary to the above, desktop GPUs running at much lower temperatures can undervolt better. Higher temperatures both increases the voltage needed for stability as well as the power consumption.
    4. Mobile Pascal didn't just close the gap, it effectively eliminated it at stock without any end-user tuning, at least in the case of the GTX 1080N.
     
    NuclearLizard likes this.
  25. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Those are some perfectly valid points.

    For the Max-P GPUs with higher power budgets, I'd expect those to be pretty close to desktop level performance, but don't quote me on that. I will happily eat my words if I'm wrong. It does seem that the desktop cards are using a lot more power than necessary though, which is what led me to the conclusion that the performance gap between desktop and laptop cards could get narrower vs Turing.

    Remember that you can save around 80 watts of power with barely any drop in performance. I do expect the performance loss to become greater as the power drops further, but I think it's still possible to get awesome performance on mobile Ampere. We'll just have to wait for the mobile Ampere GPUs to get concrete statistics on their behavior in laptops.
     
  26. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
  27. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    cover2(1).jpg

    Beefier and more effective cooling systems will enable laptop manufacturers to cram ever more powerful parts into laptops.

    There's a LOT of room for laptop cooling to improve still.
     
    Clamibot and NuclearLizard like this.
  28. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    I said this before when Ampere was announced: you should be happy if you get desktop 2080Ti performance with ampere equipped laptops. The power draw is insane and even with the 65% power draw = 98% performance thing, you’d be ~200W which is only attainable by the beefiest of machines, which are generally not the most commonly sold computers.
     
  29. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    If Ampere is truly more power efficient they should just be able to shoot for the same power targets Turing was using and we will just see a commensurate performance gain at the same power level.
     
    etern4l and Aroc like this.
  30. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Wishful thinking I guess. There are reports of RTX 3080 GPUs crashing and even outright failing because the capacitors can't handle current from the VRMs. Nvidia say they have fixed that with a driver update. Translation: They capped the boost frequency for the GPU,and cut the time for sustained boost.
    Allegedly there's a 10% drop in performance,so there goes the performance gap we saw claimed with pascal desktop and laptop.
    The ampere card still runs at 300+ Watts after the driver update.
    Secondly, the samsung 8nm node has poor yields out of the box: 75-80%.
    Shockingly they made a statement recently that your chances of getting a 'good' or 'great' GPU are better than getting an 'okay' GPU. Reading between the lines of PR marketing, nvidia is not confident with the node.
    The fact that even RTX 3090 which is a Titan level card is stuck with 8nm should say heaps about nvidia trying to cut corners here. There are confirmed reports that nvidia were in talks with TSMC to try and shove in 7nm somehow on their higher end GPUs but instead had to go the Samsung route because TSMC was charging per die and Samsung per wafer, if they had a working chip at the end.
    These cards are also priced pretty aggressively so distributors are trying to increase margins by opting for second grade non-ceramic capacitors- another cheap move by nvidia because AMD clearly has a much superior node with TSMC 7nm+.
    Nvidia made in house capacitors until Turing but why did they abandon it with Ampere? Because they want to retain maximum profit margins per board.
    Everything they pulled, including buying ARM is a knee jerk reaction to AMD, who genuinely stand a chance to win this gen if they price the 6800/6900xt sensibly.(look at the consoles!)
    Lisa Su single handedly dethroned Intel with Ryzen, and now they are gunning for Nvidia. Ampere and ARM are Nvidia desperately trying to cling to the performance crown-Although if Nvidia plays their cards right they might have a winner with ARM and their AI tech.
    As for the GPUs, it's time to see what Big Navi can do. It's got a better chance even on laptops.
     
  31. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    The boost clocks being cited were over the maximum SPEC clock of 1710Mhz. This is "GPU Boost" which is pretty much just gravy boost if there is power and temperature on the table to use. If they have to limit that to lower boost bins around 1900Mhz or so instead of over 2000Mhz there is nothing "wrong" with that.

    But since the first launch config (which didn't allow AIBs to do any testing at all it seems) was too aggressive and it's been scaled back a tad, people are considering there was something "taken away" from them. In reality, if there was proper time for testing they probably would have launched with this configuration to begin with. Seems Ampere is "living on the edge" moreso than Turing was.

    nVidia seems to have bungled this by not giving AIB time to do things properly.
     
    Prototime likes this.
  32. Ifrin

    Ifrin Notebook Evangelist

    Reputations:
    308
    Messages:
    377
    Likes Received:
    397
    Trophy Points:
    76
  33. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    JRE84 likes this.
  34. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Yes, that was exactly my point! You see the issue here? If 6/10 GPUs can't even handle opportunistic boost, it means they're already pushing the silicon to its limits. The gravy boost you mention is essentially overclocking if I understood correctly , and there have been pre-overclocked third party boards for previous gen cards. Infact, Jensen Huang quoted that Turing cards were "made for overclocking".
    If a 100 mhz overclock is causing crashes, clearly ampere was rushed? We have the evidence of laptop performance with Turing already, it'll be very surprising if ampere gives even 2x that, tier for tier(unless of course they find a way to up the wattage and lower the temperatures on laptops) .
     
    hfm likes this.
  35. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    If it was made for overclocking, what's to say nvidia didn't mean more overclocking headroom for themselves?
     
  36. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    He mentioned it in the Turing reveal. Make of it what you will. It was mentioned right around the time he revealed the 2080/2080ti FE.
     
  37. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
  38. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    This guy here made several tests and thinks the problem is related more to the power limit:

     
    Kunal Shrivastava likes this.
  39. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    RTX 3080 performance when undervolted

    Screenshot_20201019-074015.jpg
    Screenshot_20201019-074305.jpg

    Seems like 250w will be the sweetspot
     
    Normimb likes this.
  40. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Good example of people seeing what they want to see. 1950@931mV and 291W is clearly the sweetspot in this one particular bemchmark.
     
    Normimb likes this.
  41. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Keep in mind the gddr6x will also be undervolted in the mobile versions as well as other efficiency improvements.
     
  42. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Need to add in this :D
    [​IMG]

    NVIDIA GeForce RTX 3080, 3070 and 3060 Mobile to be paired with Intel 10th Gen Core CPUs? videocardz.com

    3080 Max-Q = 3070 on GA104 / Desktop 3080 on GA102 (A neat disgusting change from Pascal and Turing).

    Yeah, Nvidia enjoy screw the notebook customers. Why change the mobile surname when M was good enough? Max-Q as I have said the whole time was a awful idea by Nvidia.

    @Meaker@Sager you said Max-Q due the same silicon (specifications) vs desktop cards hence Nvidia could charge damn high price for Max-Q although it have crippled performance. Will we now get cheaper priced Max-Q Jokebooks due the innferior silicon and lower performance vs. same destop cards from same so called top tier cards? Or is Nvidia back to their disgusting practice? Yeah, I know this is rumors, but still.... @Mr. Fox

    1080 Max-Q = 1080 on GP104
    2080 Max-Q = 2080 on TU104
    3080 Max-Q = 3070 on GA104 / Desktop 3080 on GA102

    And there's also a big possibility that NVIDIA utilizes slower GDDR6 dies rather than GDDR6X to power the RTX 3080 Max-Q graphics card. Damn nice!

    The GA102 GPU is not expected to come to mobile form factor, although NVIDIA has made a commitment to offer desktop and mobile SKUs with similar specifications. In reality, though, those SKUs are never the same, spec and performance-wise since they have different TGP targets.

    With this move, Nvidia can come up later with a Super refresh with the GA102 silicon. And change to GDDR6X 10GB vRam (but slower speed vs. the desktop ram) as for todays desktop cards while Nvidia already prepare for more and faster vRam for the desktop refresh. Yeah, Nvidia continue treats notebooks like unwanted stepchildren! On top behave as liars (NVIDIA has made a commitment to offer desktop and mobile SKUs with similar/same specifications). What a letdown/and what a betrayal if this is correct.

    Yeah, I expect my predictions on this Mess will come true, LOOL
    [​IMG]
     
    Last edited: Oct 25, 2020
  43. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Yeah i do think it's very possible that GA102 will stay a desktop only chip. As much as I'd like to see the biggest die come to laptops.

    Question is, how will nvidia nerf the 3070 mobile now that it shares the same GA104 die as the 3080 Max-q? Maybe GDDR6X GA104 for the 3080 Max-q and Gddr6 GA104 for the 3070 mobile.

    The simplest thing to do would be to completely boycott the 3080 if that comes to fruition. And for that we need the help of reviewers if possible.
     
    Papusan likes this.
  44. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    You mean they can charge an arm and a leg more if they will use GDDR6X for 3080 Max-Q ? Could be. But I think Nvidia will rather use those vRam dies as a nice way to lure people into the Refresh :) Or maybe swap to GA102 for the refresh is enough? :rolleyes:

    Though, they have SK Hynix's GDDR6 memory rated at 1.2V with speeds of 12 Gbps up their sleeve. Aka lower powered/slower dies than the desktop variant.

    I can't see a low powered GA104 with lower clocks than 3070 desktop cards is a proper upgrade over 2080 Mobile if this will be the 3080 Max-Q Sku.
     
    Last edited: Oct 25, 2020
  45. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I'm not sure it really matters, as long as it's some iterative improvement over the 20x0 series notebook silicon people will buy it. I mean there really won't be a choice for people buying a NEW notebook, it'll all just be whatever TDP they can fit from the 30x0 parts that are avaialble. If you're looking to upgrade from your current Pascal or Turing notebook, I guess it would be is the improvement across the board (CPU, features, panel, and GPU, SSD) as a entire package, worth it.
     
    Papusan likes this.
  46. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Well nvidia is rumoured to switch back to tsmc 7nm for their super refresh so it'll be interesting to see what they actually do.

    The super refresh is a brilliant idea to sell more cards later in the life cycle by holding back the performance of the launch cards and I hate how brilliant it is.

    That said, I'll be watching from the sidelines this gen, its going to be bumpy ride.
     
    Mr. Fox and Papusan like this.
  47. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    Criminals have been known to be some of the smartest and most creative people alive. They simply abuse their God-given abilities by using them for nefarious endeavors. Technology leaders and decision-makers seem to possess some of best examples of truly brilliant criminal minds. Their actions suggest that they believe most consumers are stupid, and the actions of most consumers suggest they are correct in their belief.
     
  48. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Maybe we will see second round crippled 3080 (Max-Q) as the refresh :D What is better than throw out a cut down GA102 as the top tier when Nvidia find the right time to help the notebook manufacturers with boosting sales next time? Several ways to screw the notebook users. They will buy whatever they being offered as long it perform slightly better than it's predecessor.

    The sku branded Max-Q was a huge Scam! And the sad part.... Some still don't see it. Nvidia just can't differentiate Max-Q and Max-P silicon. Both crippled vs. desktop cards. The only difference will be higher TGP and slightly higher clocks for the Max-P variant. See... Back to old days but now with two Mobile cards instead for 1 and with minor differences.

    I wonder why so many people out there still don't feel or can see that they become scammed by greed :confused:
     
    Last edited: Oct 25, 2020
  49. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
  50. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Max-P versions (3070/3080) will probably be only in the big three (Area-51m R3, Clevo X170SM-G, MSI GT76)

    I could see 3070 Max-P in thinner 17" models too. Depends how good they can bin them.
     
← Previous pageNext page →