The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    How will Ampere scale on laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    There is nothing they can do in the short term; ampere was designed with the current power limits and they can neither put that power in a laptop nor get the performance without it. While I detest them applying desktop nomenclature to a mobile counterpart that does not share that same performance I don't have any other specific gripe other than I'm not getting what I want. At at least no other gripes if we leave out their business practices, the ongoing duopoly, all that fun stuff. As hfm implies the performance we can have in a laptop is not so bad, it was only a short time ago that playing a game in HD with acceptable performance on a lappy was an impossible dream. Now I'm playing the latest greatest game on a two year old laptop and griping about it being the first time I had to step down from the monitors native 2k. It's not the worst place mobile gaming has been.

    I guess where I'm going here is that at some point you have to kick the dog and move on. Ampere is what it is and were not changing it. Better if we could define it and get some useful information out of that; upgrade paths and that sort of thing.
     
    dmanti likes this.
  2. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    All too many don't know what they pay for. Hence the Max-Q still have a life.
    Nvidia wasn't forced to go back to old sins and use the middle tier silicon, and now also come with lower spec vram for their mobile flagship.

    But I expect they get same profit using the cheaper cut down silicon for 3080 Mobile cards Samsung's meager 8 nm yield is said to be responsible for the Nvidia GeForce RTX 3000 shortage

    upload_2020-12-13_19-38-52.png
    https://twitter.com/PremaMod/status/1081899577882566657

    Nvidia, just don't want that gamingbooks shall have longer lifespan than it absolutely need to. Bad for their business and bad for their partners (notebook manufacturers).

    If the notebook manufacturer continue make thinner and slimmer gaming models, this will of course increase profit for them both. Nvidia will cheer this attitude for all its worth.
     
    Last edited: Dec 13, 2020
    Normimb, jclausius and seanwee like this.
  3. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Yeah couldn't agree more, profit is what these companies are interested in and they are expected to not just make money and be profitable, but according to the scam market, they must do better and better and better every year, it's like they're in a delirious drug bender all the time. So if you play that game then you're going to resort to lots of scams and deceptions to get it done, only way it's possible really. See it everywhere, cellphones, laptops, cars etc.
     
    seanwee and Papusan like this.
  4. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Don't disagree with a word of that but nevertheless we are here making do with what our corporate overlords have delivered. I'm all for tilting at windmills and everybody loves a good bichfest but we've still got to buy hardware in the real world. I'd like to do it with eyes open while giving the boy wonders on the corporate jet as little champagne as possible. This is why I'm here on a thread regarding how well ampere will scale on laptops.

    I'm definitely up for joining in on the "Hey let's best the crap out of those Nvidia corporate scumbags" thread if someone wants to start one.
     
  5. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    No one is boycotting it. No one will care except people that are upgrading, people just buying new out of the gate won't give any mind to it's performance vs 20 series or anything else.
     
    joluke likes this.
  6. Ifrin

    Ifrin Notebook Evangelist

    Reputations:
    308
    Messages:
    377
    Likes Received:
    397
    Trophy Points:
    76
  7. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    The thing is the big majority of laptop users want to play games while on battery, so is not exactly Nvidia's fault. Nvidia just release the product that the majority of people wants to buy. People like us that want true big powerful laptops and use AC adapter most of the time are a minority that the whole laptop maker industry is willing to ditch. Sure they could make large and thin laptops but the potential buyers for large laptops is so little that it's probably not profitable enough. When you ask in focus groups and polls battery life for laptops is always in the first place, which sucks because battery life and gaming performance always go in the opposite direction. At the end we have what we ask for, no more no less.

    Interesting, though the rumor mostly talks about the Ryzen CPU.
     
  8. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    When reviews show that 3080 and 3070 laptops perform similarly but the 3070 one is a lot cheaper the results will speak for themselves
     
  9. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    you mean the mobile 3080 and the mobile 3070 will perform similar? why you think that?
     
  10. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    I find it very hard to believe that most gaming laptop users "want only to play on battery" that's ridiculous tbh. If they do it's real simple games not heavy duty 3d games, so solitaire, candy crush etc Nvidia only gives what people want, sheesh that's also not believable either. It's a one way street for the most part. I mean I guess if you turn off raytracing they probably CAN get desktop perf out of a laptop gpu. Thin and light is very subjective, travel a lot with it sure make it light/small but normal use who cares, as long as it's not a desktop.
     
    jclausius likes this.
  11. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Because of diminishing returns.
    In their current form, laptops are severely limited by thermal constraints. While its accurate that OEM's CAN design a decently cooled unit to allow hw to reach its maximum without reaching ridiculous temperatures or being too loud, they usually opt for making a cheap cooling which isn't adapted to specific hw.

    Also, an article was published earlier this year on how much of a performance/price ratio increases or decreases with given gpu's.
    It was determined that mobile 2070 was basically the highest you can go without hitting too much of a price bottleneck... past that point, the performance gains with say mobile 2080 and 2080ti were in the range of maybe just above 10%, but the cost was much higher.

    Its a similar thing when you try overclocking a CPU to too high level only to discover that you may end up with barely any performance improvement, or worse yet, less performance (for a much higher power expenditure and of course higher the price).
     
    hertzian56 likes this.
  12. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Both will be using the the GA104 die from the desktop RTX 3070
     
  13. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    yeah but Nvidia is not stupid, they will just shut a few SMs down on the GA104 and power limit the mobile 3070 to make it a 30% slower than the mobile 3080
     
  14. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Not likely, there are already leaks showing the mobile 3070 is in the ballpark of 3060ti and 2080ti performance.
     
  15. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Might be the sweet spot, from someone rocking an old 1080 and older it's still a big upgrade even if it isn't what I at least hoped for during the early days.
     
    joluke likes this.
  16. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Yeah it's a great upgrade from 1080 GTX. Getting the performance of a 2080ti in a laptop would be great
     
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Is this meant as a Joke? Or maybe I'm blind and can't read the specs properly? Will we see further castrating from Nvidia... Aka Mobile 3080 will get the lesser and cheapo GA104, No GDDR6X, Less vRam, crippled TGP and crippled clocks. If this is correct we will see capped VRam vs the desktop cards.

    It looks like Nvidia have gone bananas with razor blade on the mobile Ampere. Cut and cut everywhere. A nice way to push you over on the next cards before time (Capping the performance all over the places so you really don't see another option than buy their next scam).

    8GB vram for next Nvidia Mobile top dog :rolleyes: Just avesome!
    [​IMG]
    Acer Nitro Notebook Packing AMD Ryzen 7 5800H CPU, NVIDIA RTX 3080 GPU Listed for €1950 techpowerup.com

    [​IMG]
    Ryzen 7 5800H Joins GeForce RTX 3080 In Acer Nitro 5 Laptop tomshardware.com

    Although the Ryzen 7 5800H does come with a Vega iGPU, the Nvidia's GeForce RTX 3080 (Ampere) will do all the heavy lifting when it comes to graphical workloads. It's probably the mobile version, which will likely feature cut-down specifications. Thus far, the German merchant listed the graphics card with 8GB of memory, which we presume to be of the GDDR6 type.
     
    Last edited: Dec 22, 2020
  18. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    The problem with Ampere is that it's even more power hungry than the 2xxx NVidia series, so I guess they were gonna have to make even bigger sacrifices when it came to laptops, how much/many sacrifices is the debate though.
     
  19. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    According to notebookcheck, it's 8 or 16GB (or did they mean up to 16GB) of GDDR6, with a number of TDP variants - likely we'll see different configurations supplied by different vendors. No GDDR6X though.

    https://www.notebookcheck.net/NVIDIA-GeForce-RTX-3080-Mobile-GPU-Benchmarks-and-Specs.497450.0.html

    Notebook manufacturers may have shot themselves in the foot. The relentless pursuit of thin and light lead to fairly severe crippling of the notebook-level hardware. On top of that corona means fewer people need to be on the go, and if they do - likely to a lesser extent. The desktop route will be looking very attractive to many people. Max available power plus no more extortionate pricing, fake upgradeability, laptop service nightmares etc.
     
    Mr. Fox and Papusan like this.
  20. GrandesBollas

    GrandesBollas Notebook Evangelist

    Reputations:
    370
    Messages:
    417
    Likes Received:
    563
    Trophy Points:
    106
    This is where innovation becomes important. I have been feeding my disgust of BGA turdbooks towards playing with a M1 Mac Mini I bought for $700. What if SOCs could bring performance without the need of a bloated discrete GPU card. Just fantasy right now. But that is what is needed in the mobile laptop sector.


    Sent from my iPhone using Tapatalk Pro
     
  21. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Just look at the 1660ti, according to notebookcheck it's only 3% lower than the desktop version but the 2060 is like 15-25% lower than the desktop version. That's just dishonest marketing by nvidia, it needs to be 2060m not 2060 for mobiles just like maxwell was differentiated with the "m". So if they wanted they could have put a full desktop version tuned for wattage but chose not to, marketing dishonesty again. I guess they couldn't have done it because an rtx branded card must have raytracing, dlss cores which produces more wattage idk.
     
    Mr. Fox and etern4l like this.
  22. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Oh the regular 1660ti mobile is 15-25% slower than desktop variants too. Only the 115w ve
     
    etern4l and Mr. Fox like this.
  23. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    I thought it already was well established the mobile "3080" would just be a 3070 this time around so no GA102 die, no 10gb vram and no gddr6x. No surprises here.

    Its no cut down gpu, its just a lower tier gpu being sold as a higher tier gpu plain and simple.
     
    etern4l likes this.
  24. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    I find it humorous the GTX 980M came with 8GB vRAM when the desktop version came with 4GB, but now the desktop 3080 comes with 10GB VRAM but the 3080M gets 8GB? Come on nVidia.... we’ve been at 8GB for GeForce Cards since 2014, that’s really crazy. This has been the slowest VRAM climb in the history of computers. From 2010-2014 we went from 1GB to 8GB but 2014-2021 STILL 8GB!?
     
    Mr. Fox and Papusan like this.
  25. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Mr. Fox likes this.
  26. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    It will be fun to see if Nvidia has the guts to differentiate the different 3080 Mobile cards (amount VRAM).
    Cut down vs. 980-1080-2080 Mobile. Allover

    If what we now see is final, then it looks dark for mobile gaming. A load of different SKU's with same naming. One more crippled than the other. Can't be worse than this Ampere mobile lineup. And I'm yet not so sure we have seen the worst from Nvidia.
     
    Last edited: Dec 22, 2020
    etern4l, Mr. Fox and seanwee like this.
  27. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    I'm not talking about the Max-q variant, I'm talking about the 80w regular variant.

    Only the boosted 115w mobile version is close to desktop version. The regular 80w mobile version is further away from the desktop version

    Edit: Ah i see what's happening here. You were looking at the 1660ti mobile vs desktop 1660, not 1660ti. Check again
     
    Papusan, Mr. Fox and joluke like this.
  28. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Hopefully it's only the 3080 that's being falsely advertised. And you bet I'll be doing my best to spread knowledge about this BS.

    If the mobile 3070 is a desktop 3060 ti and the mobile 3060ti is a desktop 3060 it will be bloody scandalous.
     
  29. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    NVidia not giving a single f*ck about gaming enthusiasts and clients. Nothing new

    Enviado do meu ASUS_I001DC através do Tapatalk
     
    etern4l, Papusan and Mr. Fox like this.
  30. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    The laptop space just has a lot more normies compared to the diy market.

    Thats why they think they can get away with this sh*t. The general populace just isn't informed, and the most watched laptop reviewers are normie as f*ck too, like Dave2D and Matthew Moniz that review laptops like phones and discussion about performance, thermals, all the important stuff takes a back seat if its even mentioned at all.
     
    Papusan, Mr. Fox and joluke like this.
  31. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    The performance difference between the mobile 3080 and the desktop 3080 will be huge, 50+% is my guess. Nvidia shouldn't call this graphics card RTX 3080, it should be called RTX 3080M or something to clearly states in the name that the buyer should not expect the same (or even remotely likely) performance of the desktop 3080. It's interesting to see how the path on laptops and desktops diverge. While the desktop hardware is more power hungry than ever, the laptop hardware is getting more power limited. The thing is desktop hardware is not going to consume less power from now on, it will stay at the same level or get higher. So sadly for laptop users the gap in performance between desktop and laptops are going to get wider in the next few years. The upcoming 11900K already takes 250W which is difficult to cool down even on desktops let alone in laptops.

    I'd rather have 8 GB GDDR6X than 16 GB GDDR6
     
    Mr. Fox likes this.
  32. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    The new Intel 11th gen is a joke compared to AMD new cpu's..

    Intel limits the new CPU's to 8 cores, and then we have AMD with 5950x with 16 cores, 5960x 24cores, 5970x 32cores, 5980x with 64cores and the beast 5990x with an insane amount of 64cores (128 threads)
    Personally I hate that Nvidia doesn't have competition and owns a monopoly and no one to "fight" in their own territory.. That's why they do what they want.

    Edit: Added info
     
    Last edited: Dec 22, 2020
    Mr. Fox and seanwee like this.
  33. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    Congrats, you're getting neither!
     
  34. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I really think that the 10900K/10850K is more than enough for gaming. 5950X for gaming is too much and too expensive too. Also Zen doesn't scale well with RAM speed because the fabric speed limit
     
    joluke, dmanti and Mr. Fox like this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    I saw the direction things were headed several years ago and made the difficult but appropriate decision to go cheap on laptops and divert my computing passion back toward the desktop PC space. But, @seanwee is totally on-point about the ignorance of consumers playing a major role in the dumbing down of notebooks. The complicity of the minority that actually know better to embrace rubbish and accept massively compromised trashbooks for the sake of having something cute drives a second nail into the coffin. The normalcy of dishonesty in the ODM and OEM space and their shameless decisiveness and predictability in terms of making excuses for the trash they offer is our assurance that things will never improve.

    As I have been saying for the past 8 or so years (some of you will remember hearing it over and over, and probably remember being sick of reading it in these forums,) we will all have whatever they are willing to put up with. This is a self-fulfilling prophecy that required nothing more than common sense and rational analysis to identify the writing on the wall. Welcome to the future. It sucks as bad as I expected it to. No worse and no better.

    We have only our collective selves and our friends to blame for not saying no with our wallets at the first sign of treachery. Had we allowed the compromised trash to rot on warehouse shelves and end up being sold at a massive loss, or shredded and recycled at a complete loss in order to make space for respectable products to occupy those warehouse shelves this would not be happening.

    If we don't start doing it now, then "we ain't seen nothin' yet" and we have no idea how low they can go. The pit is as bottomless as the empty heads that made it so. Making excuses for mediocrity because it's "just a laptop" guarantees those products will remain inexcusably poor. Should you dare to speak out and call the balls and strikes, be prepared to be labeled as a hater, troll, elitist, or even a racist or bigot, for having the pelotas to do so. And, make no apology for it... embrace it as a compliment and make no apology for the smugful pride in your superiority.

    Yes, but not a funny one. The clowns that go along with it and think it is OK are the kind we should all be afraid of.

    To be forgiving of their sins and crimes is Pennywise... and pound foolish.

    Don't take it... you'll regret it. Run away while you still can.
     
    Last edited: Dec 23, 2020
    jclausius, Papusan and seanwee like this.
  36. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    I remember you saying it a couple of years ago that it would be like this and you hit every single word right.. Sucks but it is true
     
    jclausius, Papusan and Mr. Fox like this.
  37. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    Yes, sucks is putting it kindly. All I can say is, it's good to not be one of the lowest common denominators that contributes to the problem that curses us all.
     
    joluke likes this.
  38. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Actually, 980M came with either 4 or 8GB, which I guess sheds some light on why, according to NBC, 3080 mobile supports 8 or 16GB.
     
  39. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Why? What is the improvement in performance, say in some gaming benchmarks, due to just the different RAM type. As measured in practice, as opposed to advertised by Nvidia. Oh, we probably don't know.
     
  40. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Just a matter of having faster ram vs more ram.

    Faster ram will always be more useful until don't have enough, then more ram will be better.
     
    jclausius likes this.
  41. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    Well GDR6X has 19% more bandwidth over GDDR6. I don't know how much that does increase in performance, but I do know that 8 GB of GDDR6X has 19% more bandwidth than 16 GB of GDDR6 in all games the 100% of the time. On the other hand you only get better performance out of 16 GB of GDDR6 only when a game uses more then 8 GB, which is not for the 100% of the games and not the 100% of the time.
     
  42. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    For me it's just a tradeoff between performance in applications thst require less VRAM, and being able to leverage more VRAM now or in the future. Would be good to have some performance data in hand to make an informed call.

    When is 8GB not enough? Isn't it the case already in many applications today, even looking at gaming. When will it stop being enough, given that 3080 desktop ships with 10GB?

    I'd much rather have 16GB of VRAM, unless there is a large performance gap, and we don't know that. Recall that VRAM overclocking tends to have relatively minor impact on performance.
     
    Last edited: Dec 23, 2020
  43. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Does GDDR6X overclock at all? Could this just be factory overclocking, where the gap can be closed just by overclocking GDDR6?

    You don't need a 3080 in 100% of the games. In fact you probably only need it in 0.01% of games overall, however, you may need it to run the latest AAA titles on Ultra, in which case it would be very annoying to see the dreaded "Your system does not have enough video RAM to enable Ultra quality textures" message, especially after dropping $5k on a gaming laptop.
     
    Last edited: Dec 23, 2020
  44. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Nope, if you read it it says the clock speeds are only 3% lower than the desktop 1660ti, from the specs it has the same everything else. But looking at the fps etc it's about 10% difference, in some of the benchs the mobile beats the destop though.

    https://www.notebookcheck.net/GeFor...ce-GTX-1660-Ti-Mobile_9836_9641.247598.0.html
     
    Last edited: Dec 23, 2020
  45. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Yeah both GDDR6 and GDDR6X overclock. Though theres more to GDDR6X than just higher clock speed, see here https://www.tomshardware.com/news/m...ls-the-future-of-memory-or-a-proprietary-dram

    Clockspeed that is not backed by performance is meaningless
     
    etern4l likes this.
  46. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    That has to be fake because the Nitro is Acer's entry-level laptop line. The Triton or Helios would be the one to have those parts.
     
    jclausius and etern4l like this.
  47. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Thanks, quite informative. Check this out - looks like burst mode speeds are actually the same:

    "There is a slight caveat, though. GDDR6 has a burst length of 16 bytes (BL16), meaning that each of its two 16-bit channels can deliver 32 bytes per operation. GDDR6X has a burst length of 8 bytes (BL8), but because of PAM4 signaling, each of its 16-bit channels will also deliver 32 bytes per operation. To that end, GDDR6X is not faster than GDDR6 at the same clock."

    So, I would really wait for some substantial performance analysis on Micron GDDR6X (as it should be called, given the proprietary nature of the technology) in desktop RTX 3000 series, before mourning its loss in the mobile variants. Such analysis wouldn't be trivial to do, since I guess it won't be possible to directly compare, say a desktop 3080 with or without GDDR6X, but it would be doable I think, and - failing that - we'll see just how big the performance gap between "equivalent" mobile and desktop RTX 3000 series GPUs is in practice.

    At any rate, the potential availability of 16GB mobile 3080 variant is a welcome development in my view, particularly given that we can't expect to see a mobile 3090 - this will reduce the gap in the memory capacity department at least (at a likely small performance penalty).
     
  48. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Yep, I would expect an avalanche of fake news, borderline vaporware, and furious roboscalping before ordinary consumers get to lay hands on this stuff at MSRP.
     
    saturnotaku likes this.
  49. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Plus, I think people are going to be disappointed in the gains RTX 3000 mobile will have over the 2000 series compared to what we've seen on their desktop counterparts.
     
    Papusan, seanwee and etern4l like this.
  50. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    I'd refrain from that speculation. Head over to the Alienware forums and see for yourself: people are happily dropping $5k for a +5% "performance upgrade" from 2080 to 2080S. The 2080S to 3080 upgrade will almost surely be much more substantial, and - if it materializes - the 16GB VRAM option will appeal to many.
     
    Papusan and joluke like this.
← Previous pageNext page →