The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    nVidia 2015 mobile speculation thread

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.

  1. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    This is all I'm seeing too. Again, wrong as I may be, I apologize for my ignorance in that regard, I still see some significant headroom where that i7 can be further taxed. However, that's a completely different argument in itself so I'll leave that one alone.

    EDIT:

    Though may I ask if the current gen i7s can keep up with a single 980M? o_O If not, then that's some BS. :S
     
  2. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    That would be even worse, since you're losing half of the threads on a system that's (potentially) limited already. It would be difficult to interpret the result.

    To really see the difference CPU performance needs to be changed physically, like a OC/UC or run the same test with Clevo P570 on same GPU clocks.
     
    Last edited: May 26, 2015
  3. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Depends on the game and the bottleneck.

    For example, GTA V is an exceedingly heavy game on the CPU. If however you turn on MSAA and max everything out, you'll end up forcing a GPU bottleneck for most of the game. Since I don't use MSAA in GTA V (to keep higher framerates) as well as I keep grass on "high" instead of "higher" or "ultra", I get a whole lot of excess GPU power. My CPU's bottleneck then steps in in some parts of the game, where the game simply will not use more CPU power (the max is 80% via Task Manager, or 66% via my overlay/throttlestop's readouts... GTA V doesn't really benefit from Hyperthreading). I have discovered by talking with some friends however that a ~4.4GHz intel haswell quadcore is enough to almost never go below 60fps in GTA V if you remove the GPU bottleneck. But most of the laptops coming with the 4720HQ 3.4GHz CPUs? Yes, yes it can. It'll be in a rather small number of games, but it can indeed bottleneck a 980M.

    If you'd asked me exactly five months ago if a 4720HQ could bottleneck a 980M for 1080p 60fps gaming however, I'd have said "no" and that'd have been the truth. But remember how last year was the "WE WANT VRAM! NOM NOM NOM" year of video games (Titanfall, Watch Dogs, Shadows of Mordor, CoD: AW, etc)? Well this year is the "WE WANT CPU POWER! NOM NOM NOM" of video games (Dying Light, GTA V, more-to-come).
     
    jaybee83 and Mr.Koala like this.
  4. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I'll have to give Throttlestop another look since it looks like UncleWebb is making updates for current gen stuff and see if it's worthwhile.

    Of course this is why I'm strongly considering the P750ZM refresh so there is the desktop CPU which can easily achieve over 4GHz and possibly SLI Maxwell refresh #2.
     
  5. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It is. And if the P770ZM SLI, unlocked Skylake refresh is a thing, then we might have a truly amazing machine. I still think that a single 330W brick is not enough though... the desktop CPUs use higher voltages by default. Bigspin has noticed his 4790S getting TDP limited with a 65W allowance out-of-the-box. If an unlocked Skylake CPU with 95W rated TDP was shoved next to two 100W 980Ms (far less 980M refresh cards using full GM204, maybe near 115W or 120W TDP each) there's no way a single 330W brick can provide enough power, and while people like myself or Mr. Fox might be fine with having two bricks, it still defeats the purpose of "laptop" to me if I *NEED* to use two of them to properly use my notebook. Overclocking is one thing, but if I'm sitting bone stock and getting power brick shutdowns because I wanna play Crysis 3 above 60fps? Then that's a design flaw, you get me?

    Buuut I'm very interested to see where they're going. If they make another 120Hz panel-using notebook (far less a 3D model) with heaven forbid a new 120Hz 17" panel? I'd probably oogle it and die XD.
     
    TBoneSan likes this.
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Ha. A 500W laptop power brick, lol. But it may be necessary. Although the more I think about it, I may go the desktop route myself and keep my P650SE for an indeterminable amount of time. I don't need to be as mobile as I used to be. A desktop in my main home office would be sufficient, and something like my P650SE gives me the flexibility to game on the go. Still up in the air though. As 120Hz and 4k gaming gain popularity, desktop to me seems the best route for the time being until mobile can catch up on that race. Otherwise sticking with 1080p/60 seems the best option for laptop gaming for the near future.
     
  7. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Given enough buffering it should just throttle, not shut down. Shutting down due to overloading PSU is a design flaw even if the PSU itself is guilty.

    As for carrying power bricks, Clevo's habit of using relatively cheap large bricks really doesn't help here. Maybe reuse slim Dell bricks again?
     
  8. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Cheap or not, they have high quality 330W bricks that can deliver well over 330W of power for a decent amount of time. I'm not sure about how the throttling would work there. If throttling is designed to be off during plugged in high performance plans/modes, I think it should probably exhibit no throttling and crashes rather than throttling randomly.

    I was thinking a 400W brick, actually. Should be sufficient (100W TDP CPU, 250W TDP SLI GPUs, 50W for rest of system). Of course, overclocking might be difficult at that limit, but if they could invent a 500W brick that's not too much larger than what we've got now, things should be good.

    120Hz is gaining no popularity. By and large the majority still only cares about 60Hz. People have been using 120Hz and 144Hz back in the day with CRT monitors and such. It fell off for a while when LCDs became the new thing, but it has always existed in one form or another, and its popularity I'm sure has remained fairly constant. I don't understand WHY it isn't becoming more popular, but I suppose it's so easy to just say "yeah I can play this with an i5 at 60fps" etc. Single threaded bottlenecks become VERY apparent above 60fps (looking at you mass effect 3 and Payday 2) and some games simply... hate... being above 60fps. Mass Effect is one series that has collision detection issues (namely shepard and teammates walking near to objects such as counters or tables = them suddenly standing on top of said object) or like the often-mentioned BF4/BF:H where 120fps constant usually requires a pretty fast CPU. Then you have games like anything-made-by-bohemia-interactive where you'll never see 120fps, far less 60fps sometimes. Then some games have no problems hitting 60fps but seem to be incapable of maintaining above 60fps, like the last two call of duty games. If I locked them to 60? I'd never go below 60. Leave them unlocked to their 91fps default lock? Enjoy your fluctuating framerate no matter what PC you've got! Ironically AW's single player basically sat over 100fps 24/7 for me, which meant the MP needed some serious optimization work =D. Or Skyrim which requires being locked to 60fps lest the physics engine breaks and the day/night cycle goes out of sync, with NPCs thinking daytime is nighttime, causing quests to be impossible to start and/or complete because NPCs are not there at the prerequisite time, but speak to them there at other times and they'll say "come back at <right time>" etc.

    So yeah. While 120Hz is a fantastic thing without question, it definitely isn't for the basic user. It also needs a good bit more CPU power. I'd consider it as much a thing for advanced users as I consider getting the most out of a SLI machine. Basically, it's a "headache" most gaming machine owners rather not bother with. It has benefits, but sometimes you gotta put in some elbow grease. It's not like tossing in a 980 and an i5 and 8GB of RAM and playing on a single monitor with vSync on 24/7 using GeForce experience 1-click optimization like pretty much every youtube celeb with a half-decent PC has done XD.
     
  9. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    About CPU load reading:


    A game could be single thread bottlenecked. If only one thread is busy we could see a 13% (4 core HT on) or 25% (4 core HT off) AVERAGE load but the game is already bottlenecked. So HT on or off the total CPU load value would not give much useful information.

    To give a silly extreme example, running an old game on a four socket workstation might give you 0% average CPU load. But the main thread is still capped (if it's heavy enough).


    A modern multi-threaded game would not be that extreme, but there might be some threads heavier than the others.

    Thanks to the slow single thread high core count consoles this should be alleviated as devs get more used to them.
     
  10. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    A tool like Open Hardware Monitor will show per-core load. If a single thread is the choke point then the core it is on will show a load of 50% or possibly a little more depending on what else the OS is doing. You can use Task Manager to set the processor affinity of processes you are examining so that they don't bounce between cores.
     
  11. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    In some cases, yes.

    But a logical thread can jump from one physical core to another frequently, depending on CPU affinity settings (which we have no control of on a per-thread level). The game code can even generate and destroy heavy threads on the fly.

    100% load on a thread doesn't mean CPU bottleneck neither. Some (badly programmed) old games loop forever with zero idle time. You can OC an i7 to 10GHz and it would still cause 100% load. But most of the loops are simply wasted. There's no actual bottleneck.


    Without understanding of the game's source code there's no reliable way to analyze this from a single CPU load data dump. We need to change CPU performance physically or somehow limit load level evenly on every core to simulate slower hardware, and observe how the games responses to the change.
     
    Last edited: May 26, 2015
  12. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    Yes, we do. Each processor thread (logical processor) looks like a discrete processor to the OS, and we can pin processes to specific logical processors.

    Spawned threads inherit the parent's processor pinning status.

    @D2 Ultima 's specific example is Battlefield 4, a game that is known to be well-optimized for 4 core/8 thread processors.
     
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Look at the load readings in my picture here on my GPUs and my CPU. I can more than easily tell a CPU bottleneck, like here where one almost shows up via Final Fantasy XIII. The FF XIII bottleneck is not present there as you can see I'm at the 60fps lock, however it's close, and I've been both CPU and GPU bottlenecked in that game (it's not optimized at all).
     
  14. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Which is done on per-process level, not per-thread level. Once pinned all the threads from the game would fight for one core. We can not pin the primary offender thread onto a specific core while all the others get out of the way.
     
  15. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    Yes, we can. Not with Task Manager which is pretty limited as these things go, but Process Hacker can be used to change the priority and affinity of individual threads. You could, for example, pin all processes to all logical processors except 0 and 1 (or is it 4? I don't remember how Windows maps logical processors to physical cores), start the game, and pin the thread you want to analyze to LP 0. Core 0 would then be exclusively used by that one thread and any additional threads it may spawn.
     
  16. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Was not aware of that. Thanks.

    This should be make it much easier to analyze
     
  17. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    is it too late to sign up? I'm also up for that :D


    let's be real here, 120hz notebook panels have been dead for quite a long time, and I don't believe it has even the slightest of chance of coming back until DX12 becomes more widely spread
     
  18. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    As the owner of one of the last 120Hz notebooks to ever come out (the other being the AW17 R1 3D model), I am fully aware of this. However, again, DX12 will do nothing to bring it back. Even if it pulls load off the CPU, most users don't care and it's more trouble than a 60Hz panel as I said in the post you quoted.

    DX12 is not an end-all solution which will bring improvements to every problem that exists currently. DX12 will do ONE thing for sure: reduce CPU load and allow for increased draw calls. It MAY do another thing: allow SLI/CFX to add vRAM instead of mirror the contents. It MAY do a third thing: allow AMD + nVidia (or AMD/nVidia + Intel) cards in one system to work together rendering games for extra performance.

    The "MAY" in each situation relies on many variables, such as drivers that allow multiple GPU vendor cards to be in a system and active, and if the limitation of the SLI bridges or the PCI/e speeds cause problems with adding the vRAM together, and if one GPU won't hold back the other (such as the iGPU in an intel CPU holding back a powerful dGPU due to its tiny and slow memory, etc). There's no real guarantees even if the framework is there, and even reducing CPU load doesn't mean the public is going to jump on things like 120Hz if they would not have before. AND devs don't even need to code in DX12. They can very very well code in DX11 or even DX9 (as many games still do) and completely ignore DX12 as a whole if they don't feel it's worth learning how to code for the API.
     
  20. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Or they feel that many customers would still be on DX11 hardware.
     
  21. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    those are desktop monitors, there are no more 17.3" or under 120hz LCDs in production.

    ok lemme change my wording a little bit here. maybe, maybe until DX12 becomes widely spread. on technical side, there's no mobile CPU that are capable of driving 120fps consistently for most AAA games. DX12 at least provides a possible solution for that. if you can't even cross that line, then it's guaranteed that there won't be anymore 120hz panels. I'm not saying DX12 can drive 120hz panel to the market, I'm saying you need DX12 at least to make 120hz even worth using, maybe :p

    if anything high refresh panels will probably start to come back with VR getting pushed into the market
     
  22. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    Nope, nearly everything that supports DX11 nowadays will support the D3D 12 API, which means they can still get the CPU boost.

    if you think about, if NVIDIA can say with confidence that even fermi is getting support, will the majority be using anything more than 4 years old by the time windows 10 comes out?
     
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Radeon HD 5000 and 6000 Series are DX11 but won't be getting DX12 support
     
  24. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    120hz only ever came to mobile for 3D. Since that was an abysmal failure, it makes sense that they are gone. I think Clevo originally had plans for a 3D SM-A and scrapped them which is why I got the screen in my machine. 60FPS has pretty much always been the target for games... At least until the consoles made us okay with 30......
     
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The 4810MQ, 4910MQ and 49x0MX as well as the 39x0XM would like a word with you XD. If the 4910MQ or 49x0MX or 39x0XM couldn't push 120fps, then no desktop CPU excepting the hexacores could, and since not all games use the hexacores, that's a pointless statement. CURRENTLY there may not be mobile CPUs that can do it since everything is all TDP-locked HQ chips. But that's a different story. They exist and the laptops they were sold with could have run them. I should know because I'm using one, even though I'd love a 4930MX or 4940MX for this instead. If it's a good chip I might be able to keep it near 4GHz without much hassle/voltage.

    As for this, the whole reason for 120Hz laptops before was 3D vision's craze. Since VR has its own panels and simply requires GPUs to be strong enough, high refresh panels do nothing for it.

    All that being said, DX12 would indeed *HELP* 120fps... but it's still a niche many devs refuse to bother coding for, and it does not make a single difference with already-existing titles. It only helps future titles for devs who want to bother with DX12, because as we all know, PC gaming gets BEST* optimization**.
    True, but there's always been Clevo models which supported it without being a 3D edition. The P170HM could take the panel without being the P170HM3, and the P370EM could also take it without being the P370EM3. Just like the P37xSM could take the 120Hz panels without being P370SM3, the P37xSM-A models could all take it since it's basically the same board with a small revision for mSATA etc. When revising the SM-A models though, with all the 120Hz panels out of production and the dwindling interest in 3D models (note how I've never found another P370SM3 user on these forums? Well I used to find quite a few P170HM3 owners back in 2011-2012 when I was originally trying to get a new laptop)

    * = no
    ** = devs do not know the meaning of this word
     
  26. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    That's my point though. Their intended use, 3D, never took off. Why push panels that have 120hz screens that no stock laptop can use to its fullest potential? It's an added cost for the manufacturer and consumer. I find it worth it but I'd be lying if I said that I get 120 FPS out of it. Most modern 2014/2015 releases average 80 which I could have just gotten a stock panel with better color (this panel really messes up yellows) and overclocked, saving 150 (I think?) bucks in the process.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah, newer games you need to legit brute-force the 120fps. It's terrible in CPU-limited GTA V for one, but even other games it seems like 120fps requires well over two GTX 980s sometimes. Though most of the ultra hyped games come from Ubisoft. The stuff EA churns out can mostly get near 120fps (except Dragon Age), and with the exception of CoD Ghosts I get mostly 120fps or higher in CoD's single players every single time. Most other games too. Evolve was a punishing one on the GPU, but I sacrificed some FPS for higher graphics, and you wouldn't need to do the same kind of thing.

    That being said, you have a 72% NTSC gamut screen, so if yours messes up yellows that's probably just a single bad panel. I'll PM you an ICC profile to try out for it. It makes it FAR less bright and tones down the blues a lot, but I've grown accustomed to it and I like it now more than the default.

    Also, $150 was the cost of the regular screen upgrades from 60% to 72%, so you wouldn't have saved any money. That bit was a moot point.
     
  28. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    When I bought my machine the regular screen was 72% as well. They had 72% and 72% 120hz because in order to get the 120hz you had to buy 880Ms. Wish I had a copy of the old configurations page because I think that my S model actually had the 90% option too.

    Do you have the same panel I do? Prema has a profile posted on TI I just haven't tried it. Yellows only look right if viewed perfectly head on otherwise they fade into white if its on a white background.
     
  29. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    All the 17" 120Hz panels should be the same... unless you don't have the -TPB1 variant. The only 120Hz ones are LP173WF2, but there's four variants. One is glossy and three are matte, and I have the TPB1. You'd have to hunt for your own specific model XD.

    The 90% model was an AUO panel and it was glossy. If the stock for the -S model had 72% and then it was +$150 for the 120Hz, then that was a bit of a rip off. These machines' standard panels were 60% gamut. I know that for sure. If you had a 72% panel as default they should not have charged extra for 120Hz. I have been looking at regular and 3D laptops and the 120Hz panels since the P170HM3 came out because I wanted 3D so bad for years. I knew all the prices down in my head.
     
  30. seamon

    seamon Notebook Consultant

    Reputations:
    5
    Messages:
    226
    Likes Received:
    36
    Trophy Points:
    41
    Any chance the GTX 990m will come with a 15" laptop?
    17 inchers are way too impractical to carry to class.
     
  31. Kommando

    Kommando Notebook Evangelist

    Reputations:
    46
    Messages:
    376
    Likes Received:
    271
    Trophy Points:
    76
    Yes, there still is a chance. We'll know when it's out.
     
  32. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    I'd say it probably will. We've had flagship gpus in 15-inchers for quite some time.
     
  33. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I'm expecting the 1080M to fit snugly inside all existing models that currently house 980Ms. I just think that the models currently with mere 180W PSUs, such as my P651SG, will need to be sold with 240W PSUs this time around.

    Sent from my Nexus 5 using Tapatalk
     
  34. seamon

    seamon Notebook Consultant

    Reputations:
    5
    Messages:
    226
    Likes Received:
    36
    Trophy Points:
    41
    The next lineup should come with the Skylake CPUs right?
     
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Maybe Clevo will start supplying a new 240W brick then, but somehow I doubt it looking back on history
     
  36. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I'm keeping my fingers crossed for that!

    I can't see how they'd manage to get a 125W(ish) 1080M to live within a 180W limit. And P65xSx series is dead to me if it doesn't come with the next x80M GPU.

    Sent from my Nexus 5 using Tapatalk
     
  37. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    Everything can be fixed with a "fantastic" hybrid bios :p. Just look at what Dell did with the new Crippled Aw models and what MSI did previously.
     
    TomJGX and Ashtrix like this.
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    230W brick wave of future #Clevo #PleaseNoBeDumbYouAreOnARoll
     
    TomJGX likes this.
  39. mason2smart

    mason2smart Notebook Virtuoso

    Reputations:
    232
    Messages:
    2,440
    Likes Received:
    1,353
    Trophy Points:
    181
  40. MichaelKnight4Christ

    MichaelKnight4Christ Notebook Evangelist

    Reputations:
    30
    Messages:
    431
    Likes Received:
    71
    Trophy Points:
    41
    I have a feeling they will increase the vram on the soldiered chips. At least I think they should for the next high end refresh.
     
  41. houstoned

    houstoned Yoga Pants Connoisseur.

    Reputations:
    2,852
    Messages:
    2,224
    Likes Received:
    388
    Trophy Points:
    101
    i'm in the market for a new gaming laptop and was wondering when the new generation of Nvidia GPU's are releasing. is the top GPU gonna be called the 1080M? what's a 990M?

    i don't want to make the same mistake twice because the 580M came out about a month or 2 after i bought my 480M-equipped laptop. :smh:
     
  42. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Wait until November. If there will be a refresh it will be around September but by November we will know what shenanigans were pulled.........
     
    Mr Najsman likes this.
  43. houstoned

    houstoned Yoga Pants Connoisseur.

    Reputations:
    2,852
    Messages:
    2,224
    Likes Received:
    388
    Trophy Points:
    101
    dang. . .

    that's a long wait. :(
     
  44. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    It is. There is not a single game that wrecks the 980M... Other than TW3 and Batman AK... But turn a few knobs and you're running 60 without an overclock or any real noticeable impact. I can tell you that the rumor is the new card is almost twice as fast. But that's rumor, not fact.
     
  45. houstoned

    houstoned Yoga Pants Connoisseur.

    Reputations:
    2,852
    Messages:
    2,224
    Likes Received:
    388
    Trophy Points:
    101
    i love the 980M's performance. i just don't want to spend $3,000 -/+ on a new laptop and then a newer version, or better hardware comes out a month or 2 later.

    i already went through that with my NP8850 lol
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    With what you've got now, a new laptop with a 980M should be just fine regardless of what comes out, but I don't blame you for wanting to wait. Remember: if you can't play games and you want to play them or if your old machine is on its last legs, don't feel bad in getting something new.

    Besides, it'll take at least a month or two after the new cards come out for Prema and Johnksss and svl7 and the vBIOS modders to rip into it and determine whether it's worth it or not. The 880M so fresh in memory is one great example of a card that while on paper was a fantastic out-of-the-box upgrade, turned out to be a terrible disaster with only the lucky or the technical adepts making it work well for day to day usage.
     
    Kade Storm likes this.
  47. Any_Key

    Any_Key Notebook Evangelist

    Reputations:
    514
    Messages:
    684
    Likes Received:
    316
    Trophy Points:
    76
    Not going to see Pascal launch anytime soon (ala next few months). "Possible" refresh of the Maxwell chips to go along with Skylake launch... if that is the case I image we should be seeing leaks and mentionings right around now. About the only thing you can really count on is Skylake announcing in August and launching in September.
     
  48. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    There is a Maxwell refresh coming. We don't know the details but nVidia never misses a chance to refresh when Intel launches a new CPU. My guess is GM204 in all its glory... Especially with Pascal so close.
     
  49. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    *drools*

    I needs it, I does.
     
  50. houstoned

    houstoned Yoga Pants Connoisseur.

    Reputations:
    2,852
    Messages:
    2,224
    Likes Received:
    388
    Trophy Points:
    101
    actually, the only reason why i haven't upgraded yet is because my laptop is still going strong. i take immaculate care of my machines and my laptop is actually faster than when i got it because of the SSD upgrade. it still piles through any game i've thrown at it -- albeit i haven't bought any new games for PC in a while.

    i guess i'm just gonna wait til the new generation of Nvidia GPU's come out then because i'm not in any kind of rush. i'm definitely not making the same mistake again in upgrading too early.
     
← Previous pageNext page →