The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    New Nvidia 900m series flagship

    Discussion in 'Gaming (Software and Graphics Cards)' started by DataShell, Aug 7, 2015.

  1. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
  2. Porter

    Porter Notebook Virtuoso

    Reputations:
    786
    Messages:
    2,219
    Likes Received:
    1,044
    Trophy Points:
    181
    But the article points back to this thread as it's source o_O
     
    Seanwhat likes this.
  3. Perseus17

    Perseus17 Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    1
    Trophy Points:
    6
    Gotta keep the faith.
     
    hmscott likes this.
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Repost
     
  5. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Hope springs eternal.
     
    hmscott likes this.
  6. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    Is MXM dead?
     
  7. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    They don't just list the speculation thread as a source, they put "Own" right above it. That TDP range sounds ridiculous though... 200W TDP?!?!? Guess we won't need to use our heaters in the winter anymore :p
     
  8. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    All speculation. More speculation. I don't doubt there will be a new "flagship" Maxwell GPU, but it won't be a mobile 200W card and it will be MXM. Making a flagship card as soldered only is dumb. Besides Clevo and Alienware build their own MXM cards so what's to keep them from making an MXM one?
     
    Mr Najsman likes this.
  9. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It'll probably be GM204 with more cores (up to 2048) giving us about 30% performance over the 980M.

    It would be interesting to see a hybrid card with HBM.
     
  10. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Alienware is back in the MXM game? Am I missing something? Also they don't build them, rather contract someone to make it for them. MSi builds their own modules, and most vendors contract them as well. As far as I know Clevo doesn't build their modules on their own, but relies on someone else (ASUS/Pegatron), not very certain though.
     
  11. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Maybe the R2's will be. Perhaps they learned from their mistakes. Doubt it, though. :p
     
  12. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    more cores for video editing and 3d work. yummy
     
  13. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    ++msi
     
  14. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    What nonsense. Remember the Maxwell 860M was forced solder only...? You think that they won't do it again? The ODMs are at the mercy of nVidia, especially with AMD being out of the game so long... They will do what nVidia says. Of course if you need proof of this ask @Prema about the system BIOS overclock block he discovered.
     
  15. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I'm in touch with Prema quite a bit. I know a lot of that crap that goes on, between him and a couple other contacts. But to have 970m, 980m MXM and then 990m soldered only? Whaaa!? Then again, I wouldn't put it past Nvidia these days. That would just be icing on the cake. You want 990m? You gotta buy a new laptop. No MXM for you!
     
    Last edited: Aug 9, 2015
    Ethrem likes this.
  16. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Those idiots selling 990M pre-orders for 1500 EUR are going to be in one hell of a mix up if it's soldered, lol.
     
    moviemarketing likes this.
  17. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    Alienware is no longer manufacturing MXM cards.
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    All current high end laptops would need an all new motherboard design. I don't see that going over lightly with those OEM's that offer laptops.
     
  19. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Clevo is launching their new products at the end of this month with the 970M & 980M, so we know that it must be MXM. Unless they plan to launch an additional lineup, never seen before, it would be suicide for the company to invest in obsolete hardware/processes.

    Board meeting: "Hey guys, I have a great idea. Let's bankrupt ourselves. Sounds cool, right?"

    I imagine NVIDIA will slowly transition into another type after 2016 (e.g. MXM 3.0c / 3.1 or something).
     
    Last edited: Aug 8, 2015
  20. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    yeah, I just don't see it happening. The 860m thing was so that they could exhaust their Kepler chips.
     
  21. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    The days of unfolded chips didn't end with nVidia and Intel at the same time for no reason.
     
  22. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    Alienware builds nothing.
    Compal and wistron build the alienware models.
     
  23. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    Ah, I thought that the other thread about the new chip was dead. @Ethrem might as well lock this one since it's basically a repost.
     
  24. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    thegreatsquare, there isn't only 1 path for GPU upgrades, and there are a lot of MXM spots to fill with upgrades.

    I doubt Nvidia would leave money on the table, they will provide SLI/single MXM upgrades.

    Nvidia can come up with MXM 985m's that SLI faster than a single 990m :)
     
  25. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    985m in sli with a 17 inch 4k screen and skylake cpu with some pcie 3.0 ssd drives? sold!
     
    hmscott likes this.
  26. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Phase, first, 4k wouldn't be fun to feed for intense games - low FPS - at least until 2016+. Use DSR at 1080p for internal screen for now for best gaming experience.

    18.4", it's bigger :)

    PCIE lanes in Skylake are divided weird, only 4 lanes direct to the CPU, and the other 16 lanes share those 4 lanes into the CPU. Maximum shared throughput is 3.5GB - that is from 4GB theoretical - and from only 1 device, more devices feeding that will incur more overhead loss.

    First gen Skylake will be fun, even better in some respects, but not fully realized.

    Unless it is, but we won't know until it ships ;)
     
  27. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    The new chip is meant to fill the gap for the most powerful cross platform engine... UE4... It is optimized based upon the last frame... SLI won't ever be officially supported and everything else will be an extremely buggy hack...
     
  28. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Don't forget the 62fps cap built in to UE4.

    UE4: It seemed like a good idea at the time...

    Genius. ;)

    Does the UE4 engine support SLI?
    https://answers.unrealengine.com/questions/21746/does-the-ue4-engine-support-sli.html

    No SLI support with Unreal Engine 4
    https://forums.geforce.com/default/topic/812392/no-sli-support-with-unreal-engine-4/
     
    Last edited: Aug 11, 2015
    TBoneSan and Ethrem like this.
  29. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    This quote doesn't make any sense:

    Don't a lot of games rely on data from previous frames, and why can't that info be shared between cards?
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    No, actually. It's how AFR works. Previous-frame frame buffer data tech means that to work, it needs to know what every single frame did. The frame buffer of GPU 1 is not the same as GPU 2 because they render different frames, so tech like SMAA's Temporal Filter and MFAA cannot be activated as they'd be "guessing" where to work from on every frame. nVidia made TXAA work in SLI though, so I don't know why MFAA won't. Silly nVidia.

    Also, the frame buffer data is too much to transfer over the SLI bridge. Octiceps can explain more about it. Know how AMD's XDMA tech on the R9 290/290X/390/390X/Fury/FuryX can work in CrossfireX without a crossfire connector? That's what needs to happen for frame buffer data to be transferable quick enough; otherwise PCI/e communication is also too slow. nVidia cards aren't designed for it. AMD cards theoretically can do it. Also, doing this method of rendering (which would be SFR) would kill the vRAM access bandwidth improvements of multi-GPU configurations. Maybe that's why HBM is becoming a huge thing right now, coinciding with DX12/Vulkan.
     
  31. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    When did UE4 get a 62fps cap? I am dead certain UT4's alpha ran at ~120fps for me when I tried it last year
     
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It's probably a simple .ini file change. Many UE3 games on PC also defaulted to a 62 FPS cap. Hardcoded numbers are usually 30 and 60.
     
  34. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I hope you're right. I don't know for certain. I want the new Doom at more than 60* otherwise it's going to spoil our reunion :p
     
    TomJGX, moviemarketing and D2 Ultima like this.
  35. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    Yea sli was a neat trick but I would take a single powerful GPU over it 999 out of 1000 purchases

    Sent from my Nexus 6 using Tapatalk
     
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That's why you buy that powerful GPU...
    THEN SLI
     
    TomJGX, Mr Najsman and hmscott like this.
  37. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    The benefits of sli are still there you just have to look harder to find them lately

    Sent from my Nexus 6 using Tapatalk
     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh believe me, I know that better than most. I've long since stopped telling people to SLI anything under a 980Ti. If you can buy a second 970, sell your current one and buy a 980Ti. If you're aiming to buy two 980s, get one 980Ti and save up for a second one. I'm seeing more and more games barely taking advantage of or giving low returns for SLI, and all the UE4 games that'll be coming out soon will never support SLI either.
     
    TBoneSan likes this.
  39. M0rdresh

    M0rdresh Notebook Consultant

    Reputations:
    0
    Messages:
    115
    Likes Received:
    34
    Trophy Points:
    41
    I'm having a hard time seeing how in this day and age a very popular engine fails to support multi-gpu rendering. Ever from the moment I slotted two 3DFX Voodoo 2 12MB's I've been a strong believer of multi-gpu rendering. I hope Nvidia gets creative and find something else than AFR, that is suggested anyway looking at the DirectX 12 multi-gpu rendering features (such as stacked vram and more act as one for both gpu's) if I'm not mistaken.
     
    hmscott likes this.
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Exactly, AFR is the problem. It's a hacky inefficient way of doing multi-GPU from the '90s. SLI/CrossFire worked fine for some years after release when all games were relatively simple DX9 affairs but it just doesn't mesh with modern engines anymore.
     
  41. M0rdresh

    M0rdresh Notebook Consultant

    Reputations:
    0
    Messages:
    115
    Likes Received:
    34
    Trophy Points:
    41
    I wonder what Nvidia's stance is on that. Having the developers of one of the most popular engines openly stating on their website one has to forgo SLI to enjoy the Unreal 4 engine has to create some buzz in the board rooms at Nvidia HQ not?
     
    TBoneSan and hmscott like this.
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah that is curious isn't it, esp. considering Epic is one of Nvidia's biggest partners
     
  43. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Why are there two threads about speculating on NVIDIA GPU's this year?
     
  44. Gunn

    Gunn Notebook Enthusiast

    Reputations:
    0
    Messages:
    28
    Likes Received:
    14
    Trophy Points:
    6
    unlike Epic, Notebook review supports SLI technology for its threads perhaps?
     
  45. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I thought about merging them but the OP contains the information on the rumored GPU while you have to go to page 70 or so of the other to get it.

    nVidia is likely not bothered at all by the loss of SLI technology in UE4 by the way. It forces people to buy a more expensive video card rather than buying two cheaper cards. It's a win for nVidia.
     
    DataShell likes this.
  46. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    To me it feels like another nail in the coffin for SLI as a legitimate setup or upgrade path. It's a loss for Nvidia as far as I'm concerned since it another reason for me not to SLI my 980ti. I know I'm not the only ex-SLI fan that feels this way. SLI feels more illegitimate than ever since Maxwell released. Frame pacing, scaling issues and a lack of support from both Nvidia and Devs are at the heart of the problem.
     
  47. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    SLI and CF are buggy disasters these days. I miss the Voodoo days. But still, it is more likely that a gamer will spend the extra money for a more powerful video card, especially considering how much horsepower that engine actually uses when it is pushed, or they'll have to put up with poor performance. Time will tell. They managed to get Ether One working at mostly 30FPS on the PS4 using UE4 so it may all be about the optimization.
     
  48. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah, I agree. I was playing MGSV Ground Zeros the other day. It worked fine, except for some inexplicable drops to about 45-50fps that was fairly common. My util wasn't going up, I wasn't maxing on CPU or GPU... was sitting at around 70% GPU util, maybe 80% sometimes.

    Solution? Overclock GPUs. Boom, instant 60fps flat. Apparently that game won't use over 70-80% of each GPU, so OCing them granted extra power to the game. Silly. UE4, Unity 5, ID Tech 5, no like SLI. Who knows how Source 2 is going to like it. Maxwell has features that are disabled in SLI, and it works worse in SLI without custom vBIOSes,

    I've been telling everyone. If you plan to SLI or Crossfire any card, buy a 980Ti first. The only person I let off the hook is someone who owns a 980 and wants to buy another 980, because selling a 980 to buy a 980Ti is kind of... I can understand not wanting to do that.
     
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Why? I totally understand it. @TBoneSan did this, he got rid of 980 SLI for 980 Ti. GM200 is exactly 50% bigger than GM204 any way you slice it, so at identical clocks it is 50% faster. A 50% tangible improvement is more than can be said for SLI nowadays, what with the heat, power, microstutter, and awful driver/game support. Less headaches and not to mention cheaper.
     
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, you understand it and I do too, but other people aren't so... understanding. They're quicker willing to dump a 970 for a 980Ti.

    Either way, I make sure I get it clear that everyone should aim for a 980Ti rather than a 980 (overpriced) or a 970 (broken by design). I can recommend a R9 390 over the 970 easily, but it's difficult when someone wants to buy a ~$500 GPU. There's not enough extra power to justify a 980 or R9 390X for the cost, people don't wanna buy a R9 290X anymore, and the 980Ti may be slightly out of reach for them. But then I'll go "if you're buying this now planning to get another in the near future, hold off and save up" and that's a hard sell too.

    Blah, I dislike the GPU market right now. I've been seeing some people have random issues with Fiji cards too; random low utilization.

    And now potentially soldered-only 990M that probably won't even have a notebook to go into.
     
 Next page →