The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    nVidia 2015 mobile speculation thread

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    Probably because a few laptop buyers choose this junk graphic variant with all the names. Easier to cheat all these customers because they maybe not read/know as much about the 180w PSU Gate. Few sold = Few complaints :rolleyes:. Everyone with Aw + Gtx980m knows that Gtx980m and newer Aw15/17 is a failure without a 240 psu.
     
    TomJGX likes this.
  2. woodzstack

    woodzstack Alezka Computers , Official Clevo reseller.

    Reputations:
    1,201
    Messages:
    3,495
    Likes Received:
    2,593
    Trophy Points:
    231
    Here's the thing, the gaming/laptop enthusiast community has at large wanted bigger and better graphics in the laptops. So they had to ditch the 100W limitation and thus are going to change MXM 3.0b to whatever the next one will be called. With more headroom, we could still technically get 75-125W videocards in a new form but there will art least be more room for them to operate without throttling so bad if they can use more/ have more power, hopefully.

    This is really, a change caused inpart by this community. Anyone not satisfied, really begs to wonder if DELL and other corporations are wrong in thinking "They do not know what they want."

    This is essentially, the OPPOSITE of soldered on garbage. They tried to do it that way to see if there was a performance gain/efficiency bieng the better term - howevre, it resulted in an immediete loss in per4formance over the previous generation - so it cause quite a stir and sales were crap ! Like any revolution, it takes time to settle in and absorb those changes. Now, it didn't sit well, so here comes the next revolution, until something ties it down.

    What we need is a backwards compatability MXM 3.0b connection capable of 125-175W and mobile chipsets that can afford that sort of power ! Then, you must shre CPU and GPU heatsinks, together in one larger heatsink, where less ground is wasted inbetween, this will save roo for extra cooling, if it was a larger heatsink that shared resource space AND was roughly abit larger then the combined two would have been. It will be simplier, more robust, and can share cooling. Clevo has sort of start to do it, and it works amazing, and they have been able to infact support desktop CPU's while doing this too !
     
    Last edited: Oct 25, 2015
    MickyD1234 and i_pk_pjers_i like this.
  3. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    If you read recent posts, Dell has said they chose BGA because MXM is not mature enough and hasn't been developed enough to continue to support it. Too expensive blah blah blah BS.

    BGA certainly has it's place, it allows for thinner/lighter laptops. TBH I don't see compelling reason for BGA to even exist though. If I need a thin light laptop, I'm not buying a BGA ultrabook, I'm buying a Google Pixel C or Microsoft Surface Book, laptops that have enough power to run any basic office applications, light gaming, video/web surfing while functioning as a convenient tablet. I personally think Intel ought to scrap BGA all together and scrap mobile CPUs. They should focus on Desktop LGA, server and mobile market. Thin light laptop market is DEAD, nada. I don't see a good reason to buy a MacBook Pro anymore, buy a freaking iPad Air Pro or MacBook Air.

    But I see no advantages or reasons to use BGA in desktop replacement/Gaming laptops unless you are truly vain and need an Aurorus or Razer notebook, because having a thin laptop while sitting for hours stationary gaming really matters. Especially when your gaming setup already takes up large real estate with audio components, peripherals, massive mouse pads etc. You can tell I think the Aurora and Razer notebooks idiotic, all that money for something so vain and shouldn't be a considering when buying these types of laptops.
     
  4. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Unfortunately, it isn't dead. The FIRST thing people recommend are Razer Blades, GS60/GS70s, Gigabytes, Aorus... as long as someone isn't under the $1000 mark when they get recommended a Lenovo.

    Go on basically ANY other forum. Linus' forums, OCN, /r/pcmr, etc. All the general public cares about is if it's thin. Nobody cares about if it works properly. As long as it doesn't stutter in games people really don't care. All the lenovo people who tell me the laptop is great... I tell them play a game stressing both CPU and GPU and tell me how your CPU loses turbo boost. They don't care, because it plays things.

    Either people want things too cheap ($1000 or less for essentially an i5 and gtx 970 performance) or they want things extremely thin because they can barely lift 2 pounds.

    I'm not saying there are no legitimate reasons to want a machine that thin, but the fact remains and will always remain: Buy to suit your function. If your main function is "I need it to be lightweight and thin" then you either buy a weaksauce laptop that satisfies that requirement, or you buy something like the GS30 Shadow which lacks a GPU and you game at home. If your main function is gaming, then buy something to suit the hardware in the machine.
     
    TomJGX, MickyD1234, jaybee83 and 2 others like this.
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I do like thin if only for the fact it forces companies to optimize the form factor a bit. However they take it to an extreme so now they do it to have the absolute thinnest laptop though, like the Razer. "I'm thin as a dime" ... "Well I'm 0.1mm thinner than a dime"... yadda.

    Just optimize the form factor. If it has to be 21mm thick instead of 19mm thick but add adequate cooling then so be it. Just do it. But don't go 15mm thick because you can only to result in a hot mess.
     
    TomJGX, jaybee83, Papusan and 2 others like this.
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,655
    Trophy Points:
    931
    The new desktop Gtx980 cards +180w require thicker laptops. If laptop brands still want to use modern thin laptops + high Tdp cards they will only make a new throttle mess. I think we will see a new type of laptops now. Remember Aw15 and 17 normally use the same top off the line graphics card from Nvidia. You can't do this anymore with those 2 models. High Tdp cards for thicker models with better cooling and low Tdp cards for thinner model. Thank goodness for that.
     
    Last edited: Oct 27, 2015
  7. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I want to know more about Asus 3D Vapour cooling system. Is it hype or is it real? If it's real, then maybe we can see future laptops like Clevo ZM, DM series run cooler and even thinner.

    From what I can gather, it would be a heatsink with fluid, but the heat pipes would be fused/part of the heatsink itself. Better distribution and dissipation of heat then.

    It is a wonder that it has taken this long for a laptop OEM to implement this.

    [​IMG]
     
    Last edited: Oct 28, 2015
    Prema likes this.
  8. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Although currently used heat pipes already include small amounts of liquid in them - whereby the moisture evaporates at the computation core die end & then the vapour travels up the heatpipe to the fin end where it condenses & wicks back to the computation core end of the heat pipe, would the Asus 3D Vapour Cooling System be any different than this?
     
  9. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I would like to know also. Seems the heatsink itself has vapour chamber, but now the heatpipes are part of it, one unit.
     
    TomJGX and Robbo99999 like this.
  10. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    that would mean a larger amount of liquid and a larger dissipation area for it to condensate again, interesting...

    Sent from my Nexus 5 using Tapatalk
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Basically a mini All in One water cooler. Really just a hybrid of existing laptop cooling and AIO in desktops.
     
    ajc9988 likes this.
  12. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    I don't think vapor chambers and AIO water cooling can be compared here. The G752 (pictured) looks like it's heat sinks are greater % vapor chamber than we are used to seeing, which should equate to some sort of improvement. The cooling on the G751 is damn good as is so I am looking forward to seeing how the G752 shapes up.

    Are you thinking of the GX700 with it's dock maybe?
     
    ajc9988 likes this.
  13. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    There is a cooling design where you have big vapor chamber, like the one in the picture and heatpipes which transfer the heat to the radiators and a return pipe for the fluid. This lacks the later, so in my head its only one solution - thicker wick, since it doesn't need space for the liquid (that usually is in the heatpipes). This way the wick transfers more fluid and there's plenty of fluid, because it is in the vapor chamber. That's my take on it. Pretty great designs, both G752 and GX700, sadly I think that they'll be paired with non-upgradeable components which is a REAL shame.
     
    i_pk_pjers_i and ajc9988 like this.
  14. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
  15. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    I HIGHLY doubt that. The architectural differences between nGreedia and AMD are just way too far apart for a driver to fix. nVidia will continue to stomp on AMD until AMD gets its act together (if it ever will)
     
  16. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    You are seriously exaggerating. If anything GCN is a more well rounded and versatile architecture than Maxwell.
     
  17. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    I wasn't implying as if it'll be a dramatic change. Just that if something will change. Could be that the top AMD mobile GPU gets the lead over the GTX 970m, but still pales in comparison to the GTX 980m. Could be that it's a small change that changes very little.
     
  18. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    Granted, but at what cost? Apart from asynch compute, the R9 M290X and the R9 M295X/M395X don't hold a candle to the GTX 970M considering how power hungry they are for the performance they give. DX12 performance has yet to be compared in the mobile space, and if we ignore professional compute power (since most of the people paying attention to this subject would be mobile gamers), Maxwell spanks GCN in DX11 through and through. No driver fill ever fix that.
     
    i_pk_pjers_i and Robbo99999 like this.
  19. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Their mobile implementations are lazy and they don't care as much as they should. And I agree on DX11 drivers, honestly if I ever was rich enough to buy AMD I would sack the entire driver team and start fresh. They've been incompetent for years and they let down the hardware division.

    But there is not some great divide between NV and AMD like there is with Intel and AMD on the CPU side. All it takes is one slip up from NV and it's 3-4 years ago again with AMD as top dog in the graphics space.
     
    TomJGX, TBoneSan and i_pk_pjers_i like this.
  20. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Not a chance. nVidia would just release cards that push every component on them to the limit so that they fail in a year or two (or even 6 months) - they've done it before.....
     
    triturbo and TomJGX like this.
  21. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I was under the impression that AMD gpu's tend to be far more powerful/capable than Nvidia in professional software.
    Now, if all you are comparing are gaming performance and efficiency based on that alone, then yes, AMD is definitely lacking, and they hadn't released anything new (except rebrands) in the mobile (even desktop - sans their Fury line which actually puts them relatively on par with Maxwell more or less gaming-wise).

    However, I was under the impression that a GPU should be evaluated using ALL of it's capabilities.
    Not just gaming performance.
    After all, there are multiple consumers who use pro based software.

    I just find that its a bit unfair to compare AMD and Nvidia on the gaming side and say which company is better based on that alone.
    GCN as an architecture also seems to age better than Nvidia's do (at least on the desktop end), so AMD seems more focused on the longevity of their products - though if I'm not mistaken, 7970m experienced a lot of problems and failures, did it not?
     
  22. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    After Fermi, nVidia cards took a performance hit in pro apps and in compute in general. AMD still has the upper hand there, but in pure gaming, nVidia is winning. However their Fiji based cards are doing well against the big maxwell GM200 chips. We just need something in the mobile sector, which should happen in 2016.
     
    TomJGX likes this.
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, to make it clear, AMD's cards were always stronger. They just can't driver. Without drivers pulling out their full potential, GPUs might as well be scrap metal. AMD's drivers get slowly better over time, so you see their cards getting "stronger". It isn't that the cards are getting stronger... they just couldn't use their power before. It's why they fly in DX12, even under the 31+1 compute that nVidia is limited to. They can simply render to their full strengths and their drivers aren't such a limiting factor anymore. Teraflops and such are a stupid way to compare GPUs, sure, but make no bone about it: they are a legitimate compute measurement. If they can render at say... 300 gigatexels and 1000 gigapixels per second, and their drivers let them only use 200 gigatexels and 500 gigapixels, and nVidia can max at 250GT and 600GP, then what happens if their drivers don't block things off? Good things happen. That's why AMD is being so sought after by idiots who think DX12 is literally right around the corner, when in reality by the time it's in full swing, we might be ready to launch the GPU that's two generations above what we have now.

    That being said, if AMD's arctic islands can crush what we currently have and match Pascal's cards next year, but still be "underperforming", when DX12 actually does come out and get used a bit, nVidrosoft might find that their "build to suffice" attitude is biting them in the butt =D.
     
    TomJGX and jaybee83 like this.
  24. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    AMD did mention that Arctic Islands will have twice the performance per watt compared to their Fury line (which is almost neck and neck - 10W difference in some games - TDP wise).

    I would imagine that a lot of AMD's power draw might come from their ability to surpass Nvidia in professional software.
    Hence the high shader numbers which do not readily translate to gaming performance (but evidently, its a completely different ballpark on pro software).

    As for their drivers... yes, it was pointed out that at DX11 and below, their drivers lack the same type of optimizations that Nvidia pulls into their own, but at the same time, Nvidia also plays dirty by using proprietary Gameworks libraries, while AMD freely released their TressFX (which is open source and incidentally BETTER in terms of visual quality and less resource intense than Nvidia) for which Nvidia managed to optimize.

    Granted, Nvidia has more cash behind it, so they can afford better driver optimizations... though, after all this time, AMD really has little to no excuse on that end - though the upcoming driver release might affect things on this scale.

    Regarding DX12... it may not be around the corner, however, given the choice between Nvidia and AMD, I'd pick the AMD Fury Nano.
    Preferably in laptop form, exactly because it's more DX12 ready, has HBM already, and I usually keep my laptops for extended periods of use (my current Acer is 7 years old after all).

    It's actually sad that AMD isn't pushing the Nano for the mobile. It probably can be done, and it would likely give them a needed edge to boost their sales and keep themselves visible in the laptop arena while also matching the desktop 980, if not surpassing it (and it would still offer more than good enough DX11 performance).
     
    TomJGX and triturbo like this.
  25. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    AMD has promised so many things so many times... And they haven't delivered since the Athlon 64 on the CPU front and when it comes to video cards, sure they've traded blows with nVidia... But as @D2 Ultima said...... They can't make a driver to save their lives... and they REALLY need to.
     
← Previous page