The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    nVidia 2015 mobile speculation thread

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.

  1. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    sure, u just gotta wait long enough for your next purchase *trololol*

    Sent from my Nexus 5 using Tapatalk
     
    ajc9988 likes this.
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yep, Volta and whatever's after Cannonlake sounds about right
     
  3. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    You will wait until 2018 to buy a new laptop? =P
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No not me, but @D2 Ultima would if he wants the same magnitude of improvement as his previous upgrades
     
    D2 Ultima likes this.
  5. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    I see, yes - if you can get your money's worth out of that machine, waiting at least 4-5 years or so makes it seem not so bad when spending ~$3k.

    Or buy a new one every year and sell the old one for nearly the same price. It works with Macs, but not sure if the Clevo resale value would be as high.
     
  6. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Clevo resale value is high, however selling my machine is automatically a downgrade due to loss of 120Hz panel.
     
  7. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I said it before - there's enough room on the board (as well as all the necessarily connections and power), but it needs retooling, it can't fit between the current mounting holes, that's the problem. These holes are by standard, so it also means change in spec before the retooling. If placed right, there would be ~2x room for power delivery components.

    Because nGreedia fan boys are always like - "Let's wait and see what our worshiped company would bring to save us all, it would totally rock!" But if it's the other way around, oh God, run FAR away, WWIII is coming - "Haha, where's your company, it totally sucks... TROLOLOL"

    My personal "the worst" chart is:

    1. gnusmas

    A LOT OF SPACE

    2. nGreedia/grIntel
     
    Last edited: Sep 15, 2015
  8. SuperContra

    SuperContra Notebook Guru

    Reputations:
    2
    Messages:
    54
    Likes Received:
    26
    Trophy Points:
    26
  9. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    I'm actually quite underwhelmed with the "GTX 990" or whatever it's going to eventually be called. I'd be OK with 'regular' 980m's in SLi if the price is going to skyrocket or I need a watercooled machine to get 990. ~500 shaders for basically the same hardware otherwise isn't going to make me lose sleep.
     
  10. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    It won't need liquid cooling in Clevo machines.
     
    TomJGX and moviemarketing like this.
  11. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    From the article SuperContra posted:
    " The CEO further revealed that the TDP of the chip is going to be upwards of 100 Watts (and could be as high as 180 Watts"

    If it goes beyond 100W, then it will require much better cooling.
    What I'm actually wondering about is the quite possible duplicity of the industry.
    When AMD released their current mobile GPU which is in the area of 120W, the cooling in laptops wasn't modified AT ALL... even further still, those laptops usually came with underpowered power bricks (so it ended up with lack of power and throttling due to heat)... and yet it seems as if the industry might actually consider making accommodations for Nvidia's power hungry gpus.
     
    ajc9988 likes this.
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yes. And 980Ms have hit 150W+ already in existing clevos in benchmarks with heavy overclocks.

    Maybe because AMD doesn't have much enough market share? I remember AMD GPUs working in most of the single-card and SLI machines from Alienware and Clevo when the 7970M came out.
     
    jaybee83 and ajc9988 like this.
  13. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Starting to wonder why this thread isn't called "Nvidia & AMD 2015 mobile speculation thread".
     
    TomJGX and jaybee83 like this.
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That would require AMD to release something worthwhile, which isn't happening this year
     
    TomJGX, Ethrem, hmscott and 3 others like this.
  15. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i can confirm this from firsthand experience: overclocked to the bleeding edge, i can make my machine draw wattage in excess of 300W. 80-85W are accounted for by the cpu, lets say the rest of the system draws a maximum of 40W, that still leaves 175-180W unaccounted for...guess which other component could possibly be responsible for that? correct! my 980M :p
    and guess something else? in that scenario i can barely make it break 70C...75C on a superhot summer day with ambient temps of 35C. thats it!

    if u ask me: gimme a 990M and ill show u how it runs stock AND overclocked in my dark knight without it even breaking a sweat :cool:
     
    TomJGX, Mr Najsman, Prema and 3 others like this.
  16. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Wow, very impressive temps on the 980M in that case! Didn't realise the Batman had any better GPU cooling than Alienware (e.g. M17xR3/R4 and the M18xR1/R2) - sounds like it could well be better unless you're just lucky with your unit or you've done some crazy mods!
     
  17. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    gpu cooling was never an issue in the batman, its insanely powerful. coupled with the supercool maxwell cards, u get those kinda temps even without crazy modding :) in my case, i just repasted with gelid gc extreme :p

    the cpu cooling though, even very powerful on its own, needs a lil luv in order to cope with a 4790K, especially oced ;)
     
    Robbo99999 likes this.
  18. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Speaking of CPU, why does Intel use such terrible internal TIM even on K processors?
     
    i_pk_pjers_i likes this.
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The TIM isn't that bad, esp. on Devil's Canyon
     
  20. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Some reviews say 10~15C drop after delidding and switching to liquid metal. That's a lot of valuable headroom for poor Batman running close to 100C.
     
  21. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Liquid metal dropped my temps by 10c+ in many scenarios and I was using things like IC Diamond before... it's not that the internal TIM was all that bad, it's just that liquid metal is THAT GOOD.

    Also, I don't know of anyone who hit 100c+ using a P7xxZM with max fans and NOT having a warped heatsink at stock.
     
    Kade Storm, TomJGX, jaybee83 and 4 others like this.
  22. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Yep, warped heatsink on my first one and deeply concave ihs! Both fixed now! :) but a solder from factory (as seen on e5 Xeon and above) would save the need to strive for the extra cooling. Then it would only be lapping flat....
     
    TomJGX likes this.
  23. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Cost cutting measure.
     
    TomJGX likes this.
  24. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    The difference in cost to them is negligible! Spend the little bit more and charge $10-20 for the quarter in material costs.
     
  25. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    They can easily do much better and they know it. All Xeon E5/E7 chips use solder, and delidding is basically useless there. A high thermal density "e-n-t-h-u-s-i-a-s-t" product requires good cooling more than those things, and they just don't care.

    With relatively high room temp or AVX FPU load, close to 100C should be possible on 4790K with stock multipliers. Normal stress tests won't push that high.

    Does that amount of solder even cost a dollar? And how much does a solder-applying worker cost in those Chinese or Vietnamese factories, assuming they are not replaced by even cheaper robots yet?

    Even AMD's dirt cheap low end products use solder.
     
    TomJGX, i_pk_pjers_i and ajc9988 like this.
  26. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well negligible != 0. :p

    Come on guys my point was if Intel find a way to charge more for less, they'll do it. I definitely wouldn't mind paying $10-20 extra for a soldered chip. But from Intel's perspective, why spend that tiny amount more when you can charge the same and people will still buy it in droves?
     
    TomJGX and D2 Ultima like this.
  27. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Solder is tricky to use and will impact yields, the paste is good but the gap between the core and heat spreader is the main culprit, delidding removes the glue which makes it sit flatter.
     
    ajc9988 likes this.
  28. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well the fluxless solder was introduced with P4 Prescott, and lasted all the way to Sandy Bridge on the mainstream quads. One of the leading arguments (besides "Intel is cheap") for why they ditched fluxless solder is because apparently the solder cracks and fails in an unacceptable fashion for small dies.

    Original research paper: http://iweb.tms.org/PbF/JOM-0606-67.pdf

    Intel defines die sizes thusly in table 1:

    Small die: ~130 mm²
    Medium die: !270 mm²
    Large die: ~529 mm²

    Now the problem I have with this argument is Prescott had a die size of 112 mm², some 30% smaller compared to Ivy Bridge's 160 mm². Before someone gets smart, yes the actual die space occupied by CPU cores is much smaller thanks to node shrinks, but the final die size is still bigger on Ivy Bridge due to the IGP, and that's all that matters.

    So clearly Intel had no problem using fluxless solder on what was clearly a small die (Prescott), yet they stopped after Sandy Bridge, despite Ivy Bridge having a larger die size.

    You draw your own conclusions, but mine is Intel is pinching pennies.
     
    TomJGX, Prema, ajc9988 and 2 others like this.
  29. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    with Haswell's on die FIVR, you are always get much more heat on higher clocks anyways. the TIM was only a minor issue, but I do agree that intel's gone cheaper and cheaper. now they don't even include stock coolers for skylake K series
     
    TomJGX and ajc9988 like this.
  30. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    ...and they produce thinner cpu PCBs, making the vice method for delidding basically obsolete :p
     
    TomJGX and ajc9988 like this.
  31. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I don't think skimping out on a cooler is a bad thing especially for K series since probably a majority of users will use their own cooler. But yeah they need to improve that interface between the die and the IHS.
     
  32. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,755
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Get a razor blade like a man and learn how not to knick while using it!!! :)
     
  33. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    thx but no thx, sticking to vice method! *continues hammering onto his wooden block while screaming YEEEEEEEHAW!*
     
    i_pk_pjers_i likes this.
  34. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    It's not, but the savings aren't passed down to us, and that's where I have a problem with it. I mean if they included a stock cooler I could at least try to pawn it off for $10 on CL or something.
     
    heibk201 and ajc9988 like this.
  35. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    the point is if you have a broken cooler or something happens, at least you have something to use for backup.

    ^this, is the biggest reason. 6700k is still even more expensive at $350, I don't see any cost savings but instead a jacked up price
     
    ajc9988 likes this.
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    This is the biggest thing. The 6700K has almost no reason for anybody to buy it right now. You can find a 5820K for almost the same price, and a X99 board for almost the same price as some of the Z107 boards. X99, 2 extra cores/4 extra threads, a much more robust enthusiast chipset and quad channel support can be yours for only $50 more. I hope the 6700K flops and intel has to drop its prices down to about $280 where it should be.
     
    TomJGX, ajc9988, hmscott and 3 others like this.
  37. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    B-b-but 14 nm dammit!!

    Nah it won't flop, so long as "enthusiasts" keep upgrading every CPU cycle, Intel has no reason (or motivation) to stop what they're doing.
     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Any performance enthusiast who buys a 6700K in a desktop format instead of a 5820K is an idiot and cannot prove me wrong.
     
  39. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    If that's the case, there will be plenty of i***ts.
     
  40. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    One word: 14 nm
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Maybe they want to be able to watch YT videos in Chrome without high CPU usage/lag or being forced to install the h264ify extension? Intel added partial VP9 support to Skylake iGPU, and ofc Haswell-E has no iGPU at all.
     
    Last edited: Sep 18, 2015
    hfm likes this.
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    pshh, those two extra cores will do eet.
     
    jaybee83 likes this.
  43. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    And a 4K60 YT video will max it out, so what happens when a multi-tasker such as yourself wants to watch it while a game running in the background is also consuming significant CPU time? Brute-forcing the lack of hardware acceleration with more CPU power isn't the solution.
     
  44. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    it isnt??? :eek: o_O :oops: damn, u destroyed my illusions...
     
    moviemarketing and ajc9988 like this.
  45. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah, because last I checked, no current or upcoming notebooks (P570WM is EOL) have hexacore or octocore CPUs, and a 4K60 YT video will break your poor Batman's back more than Bane ever did
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I'll be honest, I don't think even with good hardware acceleration that watching a 4K 60fps youtube video is a good idea while playing a game :D

    But I do agree the iGPUs have uses. That said, the tradeoff (optimus) for it on notebooks is not worth it, really. I can watch most 720/60 or even 1080/60 while gaming via youtube without too much issue as long as it's in HTML5; flash murders everything CPU with chrome. Hell, chrome itself is a CPU devouring hog.
     
    jaybee83 likes this.
  47. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i was being sarcastic :rolleyes:

    and yes, it actually does! not able to watch 4K60 clips properly unless with specialized software...
     
  48. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Are you using the h264ify extension? It makes YT in Chrome/FF as efficient as in IE by forcing YT to use the H.264 codec (hardware accelerated) instead of VP8/VP9 (not hardware accelerated). Only downside is many videos are no longer playable at 1440p/2160p and/or 60 FPS since they are VP9-only at that resolution/frame rate to save bandwidth. I believe YouTube defaulted to the HTML5 player a while back, not that I've noticed any difference in CPU usage between the HTML5 and Flash players. The big difference was when YouTube on Chrome started serving its videos in VP9 instead of H.264 sometime in the last year or two and killed hardware acceleration for everyone.
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I'm not using it. I have been using the HTML5 player for as long as it's been possible to do so with chrome. The difference is not so much "CPU" (which usually has been low enough for me 1080p and below) but more how the rest of the system acts. Chrome has some kind of stupid thing where games just "bug out" framerate-/frametime-wise if anything is playing on it, and it gets worse the higher the resolution that "something" is... but youtube's HTML5 is much easier than flash elsewhere. I also compared it to experience with Palemoon (firefox derivative) and the HTML5 was also a benefit there (slightly with CPU as well), even though FF/IE have infinitely better overall system impact. But I can't deal with the "let's freeze everything "flash" on the browser if you right click flash content!" that flash itself has, and that's the literal only reason I'm still on chrome.
     
  50. JinKizuite

    JinKizuite Notebook Consultant

    Reputations:
    5
    Messages:
    123
    Likes Received:
    58
    Trophy Points:
    41
    So ive been reading about pascal and how it is able to do mixed percision of 16,32 and 64 FP where as Maxwell can only do 32. Does only doing 32 have any negativity in gaming? Pros Cons? D:
     
← Previous pageNext page →