sure, u just gotta wait long enough for your next purchase *trololol*![]()
Sent from my Nexus 5 using Tapatalk
-
Yep, Volta and whatever's after Cannonlake sounds about right
-
moviemarketing Milk Drinker
You will wait until 2018 to buy a new laptop? =P -
No not me, but @D2 Ultima would if he wants the same magnitude of improvement as his previous upgradesD2 Ultima likes this.
-
moviemarketing Milk Drinker
I see, yes - if you can get your money's worth out of that machine, waiting at least 4-5 years or so makes it seem not so bad when spending ~$3k.
Or buy a new one every year and sell the old one for nearly the same price. It works with Macs, but not sure if the Clevo resale value would be as high. -
Clevo resale value is high, however selling my machine is automatically a downgrade due to loss of 120Hz panel.
-
I said it before - there's enough room on the board (as well as all the necessarily connections and power), but it needs retooling, it can't fit between the current mounting holes, that's the problem. These holes are by standard, so it also means change in spec before the retooling. If placed right, there would be ~2x room for power delivery components.
Because nGreedia fan boys are always like - "Let's wait and see what our worshiped company would bring to save us all, it would totally rock!" But if it's the other way around, oh God, run FAR away, WWIII is coming - "Haha, where's your company, it totally sucks... TROLOLOL"
My personal "the worst" chart is:
1. gnusmas
A LOT OF SPACE
2. nGreedia/grIntelLast edited: Sep 15, 2015 -
New article. New name
http://wccftech.com/nvidia-flagship-mobile-gtx-980-gm204/ -
I'm actually quite underwhelmed with the "GTX 990" or whatever it's going to eventually be called. I'd be OK with 'regular' 980m's in SLi if the price is going to skyrocket or I need a watercooled machine to get 990. ~500 shaders for basically the same hardware otherwise isn't going to make me lose sleep.
-
It won't need liquid cooling in Clevo machines.TomJGX and moviemarketing like this.
-
From the article SuperContra posted:
" The CEO further revealed that the TDP of the chip is going to be upwards of 100 Watts (and could be as high as 180 Watts"
If it goes beyond 100W, then it will require much better cooling.
What I'm actually wondering about is the quite possible duplicity of the industry.
When AMD released their current mobile GPU which is in the area of 120W, the cooling in laptops wasn't modified AT ALL... even further still, those laptops usually came with underpowered power bricks (so it ended up with lack of power and throttling due to heat)... and yet it seems as if the industry might actually consider making accommodations for Nvidia's power hungry gpus.ajc9988 likes this. -
Yes. And 980Ms have hit 150W+ already in existing clevos in benchmarks with heavy overclocks.
Maybe because AMD doesn't have much enough market share? I remember AMD GPUs working in most of the single-card and SLI machines from Alienware and Clevo when the 7970M came out. -
Starting to wonder why this thread isn't called "Nvidia & AMD 2015 mobile speculation thread".
-
i can confirm this from firsthand experience: overclocked to the bleeding edge, i can make my machine draw wattage in excess of 300W. 80-85W are accounted for by the cpu, lets say the rest of the system draws a maximum of 40W, that still leaves 175-180W unaccounted for...guess which other component could possibly be responsible for that? correct! my 980M
and guess something else? in that scenario i can barely make it break 70C...75C on a superhot summer day with ambient temps of 35C. thats it!
if u ask me: gimme a 990M and ill show u how it runs stock AND overclocked in my dark knight without it even breaking a sweat
TomJGX, Mr Najsman, Prema and 3 others like this. -
Robbo99999 Notebook Prophet
Wow, very impressive temps on the 980M in that case! Didn't realise the Batman had any better GPU cooling than Alienware (e.g. M17xR3/R4 and the M18xR1/R2) - sounds like it could well be better unless you're just lucky with your unit or you've done some crazy mods! -
gpu cooling was never an issue in the batman, its insanely powerful. coupled with the supercool maxwell cards, u get those kinda temps even without crazy modding
in my case, i just repasted with gelid gc extreme
the cpu cooling though, even very powerful on its own, needs a lil luv in order to cope with a 4790K, especially oced
Robbo99999 likes this. -
Speaking of CPU, why does Intel use such terrible internal TIM even on K processors?
i_pk_pjers_i likes this. -
The TIM isn't that bad, esp. on Devil's Canyon
-
Some reviews say 10~15C drop after delidding and switching to liquid metal. That's a lot of valuable headroom for poor Batman running close to 100C.
-
Liquid metal dropped my temps by 10c+ in many scenarios and I was using things like IC Diamond before... it's not that the internal TIM was all that bad, it's just that liquid metal is THAT GOOD.
Also, I don't know of anyone who hit 100c+ using a P7xxZM with max fans and NOT having a warped heatsink at stock.Kade Storm, TomJGX, jaybee83 and 4 others like this. -
Yep, warped heatsink on my first one and deeply concave ihs! Both fixed now!
but a solder from factory (as seen on e5 Xeon and above) would save the need to strive for the extra cooling. Then it would only be lapping flat....
TomJGX likes this. -
The difference in cost to them is negligible! Spend the little bit more and charge $10-20 for the quarter in material costs.
-
They can easily do much better and they know it. All Xeon E5/E7 chips use solder, and delidding is basically useless there. A high thermal density "e-n-t-h-u-s-i-a-s-t" product requires good cooling more than those things, and they just don't care.
With relatively high room temp or AVX FPU load, close to 100C should be possible on 4790K with stock multipliers. Normal stress tests won't push that high.
Does that amount of solder even cost a dollar? And how much does a solder-applying worker cost in those Chinese or Vietnamese factories, assuming they are not replaced by even cheaper robots yet?
Even AMD's dirt cheap low end products use solder.TomJGX, i_pk_pjers_i and ajc9988 like this. -
Well negligible != 0.
Come on guys my point was if Intel find a way to charge more for less, they'll do it. I definitely wouldn't mind paying $10-20 extra for a soldered chip. But from Intel's perspective, why spend that tiny amount more when you can charge the same and people will still buy it in droves? -
Meaker@Sager Company Representative
Solder is tricky to use and will impact yields, the paste is good but the gap between the core and heat spreader is the main culprit, delidding removes the glue which makes it sit flatter.
ajc9988 likes this. -
Well the fluxless solder was introduced with P4 Prescott, and lasted all the way to Sandy Bridge on the mainstream quads. One of the leading arguments (besides "Intel is cheap") for why they ditched fluxless solder is because apparently the solder cracks and fails in an unacceptable fashion for small dies.
Original research paper: http://iweb.tms.org/PbF/JOM-0606-67.pdf
Intel defines die sizes thusly in table 1:
Small die: ~130 mm²
Medium die: !270 mm²
Large die: ~529 mm²
Now the problem I have with this argument is Prescott had a die size of 112 mm², some 30% smaller compared to Ivy Bridge's 160 mm². Before someone gets smart, yes the actual die space occupied by CPU cores is much smaller thanks to node shrinks, but the final die size is still bigger on Ivy Bridge due to the IGP, and that's all that matters.
So clearly Intel had no problem using fluxless solder on what was clearly a small die (Prescott), yet they stopped after Sandy Bridge, despite Ivy Bridge having a larger die size.
You draw your own conclusions, but mine is Intel is pinching pennies. -
with Haswell's on die FIVR, you are always get much more heat on higher clocks anyways. the TIM was only a minor issue, but I do agree that intel's gone cheaper and cheaper. now they don't even include stock coolers for skylake K series
-
...and they produce thinner cpu PCBs, making the vice method for delidding basically obsolete
-
I don't think skimping out on a cooler is a bad thing especially for K series since probably a majority of users will use their own cooler. But yeah they need to improve that interface between the die and the IHS.
-
Get a razor blade like a man and learn how not to knick while using it!!!
-
thx but no thx, sticking to vice method! *continues hammering onto his wooden block while screaming YEEEEEEEHAW!*
i_pk_pjers_i likes this. -
It's not, but the savings aren't passed down to us, and that's where I have a problem with it. I mean if they included a stock cooler I could at least try to pawn it off for $10 on CL or something.
-
the point is if you have a broken cooler or something happens, at least you have something to use for backup.
^this, is the biggest reason. 6700k is still even more expensive at $350, I don't see any cost savings but instead a jacked up priceajc9988 likes this. -
This is the biggest thing. The 6700K has almost no reason for anybody to buy it right now. You can find a 5820K for almost the same price, and a X99 board for almost the same price as some of the Z107 boards. X99, 2 extra cores/4 extra threads, a much more robust enthusiast chipset and quad channel support can be yours for only $50 more. I hope the 6700K flops and intel has to drop its prices down to about $280 where it should be.
-
B-b-but 14 nm dammit!!
Nah it won't flop, so long as "enthusiasts" keep upgrading every CPU cycle, Intel has no reason (or motivation) to stop what they're doing. -
Any performance enthusiast who buys a 6700K in a desktop format instead of a 5820K is an idiot and cannot prove me wrong.
-
If that's the case, there will be plenty of i***ts.
-
One word: 14 nm
-
Maybe they want to be able to watch YT videos in Chrome without high CPU usage/lag or being forced to install the h264ify extension? Intel added partial VP9 support to Skylake iGPU, and ofc Haswell-E has no iGPU at all.Last edited: Sep 18, 2015hfm likes this.
-
-
And a 4K60 YT video will max it out, so what happens when a multi-tasker such as yourself wants to watch it while a game running in the background is also consuming significant CPU time? Brute-forcing the lack of hardware acceleration with more CPU power isn't the solution.
-
it isnt???
damn, u destroyed my illusions...
moviemarketing and ajc9988 like this. -
Yeah, because last I checked, no current or upcoming notebooks (P570WM is EOL) have hexacore or octocore CPUs, and a 4K60 YT video will break your poor Batman's back more than Bane ever did
-
I'll be honest, I don't think even with good hardware acceleration that watching a 4K 60fps youtube video is a good idea while playing a game
But I do agree the iGPUs have uses. That said, the tradeoff (optimus) for it on notebooks is not worth it, really. I can watch most 720/60 or even 1080/60 while gaming via youtube without too much issue as long as it's in HTML5; flash murders everything CPU with chrome. Hell, chrome itself is a CPU devouring hog.jaybee83 likes this. -
i was being sarcastic
and yes, it actually does! not able to watch 4K60 clips properly unless with specialized software... -
Are you using the h264ify extension? It makes YT in Chrome/FF as efficient as in IE by forcing YT to use the H.264 codec (hardware accelerated) instead of VP8/VP9 (not hardware accelerated). Only downside is many videos are no longer playable at 1440p/2160p and/or 60 FPS since they are VP9-only at that resolution/frame rate to save bandwidth. I believe YouTube defaulted to the HTML5 player a while back, not that I've noticed any difference in CPU usage between the HTML5 and Flash players. The big difference was when YouTube on Chrome started serving its videos in VP9 instead of H.264 sometime in the last year or two and killed hardware acceleration for everyone.
-
I'm not using it. I have been using the HTML5 player for as long as it's been possible to do so with chrome. The difference is not so much "CPU" (which usually has been low enough for me 1080p and below) but more how the rest of the system acts. Chrome has some kind of stupid thing where games just "bug out" framerate-/frametime-wise if anything is playing on it, and it gets worse the higher the resolution that "something" is... but youtube's HTML5 is much easier than flash elsewhere. I also compared it to experience with Palemoon (firefox derivative) and the HTML5 was also a benefit there (slightly with CPU as well), even though FF/IE have infinitely better overall system impact. But I can't deal with the "let's freeze everything "flash" on the browser if you right click flash content!" that flash itself has, and that's the literal only reason I'm still on chrome.
-
So ive been reading about pascal and how it is able to do mixed percision of 16,32 and 64 FP where as Maxwell can only do 32. Does only doing 32 have any negativity in gaming? Pros Cons? D:
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.