I agree. The notebookcheck review of the p650se with 970m had it getting up to 72c, the notebookcheck review of the p651rp6 had the 1060 capping at 72c too. No doubt the Pascal cards should run a bit hotter(the performance gains are insane) but i don't think they are running hotter than the sun lol. Even over clocked the 1060 is only at like 75c. If the temps are within a few C but performance is 60% better, I'm not really sure what the big deal is about.
-
Ionising_Radiation ?v = ve*ln(m0/m1)
F**k nVidia.
Anyone who wants to buy this, please look at machines with the 970M/980M instead. They're bound to be waaay cheaper, and can be overclocked the hell out of, to well exceed the performance of this ****.
I'll say it again: f**k nVidia, screw Pascal. They could've given us the 3 GB version of the 1060, but noooo, they had to make this gimped crap. Hell, at this rate even re-branded Maxwell GM204/206 looks like a better deal. See, gentlemen? The 1060 is supposed to perform like the GTX 980. The 1050, one performance level below, will perform like a GTX 960.
This is marketing at its finest.Last edited: Sep 4, 2016TomJGX, ThePerfectStorm and hmscott like this. -
Relying on DX12 and Vulkan is a very very bad mindset to have. There are many current drawbacks to those APIs, and since developers are not bothering to actually run much optimizations on their titles, and instead rely on the driver. And believe me, if developers get slack with it, even DX12/Vulkan can suck tons of CPU power from systems.
It's actually *EXTREMELY* weak. It is a 3.1GHz 4-core load processor. All mobile skylake chips out of the box are very weak. The reason the 6700HQ is used in "all those laptops" is because it's simply the basic mobile i7. It's the same reason the 4700MQ was used, and then the 4710MQ was used, and you almost certainly had to visit a boutique website to find a higher end chip. If I had my way, the minimum clockspeed of any i7 on the market intended for gaming and other similar workloads (I.E. not counting things like the i7-6700T) would be 3.5GHz flat, and I would want the higher end ones to run at 4GHz as well. Desktops get it and laptops should have it, and the number of times I've found myself CPU limited in games that I play is rather astounding... albeit I'm above 60Hz, but I've been limited to 80fps and thereabouts already, and with 75Hz and 120Hz laptops I can guarantee you that newer machines WILL be limited very badly. Especially Deus Ex Mankind Divided. That game DEVOURS processors like it's a joke. Battlefield 1 too, is rather draining especially with some of the graphics options turned up. I've no idea how they managed to make the CPU load increase when raising texture filtering and mesh quality, however they have. And my RAM is extremely good too; the crappy RAM that's found in most of these machines are likely to limit things as my old bad RAM once did.
The 3GB version wouldn't have helped much of anything at all. The heat on Pascal is all on the tiny core. The memory, at best, would help with some power draw or memory overclocking depending on if they existed on the back of the cards and weren't cooled properly, but since most of these cards are BGA and power draw is less than Maxwell counterparts, those are both moot points. The problem, in both desktops and laptops, is Pascal is design badly. That's basically it. OEMs/ODMs simply haven't cooled them properly here. I don't agree that such heat should be a thing, this is true, and I dislike all the locked down crap that Pascal is, but in THIS case where laptops are coming out broken at the gate? It's time to blame the ODMs for not putting in the money and effort. They skimped on cooling before and now they're trying to push their old cooling on us or "work around" their barely-sufficient form factors they're getting problems. It's their faults.Mr Najsman, ajc9988 and birdyhands like this. -
-
Sent from my SM-G935T using Tapatalkhfm, birdyhands and PMF like this. -
This. Is. Exactly. The. Point. There is something wrong with the bloody games. Developers do not care. That's what's wrong. Look at this shot of Deux Ex Mankind Divided obliterating a 4.5GHz 2500K at 720p:
I need to see people holding 140fps in BF1 beta in 64-man conquest large. I cannot do it. Minimum graphics I hit a CPU bottleneck around 105fps often with the game using around 80% of my CPU load as of last night, with only Mesh quality and Texture filtering turned up. Rush I can hold 120 but my load fluctuates like crazy. and often hits above 80% on all threads. And 6700HQ chips are slower than mine. I just flat out don't believe people when it comes to these claims. I got to where my knowledge point is because I'm very skeptical of people claiming great things.
Do you play AAA titles? Do you unlock your framerate at all? Those are very big points. 60Hz is not the "standard" for much of these machines anymore. 75Hz and 120Hz are very common. Most gsync models from ASUS and MSI are 75Hz, and there's 120Hz from varying models too. It's a very slow CPU and shouldn't even be the baseline mobile chip.Mr Najsman, ajc9988, TomJGX and 3 others like this. -
Gabrielgvs Notebook Consultant
Last edited: Sep 4, 2016 -
hfm and birdyhands like this.
-
In a game that does not make use of the hyperthread cores, i5s and i7s with single-monitor fullscreen gaming are the same. A 4.5GHz 2500K is better than a 3.1GHz skylake "i5" (what the 6700HQ would do in general). If a game uses the hyperthread cores, then the 6700HQ would be slightly better (even though this is negatively offset by generally awful DDR4 RAM that laptops are selling with). But that would be exactly why it loses in "multicore benchmarks". However multicore benchmarks aren't games, and while I find i7s always trump i5s, I cannot sit here and say the 6700HQ is all that.
There is. No way. In the UNIVERSE. You play Witcher 3, Rainbow Six Seige, GTA V or Battlefield 1 and see ONLY 30% usage on your 4690K at 60fps. ESPECIALLY in Novigrad in TW3 and most of GTA V. @tgipier has a 4.3GHz 5960X and a Pascal Titan X with 3200MHz 16-18-18-40 quad channel DDR4 RAM and can't sustain 60fps in GTA V because of CPU limits. There's no way you only see 30%.
You did say it is fine, and that it won't bottleneck a 1060. That's false. I can go prove that it's false right now. The 6820HK is better if it's overclocked, certainly. I'd say a 3.8GHz or so 6820HK is at the point where I'd say a 1060 isn't bottlenecked. 4GHz for the 1070, coupled with good RAM. Especially beyond 60Hz. The "unless a game is seriously messed up" is the "norm". You can't escape it unless you avoid those games, and if you avoid those games you could probably get by on a single 780M. You have no need of an i7 or Pascal.
Also, I never said you need a 10 pound DTR that costs 2x as much (they don't). The 6820HK in something with decent cooling is fine. The P650RG can do it. The GT72VR might be able to do it, if MSI doesn't BIOS-lock the 6820HK to 45W like they've done on some models in the past.birdyhands likes this. -
GTX 1080 SLI benchmark results (same from my review, but in a condensed YouTube video)...
hfm, ThePerfectStorm, Cass-Olé and 5 others like this. -
When you get a desktop 1080/1070 please swap results from it for the 980ti used in the charts, that's what people want to see - how a laptop 1080 compares to a desktop 1080/1070.
The games aren't really SLI supporting for the most part. Look at that Unboxed Hardware 1080 SLI gaming review for suggestions.
Nvidia's HB SLI Bridge - Surprising gains! GTX 1080 SLI Testing Inside! - Video
Nvidia’s HB SLI Bridge – Surprising gains! GTX 1080 SLI Testing Inside! - Written Article
IDK why many reviewers fixate on what doesn't scale with SLI in these reviews, there are 20x as many games that do support SLI, why not show those?
That's what someone interested in an SLI laptop is looking to see - how their SLI enabled games will do on the SLI laptop. We already know what doesn't scale well yet (new games), and which ones won't scale at all - bad games we don't buy or play
It's interesting to see the power draw could be handled - for the most part - by 2 x 240w power supplies, and we don't really need 2 x 330w - that's a big weight savings.
Maybe when you OC heavily using the @Prema BIOS those power draw numbers will be over 500w, and require 2 x 330w power supplies.
Probably most of that power increase is to feed those hungry 12v fans
Thanks for the awesome reviews @HTWingNut !!Last edited: Sep 5, 2016Cass-Olé, HTWingNut and birdyhands like this. -
birdyhands likes this.
-
That sound track is from someone I met online from UK. He's "Spoony-Metal" on YouTube (it's actually an edit I made but mostly him). Good stuff. I'm looking for more unique tracks to add to my collection, so if you know anyone that wants to share their talents, let me know!Last edited: Sep 5, 2016birdyhands, i_pk_pjers_i and hmscott like this. -
-
Edit: just tested with all normal settings maxed AA on fxaa and a couple advanced settings on. No drops at all anywhere period, in the air, in gunfights, from explosions, anywhere. All in game cars are replaced with real life equivalents too.Last edited: Sep 5, 2016birdyhands likes this. -
Sent from my SM-G935T using Tapatalkbirdyhands and hmscott like this. -
-
Edit: I literally leave my cpu on 4.1 instead of 4.5 because it hasn't bottlenecked on anything yet.
LMAO this site actually maxes out one of my cores when loading pages. Maybe that's why it always feels so slow on my worse laptops.Last edited: Sep 5, 2016birdyhands, Ionising_Radiation, Prototime and 1 other person like this. -
Also, LOL. Yeah loading websites on chrome lately is brutal on CPUs. -
-
i_pk_pjers_i and Galm like this.
-
-
Either or, your desktop is about 60% faster.
But still, I think it might have something to do with GTA. I know for certain sometimes when playing GTA I get a tank in FPS and it's like, some kind of windows process just gobbles 20% of my CPU, and it only happens when GTA V is running. Probably their DRM being stupid or some crap. I eventually uninstalled GTA V because every update changed the performance a ton and because of that rogue process spike, but you know. Blah. But either way, I've been eyeing down CPU usage on AAA titles since I got this machine, and there's a HUGE marked increase since 2015. It's enough so that people with hexacores are fine, but quadcores are generally not. But I keep seeing people jumping in going how 6700HQs are fine, etc. It's happened since skylake has come out, and people don't seem to grasp the idea that there are instances where a 6700HQ can bottleneck a 980M, far less a 1060 or 1070.ajc9988 likes this. -
GTAV is really bad optimized. But heck, 99% of the PC games are.
mason2smart, hmscott and ThePerfectStorm like this. -
ThePerfectStorm Notebook Deity
Sent from my SM-G935F using Tapatalki_pk_pjers_i and hmscott like this. -
Then the consoles break, with tons of memory to fiddle with. PC titles took a SHARP decline after that. I look at and own a lot of titles. When they come out I constantly find out how it runs for certain people. In 2014, games started devouring vRAM like we had some infinite wellspring of it. We jumped CLEAN from 2GB vRAM running 5760 x 1080 with maxed settings to games not even being able to turn on max settings without 3GB or higher at 1920 x 1080 in the space of five months. By 2015, we had games where 4GB isn't even enough to max texture settings, like Black Ops 3 (and before the people who cry that you "don't need so much vRAM" come and try to dispute this, you *LITERALLY* cannot even SEE the "extra" textures, or shadows, etc in the options menu without 6GB of vRAM or above).
In 2015, I started seeing titles chewing out CPUs like some kind of joke. If it wasn't multi-core, it needed high single core speeds. I couldn't prevent Killing Floor 2 from going below 60fps no matter what I did until an optimization patch that happened within the last two months. It just WOULD happen when stuff was on-screen. The game would chew up my single-thread usage, despite "recommending" a quadcore (there's no way the CPUs they recommended would have held 60fps before the last two months either even on min graphics). GTA V is a CPU hog. You've seen Deus EX. Black Ops 3 above exactly 78fps just exponentially starts gulping CPU usage. In fact, lemme do a video on that. Here, I did a video on that while taking a break from typing this. Hell, final fantasy X HD takes 40% of my CPU to render a 30fps title that launched in 2002 on the playstation 2, that's ridiculous. BF1 as well, i5s are shredded. Overwatch too, uses *FAR* too much CPU for its graphical fidelity. Please understand though, that I can get 120fps flat on Overwatch 99% of the time... but the visual fidelity to CPU (and GPU) utilization does not have a high ratio. The game looks clean, but is not graphically intense. DOOM not using Vulkan sucks CPU quite a bit too. Witcher 3 is another monster that loves eating CPUs for breakfast (lunch, dinner and midnight snack too).
2016... I think this is the year that games start eating RAM now. BF1 uses about 7GB by itself. Yeah, you heard me, 7GB, on its own. That means 8GB is no longer "enough". 12GB could work if you close everything before booting it. 16GB is a kind of comfort spot. I have heard Deus EX suck over 5GB of RAM on its own as well, easily.
Now, the indie titles? You know, the ones that usually are the most fun? Those are pretty much fine. How to Survive is great, and How to Survive 2 is different but still nice. Don't Starve Together, Binding of Isaac Rebirth/Afterbirth, Elite: Dangerous, etc all have much nicer optimization ratios. Granted not all of them look all that great, but their graphics qualities, whatever they are, are nicely proportionate to the load on a system. And that's what's important.
So... are MOST games coming out for PC unoptimized messes? Nah. Not even close.
But are the games people most often buy and want to play on PC unoptimized messes? Damn straight they are. They're the ones most in the public eye, too.
The reason you have this book is because I started typing it after 2am. Blame my timezone. I need to publish minimum 1 book on at least 1 forum after 2am and before I sleep every day.ajc9988, Prototime, birdyhands and 5 others like this. -
I think CPU's possible bottleneck is being exaggerated. Fallout 4 uses only 30% of the 6700hq coupled with a 970m. So logically this cpu can still feed gpus 3 times the power of a 970m.
hmscott, Paull, birdyhands and 2 others like this. -
Paull, Prototime, birdyhands and 4 others like this.
-
HaloGod2012 Notebook Virtuoso
Mr Najsman, D2 Ultima, birdyhands and 1 other person like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
@D2 Ultima - X-Plane 9 and now 10 have killed CPUs, GPUs, RAM and VRAM for nearly a decade now. And there was FSX before X-Plane became increasingly popular.
Flight simulators have tended to become 'Earth simulators', with demand for realistic scenery, detailed, accurate weather depiction and even more accurate flight physics.
Now the devs of X-Plane want to experiment with Vulkan, physically-based rendering, real reflections (not specular maps, which is what FSX uses), ambient occlusion, etc.
Literally the only thing we're missing is actual soft-body physics deformations of the planes' fuselage and flying surfaces, á la BeamNG.Drive.ajc9988 and birdyhands like this. -
killkenny1 Too weird to live, too rare to die.
-
Ionising_Radiation ?v = ve*ln(m0/m1)
birdyhands likes this. -
ThePerfectStorm Notebook Deity
http://www.in.techradar.com/news/mo...r-21-X-curved-laptop/articleshow/54009519.cms
In that article, Acer says that the CPU and GPU of the 21 X are not replaceable. Also say that overclocking is "limited". Guess that means that people like @Papusan and @Mr. Fox aren't going to be buying it.
Sent from my SM-G935F using Tapatalkajc9988, Papusan, birdyhands and 2 others like this. -
killkenny1 Too weird to live, too rare to die.
So, how 'bout them 1000s series in laptops.
But wait, GTA V is a bad port now? And here I was thinking it was nice...Paull and birdyhands like this. -
birdyhands likes this.
-
Last edited: Sep 5, 2016 -
NotebookCheck review of MSI GS43VR with GTX 1060:
http://www.notebookcheck.net/MSI-GS43VR-6RE-Phantom-Pro-Notebook-Review.172721.0.html
It doesn't seem to be doing too badly - at least it's apparently not worse than its Maxwell predecessor MSI GS40 with GTX 970M.
http://www.notebookcheck.net/MSI-GS40-6QE-Phantom-Notebook-Review.158147.0.html
Still about the same hot and loud, what else can you expect in 23mm thin 14", but now it's at least noticeably faster.
Unigine Valley stress test: CPU max 74 C, GPU max 81 C.
Prime95 + Furmark stress test: CPU max 89 C, GPU max 88 C.
GS40 got:
Prime95 + Furmark stress test: CPU max 89 C, GPU max 87 C.
Chassis max temp is now 62 C vs 69 C for GS40.
-------
Edit: oops, wrong link, thanks to @PMF for the correctionLast edited: Sep 5, 2016Robbo99999, Prototime and PMF like this. -
birdyhands likes this.
-
Bench this BGA machine is same as see paint dry!!
No thanks. Why pay more for less?
ajc9988 likes this. -
But yes, I understand.Galm likes this. -
The way I can tell CPU bottleneck is quite simple, I just simply look at GPU usage. Anytime the GPU usage isnt 97/9899/100%, I assume its CPU bottlenecking since I have vsync off. I am not being bottlenecked on 60fps for the most part, but I think it happens on higher fps.
From what I seen on GTA V, it takes 6 threads. For quad cores, HT is useful. Hexa/octo, less so.
To me, a 6700hq is only acceptable on laptops with light workload.Papusan likes this. -
-
birdyhands likes this.
-
Edit: the thinkpad actually has 6200u, my mistake but it still is only slightly better.
Sent from my iPhone using TapatalkLast edited: Sep 5, 2016 -
Also GTA V saw max of 28ish% CPU usage when I forced 4.4ghz on all cores and force it to use all non HT cores.
Which pretty much means, GTA V is taking 4x 4.4ghz non hyperthreaded haswell E cores completely.Last edited: Sep 5, 2016 -
Sent from my SM-G935T using Tapatalk -
-
You would think a business class notebook would have pretty powerful processing power under the hood for workloads but I guess that's wrong. My point was that you're saying a 6700hq is ideal for light workloads? I don't think so. Look at a MacBook Pro 15 in retina. It only has a 4980hq.
Sent from my iPhone using Tapatalk -
Sent from my SM-G935T using Tapatalkbirdyhands likes this.
*Official* nVidia GTX 10xx Series notebook discussion thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Orgrimm, Aug 15, 2016.