The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *Official* nVidia GTX 10xx Series notebook discussion thread

    Discussion in 'Gaming (Software and Graphics Cards)' started by Orgrimm, Aug 15, 2016.

  1. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    I agree. The notebookcheck review of the p650se with 970m had it getting up to 72c, the notebookcheck review of the p651rp6 had the 1060 capping at 72c too. No doubt the Pascal cards should run a bit hotter(the performance gains are insane) but i don't think they are running hotter than the sun lol. Even over clocked the 1060 is only at like 75c. If the temps are within a few C but performance is 60% better, I'm not really sure what the big deal is about.
     
    birdyhands and hmscott like this.
  2. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    It's good news for neither camps of people. One small step down in numbering from the GTX 1060, and look how much the card is cut down by (nearly half the core count of the card supposedly just above it, two/thirds the memory bus width, etc.).

    F**k nVidia.

    Anyone who wants to buy this, please look at machines with the 970M/980M instead. They're bound to be waaay cheaper, and can be overclocked the hell out of, to well exceed the performance of this ****.

    I'll say it again: f**k nVidia, screw Pascal. They could've given us the 3 GB version of the 1060, but noooo, they had to make this gimped crap. Hell, at this rate even re-branded Maxwell GM204/206 looks like a better deal. See, gentlemen? The 1060 is supposed to perform like the GTX 980. The 1050, one performance level below, will perform like a GTX 960.

    This is marketing at its finest.
     
    Last edited: Sep 4, 2016
    TomJGX, ThePerfectStorm and hmscott like this.
  3. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yes, heavily, in most AAA titles 2015 and beyond. Even a 1060 would get bottlenecked. I'm not certain how often the bottleneck will hit 60Hz, but 75Hz is a definite target and 120Hz is absolutely a target for CPU bottlenecking. It's not the fault of the CPU specifically (though it IS very slow) however it's rather the fault of developers that for some reason have decided that optimization doesn't need to exist anymore and anybody willing to play a game will have a 5GHz i7 at the bare minimum, and also only games at 60Hz.

    You haven't tried most AAA titles from 2015 onward.

    Relying on DX12 and Vulkan is a very very bad mindset to have. There are many current drawbacks to those APIs, and since developers are not bothering to actually run much optimizations on their titles, and instead rely on the driver. And believe me, if developers get slack with it, even DX12/Vulkan can suck tons of CPU power from systems.

    It's actually *EXTREMELY* weak. It is a 3.1GHz 4-core load processor. All mobile skylake chips out of the box are very weak. The reason the 6700HQ is used in "all those laptops" is because it's simply the basic mobile i7. It's the same reason the 4700MQ was used, and then the 4710MQ was used, and you almost certainly had to visit a boutique website to find a higher end chip. If I had my way, the minimum clockspeed of any i7 on the market intended for gaming and other similar workloads (I.E. not counting things like the i7-6700T) would be 3.5GHz flat, and I would want the higher end ones to run at 4GHz as well. Desktops get it and laptops should have it, and the number of times I've found myself CPU limited in games that I play is rather astounding... albeit I'm above 60Hz, but I've been limited to 80fps and thereabouts already, and with 75Hz and 120Hz laptops I can guarantee you that newer machines WILL be limited very badly. Especially Deus Ex Mankind Divided. That game DEVOURS processors like it's a joke. Battlefield 1 too, is rather draining especially with some of the graphics options turned up. I've no idea how they managed to make the CPU load increase when raising texture filtering and mesh quality, however they have. And my RAM is extremely good too; the crappy RAM that's found in most of these machines are likely to limit things as my old bad RAM once did.

    Not really. They hit silicon limits around 1250MHz give or take for the most part. Only the earlier cards are generally very good overclockers from what I remember. Most of the newer ones simply are such desktop reject trash that they don't overclock very well. I hear about this from our resident overclockers, so I'll definitely trust THAT information.

    The 3GB version wouldn't have helped much of anything at all. The heat on Pascal is all on the tiny core. The memory, at best, would help with some power draw or memory overclocking depending on if they existed on the back of the cards and weren't cooled properly, but since most of these cards are BGA and power draw is less than Maxwell counterparts, those are both moot points. The problem, in both desktops and laptops, is Pascal is design badly. That's basically it. OEMs/ODMs simply haven't cooled them properly here. I don't agree that such heat should be a thing, this is true, and I dislike all the locked down crap that Pascal is, but in THIS case where laptops are coming out broken at the gate? It's time to blame the ODMs for not putting in the money and effort. They skimped on cooling before and now they're trying to push their old cooling on us or "work around" their barely-sufficient form factors they're getting problems. It's their faults.
     
    Mr Najsman, ajc9988 and birdyhands like this.
  4. jddunlap

    jddunlap Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    2
    Trophy Points:
    6
    Yeah, Arma is mostly limited by max single-core performance, since a lot of the critical stuff happens on a single thread, and the GPU is made to wait for it.
     
  5. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    I never said to rely on them. I said most likely in the future, requirements would go down not up. I can't say as though I've been even close to CPU constrained in any modern game I've played aside from the arma series as I pointed out. And I've only got an i5-4670k not over clocked. Most games seem to hover around 20-30% usage. Sure there's going to be some games that use more CPU that others but it's unlikely to be your bottleneck. If the CPU is the bottleneck that likely means something's wrong with the game. People with 1060s and a mild OC with the p650rp6 are pushing around 140fps in bf1, so CPU isn't a problem there even in beta. The i7 6700hq isn't incredibly weak. It may be weaker than a desktop CPU, but it's by no means a slouch. I wouldn't get it for doing anything thats going to be hammering the CPU, but the reality is most games don't. And yes, I play many games from after 2015. Over clocking my CPU to 4.3ghz has never improved fps in any game I've tested. Reducing CPU overhead from background tasks and os issues probably helps more than overclocking anyways.

    Sent from my SM-G935T using Tapatalk
     
    hfm, birdyhands and PMF like this.
  6. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Don't bet on this. Requirements never go down. Look at all the DX12 microsoft titles. They're *ALL* hard on the CPU, and the GPU. The optimization is nonexistent. DX12 doesn't "save" an unoptimized title, all it does is reduce CPU load from existing titles in comparison. If you make a DX12-only title you can hit unholy levels of CPU usage perfectly fine... it simply means that the game would not function in DX11 or OGL due to overbearing CPU load.

    This. Is. Exactly. The. Point. There is something wrong with the bloody games. Developers do not care. That's what's wrong. Look at this shot of Deux Ex Mankind Divided obliterating a 4.5GHz 2500K at 720p:
    [​IMG]

    I need to see people holding 140fps in BF1 beta in 64-man conquest large. I cannot do it. Minimum graphics I hit a CPU bottleneck around 105fps often with the game using around 80% of my CPU load as of last night, with only Mesh quality and Texture filtering turned up. Rush I can hold 120 but my load fluctuates like crazy. and often hits above 80% on all threads. And 6700HQ chips are slower than mine. I just flat out don't believe people when it comes to these claims. I got to where my knowledge point is because I'm very skeptical of people claiming great things.

    Do you play AAA titles? Do you unlock your framerate at all? Those are very big points. 60Hz is not the "standard" for much of these machines anymore. 75Hz and 120Hz are very common. Most gsync models from ASUS and MSI are 75Hz, and there's 120Hz from varying models too. It's a very slow CPU and shouldn't even be the baseline mobile chip.
     
    Mr Najsman, ajc9988, TomJGX and 3 others like this.
  7. Gabrielgvs

    Gabrielgvs Notebook Consultant

    Reputations:
    1,165
    Messages:
    119
    Likes Received:
    2,149
    Trophy Points:
    156
    Nvidia is offering a marked increase in performance at a price point that reflects very, very good value. That said, this really isn't THE generation of cards to buy, IMO. If I didn't need to upgrade a 4 yr old unit sooner than later I'd wait because I think Volta is going to hit the sweet spot. That leaves me buying a 1060, which tbh, should be more than adequate for 18-24 mos. I don't see value in the 1070, and it has nothing to do with the performance increase for the premium. Practically, it seems redundant to me because I'm not sure it buys you anything between here and Volta. As for the thin chassis notebooks, my honest opinion is that I see a whole lot of confirmation bias when I look around these forums watch people convince themselves that it makes any sense at all.
     
    Last edited: Sep 4, 2016
  8. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    There was a screenshot of someone playing bf1 at 136fps I believe in the owners thread. Not sure what game mode. The i5-2500k is not necessarily faster even with 4.5ghz oc. It gets obliterated in multicore benchmarks which from looking at your deus ex screenshot, would help in that scenario. It even wins most single threaded benchmarks but again you have over clocked to help reduce the gap. Yes I do play AAA games such as tw3, r6 siege, GTA, paragon, and bf1, and yes I frequently unlock the fps to see what I can hit, though I play with vsync on most of the time because I can't handle tearing. I'm not saying the i7-6700hq is a beast, I'm just saying it's unlikely to be the bottleneck unless a game is seriously messed up. And to me it's not worth upgrading to a 2+ inch thick, 10lb desktop replacement with a desktop CPU that costs 2x as much, just in case a game comes around that I can't run at 120fps...and the 6820hk is an okay compromise but the added heat of oc will likely end up just throttling you down in the end.
     
    hfm and birdyhands like this.
  9. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Nope, doesn't count. I could get a 150fps screenshot without issue. Gotta be consistent gameplay, and performance metrics need to be a thing too.

    In a game that does not make use of the hyperthread cores, i5s and i7s with single-monitor fullscreen gaming are the same. A 4.5GHz 2500K is better than a 3.1GHz skylake "i5" (what the 6700HQ would do in general). If a game uses the hyperthread cores, then the 6700HQ would be slightly better (even though this is negatively offset by generally awful DDR4 RAM that laptops are selling with). But that would be exactly why it loses in "multicore benchmarks". However multicore benchmarks aren't games, and while I find i7s always trump i5s, I cannot sit here and say the 6700HQ is all that.

    There is. No way. In the UNIVERSE. You play Witcher 3, Rainbow Six Seige, GTA V or Battlefield 1 and see ONLY 30% usage on your 4690K at 60fps. ESPECIALLY in Novigrad in TW3 and most of GTA V. @tgipier has a 4.3GHz 5960X and a Pascal Titan X with 3200MHz 16-18-18-40 quad channel DDR4 RAM and can't sustain 60fps in GTA V because of CPU limits. There's no way you only see 30%.

    You did say it is fine, and that it won't bottleneck a 1060. That's false. I can go prove that it's false right now. The 6820HK is better if it's overclocked, certainly. I'd say a 3.8GHz or so 6820HK is at the point where I'd say a 1060 isn't bottlenecked. 4GHz for the 1070, coupled with good RAM. Especially beyond 60Hz. The "unless a game is seriously messed up" is the "norm". You can't escape it unless you avoid those games, and if you avoid those games you could probably get by on a single 780M. You have no need of an i7 or Pascal.

    Also, I never said you need a 10 pound DTR that costs 2x as much (they don't). The 6820HK in something with decent cooling is fine. The P650RG can do it. The GT72VR might be able to do it, if MSI doesn't BIOS-lock the 6820HK to 45W like they've done on some models in the past.
     
    birdyhands likes this.
  10. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    GTX 1080 SLI benchmark results (same from my review, but in a condensed YouTube video)...

     
  11. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Woke me up with that sound track, awesome :)

    When you get a desktop 1080/1070 please swap results from it for the 980ti used in the charts, that's what people want to see - how a laptop 1080 compares to a desktop 1080/1070.

    The games aren't really SLI supporting for the most part. Look at that Unboxed Hardware 1080 SLI gaming review for suggestions.

    Nvidia's HB SLI Bridge - Surprising gains! GTX 1080 SLI Testing Inside! - Video

    Nvidia’s HB SLI Bridge – Surprising gains! GTX 1080 SLI Testing Inside! - Written Article

    IDK why many reviewers fixate on what doesn't scale with SLI in these reviews, there are 20x as many games that do support SLI, why not show those?

    That's what someone interested in an SLI laptop is looking to see - how their SLI enabled games will do on the SLI laptop. We already know what doesn't scale well yet (new games), and which ones won't scale at all - bad games we don't buy or play :)

    It's interesting to see the power draw could be handled - for the most part - by 2 x 240w power supplies, and we don't really need 2 x 330w - that's a big weight savings.

    Maybe when you OC heavily using the @Prema BIOS those power draw numbers will be over 500w, and require 2 x 330w power supplies.

    Probably most of that power increase is to feed those hungry 12v fans :confused: :D o_O

    Thanks for the awesome reviews @HTWingNut !!
     
    Last edited: Sep 5, 2016
    Cass-Olé, HTWingNut and birdyhands like this.
  12. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    The same person who posted the screenshot also said they were getting consistently above 100fps the whole time. Obviously it's going to fluctuate a bit depending on whats being rendered at the exact second. But 100+ fps with a mild oc on a 1060 seems great to me. I'll have to download a real-time hardware monitor overlay or something and do some testing when I get home to my desktop. Problem is, I'm more likely to be GPU constrained with my 970 in most of those games before im going to hit a cpu limit. Novigrad is likely bad, that place is crazy. If I have my settings nearly maxed out with hair works I'm low 50s to high 40s. But I've just figured that's my gpu. Never monitored CPU usage in GTA. If that beast of a rig can't run GTA v at 60fps then it's an irrelevant example of why the i7-6700hq will bottleneck the 1060 because apparently no CPU is powerful enough to maintain 60fps. 1070 you are more likely to hit CPU bottleneck because it's so powerful, relative to everything else. But even then the odds of it happening at 75fps is pretty low. Even at 120hz the odds of being CPU constricted is relatively low. At least with most games out today. Sure you can pick and choose examples where a game is CPU bound, but Its always going to be that way. If a game is stressing the CPU so hard as to cripple a beastly overclocked desktop CPU, of course it's gonna cripple a laptop CPU, but for the majority of games(which don't) the CPU usage typically isn't an issue. It's like if I pointed out a game that was GPU bottlenecked by the 1070 and said that was evidence that the 1070 was a definite potential bottleneck because a couple games barely touched the CPU but pegged out the GPU. Basically my general point is that sure there will be games(or parts of games) that will hit the CPU hard, but most games out today don't run into those kinds of issues, especially if playing at 60-75fps. Like I said the Arma series is notorious for being terribly optimized and has complex AI. That's a game I noticed almost no fps improvement in when moving from a gtx 760 to a gtx 970 but keeping my CPU the same. Also future games coming out on unreal engine 4 should run very nicely. They've been optimizing the engines performance to maintain a stable 60fps mainly for their game paragon which suffers if framerate dips. In closing I'm not saying there is no way the gtx 1060 will be CPU constrained by a 6700hq, but currently it's the exception, not the rule. Also I said desktop replacements cost 2x because you can pickup a clevo with 1060 for about $1300, clevos with desktop CPUs start right around $2000.
     
    birdyhands likes this.
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yeah SLI is a sticky tricky beast. Sometimes it works great, but a lot of times it needs a lot of dedication from the game dev, Nvidia, and the user.

    That sound track is from someone I met online from UK. He's "Spoony-Metal" on YouTube (it's actually an edit I made but mostly him). Good stuff. I'm looking for more unique tracks to add to my collection, so if you know anyone that wants to share their talents, let me know! :)
     
    Last edited: Sep 5, 2016
    birdyhands, i_pk_pjers_i and hmscott like this.
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I'm not going to discuss this anymore. I'm just tired about making the same statements for ages whenever someone shows up saying the same thing you are doing right now.

    As for your statement about price, it's wrong:
    [​IMG]
     
    Ashtrix, ajc9988 and TBoneSan like this.
  15. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    I gotta say, I play modded GTA V all the time at 4k even, and I have no idea what your talking about cpu wise. My old laptop with a 4700mq did get heavy usage, but was still around 50-70 fps with a 980m. My 5820k at 4.1GHz doesn't struggle with 60fps at all. Like not even close, its one of the most consistent running games I play... Average about 90fps with drops into the low 80s and I could push it higher before hitting a cpu bottleneck. There's also no way it only uses 30% of a 4690k so I agree with you there, but that rig you described with a Titan XP and 4.3GHz Octacore should not be dropping below 60fps, something else is wrong.

    Edit: just tested with all normal settings maxed AA on fxaa and a couple advanced settings on. No drops at all anywhere period, in the air, in gunfights, from explosions, anywhere. All in game cars are replaced with real life equivalents too.
     
    Last edited: Sep 5, 2016
    birdyhands likes this.
  16. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    My 30% number was not in reference to the specific games I listed, which is why I said I would check once I got home. I rarely actively monitor CPU usage in games, but when I've played some games in windowed mode, I've noticed it's typically around 30%. I've just never noticed my CPU pegged out when. I'm getting fps lag. Again I'm just going off of memory, with the 30% number but it definitely isn't typically very high.

    Sent from my SM-G935T using Tapatalk
     
    birdyhands and hmscott like this.
  17. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    To be honest, I find it is weird about that dropping too, but I mean I know the system (and obviously I'm not playing it directly, just hearing reports). However, I know GTA V drops below 60fps at some point for pretty much everyone.
     
  18. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Here's some cpu usage ingame. You can see the GTA5.exe running. The drop in cpu usage in the beginning was because I was paused double checking my settings. Not even close to bottlenecking you can see core usage and clockspeed, so I have less cores and am clocked lower than that 8 core. I seriously don't drop below 60 in that game. Maybe if I destroyed like 50 cars at once or something. Untitled.png

    Edit: I literally leave my cpu on 4.1 instead of 4.5 because it hasn't bottlenecked on anything yet.

    LMAO this site actually maxes out one of my cores when loading pages. Maybe that's why it always feels so slow on my worse laptops.
     
    Last edited: Sep 5, 2016
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Are you in SP or MP? Curious.

    Also, LOL. Yeah loading websites on chrome lately is brutal on CPUs.
     
  20. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Hm that test was singleplayer, but I've never noticed a difference in multi, in fact the lack of mods should make it lighter even though other player data is needed. I can test it I guess, but not right now its pretty late lol. I've never had any issues in singleplayer or multiplayer since I got this desktop though was the point I was trying to make, and a 4.3GHz 5960X should be outperforming me.
     
  21. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah. GTA V is odd though. It seems to run a bit different on all sorts of stuff. But in general, it's a CPU devourer in a lot of scenarios. Unless it's somehow much better on W10 than it is on 8.1 (what I tried) and 7 (what tgipier uses).
     
    i_pk_pjers_i and Galm like this.
  22. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Yeah idk. My laptop was 8.1 and had some issues. My desktop has always been 10 and shreds it but is like a 2x if not more powerful cpu.
     
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    3720QM? That's 3.4GHz 4-core load ivy bridge, right? Or was it the 4710MQ?

    Either or, your desktop is about 60% faster.

    But still, I think it might have something to do with GTA. I know for certain sometimes when playing GTA I get a tank in FPS and it's like, some kind of windows process just gobbles 20% of my CPU, and it only happens when GTA V is running. Probably their DRM being stupid or some crap. I eventually uninstalled GTA V because every update changed the performance a ton and because of that rogue process spike, but you know. Blah. But either way, I've been eyeing down CPU usage on AAA titles since I got this machine, and there's a HUGE marked increase since 2015. It's enough so that people with hexacores are fine, but quadcores are generally not. But I keep seeing people jumping in going how 6700HQs are fine, etc. It's happened since skylake has come out, and people don't seem to grasp the idea that there are instances where a 6700HQ can bottleneck a 980M, far less a 1060 or 1070.
     
    ajc9988 likes this.
  24. ekkolp

    ekkolp Notebook Evangelist

    Reputations:
    106
    Messages:
    539
    Likes Received:
    379
    Trophy Points:
    76
    GTAV is really bad optimized. But heck, 99% of the PC games are.
     
  25. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    Yeah, just look at prime examples like Arkham Knight and AC: Unity.

    Sent from my SM-G935F using Tapatalk
     
    i_pk_pjers_i and hmscott like this.
  26. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Wouldn't say 99% of PC games are... just the AAA ones. Those new consoles broke the mould. The old ones were the common denominators for so long that PCs could just take it easy, and there came a point (around 2010 or so) when developers started to realize that PCs could simply do more. This is when consoles started getting minimum graphics on everything, etc.

    Then the consoles break, with tons of memory to fiddle with. PC titles took a SHARP decline after that. I look at and own a lot of titles. When they come out I constantly find out how it runs for certain people. In 2014, games started devouring vRAM like we had some infinite wellspring of it. We jumped CLEAN from 2GB vRAM running 5760 x 1080 with maxed settings to games not even being able to turn on max settings without 3GB or higher at 1920 x 1080 in the space of five months. By 2015, we had games where 4GB isn't even enough to max texture settings, like Black Ops 3 (and before the people who cry that you "don't need so much vRAM" come and try to dispute this, you *LITERALLY* cannot even SEE the "extra" textures, or shadows, etc in the options menu without 6GB of vRAM or above).

    In 2015, I started seeing titles chewing out CPUs like some kind of joke. If it wasn't multi-core, it needed high single core speeds. I couldn't prevent Killing Floor 2 from going below 60fps no matter what I did until an optimization patch that happened within the last two months. It just WOULD happen when stuff was on-screen. The game would chew up my single-thread usage, despite "recommending" a quadcore (there's no way the CPUs they recommended would have held 60fps before the last two months either even on min graphics). GTA V is a CPU hog. You've seen Deus EX. Black Ops 3 above exactly 78fps just exponentially starts gulping CPU usage. In fact, lemme do a video on that. Here, I did a video on that while taking a break from typing this. Hell, final fantasy X HD takes 40% of my CPU to render a 30fps title that launched in 2002 on the playstation 2, that's ridiculous. BF1 as well, i5s are shredded. Overwatch too, uses *FAR* too much CPU for its graphical fidelity. Please understand though, that I can get 120fps flat on Overwatch 99% of the time... but the visual fidelity to CPU (and GPU) utilization does not have a high ratio. The game looks clean, but is not graphically intense. DOOM not using Vulkan sucks CPU quite a bit too. Witcher 3 is another monster that loves eating CPUs for breakfast (lunch, dinner and midnight snack too).

    2016... I think this is the year that games start eating RAM now. BF1 uses about 7GB by itself. Yeah, you heard me, 7GB, on its own. That means 8GB is no longer "enough". 12GB could work if you close everything before booting it. 16GB is a kind of comfort spot. I have heard Deus EX suck over 5GB of RAM on its own as well, easily.

    Now, the indie titles? You know, the ones that usually are the most fun? Those are pretty much fine. How to Survive is great, and How to Survive 2 is different but still nice. Don't Starve Together, Binding of Isaac Rebirth/Afterbirth, Elite: Dangerous, etc all have much nicer optimization ratios. Granted not all of them look all that great, but their graphics qualities, whatever they are, are nicely proportionate to the load on a system. And that's what's important.

    So... are MOST games coming out for PC unoptimized messes? Nah. Not even close.
    But are the games people most often buy and want to play on PC unoptimized messes? Damn straight they are. They're the ones most in the public eye, too.

    The reason you have this book is because I started typing it after 2am. Blame my timezone. I need to publish minimum 1 book on at least 1 forum after 2am and before I sleep every day.
     
  27. vesayreve

    vesayreve Notebook Evangelist

    Reputations:
    22
    Messages:
    360
    Likes Received:
    50
    Trophy Points:
    41
    I think CPU's possible bottleneck is being exaggerated. Fallout 4 uses only 30% of the 6700hq coupled with a 970m. So logically this cpu can still feed gpus 3 times the power of a 970m.
     
    hmscott, Paull, birdyhands and 2 others like this.
  28. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It's quite common to see such exaggeration.
     
    Paull, Prototime, birdyhands and 4 others like this.
  29. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    you don't have to be at 100% utilization to bottleneck. One thread or core at 100% handling a certain task will bottleneck, however the total cpu utilization will be low, since that's the overall usage and including all 4 or 8 threads.
     
  30. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    @D2 Ultima - X-Plane 9 and now 10 have killed CPUs, GPUs, RAM and VRAM for nearly a decade now. And there was FSX before X-Plane became increasingly popular.

    Flight simulators have tended to become 'Earth simulators', with demand for realistic scenery, detailed, accurate weather depiction and even more accurate flight physics.

    Now the devs of X-Plane want to experiment with Vulkan, physically-based rendering, real reflections (not specular maps, which is what FSX uses), ambient occlusion, etc.

    Literally the only thing we're missing is actual soft-body physics deformations of the planes' fuselage and flying surfaces, á la BeamNG.Drive.
     
    ajc9988 and birdyhands like this.
  31. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    Since when did X-Plane became increasingly popular?
    :D :D :D
     
  32. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Take this to PM, mate. I said a lot of things, and you literally fixated on the least important of them... I'm not particularly interested in debating flight sims now. I was merely using them as an example of being nearly the pinnacle of real programs (not synthetic benches), running for extended periods of time, exerting very, very intense loads on a computer.
     
    birdyhands likes this.
  33. ThePerfectStorm

    ThePerfectStorm Notebook Deity

    Reputations:
    683
    Messages:
    1,452
    Likes Received:
    1,118
    Trophy Points:
    181
    ajc9988, Papusan, birdyhands and 2 others like this.
  34. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    Chill, just making fun, no need to make a storm in a teacup ;)

    So, how 'bout them 1000s series in laptops.

    But wait, GTA V is a bad port now? And here I was thinking it was nice...
     
    Paull and birdyhands like this.
  35. ZeneticX

    ZeneticX Notebook Evangelist

    Reputations:
    16
    Messages:
    316
    Likes Received:
    238
    Trophy Points:
    56
    Shame, wasted potential for a great chasis. But then I'll have my doubts on the cooling as well until reviews are out. See what happen to the Predator 17...
     
    birdyhands likes this.
  36. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,657
    Trophy Points:
    931
    That guess would be 100% accurate. No point in spending money on disposable trash. They had a chance to finally do something great after a lifelong history of producing low-end consumer turdbooks and chose to stay on the low road. I guess "once a loser, always a loser" is the best way to characterize a lame outfit like Acer. I suppose they just needed something extra expensive to go along with their cheap stuff. They had our attention for a few minutes though.
     
    Last edited: Sep 5, 2016
    TomJGX, ajc9988, steberg and 4 others like this.
  37. aqnb

    aqnb Notebook Evangelist

    Reputations:
    433
    Messages:
    578
    Likes Received:
    648
    Trophy Points:
    106
    NotebookCheck review of MSI GS43VR with GTX 1060:

    http://www.notebookcheck.net/MSI-GS43VR-6RE-Phantom-Pro-Notebook-Review.172721.0.html

    It doesn't seem to be doing too badly - at least it's apparently not worse than its Maxwell predecessor MSI GS40 with GTX 970M.

    http://www.notebookcheck.net/MSI-GS40-6QE-Phantom-Notebook-Review.158147.0.html

    Still about the same hot and loud, what else can you expect in 23mm thin 14" :), but now it's at least noticeably faster.

    Unigine Valley stress test: CPU max 74 C, GPU max 81 C.
    Prime95 + Furmark stress test: CPU max 89 C, GPU max 88 C.

    GS40 got:

    Prime95 + Furmark stress test: CPU max 89 C, GPU max 87 C.

    Chassis max temp is now 62 C vs 69 C for GS40.

    -------

    Edit: oops, wrong link, thanks to @PMF for the correction ;)
     
    Last edited: Sep 5, 2016
    Robbo99999, Prototime and PMF like this.
  38. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    4700mq was holding around 3.1ghz on 4 cores not 3.4 thats the 1 core speed. My 5820k can hold 4.5ghz with ease and has 2 extra cores so thats how I got the 100% better math. Not exactly correct but I wasnt trying to be exact. Gta V doesnt utilize the last 2 cores very much though compared to like battlefield.
     
    birdyhands likes this.
  39. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,652
    Trophy Points:
    931
    ajc9988 likes this.
  40. PMF

    PMF Notebook Consultant

    Reputations:
    19
    Messages:
    284
    Likes Received:
    251
    Trophy Points:
    76
  41. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    4710MQ is what I saw in your sig and that's supposed to be 3.3GHz, so I was comparing it to the 4.1GHz you had.

    But yes, I understand.
     
    Galm likes this.
  42. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    I cant max the game on 3440 x 1440 60fps. Simply not possible with an OCed titan XP. You will need two.

    The way I can tell CPU bottleneck is quite simple, I just simply look at GPU usage. Anytime the GPU usage isnt 97/9899/100%, I assume its CPU bottlenecking since I have vsync off. I am not being bottlenecked on 60fps for the most part, but I think it happens on higher fps.

    From what I seen on GTA V, it takes 6 threads. For quad cores, HT is useful. Hexa/octo, less so.

    To me, a 6700hq is only acceptable on laptops with light workload.
     
    Papusan likes this.
  43. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    I meant from what I just showed you should not be dropping below 60 from a cpu bottleneck.
     
  44. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    Yeah you are right. Turns out GTA V is probably spreading the load enough on all 6 threads that my CPU even refused to clock to full 4.4ghz. I need to fix this.
     
    birdyhands likes this.
  45. birdyhands

    birdyhands Notebook Consultant

    Reputations:
    64
    Messages:
    235
    Likes Received:
    214
    Trophy Points:
    56
    Most of these posts on here haven't deterred me from buying the laptop with the 6700hq, but I am interested in this "light workload" you speak of. The latest Lenovo think pad (a business notebook) has a 5200u, which is just a slightly better cpu than the 6700hq.

    Edit: the thinkpad actually has 6200u, my mistake but it still is only slightly better.

    Sent from my iPhone using Tapatalk
     
    Last edited: Sep 5, 2016
  46. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    How is a 5200u better than 6700hq again?


    Also GTA V saw max of 28ish% CPU usage when I forced 4.4ghz on all cores and force it to use all non HT cores.

    Which pretty much means, GTA V is taking 4x 4.4ghz non hyperthreaded haswell E cores completely.
     
    Last edited: Sep 5, 2016
  47. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    6200u is inferior to the 6700hq. It's part of the ultra low voltage processor series from Intel made for ultra books. It's only a dual core too.

    Sent from my SM-G935T using Tapatalk
     
  48. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    4700MQ is 3.4ghz quad core, 3.6ghz single core. This is via the turbo bins with intel XTU. On stock it does run at 3.2ghz quad core, 3.4ghz single core.
     
  49. birdyhands

    birdyhands Notebook Consultant

    Reputations:
    64
    Messages:
    235
    Likes Received:
    214
    Trophy Points:
    56
    You would think a business class notebook would have pretty powerful processing power under the hood for workloads but I guess that's wrong. My point was that you're saying a 6700hq is ideal for light workloads? I don't think so. Look at a MacBook Pro 15 in retina. It only has a 4980hq.


    Sent from my iPhone using Tapatalk
     
  50. Xileforce

    Xileforce Notebook Evangelist

    Reputations:
    15
    Messages:
    521
    Likes Received:
    222
    Trophy Points:
    56
    4980 is pretty fast. Faster than the 6700hq more than likely. The main advantage of Skylake is their efficiency. There wasn't much gain in performance.

    Sent from my SM-G935T using Tapatalk
     
    birdyhands likes this.
← Previous pageNext page →