The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    GTX 980M / 970M Maxwell (Un)official news

    Discussion in 'Gaming (Software and Graphics Cards)' started by HSN21, Sep 18, 2014.

  1. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    They COULD use 6GB vRAM, but that leaves nothing for the game itself. I was simply saying that the entire 5.5GB (X1) or 6GB (PS4) memory pools must be used for both what we call system RAM usage (I.E. ~2GB on average for most games) and vRAM usage. I.E. if a game on PC uses 2GB RAM, then it should use something similar on a console, and thus not ever use more than 4GB on PC (as that would surpass the limit the consoles have). If a game uses 6GB+ vRAM on PC while using 1.5-2GB RAM, then that's way beyond console limits and FURTHER points to serious unoptimization.

    And yes, sales work for essentially dead games in that way. I don't blame you for it. But the main point is that we need them to know they're doing things wrong. If they want to get pissy about it because their sales aren't high when their game makes BF4 look as demanding as CoD 4 and their game LOOKS like CoD 4 itself, then that's their own fault. But it's what is happening now, and we need to get rid of that problem lickety split.
     
  2. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    I'm aware, no one saying they will use 5.9GB VRAM and leave 100mb for actual ram, if a developer uses 3GB+ VRAM on a 720p no AA and direct port to pc without optimization the 6GB for 1080pAA/ultra settings seems "normal" we will deal with plenty of those soon i have already posted 3 recent examples.
     
  3. naldor

    naldor Notebook Consultant

    Reputations:
    162
    Messages:
    117
    Likes Received:
    27
    Trophy Points:
    41
  4. Tonrac

    Tonrac Notebook Evangelist

    Reputations:
    301
    Messages:
    336
    Likes Received:
    82
    Trophy Points:
    41
    if you watch the preliminary bench result of the GTX980M, the notebookcheck specification seem to be more accurate
    Bench results show that the 980m seem to be an underclocked desktop GTX970.
     
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    That's actually
    4710HQ: 3.5GHz with HD 4600 iGPU
    4810HQ: 3.6GHz with Iris Pro 5200 iGPU
    4980HQ: 4.0GHz with Iris Pro 5200 iGPU

    Distinction is really with the dedicated GPU

    And HQ = Soldered, MQ = Socketed

    I would guess the 4980HQ will definitely throttle except with the most efficient of coolers, and soldered CPU's usually are geared more towards the thin and lights. 4810MQ will likely throttle as well. But it may be worth it for the Iris Pro at least for gaming on battery.

    That's just total crap. Even the highest end desktop GPU's only have 3GB or 4GB vRAM. GTX 780, 780Ti, 970, 980. Only the $1000+ nVidia GPU's start to have more than 4GB. Ridiculous.

    [​IMG]
     
    octiceps likes this.
  6. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    LOL. It has been like this since around 2004 when consoles started to gain more traction. I noticed how little they care about us when I played GTA IV. One of my favorite games of all times and nothing on the planet can run that without problems. Then Rockstar never released Red Dead Redemption for PC. Too much effort because "there is no PC market." There were 7 million people at a single (!) DOTA 2 tournament and 67 million playing LoL. Definitely no PC market... At least a million people on steam a day, at least. Valve is showing all these morons. And then GTA V was praised for their business practices because they made a billion dollars in a week. I think it cost them over $200 million to make the game. No one mentions how COD made a billion when the latest game came out and COD doesn't take 5 years to make; there is one every year. These guys are morons. They say there are not enough PC copies sold but they only mention physical copies sold. Valve took a survey and most users do not have more than 4gbs of ram. You don't need it. I am surprised you guys didn't mention the new consoles use an 8 core processor; I have a feeling games are gonna optimized around that too.
     
    octiceps likes this.
  7. Derek712

    Derek712 Notebook Virtuoso

    Reputations:
    462
    Messages:
    2,574
    Likes Received:
    999
    Trophy Points:
    131
    I tested a few older titles with the Aorus x3+ to see if iris pro was any more efficient than the 870m it packed. On average, I was consuming 10 percent less battery with the iris pro than with the 870m. Not worth it if you ask me, considering the cpu costs so much more and you have to manually force it to use the cpu when you want the savings.
     
    heibk201 and Ningyo like this.
  8. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    plus 9xx series suppose to be improved when running with battery, iris is nearly useless when you have a real gpu, 970m suppose to use less power than 870m too.
     
    Ningyo likes this.
  9. Curunen

    Curunen Notebook Consultant

    Reputations:
    10
    Messages:
    127
    Likes Received:
    16
    Trophy Points:
    31
    Ok, haven't been following news too thoroughly so am a little out of the loop - no new laptops announced yet?

    Please don't say they're holding out for broadwell next year...
     
  10. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    so many preorders were up and still they keep getting down (nvidia ninjas) you will be able to order one OCT7-11 so basically 2 weeks max 100%
     
    Ningyo likes this.
  11. Curunen

    Curunen Notebook Consultant

    Reputations:
    10
    Messages:
    127
    Likes Received:
    16
    Trophy Points:
    31
    Good to know, thanks. :)
     
  12. tlprtr19

    tlprtr19 Notebook Evangelist

    Reputations:
    393
    Messages:
    390
    Likes Received:
    96
    Trophy Points:
    41
    As with 20nm TSMC said they went in mass production capability with their fabs back in Q4 2012. Later they delayed it to Q2 2013. Q2-Q3 2013 reports emerged something went wrong (density and leakage issues on 20nm). Q4 2013 they came out and said they are delaying it to Q2 2014 until they sort out density defects and reproducibility (I am assuming under the table TSMC has already told Nvidia that High power SoC not possible - Nvidia scrambles to form new architecture at 28nm by 2013 and releases kepler refreshes by Q1-Q2 2014). Unlike all rumors, reports begin to emerge High power chips not possible and that Low power SoC are being manufactured for apple at 20nm. The whole scenario played out within the '12-'14 timeframe.

    Coming back to 16nm. 16nm finfet has already been demonstrated to show better gate density, fewer leakage and better reproducibility by various R&D labs. AMD's defacto foundry "Global foundries" heavily focused on 14nm rather than 16mn becasue of Intel's competition (14nm broadwell). They have also started research on 12nm FYI.
    http://globalfoundries.com/docs/def...dries-14nm-collaboration---final.pdf?sfvrsn=2
    TSMC on the other hand announced 16nm finfet adoption which pleased major tech companies. Now that Nvidia's roadmap shows Pascal in 2016 (Remember roadmaps can change when issues come up), it's assumed they will probably go for 16nm. Many companies have shown their dislike and moving away from 20nm.
    http://www.extremetech.com/computin...ds-if-you-ask-the-engineers-or-the-economists
    http://www.extremetech.com/computin...m-finfet-tapeout-of-big-little-cortex-a57-soc

    As I stated above there is a 1 yr gap between mass production capability in TSMC fab, to tapeout and production of actual orders. Which means if TSMC announces 16mn mass production capability in Q42015, assuming no unforseen hurdles (unlikely - the smaller the fab limit, the more problems - aka nanotechnology behavior limits) 2016 is the likely year. GM204 and GM200 is plenty capable. Also NVD has invested a lot in changing the architecture (Kepler to Maxwell) therefore IMO makes no sense to change node within a year (2015) when (2016) 16nm is going to be ready. Nvidia would have already started research on Pascal, initiated trials on its architectures and node shrink. They also need to finalize chip design before finally submitting orders for mass production (Basically too busy to be thinking for 2 node shrinks ie. 20 and 16nm within a 1-2 yr timeframe). I think they will just go ahead and milk out Maxwell 3.0 on 28nm next year until Pascal 16mn matures. No bets just a gut feeling.
     
    LostCoast707 and Ningyo like this.
  13. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Next cards will remain 28, they still have 980 TI, new titan and 990, new titan black, new 990 from blacks and don't forget rebrands with higher clock speed etc
    I expect 20~Nm Maxwell August-September 2015 and Pascal OCT-NOV 2016
     
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You're also misunderstanding exactly how much of an impact AA and resolution bumps have on vRAM usage. You should check the findings in my vRAM guide. Bumping from 720p to 1080p and bumping AA from off to 4x MSAA might be... 200MB extra vRAM? At most? Anything more and it's clear unoptimization by devs. End of story. 4k res at MOST throws on like 400MB of vRAM extra over 1080p, if so much... AA included.
     
    octiceps likes this.
  15. tlprtr19

    tlprtr19 Notebook Evangelist

    Reputations:
    393
    Messages:
    390
    Likes Received:
    96
    Trophy Points:
    41
    Nvidia may skip 20nm process technology, jump straight to 16nm | KitGuru
    What I intended to say in my previous post is that there is no logical sense in doing 20nm and then doing 16nm all in 1-2 yrs timeframe. Not a good idea from a semiconductor company perspective (technology and business wise)
     
  16. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Sweclockers made a boo boo and changed the specs for GTX 980M down from 2048 cores to 1664.
    Shocker :rolleyes:
     
    Tonrac and Ningyo like this.
  17. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    that is normally correct, though it depends on the game. For instance many DotA/RTS style games increase viewing area with increased resolution while leaving each object the same size in pixels. This means they can need to load far more textures and such into VRAM. Admittedly this almost exclusively affects games of types that normally don't need much VRAM to begin with, but if you were for instance using a 4k resolution laptop with a lower end GPU that only has 2 gb vRAM ( I think the 840m has that?) then you might run into issues with some of these games.

    Most FPS/ RPG games on the other hand scale up the size of objects so the same amount fit on the screen, they are just in higher detail. In these cases you are absolutely correct, the impact on vRAM usage is extremely minimal. And since these are the types of games that normally use by far the most vRAM this is normally not an issue.
     
    D2 Ultima likes this.
  18. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    lol have you guys seen the VRAM requirements for "Shadow of Mordor"?
    6GB+ for Ultra, 3GB+ for High.

    :p
    [​IMG]


    Recommended specs:
    I`m glad Maxwell is here soon. Will get 6GB GTX 970M SLI or 8GB GTX 980M if not so expensive over 3GB/4GB versions.
     
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The Evil Within:

    http://www.eurogamer.net/articles/2014-09-26-bethesda-warns-you-should-have-4gb-of-vram-to-play-the-evil-within-pc


    I`d say that one should prepare for more games to follow this in the future, now that we get the PS4/XBox1 consoles using the same architectures and they will do these ports over to PC with high requirements for VRAM and specs.
     
    Robbo99999 likes this.
  20. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Exactly. All these games are throwing on FAR more vRAM than they need without any kind of increase in quality at all, and nobody is saying anything about it. It is NOT necessary. I have very rarely ever found it necessary at all for such an increase in vRAM.
     
  21. LemonLarryLP

    LemonLarryLP Notebook Consultant

    Reputations:
    0
    Messages:
    146
    Likes Received:
    9
    Trophy Points:
    31
    Hi,
    I want to buy a new notebook till the 20th of October, and i wanted to know if anyone knows if the MSI GS70 gets the Nvidia GTX 970M or the 980M.
    Are there any other light notebooks(15" and 17") which will get the 900M Graphics Cards except the MSI GS60
    I know these ones already: MSI GS70, MSI GS60, Asus G750 Series and the Gigabyte P25x V2.
    Which one do you recommand most?
    I would till now prefer the MSI GS70 most, since its the lightest and for me the cheapest for the hardware it has.
    Any other recommandations?
     
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    We can all blame Watch Dogs for this. Thankfully I have 0 interest in either of those games.
     
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It actually started with CoD: Ghosts and followed up by Titanfall and Wolfenstein. I think Watch Dogs released later than those 3 titles, or at LEAST on the same day as Wolfenstein.
     
    TBoneSan likes this.
  24. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    GS70 get the GTX 970M. GS70 is only called 2QE and a 2QD version doesnt exist. 2QE = GTX 970M.

    I have no idea what to pick out of those notebooks. Never used any of them before.
    Aorus will also come with GTX 970M SLI btw

    MSI will also offer GT60 with GTX 970M
    http://www.acadia-info.com/materiel-pc-grossiste/11260-MSI-GT60-2QDDominator-1077XFR

    There is also a GT60 version with GTX 980M coming, but its not listed yet
     
  25. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    It's it's an insane texture resolution, it will make sense.

    The same people complaining about current-gen VRAM requirements, are the same ones who berated me for saying that system requirements would raise significantly with the X1/PS4 release.

    Learn to forecast, 4GB will be the minimum from here on out.
     
    Tonrac likes this.
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Exactly, worse case scenario I've seen 4K use 2x as much VRAM as 1080p at the same settings.
     
  27. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431

    Yeah. People are quick to bash the consoles but forget that previous consoles had so low requirements which were hindering overall requirements. Thanks to the new consoles we will see a rise in overall quality and everything will be pushed to the limit. The problem will be the requirements will have a hefty tag from now on.

    We already see the HDDs requiring 50GBs for games, RAM and vRAM getting a huge bump in requirement etc, and now multicore CPUs are even minimum requirements.
     
  28. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    It's very different using an asset and running it at a higher resolution, than using very high resolution assets to begin with, then running them at high resolution.

    Using the same assets and running high resolution has low impact on vram because it becomes extra information added for the image, rather than loading huge files onto VRAM. It was due to this that microsoft created tiled resources, so that you can load huge assets to RAM while keeping vRAM low since it is kind of a waste of memory :p
     
    Ningyo likes this.
  29. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    The problem is that game developers are developing for the unified memory of the consoles... If these games were optimized for the PC, the requirements would go down. They don't want to spend any resources doing that though, regardless of the pain it dumps on their paying customers.

    LAZY developers is why the requirements are going up and we have the option to choose not to buy their games.
     
    octiceps likes this.
  30. marcos669

    marcos669 Notebook Evangelist

    Reputations:
    359
    Messages:
    300
    Likes Received:
    71
    Trophy Points:
    41
    Yes, yes, Xbox One uses 3GB ram only for O.S., and PS4 quite a lot too
     
  31. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I know I expected them to increase, and as such was elated that I was getting 4GB vRAM cards back in 2013 before all of this talk of super high memory requirements and new consoles releasing... but that doesn't excuse the unnecessarily high requirements that devs are making... because the games are not comparably improving visually. What is the point of high resolution textures that aren't even drawn better than some game from 2012? It's just inflated, and it has no purpose doing so. The whole "well PC can toss a stronger graphics card at it" mantra has to evaporate. IF that game makes Cryengine 3 games look like Far Cry 1? I'll shut up. But it really doesn't, as far as I know.
     
  32. LemonLarryLP

    LemonLarryLP Notebook Consultant

    Reputations:
    0
    Messages:
    146
    Likes Received:
    9
    Trophy Points:
    31
    The Aorus is too expensive for me.
    I would Prefer a Notebook for max 1600€ which is 2030$.
    I found the Aorus X7 with the Nvidia GTX765M 2GB SLI for 1800€ which is way too expensive for me.
    Any idea what the prices are going to be for the Nvidia GTX 970M/GTX 980M Notebooks?
    I need the product shipped to Austria, Europe.
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Even with unified memory, I believe 5.5-6GB is their limit of usable memory. 4GB vRAM and 2GB system RAM should be it. For PC to use 2GB+ system RAM and 6GB+ vRAM means it has surpassed the console architecture limitations and is just really going crazy.
     
  34. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    6GB of vRAM on Ultra at FHD means nothing but game is HEAVILY UNOPTIMIZED. Game should have a changeable parameter of LOD distance or draw distance etc. That should be fairly easy to change in engine settings.
    Something is telling me that in one month this game becomes less performance hungry.
     
  35. mgi

    mgi Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    @LemonLarryLP: You might fit an Asus 751 with 970m and no SSD in that kind of money..or something with an 880m with a healthy discount.
     
  36. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    yeah watchdogs is a really good game, but the game runs like turd with my 860m
     
  37. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    yeah watchdogs is a really good game, but the game runs like turd with <S>my 860m</S> everyone's video cards world over especially AMD users who got screwed so badly it could go on pornhub

    *fixed
     
    octiceps, FuDoW, TomJGX and 2 others like this.
  38. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    iris pro is nowhere as battery efficient as you think. at max performance iris pro consumes 35w and it performs like a 740m, so in the end the performance/watt is as about close as if not less than a kepler GPU
     
  39. heibk201

    heibk201 Notebook Deity

    Reputations:
    505
    Messages:
    1,307
    Likes Received:
    341
    Trophy Points:
    101
    LOL
    you should see wolfenstein tho, that game runs completely sh!t if you don't have 3GB+ VRAM.

    what I don't understand how the hell are they taking up that much VRAM at 1080p. if it's 3k or 4k at least it's understandable, but at 1080p it's just ridiculous, and what's most interesting that the amount of VRAM a game requires is usually inversely related to the actual quality of its textures, the world is a funny place.

    wolfenstein has sh!tty graphics regardless, titanfall is much less amazing than bf4 with the same amount of VRAM, watchdogs runs BETTER with LESS VRAM after the mod.

    I think we can draw some conclusions here
     
  40. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    Since people seem to keep getting their numbers wrong, this list should in general be in order from fastest to slowest, though the 4910MQ has a 8mb L3 cache all the rest only have a 6mb one, which may push it above the 4980HQ in some tasks.

    i7-4980HQ 2800 ‑ 4000 mhz (Iris Pro)
    i7-4910MQ 2900 ‑ 3900 mhz (8mb L3 cache)
    i7-4810MQ 2800 ‑ 3800 mhz
    i7-4860HQ 2400 ‑ 3600 mhz (Iris Pro)
    i7-4710HQ 2500 ‑ 3500 mhz
     
  41. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Don't avoid the language filters, bro ^_^.

    Anyway, yes, I know. The mod, as I mentioned earlier in one of these threads today, repacks the game with the same "ultra" textures, but the stutter is gone, because "ultra" is coded to cause stuttering. It's just bad.

    Also, 3GB of vRAM at 4k when it takes 2GB at 1080p is something I could get behind. But it doesn't mean 4K needs 4GB+ vRAM. That's STILL bad optimization. If you're interested, read my vRAM information compilation in my sig. You'll probably have MORE hatred for it after you internalize all that information haha.
     
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You are listing the maximum turbo clocks there. You need to find what the 4-core turbo clocks are. The HQ chips MIGHT have a separate trend of turbo clocks to the MQ chips, which are generally +1000MHz 1 core, +900MHz 2 core, +800MHz 3-4 core above base. For example, that 4980HQ has a +1200MHz maximum turbo boost. It is quite possible it could run its 3-4 core clocks as +800MHz and that 2 core is +1000 and 1 core is +1200... in this case, it WILL be weaker than the 4910MQ which has a 3700MHz 4-core boost clock, versus the 4980HQ's possible 3600MHz 4-core turbo.

    Also, you should find the official possible OC capabilities. If the HQ chips have 0 OC potential, then 4800MQ+ is superior to them all in terms of possible performance capabilities, as it can hit 3.9GHz (or more, with bclk OCing) and that might very well surpass every single HQ chip out there.
     
    Ningyo likes this.
  43. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    True I was responding to all the posts with the incorrect max numbers, its true there is more to it like you say.

    They actually do benchmark in that order in most benchmarks though, over-clocking is another story, I have no clue on how they handle that.

    oh and the first of those numbers was the base clock so I did mention those too.
     
  44. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    Me, I would get the cheapest one that offers the 8mb cache. Thats going to make the biggest difference than 400+- mhz. Of course there would be slightly better performance in something with the higher clocks but sounds like overclocking past 4.2 or around there is difficult especially with heat dissipation.
     
  45. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't think 2MB of L3 cache is better than +400MHz for most applications, but it's true extra cache does help some things. The 49x0MQ is the lowest one with 8MB L3 cache.
     
  46. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    Give or take. That extra 400mhz on four cores might be better; probably right. But I think you could hit 4.0ghz stable on the 4900mq and I have read bad experiences with overclocking more than that. I think it was on the NP9377 thread with the 4940; really hard to keep stable overclock without throttling. But then again I think it was around 4.5ghz which is pretty good but was not stable. Would be a lot better to just make 6 core mobile cpus. Qualcomm is releasing octacore mobile processors next year. Maybe possible with skylake.
     
  47. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    The 4980HQ also has 128mb of L4 cache of course ;)
     
    octiceps likes this.
  48. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    the ps4 cpu is actually weaker than the ps3 cpu
    cpu wont bottleneck us anytime soon unless you are going to SLI or use external monitor for 144hz, no single mobile gpu fast enough yet to be bottlenecked by a cpu
     
    Hellmanjk likes this.
  49. Schmimar

    Schmimar Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    1
    Trophy Points:
    6
    I have a solution for all that: just buy a Titan Z
    XD
     
  50. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    What kind of CPU should one get with SLI then? I wondered about this. And thats good but I just thought more games would be optimized around 8 core processor from now on.
    ?
     
← Previous pageNext page →