The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    GTX 980M / 970M Maxwell (Un)official news

    Discussion in 'Gaming (Software and Graphics Cards)' started by HSN21, Sep 18, 2014.

  1. mgi

    mgi Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    There has also been a listing of an Asus ROG G751JY-T7054H model in France, the original link is this.

    It has been pulled since, but you can see the configuration in the description. The price was around 2,500 EUR

    Edit: The page is still available in google's cache, sans the price though. See here.
     
  2. Kaozm

    Kaozm Notebook Evangelist

    Reputations:
    328
    Messages:
    368
    Likes Received:
    284
    Trophy Points:
    76
    What is the difference in temperature on those cpu's? (same rig) cant find any..

    Edit:
    4710HQ
    4860HQ
    4980Hq
     
  3. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    he said 4810 though not 4860

    I7-4810 = 2.8-3.8 GHz
    I7-4860 = 2.4-3.6 GHz

    going I7-4710 to I7-4810 is probably a slightly larger gap than I7-4810 - I7-4980

    be careful though the numbering systems are very misleading. Higher is not always better, and sometimes you pay a ton more for one than another just because it has a better integrated GPU which is near useless if you have a dedicated GPU.

    EDIT: those are all 47 TDP, some may run hotter or cooler, but they are all very close. Also even if a 4810 were to run hotter at full load than a 4710, it would likely run cooler than a 4710 at an equal load. So it will be very dependent on the exact usage.

    If you need a cooler CPU the 4702, and 4712 are both 37 TDP, they do run at lower clock speeds though. That should not be a problem in gaming unless maybe you have a SLI system, since very few games bottleneck on the CPU.

    On the other end there is the 4930mx which is 57 TDP, which should be even a bit faster than the 4980HQ but also run hotter.
     
    Cloudfire likes this.
  4. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    You guys sure make a CPU more complicated than it really is :p

    @Ethrem: How do you know 4980HQ will throttle? The CPU was part of the Haswell refresh. All prior architectures get a small speed bump but keep the same TDP. Isnt it part of maturing silicon from Intel?

    @Ningyo: I think he meant 4710HQ because we talked about it yesterday.
    I agree with what you say though :)
     
    Kaozm likes this.
  5. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Because the 4940MX throttles and can't maintain 4GHz. It's not going to be superior enough silicon to overcome that.

    Sent from my HTC One_M8 using Tapatalk
     
  6. Babtoumi

    Babtoumi Notebook Enthusiast

    Reputations:
    56
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    And what do you think about the mq version? like the 4710mq?
     
  7. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    When you say throttle, you are taking that it can do 3.8GHz max on 4 cores in to consideration yes? And that it throttles below that on gaming and not Prime95 and benchmarks lile that?
     
  8. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    I think throttle is more of a function of how good the cooling is, not the CPU model, especially since you can use XTU to set power limits.
     
  9. -=$tR|k3r=-

    -=$tR|k3r=- Notebook Virtuoso

    Reputations:
    4,340
    Messages:
    3,583
    Likes Received:
    698
    Trophy Points:
    181
    GEE WIZ! For the 'GTX 980M / 970M Maxwell Officially announced' (still waiting for that one) thread, there is awful lot of CPU 'Dam di dudu du du'in going on here. :D
     
  10. NorrinRadd

    NorrinRadd Notebook Consultant

    Reputations:
    86
    Messages:
    117
    Likes Received:
    26
    Trophy Points:
    41
    I'm sorry guys, my posts must be a pain in the a** for you since I'm really unfamiliar with these things, but I just got confused again. Are the following statements true (if, each time, you're planning to use the 970M/980M and not the integrated GPU)?

    There is only a very small gap in performance between the 4710HQ and the 4810HQ.
    There is only a very small gap in performance between the 4710HQ and the 4860HQ.
    There is only a very small gap in performance between the 4810HQ and the 4860HQ.
    There is a significant gap in performance between the 4710HQ and the 4980HQ.
     
  11. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    Let me help you buddy because I really screwed myself with this stuff before. Once upon a time, 5 years ago... I paid $1000 for the super fast, amazing 6 core, 15mb cache (? cannot remember) 3.3ghz Intel (!) i7 980X EXTRA Extreme edition. That was for my desktop. And I had the Fermi GTX 480. Games ran great. Loading took forever and boot sucked because I had disk drives. That CPU was the most I spent on anything in my rig. And you know how much better performance and great times I got out of it??!! Its a waste of money. It will maybe add 10% performance to gaming which is like 6 FPS. 10% of 60 is 6. If you are getting 45 FPS you will get maybe get 49 FPS. The CPU does not process graphics. That belongs to the, you got it!, graphics processing unit. And the CPU doesn't increase load times that belongs to the HDD/SSD (read/write speeds). The CPU handles AI and a lot of simple things I do not know all the details of. It is not going to make anything you will notice faster. What it will do is require more power and produce more heat even though you don't use it like you do your GPU.

    What you need to know about the CPU is to make sure it doesn't cause a bottleneck which means it is to slow to keep up with the GPU and it causes the GPU to do everything, including the things the CPU is supposed to do and therefore detracts the GPU from what it is supposed to do. That is all you need to worry about. Same with RAM. You just want a balance that doesn't cause a bottleneck. People who need CPU and RAM power use it for specific programs that demand it. I really do not know what kind of people those are but if you do not know you need it then you do not need it.

    I knew another person who has been in the computer business for 30 years. He said he bought a Pentium for $1000 when it came out. It was supposed to be a revolutionary thing. After 6 months he sold it for $200. But what happens if you buy a CPU for $200 that gives you the same gaming performance as the $1000 one? You do not lose a lot of money. I know from experience. Form fits function. Get what you need. You are a gamer I presume. The CPU will not make that experience better.

    CPU can help if the game is CPU intensive and uses mult-cores like Crysis and BF4 but you will be able to get over 60FPS easy with 900 series plus they don't have 6 core+ mobile CPUs. Those games use hyperthreading and other CPU functions. Not a big deal though.
     
  12. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    It will be a miracle if Intel has managed to get a chip that will do 3.8GHz without needing liquid ultra to avoid 90s.

    My 4940MX will do 3.9GHz @ 89C but only will sustain that speed if I jack up the TDP, something you can not do with non-MX chips. It takes 70W for my chip to sustain 3.9GHz but it will do 3.6GHz all four cores and 3.7GHz on 3, 3.8GHz on 2 @ 47W as long as I have a -85mv undervolt as well.

    Sent from my HTC One_M8 using Tapatalk
     
  13. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    They have, just not with Haswell. :D
     
  14. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    This is a Haswell chip though, I figured it was a given that I was speaking of mobile Haswell. :p

    Sent from my HTC One_M8 using Tapatalk
     
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    There's thermal throttling, then there's TDP throttling. If that 4980HQ is still pegged at 47W (57W turbo), it may throttle due to the TDP cap depending on load.
     
  16. Kaozm

    Kaozm Notebook Evangelist

    Reputations:
    328
    Messages:
    368
    Likes Received:
    284
    Trophy Points:
    76
    Does anyone think that Broadwell H, 47w will run cooler than Haswell? would be great to get overall temps down..
     
  17. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    And does anyone know if Broadwell is using a smaller socket? I know Skylake will be smaller. Broadwell is probably smaller, I am just too lazy to look it up.
     
  18. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I suspect we will see both.

    Sent from my HTC One_M8 using Tapatalk
     
  19. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    Ok let me clarify how CPU bottlenecking in games works.

    First the CPU calculates what will be in a frame, this can include things like NPC movement, movement of the camera(player) through the world, are events occurring, etc... (sometimes parts of this that require a physics engine are partially handled by the GPU depending on the game)

    Then it sends this sort of sketch of the frame to the GPU which renders all the objects and figures out the exact final picture to display on the screen.

    The GPU then sends this picture to the screen which goes ahead and displays it.
    _____________________________________________________________________________________________

    Now lets pretend we have a CPU that is creating 70 frames every second and sending them to the GPU

    And we have a GPU the is rendering on ultra settings 50 FPS to the screen

    And a screen that has a max Hz of 60

    in this case we get essentially the lowest of these speeds which is the 50 FPS rendered by the GPU

    now if we drop the settings to say high and the GPU renders 65 FPS to the screen, now we get a bottleneck at the screen, since it only updates at 60hz, we still only get 60 FPS even though the GPU and CPU can handle more.

    now lets say we switch the graphics to medium so the GPU renders 90 FPS to the screen and give it a 120 Hz monitor. Now the 70 FPS the CPU is sending to the GPU is the bottleneck and we end up with only 70 FPS.
    ___________________________________________________________________________________

    As of right now I am pretty certain NO i7-4xxx CPU will bottleneck (throttle) any modern game to under 45 FPS, most games that will be more like 90-300 fps. As you go up into the more powerful ones like the i7-4810 you likely will not find a single game that bottlenecks on the CPU at under 60 FPS.

    This means you really do not need to worry about CPU throttling UNLESS you have a 120Hz+ monitor and at least a 970m SLI, OR you have a 120hz+ VR headset you may drop graphics settings in to maintain the 120+ FPS
    _______________________________________________________________________________

    To put things into perspective a i7-3610QM (less powerful than ANY i7-4xxx CPU) probably causes throttling in the following games: (out of about 30 titles)
    Thief 2014 (probably throttles to about 45-50 FPS)
    Assassin´s Creed IV: Black Flag (about 42-45 FPS)
    Company of Heroes 2 (about 51-53 FPS)

    So if you paired it up with a 880m SLI you might lose a few FPS on ultra settings

    TLDR: unless you KNOW you need a better CPU just don't worry about it
     
    Hellmanjk likes this.
  20. tlprtr19

    tlprtr19 Notebook Evangelist

    Reputations:
    393
    Messages:
    390
    Likes Received:
    96
    Trophy Points:
    41
    .
    TSMC builds world’s first 32-core networking chip using 16nm FinFET process technology

    Looks like TSMC is having an early Thanksgiving :)

    TSMC did state 16nm finfet mass production by Q42015. Seems like their R&D process is now turning over to a manufacturing stage. This is very important news as it shows Nvidia would probably switch over to 16nm rather than porting to 20nm ( rumor by Wccftech concerning GM200 leak).

    This sets a precedent for AMD to follow suite as well. We also know Samsung is heavily investing in 14nm/16nm as well. Looks like Q2-Q4 2016 could be an interesting year for a node change.

    - from the article.

    Certainly sounds nice on paper. Really excited for a high power SoC implementation soon.
     
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I don't see Nvidia riding out the entirety of 2015 on 28nm. They're a business and they need constant product to make money. 20nm cards are their only choice.
     
  22. tlprtr19

    tlprtr19 Notebook Evangelist

    Reputations:
    393
    Messages:
    390
    Likes Received:
    96
    Trophy Points:
    41

    Very expensive affair in translating maxwell to a new node when 16nm is around the corner IMO.

    Business sense would dictate milking the GM204 for 1-2 yrs until Pascal 16nm releases just like Kepler was milked for 3 years - oops! GM204 is now far more superior than any kepler or kepler refresh cards. Further tailoring and optimization can expand Maxwell 2.0 performance.
     
  23. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    CPU bound games benefit from having a more powerful CPU. For example SC2, Skyrim, and GRID 2 both show 10%+ increase in frames when going from 4700MQ to 4900MQ ( link).

    And then there are games like Crysis 3 and BF4 that also love more powerful CPUs.
     
    Ningyo likes this.
  24. LostCoast707

    LostCoast707 Notebook Consultant

    Reputations:
    51
    Messages:
    167
    Likes Received:
    36
    Trophy Points:
    41
    Are you meaning Q4 2014 and that by next year 2015 they will be in full swing of making 16nm finFET.
    Also that Q2-Q4 2015 could be an interesting year for node change? Not 2015.
    Cause waiting till the second half or later of 2016 seems like a long wait.
     
  25. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    That article said average of 7%. And 10% increase is nothing. 10% of 60 is 6. Anything lower than 60 FPS you won't notice an increase. 10% of 80 = 8. 10% of 120 is 12. The game with the highest difference was GRID 2 with 8 FPS difference; and starcraft with 8 also. I would rather buy a second GPU. I think the gpu costs less than the CPU. I know the 4940 cost $1000. I think I would prefer a regular 4810. And an expensive cpu like the 4900+ are a lot hotter. And according to the leaked firemark scores, a second 980m increased the score by 56%. Pay an extra $500-600 for a second GPU for theoretical 56% increase or $500-1000 for 7-10% increase. My thoughts.
     
  26. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    4900MQ only costs $200 more than the 4700MQ, and the 4930/4940MX is unlocked so there's a lot more performance to be gained than at stock if your laptop's cooling is up to snuff and you don't get a dud. 10% could also be the difference between having noticeable stutter or no stutter, especially if you're under 60 FPS.

    Did I recommend spending $1000 on an MX CPU? No so please don't put words in my mouth. I was simply pointing out facts and the facts are that a more powerful CPU does give more frames, especially in CPU bound games. Whether it's worth the extra expense is up to the individual.
     
    Mr Najsman likes this.
  27. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    Huh? I didn't say you did. But you did claim a 10% typically increase contrary to your own source. Only $200 more? BRB I wanna see the price.
     
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  29. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Please look at the "Enthusiast" graphs for Skyrim, GRID 2, and SC2. Skyrim shows a 15% increase, GRID 2 shows a 13% increase, and SC2 shows an 11% increase. Even at "Mainstream" settings Skyrim gets a 7% boost, GRID 2 15% and SC2 14%, and that's consistent with what I said.
     
    Hellmanjk likes this.
  30. Hellmanjk

    Hellmanjk Notebook Consultant

    Reputations:
    34
    Messages:
    210
    Likes Received:
    47
    Trophy Points:
    41
    Yeah I just saw them cheaper on ebay. Prices have dropped like $100. I think I'll just get whatever comes with the laptop and upgrade when its really cheap.
    Alright I see that. That increase with Skyrim is noticeable; I was looking at the one below. Well that is good to know. Thanks. I'll check Crysis 3 benchmarks with that tomorrow; getting late.
     
  31. Firebat246

    Firebat246 Notebook Deity

    Reputations:
    50
    Messages:
    764
    Likes Received:
    510
    Trophy Points:
    106
  32. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,983
    Trophy Points:
    431
    980m spec info on wccftech but that 2000+ cuda core is prob fake. 1660 sounds more realistic, time to wait for 20nm
     
  33. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Its a step below 4940MX with almost half the cost and an optimized TDP that tends to lead to better temps when overclocking. Definitely regret paying the money for my MX, I run it at 47W TDP to keep temps and noise under control anyway which is what the MQ does.

    Sent from my HTC One_M8 using Tapatalk
     
    Ningyo and Firebat246 like this.
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I will trade my 4800MQ for your MX Ethrem :laugh:
     
  35. Ningyo

    Ningyo Notebook Evangelist

    Reputations:
    644
    Messages:
    489
    Likes Received:
    271
    Trophy Points:
    76
    You are correct, but I stand by my post
    "As of right now I am pretty certain NO i7-4xxx CPU will bottleneck (throttle) any modern game to under 45 FPS, most games that will be more like 90-300 fps. As you go up into the more powerful ones like the i7-4810 you likely will not find a single game that bottlenecks on the CPU at under 60 FPS."

    on the second weakest i7-4xxx CPU it causes
    Skyrim bottlenecks to 58 FPS (or 63 on the MSI GT70 they list)
    Starcraft II: HotS bottlenecks to 55 FPS
    Grid II bottlenecks to 62 FPS

    note all 3 of those are well over 45 FPS and even a 4710 would likely push them over 60 FPS

    Also since the FPS went up for all 3 in the lower graphics settings, that likely means you could change a few settings to regain some of the FPS, the games are probably not optimized to use the GPU physics or such. In all likely you can manually add a line to the config files to fix that I know you can on Skyrim at the least. Either way though any i7-4xxx CPU should be able to handle a 60Hz screen just fine.

    Edit: really that was foul language you really need to (unusable synonym for making small adjustments) your filters.
     
  36. felix3650

    felix3650 Notebook Evangelist

    Reputations:
    832
    Messages:
    631
    Likes Received:
    224
    Trophy Points:
    56
    I too want Nvidia's newest Maxwell 4910MQ with a 57W TDP, 8MB of L3 VRAM, PCI-Ex 3.0 already integrated into the GPU and supporting upto 32GB of normal RAM. Oh forgot, it must be a 4 SMM core model with an Iris Pro IGP for maximum performance!
    I'm sure Age of Empires II will run 23% better on it :p

    EDIT: This thread is becoming like a drunken driver at a highway :cool: drifting from one lane to the other.
     
    FuDoW likes this.
  37. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Oh yeah what you said is correct, I was merely pointing out that upgrading the CPU can bring benefits to certain games, especially CPU bound ones. And if you play those kind of games a lot, it might be worth the $200 to invest in a stronger CPU.
     
  38. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Ok this is just in. behold

    [​IMG]


    As we are moving from PS3/XBOX360 (512 ram) to XBOX ONE/PS4 (8gb ram "only 5.5 is usable for games currently) the jump in vram is going higher and higher
    I hope that 980m has 8GB VRAM , getting the 4gb version (if there was one) WILL hold you back, according to the developers the console version is equal to HIGH but PC version has ultra so there is that
    and lol at people that bought the recent 970/980 desktop i always said 4gb is not enough specially sooner when all games will move away from ps3/xbox360

    6GB For 1080p Ultra, will probably be 7GB on 2560x1080p to me my next gpu must have 8GB ram no questions asked.
     
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I think that screenshot is a serious problem. I think vRAM usage is going through the roof and the looks are not compensating. I don't see why a game has to use more than 4GB vRAM for any reason at anything less than 4K, and even at 4K it shouldn't NEED a whole lot more. This looks more like devs can't code and are complacent due to the fact there is a truckton of free memory (and has been for years now) and they're not constrained to 512MB on X360 anymore.
     
  40. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    VRAM is useless if there is insufficient rendering power to utilize it. Seems like that option simply increases the amount of VRAM that will be used to cache textures. This is poor and lazy programming if you ask me.
     
  41. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41

    Regardless of the issue we are going to face it, Evil within "require" 4GB VRAM too, it will only go higher
     
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah, I'd be curious how low you can go before performance actually starts to suffer. I'm guessing much, much lower. I'm fine with general guidelines or suggestions as long as the game doesn't lock you out of certain graphical settings based on amount of available VRAM. They typically tend to overestimate anyway.
     
  43. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    My GTX780 Couldn't run watch dogs at max texture/max settings i was MAXING OUT my 3GB Vram and there was ton of stutter/skipping until i reduced the vram usage
    The ultra texture is suppose to be superior to the consoles, and consoles now already has 6gb Vram (2 for OS) and the whole to the metal thing+optimized etc consoles can actually make better use of their power/vram

    I expect 8GB to become standard in a year for NEXT Gen games only (upcoming assassin's creed unity/batman arkham knight/etc)
    Watch dogs saidf it wanted 3GB but in reality you need 4GB, 3GB was minimum and without turning on AA etc with max settings+high textures the game was a mess, glitches,stutters,etc unless you get 4gb vram or reduce textures

    [​IMG]

    This was ubisoft response in twitter

    Watch Dogs can use 3+ GB of RAM on NG consoles for graphics, your PC GPU needs enough VRAM for ultra options due to the lack of unified mem

    If you experience lag/stutter on a fast PC, try to lower one of those settings to reduce the GPU VRAM usage: texture quality, AA, resolution

    Making an open world run on NG & CG consoles + supporting PC is an incredibly complex task, the team did a fantastic job. Congrats guys! : D
     
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Except Watch Dogs on PC was downgraded and optimized like sh*t, everyone knows this. Sure, get as much VRAM as possible if all you're gonna play are crap console ports. Check the Steam Hardware and Software Survey, 1GB VRAM is by far the most prevalent. Smart PC devs develop and optimize for as large of a target audience as possible, they're not gonna cannibalize sales and reputation by catering to some niche crowd with absurd amounts of VRAM. It's only the inept and frankly disinterested companies like Ubisoft that don't care since the majority of their sales come from console anyway.
     
  45. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Yeah but we can do nothing about it, dealing with games that is not optimized and a mess is part of pc gaming :(
    there are 3 companies just posted here that requires 4GB or higher and we are at the early cycle of the new next gen consoles ports it will only go higher from now on, thing is watch dog textures are mostly crap too lol
     
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yes there is something you can do about. It's called speak out and vote with your wallet. Don't buy games if they're crap, simple as that. The PC platform is too much an embarrassment of riches to get tripped up over the inevitable lemons. You're missing the big picture here.
     
  47. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Funny thing is I finished Watch Dogs before they even released the first patch LOL. Game itself was good but yeah optimization was a sack of excrement.

    Also, 6GB of vRAM at 1080p is just plain wrong.
     
    heibk201 likes this.
  48. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Incorrect. The consoles have 6GB of RAM in *TOTAL* for their games. I.E. what we use as "system RAM" and what we use as "vRAM" combined. In other words, if a game uses 1.5-2GB of system RAM, a Xbox 1 only has 2.5-3GB of RAM left for textures and such. They do not have oodles and oodles of vRAM floating around as they'd have you believe.

    Also, Watch Dogs was a joke, because there's a guy who re-packed the Ultra textures in a mod that let you turn them on with the "high" setting... guess what? Stutter went out the window, even though 3GB+ vRAM was still needed. Watch Dogs was CODED to perform badly for PC. Period. To give console users the false idea that their systems can compete with people using 2 Titan Blacks and stuff at a $400 price point. The Division is going to do the same thing. The assassin's creed games are likely going to do the same thing. Far Cry 4 is almost definitely going to do the same thing. They want console parity, but since they have no excuse as to why they cannot produce better graphics on the FAR stronger PC platforms, they just make it run awfully and claim how we need stronger hardware that doesn't exist yet.

    As for the vRAM usage debacle, it's a joke. They are just being sloppy with their optimization, and they aren't even using very good textures either. High resolution poop is still high resolution poop. It's why BF4's far crisper than Titanfall despite using 1GB+ less vRAM. It's why Sleeping Dogs' textures for most of the game using the high res texture pack match or surpass a LOT of watch dogs' textures (without using E3 2012's settings), while doing so at 1.5GB+ less vRAM. It's called good quality, rather than high resolution bad quality. There's no excuse. I don't care what they say, if your game NEEDS more vRAM than Crysis 3 and it is not for a massive cache for super fast motion through a level and/or extremely powerful sniper rifles with great draw distances (like Arma 2/3 uses, etc) then your game better look better. But they don't. And HDD space is going through the roof for simply installing them, and they're taking more and more from the systems to look as good as games that've been out or could have been out YEARS ago, while running worse.

    We should never excuse bad coding. We should never give them a free pass and let them say "oh well it's just advancement". I mean if I'm telling someone to buy a new card I'll tell them to splurge on vRAM, but that doesn't mean I really think it's good that we SHOULD. I mean I have 4GB vRAM and all, but that doesn't mean I think every new game under the sun is supposed to use up all of it. And they are. Pointlessly.
     
    Ningyo likes this.
  49. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Not incorrect I'm aware of the consoles specs/use there are games that use 1gb ram and 5gb vram they are free to use as much vram as they want from that 6gb pool being unified memory and all, asnd remember watch dogs is like 720p on xbox and 900p on ps4 and not even using proper AA yet developer confirmed +3gb vram usage so you can imagine how it's on native 1080p+AA are they not optimized? obviously yes
    watch dogs texture looks like crap compared to witcher 2 on 1.5GB VRAM only at ultra, i usually buy badly optimized games during sales only at 5-10$ max
     
  50. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I agree with you, CPU is really not something worth considering upgrading for a gaming machine if you've got any i7 Sandybridge or upwards. I didn't get any CPU bottlenecking in any games I have played, apart from Skyrim, but I saw more than the 58fps that you quoted in some parts. Also, there's talk in your post & others that Grid 2 is CPU bottlenecked. I just tested Grid 2, and it's only CPU bottlenecked on my system with a lowly i7-2630QM to an average of 137fps and minimum of 107fps (run on Medium Quality) - at this point CPU usage was not above 60% using all 8 threads & GPU usage was 60%, so I'm thinking the bottlenecking is happening either in the game or platform limited in some way, but for sure it's not bottlenecked to the 62fps that I heard banded around in the last posts.

    I agree with you though when you say "any i7-4xxx CPU should be able to hande a 60Hz screen just fine" - in fact I'd go as far to say that any i7 CPU from Sandybridge and up is perfectly fine for 60fps gaming. If you want to go 120Hz gaming, then get the best CPU you can afford, which will help out in a few games to get closer to 120fps, but of course not at the expense of getting a lower GPU - buy the best GPU you can afford, and only then the best CPU you can after that (120Hz gaming).
     
    Ningyo likes this.
← Previous pageNext page →