The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    DX 12 benefits for older laptops with older CPU's?

    Discussion in 'Gaming (Software and Graphics Cards)' started by King of Interns, May 3, 2015.

  1. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I read an article today that suggested DX12 will boost frames by 60% and cut power by 50% due to greatly reducing cpu overhead!

    This seems to follow in Mantle's footsteps which is a good thing.

    Is it likely therefore that older machines like my old M15x equipped with the 920xm for example will receive a new boost of longevity?

    The reason I ask and I think there must be others too is that soon I must either decide to overhaul the laptop; pop in new gpu, OS, ssd perhaps up the ram to 16gb etc. Older gaming laptops excel in that they remain upgradeable and reliable/well built. Or to part out and then buy a new machine.

    If a year from now new API's are what they are made out to be, would the overhauling option be a viable alternative to buying a new machine? I know it could be potentially cheaper and certainly better for the environment but only if the CPU can keep up.

    Let's discuss!
     
    Last edited: May 3, 2015
    fatboyslimerr likes this.
  2. Tinderbox (UK)

    Tinderbox (UK) BAKED BEAN KING

    Reputations:
    4,745
    Messages:
    8,513
    Likes Received:
    3,823
    Trophy Points:
    431
    Sounds like snake oil, manufactures would hate it, as they want you to buy a new notebook for every irrelevant update.

    Also normally you graphics has to support the dx version you want to use, so how does an dx 9 run dx12

    John.
     
  3. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    M15x graphics card is easily changeable. Already up to the 970/980M is supported which supposedly will support dx12 themselves. Install Win 10 and the API ought to work right.

    By next year next gen GPU's will be out I hope that are still manufactured on MXM 3.0 specification that will support full DX12 feature set. Not so impossible I hope.

    You could be right but then again laptops already now feature soldered CPU's even gaming ones so better API's is one way to make the laptops last a little longer.
     
  4. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    With current soldered POS CPUs and considering that most of the market share is held by underpowered CPUs (most notable example - GT80 - 47somethingHQ with 980m SLi, LOL), it only makes more sense to offset the load to where it should be - the GPU(s). As you already said - it's more than welcome for those of us who enjoy older machines. If I were you, I'll keep the 15 :) Of all the 16:9 machines, only the M15x-R1 makes it in my "would like to have it" list. Kinda reminds me to my 5920G, it would take whatever GPU you throw at it, and has room for improvement. The missing eDP port leaves a lot to imagination (*cough*DreamColor(or 4K, if you fancy useless resolutions)*cough*) :D
     
    Last edited: May 3, 2015
  5. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    Im sure it will help but i dont think it will be 60%.They always make big news etc and in the end it wont be such drastical but it will help especially maybe in cpu temp and a boost about 20%.Will there be any dx12 this year besides fable?i think that only fable has been announced as dx12
     
  6. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    You may see some improvements. Won't know 'till you try.
     
  7. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Keep it. Your M15x is a prime example of why BGA sucks. :D
     
  8. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    I wouldnt get my hopes up - again. Mantle was a major disappointment, it does not work at all with my 7970m crossfire combo.
    Its also very likely that any older videocards wont support direct x 12. I hope I am mistaken though. But manufacturers are lazy, they usually dont update older models and just focus their support on their newest line ups.
     
  9. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    King of Interns, your M15x is very remarkable and certainly deserves an upgrade. On the other hand, it may be a good time to sell it and move on to something else. Your CPU will certainly keep up, unless it overheats and throttles that is... but what are your priorities, anyway?
     
  10. fatboyslimerr

    fatboyslimerr Alienware M15x Fanatic

    Reputations:
    241
    Messages:
    1,319
    Likes Received:
    123
    Trophy Points:
    81
    Have to say I've been impressed with Mantle with Dragon Age Inqu. 100% GPU usage with a mix of high and medium settings, and nicely balanced CPU usage across 8 threads between 30 and 70% utilisation. I think this is how mantle is meant to work, helping me get better fps on my older CPU so it does bode well for directx 12. Depends how well devs manage to incorporate the low levelness of directx 12 into their games. That 60% is probably a best case scenario with real world being something like 20-30%?
     
    Last edited: May 4, 2015
    Kade Storm, TomJGX and triturbo like this.
  11. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Gotta think though.. There isn't many machines out there that could be considered an upgrade from your current machine... Not when we're talking about future proofing.
     
    fatboyslimerr likes this.
  12. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    This underpowered CPU actually works well enough in the GT80, to my surprise. I can maintain turbo under most scenarios, and when I can't, it only drops 100mhz or so. Considering what I have experienced and seen with other HQ CPUs, I am impressed by this.

    But to the other HQ processors that drop below 3ghz, its a shame. It doesn't matter if they are soldered, you would only need TDP control.
     
  13. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    Isn't it holding back the SLi? @HTWingNut showed us that 4700MQ can hold back single 980m, let alone two of them. Anyway DX12 should be a good thing. Not the miracle 60% (just like CrossFire and SLi don't scale 100%), but nice improvement none the less. Thank you Mantle for laying the road to GPU independence.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well he has only a 60Hz screen, so I'm sure he can increase the graphics settings esp. antialiasing to be GPU-bound most of the time. 60 FPS isn't asking a whole lot from a CPU. HTWingNut showed 980M being slightly CPU limited at like 100 FPS.
     
  15. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I do understand. I'll try to explain better - it is probably fine and dandy for most of the games now, but as time goes, it wont be (or at least it wouldn't if it wasn't for DX12, but let's see how it would turn out). Take for example M17x-R2 (one of the best machines ever made in my opinion). It can get pretty nice GPU upgrades, but they only make sense if the CPU is XM and clocked pretty high on top of it. Not-so-bright CPU is kinda OK for single GPU, but CF and SLi require more horsepower to get the work done and time doesn't spare even the most powerful CPUs. See where I'm going (I hope yes :D ).
     
  16. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well, you're kinda forgetting the massive gulf between Clarksfield and Haswell. Even a BGA Haswell i7 is as fast or faster than a 920/940XM overclocked to the limit.
     
  17. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    No I don't, just give it some time. C2Ds were awesome too, no? It would be awesome if we can compare the same game with and without DX12 treatment say three years down the road. And Clarksfield at ~4.5GHz would be comparable to stock 4700MQ... good luck getting it there :D It seems that DX12 is right on time for all the BGA things (not using stronger words), especially seeing the pace of development of processors of late (Haswell, Broadwell, not to mention AMD, but let's not forget that it's their idea (the Mantle thing)).
     
  18. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Wow this was incredibly confusing. I have no idea what you just said.
     
  19. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    I know right, I have a gift :D I'll try to make it better (probably would fail). The question was - isn't a kinda underpowered CPU, holding back the 980m SLi (not to mention the promised 2 GPU upgrades down the road). This is where DX12 comes in to move the bias to the GPU, so the CPU is only needed to get the OS running (exaggeration). If it wasn't for it, the GT80 would struggle getting most out of the GPUs couple of years down the road, not to mention if one was to get one of those GPU upgrades that MSi is promising. All the CPUs were great, or OK at the time of their release, but as the time goes, all of the them are getting less and less relevant. So, DX12 is here to save us all (hopefully). Clear? No? Well I can't word it better, sorry.
     
  20. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    triturbo, most i7 quads starting with 2nd gen, and highly overclocked 1st gen, are still relevant despite the years passed. It's the i5 & i7 duals that get dated (while i3s and below are pff right from the start). It may change with Skylake... or may not.
     
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You're making the assumption that an entry-level Haswell i7 will hold back 980M SLI and future GPUs inside the GT80 (probably GM204 with more functional units enabled, or rebadges). I already told you it doesn't, particularly when gaming at 60Hz, and that's regardless of whether or not DX12 significantly reduces CPU usage. An entry-level Haswell i7 will rarely if ever bottleneck a pair of mobile GPUs in SLI badly enough to drop below 60 FPS, which is all you "need" on a 60Hz screen. If performance is fine but GPU usage is on the low side, why that's easily remedied by increasing resolution or DSR factor (when Nvidia releases the damn feature for mobile). Your GPU usage will shoot up real quick.
     
  22. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    We were gaming at 60Hz up until now, yet C2D is nowhere to be found in the current game recommended specs. That's what I meant. You don't know what future games would require as minimum or recommended specs. Although I see your point - just as C2D is not exactly a gaming/computing monster, we are at point of development where it's perfectly fine for all of the rest everyday tasks. Same goes to 4700HQ, even though not the best one around, it might as well be relevant for some years to come (gaming that is). Time will tell, but DX12 would help a good deal as well. And that's what I'm counting on with my first gen i7 :)
     
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Why should C2D be in the minspec of any modern game? It's extremely underpowered and outdated. Even your Clarksfield i7 rips it to shreds. Comparing it to Haswell would be an utter joke. For a long time the general trend has been toward CPUs with more cores and threads. Only in the last few years have games started to really take advantage of them on wide scale. You can thank current-gen consoles and low-level APIs (DX12 and Mantle/Vulkan) for that.
     
  24. TR2N

    TR2N Notebook Deity

    Reputations:
    301
    Messages:
    1,347
    Likes Received:
    255
    Trophy Points:
    101
    Very good question OP.
    I look forward to DX12 and Windows 10. It could add another year or so of longevity to my two notebooks.
     
  25. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    This is good to read! I agree with everyone here :) Of course we won't know the improvements until it gets tested but by next year the proof will be in the pudding as it were!

    If all goes well I keep this thing alive. Win 10, next gen nvidia or god forbid amd gpu, 16GB ram and new ssd will be bought! If not then I guess it will be time to move on.
     
    fatboyslimerr and Starlight5 like this.
  26. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    You'll move on to what? It's kinda rhetorical since at least for me, and I believe for you too, only Clevo's ZMs are worthy to be looked at. Or desktops, but I have couple of more projects in mind before settle for one.

    Where I said they should be min spec? I said they were recommended spec at some point, but now are nowhere to be found. So my comparison of C2D to Haswell is not a joke, but very valid. Why? Your very next sentence. After couple or three years we could have 16 core CPUs, or at least 8 cores, which rip apart Haswell and would be the new recommended spec for new games. Tell me again how it is not a valid comparison?
     
  27. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    A Clevo desktop replacement I guess. I hoping I can hold onto the M15x. In 2015 a M15x with 970M is already potent and an OCed 920xm doesn't bottleneck it so things look good so far. I just can't upgrade for another year or so.
     
    TomJGX likes this.
  28. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Yes and no. It holds max performance back in some titles, and slightly in general, but not how you think it does. I still get a lot more performance than a single 980m. My monitor is OC to 96hz, and I can play a lot of games at 96hz with vsync, and others are a bit more mixed.

    Some benchmarks, for example the new FF14 heavensward, I get an average higher than 120fps maxed out. Same for some games like Bioshock infinite.

    If I had a CPU that could run at higher frequencies, indefinitely, sure I would get more performance, but that doesn't mean I get stuck to low performance, nor really held back like an AMD CPU with a high end GPU. Then again I have consistently better performance than a 4700, and basically slightly higher than stock 4800mq.