I read an article today that suggested DX12 will boost frames by 60% and cut power by 50% due to greatly reducing cpu overhead!
This seems to follow in Mantle's footsteps which is a good thing.
Is it likely therefore that older machines like my old M15x equipped with the 920xm for example will receive a new boost of longevity?
The reason I ask and I think there must be others too is that soon I must either decide to overhaul the laptop; pop in new gpu, OS, ssd perhaps up the ram to 16gb etc. Older gaming laptops excel in that they remain upgradeable and reliable/well built. Or to part out and then buy a new machine.
If a year from now new API's are what they are made out to be, would the overhauling option be a viable alternative to buying a new machine? I know it could be potentially cheaper and certainly better for the environment but only if the CPU can keep up.
Let's discuss!
-
King of Interns Simply a laptop enthusiast
-
Tinderbox (UK) BAKED BEAN KING
Sounds like snake oil, manufactures would hate it, as they want you to buy a new notebook for every irrelevant update.
Also normally you graphics has to support the dx version you want to use, so how does an dx 9 run dx12
John. -
King of Interns Simply a laptop enthusiast
M15x graphics card is easily changeable. Already up to the 970/980M is supported which supposedly will support dx12 themselves. Install Win 10 and the API ought to work right.
By next year next gen GPU's will be out I hope that are still manufactured on MXM 3.0 specification that will support full DX12 feature set. Not so impossible I hope.
You could be right but then again laptops already now feature soldered CPU's even gaming ones so better API's is one way to make the laptops last a little longer. -
With current soldered POS CPUs and considering that most of the market share is held by underpowered CPUs (most notable example - GT80 - 47somethingHQ with 980m SLi, LOL), it only makes more sense to offset the load to where it should be - the GPU(s). As you already said - it's more than welcome for those of us who enjoy older machines. If I were you, I'll keep the 15
Of all the 16:9 machines, only the M15x-R1 makes it in my "would like to have it" list. Kinda reminds me to my 5920G, it would take whatever GPU you throw at it, and has room for improvement. The missing eDP port leaves a lot to imagination (*cough*DreamColor(or 4K, if you fancy useless resolutions)*cough*)
Last edited: May 3, 2015 -
dumitrumitu24 Notebook Evangelist
Im sure it will help but i dont think it will be 60%.They always make big news etc and in the end it wont be such drastical but it will help especially maybe in cpu temp and a boost about 20%.Will there be any dx12 this year besides fable?i think that only fable has been announced as dx12
-
You may see some improvements. Won't know 'till you try.
-
Keep it. Your M15x is a prime example of why BGA sucks.
King of Interns, TomJGX, TBoneSan and 2 others like this. -
I wouldnt get my hopes up - again. Mantle was a major disappointment, it does not work at all with my 7970m crossfire combo.
Its also very likely that any older videocards wont support direct x 12. I hope I am mistaken though. But manufacturers are lazy, they usually dont update older models and just focus their support on their newest line ups. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
King of Interns, your M15x is very remarkable and certainly deserves an upgrade. On the other hand, it may be a good time to sell it and move on to something else. Your CPU will certainly keep up, unless it overheats and throttles that is... but what are your priorities, anyway?
-
fatboyslimerr Alienware M15x Fanatic
Have to say I've been impressed with Mantle with Dragon Age Inqu. 100% GPU usage with a mix of high and medium settings, and nicely balanced CPU usage across 8 threads between 30 and 70% utilisation. I think this is how mantle is meant to work, helping me get better fps on my older CPU so it does bode well for directx 12. Depends how well devs manage to incorporate the low levelness of directx 12 into their games. That 60% is probably a best case scenario with real world being something like 20-30%?
Last edited: May 4, 2015Kade Storm, TomJGX and triturbo like this. -
Gotta think though.. There isn't many machines out there that could be considered an upgrade from your current machine... Not when we're talking about future proofing.
fatboyslimerr likes this. -
But to the other HQ processors that drop below 3ghz, its a shame. It doesn't matter if they are soldered, you would only need TDP control. -
Isn't it holding back the SLi? @HTWingNut showed us that 4700MQ can hold back single 980m, let alone two of them. Anyway DX12 should be a good thing. Not the miracle 60% (just like CrossFire and SLi don't scale 100%), but nice improvement none the less. Thank you Mantle for laying the road to GPU independence.
-
-
I do understand. I'll try to explain better - it is probably fine and dandy for most of the games now, but as time goes, it wont be (or at least it wouldn't if it wasn't for DX12, but let's see how it would turn out). Take for example M17x-R2 (one of the best machines ever made in my opinion). It can get pretty nice GPU upgrades, but they only make sense if the CPU is XM and clocked pretty high on top of it. Not-so-bright CPU is kinda OK for single GPU, but CF and SLi require more horsepower to get the work done and time doesn't spare even the most powerful CPUs. See where I'm going (I hope yes
).
-
-
No I don't, just give it some time. C2Ds were awesome too, no? It would be awesome if we can compare the same game with and without DX12 treatment say three years down the road. And Clarksfield at ~4.5GHz would be comparable to stock 4700MQ... good luck getting it there
It seems that DX12 is right on time for all the BGA things (not using stronger words), especially seeing the pace of development of processors of late (Haswell, Broadwell, not to mention AMD, but let's not forget that it's their idea (the Mantle thing)).
-
-
I know right, I have a gift
I'll try to make it better (probably would fail). The question was - isn't a kinda underpowered CPU, holding back the 980m SLi (not to mention the promised 2 GPU upgrades down the road). This is where DX12 comes in to move the bias to the GPU, so the CPU is only needed to get the OS running (exaggeration). If it wasn't for it, the GT80 would struggle getting most out of the GPUs couple of years down the road, not to mention if one was to get one of those GPU upgrades that MSi is promising. All the CPUs were great, or OK at the time of their release, but as the time goes, all of the them are getting less and less relevant. So, DX12 is here to save us all (hopefully). Clear? No? Well I can't word it better, sorry.
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
triturbo, most i7 quads starting with 2nd gen, and highly overclocked 1st gen, are still relevant despite the years passed. It's the i5 & i7 duals that get dated (while i3s and below are pff right from the start). It may change with Skylake... or may not.
-
-
We were gaming at 60Hz up until now, yet C2D is nowhere to be found in the current game recommended specs. That's what I meant. You don't know what future games would require as minimum or recommended specs. Although I see your point - just as C2D is not exactly a gaming/computing monster, we are at point of development where it's perfectly fine for all of the rest everyday tasks. Same goes to 4700HQ, even though not the best one around, it might as well be relevant for some years to come (gaming that is). Time will tell, but DX12 would help a good deal as well. And that's what I'm counting on with my first gen i7
-
Why should C2D be in the minspec of any modern game? It's extremely underpowered and outdated. Even your Clarksfield i7 rips it to shreds. Comparing it to Haswell would be an utter joke. For a long time the general trend has been toward CPUs with more cores and threads. Only in the last few years have games started to really take advantage of them on wide scale. You can thank current-gen consoles and low-level APIs (DX12 and Mantle/Vulkan) for that.
-
Very good question OP.
I look forward to DX12 and Windows 10. It could add another year or so of longevity to my two notebooks. -
King of Interns Simply a laptop enthusiast
This is good to read! I agree with everyone here
Of course we won't know the improvements until it gets tested but by next year the proof will be in the pudding as it were!
If all goes well I keep this thing alive. Win 10, next gen nvidia or god forbid amd gpu, 16GB ram and new ssd will be bought! If not then I guess it will be time to move on.fatboyslimerr and Starlight5 like this. -
You'll move on to what? It's kinda rhetorical since at least for me, and I believe for you too, only Clevo's ZMs are worthy to be looked at. Or desktops, but I have couple of more projects in mind before settle for one.
-
King of Interns Simply a laptop enthusiast
A Clevo desktop replacement I guess. I hoping I can hold onto the M15x. In 2015 a M15x with 970M is already potent and an OCed 920xm doesn't bottleneck it so things look good so far. I just can't upgrade for another year or so.
TomJGX likes this. -
Some benchmarks, for example the new FF14 heavensward, I get an average higher than 120fps maxed out. Same for some games like Bioshock infinite.
If I had a CPU that could run at higher frequencies, indefinitely, sure I would get more performance, but that doesn't mean I get stuck to low performance, nor really held back like an AMD CPU with a high end GPU. Then again I have consistently better performance than a 4700, and basically slightly higher than stock 4800mq.
DX 12 benefits for older laptops with older CPU's?
Discussion in 'Gaming (Software and Graphics Cards)' started by King of Interns, May 3, 2015.