If you flashed a rom for a nvidia geforce 470m onto a 485m would it just disable the cores, and lower the memory bandwidth? Because then it would also lower the power consumption whilst using the battery and increasing the battery life. obviously you would have to flash it back, if you wanted to get the full performance, but if you need the battery life its not a bad option.
edit: or perhaps using nibitor you could disable even more cores and then get even better performance.
-
i dont think you can turn cores on or off thru nbitor and afaik the 470 and 485 have a totally different core count, etc. flashing this could brick your card.
-
I don't think that works either.
A GTX 470 I think is a GF104 and I think the GTX 485 is a GF106 core. -
they are both gf104, the 485m is a fully implemented gf104, whereas the 470m has 3/4 the memory bandwidth and less cores of the gf104. i read a thread on the nvidia forums talking about how some drivers actually disabled cores.
-
Being able to disable cores would be an absolute god send. No more fuss with optimus and things like that when you can effectively turn your high power GPU into the equivalent of an intel IGP. As soon as the load ramps up it would enable the cores again and then you have all the beastly power back again.
-
The nvidia gpus have very good auto clocking, ramping down the card when not in use. I'm not sure if it shuts off cores when doing so.
But I highly doubt you will get much improved battery (similar to igp) by shutting down cores, even if you theoretically could.
Sent from my samsung galaxy s2 using tapatalk -
I agree, I don't think you could get much better battery life. In searching to eek every last minute of battery life out of my NP8130, I downclocked the GPU to some pathetic minimum clock speed and it made nary a difference in battery life. I know it's not the same as turning off cores, but still, 50MHz clock when normally it's like 200MHz on desktop should show significant improvement but it doesn't.
-
niffcreature ex computer dyke
I think the g92 architecture was simplified because there were so many g92 based cards, these were the only you could flash like this. Though even then the shaders would still run and you wouldn't be doing anything by flashing it.
Anyone know any other cards that have been successfully flashed a vbios with a different number of shaders? Or memory bus for that matter. I don't think its been done...
fx 2700m would not flash to 9800m GS.
As for a more realistic power saving option, one could attempt to switch off the MXM slot with BIOS or hardmods (either way BIOS would need modding to boot) and then try to use some kind of SDVO mpcie card from an embedded system. -
How I see it is this. We all know that optimus gives very real increases in battery life, quite often amounting to over an hour. The problem with optimus is to know when to switch on the dedicated GPU, this is when problems start to happen. Current dedicated GPU's have different profiles, with ones for 3D, 2D etc. If the 2D profile also shut off the majority of the cores then it would run on a fraction of the power. As soon as you push the card the 3D profile would activate and you'd have all your cores to work with again. Since there's only one GPU you won't have any problems with when to enable the dedicated card.
Optimus is marketed as being the best of both worlds. If you implimented what I am suggesting then I would call it "the best the best of both worlds".
Disable Cuda Cores?
Discussion in 'Gaming (Software and Graphics Cards)' started by aduy, Nov 17, 2011.