The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Disable Cuda Cores?

    Discussion in 'Gaming (Software and Graphics Cards)' started by aduy, Nov 17, 2011.

  1. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    If you flashed a rom for a nvidia geforce 470m onto a 485m would it just disable the cores, and lower the memory bandwidth? Because then it would also lower the power consumption whilst using the battery and increasing the battery life. obviously you would have to flash it back, if you wanted to get the full performance, but if you need the battery life its not a bad option.

    edit: or perhaps using nibitor you could disable even more cores and then get even better performance.
     
  2. trvelbug

    trvelbug Notebook Prophet

    Reputations:
    929
    Messages:
    4,007
    Likes Received:
    40
    Trophy Points:
    116
    i dont think you can turn cores on or off thru nbitor and afaik the 470 and 485 have a totally different core count, etc. flashing this could brick your card.
     
  3. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I don't think that works either.

    A GTX 470 I think is a GF104 and I think the GTX 485 is a GF106 core.
     
  4. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    they are both gf104, the 485m is a fully implemented gf104, whereas the 470m has 3/4 the memory bandwidth and less cores of the gf104. i read a thread on the nvidia forums talking about how some drivers actually disabled cores.
     
  5. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Being able to disable cores would be an absolute god send. No more fuss with optimus and things like that when you can effectively turn your high power GPU into the equivalent of an intel IGP. As soon as the load ramps up it would enable the cores again and then you have all the beastly power back again.
     
  6. trvelbug

    trvelbug Notebook Prophet

    Reputations:
    929
    Messages:
    4,007
    Likes Received:
    40
    Trophy Points:
    116
    Afaik unless they are strictly identical in structure like the 280 and 285, it is unwise to flash a card with a different bios.

    The nvidia gpus have very good auto clocking, ramping down the card when not in use. I'm not sure if it shuts off cores when doing so.

    But I highly doubt you will get much improved battery (similar to igp) by shutting down cores, even if you theoretically could.

    Sent from my samsung galaxy s2 using tapatalk
     
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I agree, I don't think you could get much better battery life. In searching to eek every last minute of battery life out of my NP8130, I downclocked the GPU to some pathetic minimum clock speed and it made nary a difference in battery life. I know it's not the same as turning off cores, but still, 50MHz clock when normally it's like 200MHz on desktop should show significant improvement but it doesn't.
     
  8. niffcreature

    niffcreature ex computer dyke

    Reputations:
    1,748
    Messages:
    4,094
    Likes Received:
    28
    Trophy Points:
    116
    I think the g92 architecture was simplified because there were so many g92 based cards, these were the only you could flash like this. Though even then the shaders would still run and you wouldn't be doing anything by flashing it.

    Anyone know any other cards that have been successfully flashed a vbios with a different number of shaders? Or memory bus for that matter. I don't think its been done...
    fx 2700m would not flash to 9800m GS.

    As for a more realistic power saving option, one could attempt to switch off the MXM slot with BIOS or hardmods (either way BIOS would need modding to boot) and then try to use some kind of SDVO mpcie card from an embedded system.
     
  9. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    If you're just downclocking then the voltage won't change. You'll still be running at something like 0.9V on the whole chip. If you disable the cores then although you'll be running at the same voltage you'll have maybe only a tenth of them actually drawing power.

    How I see it is this. We all know that optimus gives very real increases in battery life, quite often amounting to over an hour. The problem with optimus is to know when to switch on the dedicated GPU, this is when problems start to happen. Current dedicated GPU's have different profiles, with ones for 3D, 2D etc. If the 2D profile also shut off the majority of the cores then it would run on a fraction of the power. As soon as you push the card the 3D profile would activate and you'd have all your cores to work with again. Since there's only one GPU you won't have any problems with when to enable the dedicated card.

    Optimus is marketed as being the best of both worlds. If you implimented what I am suggesting then I would call it "the best the best of both worlds".