Sorry for this little offtopic, but I need to ask you something if you can help me please.
I had wondering in the past about voltage table modifications. Supposedly our voltage ICs has no registers so we need to flash the values into our VBIOSes.
They are limited to how much voltage the circuit can supply and hard ROM programmed, wich means that we are limited to the VID values.
My 8700M GT has 7 VID entries. 4 of them are in use: 2- 1V, 7- 1.15V, 5- 1.25, 3- 1.32V.
0, 1, 4 and 6 are there but unused. How can I know wich voltage the IC is programmed to use with any of those VIDs? Since there is no simple hierarchy/number order, how can I know if any of those VIDs hide a higher voltage than 1.32?
With some little assumption, if 7, 5, 3 order increase voltage, the VID 1 wich is the last non pair number, should hide some more voltage?
I made this table to simplify what I'm thinking is may be:
![]()
Maybe 1.38V there no?800core/2000shader/1000vram, that would be something...
Thanks again, and apologize for the offtopic.
-
king you should rerun the lost planet 2 demo curious what your results would be now.
-
moral hazard Notebook Nobel Laureate
By the way, I forgot to mention that I changed the critical threshold to 95C.
Can you upload a copy of your vBIOS? -
Thanks for the help!Attached Files:
-
-
moral hazard Notebook Nobel Laureate
Well I'm sorry but I can't help you Audigy.
You have helped me though.
I think the way we were trying to change the voltage in this thread was not correct, I believe we were just playing with the lable and not actually changing the real voltage.
Here is my proof:
Open nibitor, load the vBIOS.
Go to the voltages tab.
Open hexview.
Click on the "extra" voltage.
Note the value in hexview.
Now go tools > voltage table editor.
Change the lable for the "extra" VID.
Press OK.
Click on VID mode and back to Exact mode (to refresh).
Click on your "extra" voltage again and see that the value changed in hex view.
I believe this explains why even 1.16V didn't help Kingpinzero overclock.
The question is, why did my vbios help him overclock a bit more. Not sure.
I think there is a way to actually change the voltage in the vBIOS.
I have seen a guide about it before somewhere. -
moral hazard Notebook Nobel Laureate
By the way Audigy, if you can find the real place in hexview to change the voltage, that would be one easy way to find the real voltage of your unknown VIDs.
Choose VID1 or something> check in hexview what voltage that new VID has.
I do also have an idea why Kingpinzero and I got a little bit higher clocks with my vbios, but I wont say why till I test it. -
Kingpinzero ROUND ONE,FIGHT! You Win!
@Rosh: ill do some benches with lp2 when i get home: i can say that i played a bit of medal of honor 2010 @ wxga+, maxed out,no aa and vsync andbi benched min fps 32 max fps 54. Not bad at all.
@MH: as always im available to test it. As u said, the reason imho is in the vid table. It needs to show up correctly in the bios imho. -
The voltage IC is programmed to give a determined voltage range, wich is controlled by the VID expressed in a decimal value inside the VBIOS. The label is just a label, the voltage IC don't use it. Modifying it will not change the voltage because the VID is the same.
Using my case as an example, the VID 3 will always give 1.32V, VID 2 -> 1V, VID 5 -> 1.25V, VID 7 -> 1.15V and so on.
These VID values are limited, some support only 2 alternating values, some 5, and in our cases 8. What matters here is the unused VIDs, 0, 1, 4, 6, and if any of then identify an higher voltage.
Unfortunately the VID order is not always voltage order scalable, and in some cases the unused VID values have unknown voltages, with no sustainable relation between them. Thats the problem in my case. 7, 5 and 3 represent growing order in voltage. 2 should be and higher value, but it's not. Represents 1V. So I started thinking that maybe there is a relation with pair VIDs and non pair ones. Like the graph I made before, if this line of thinking is correct, the VID 1 should represent and higher voltage than 1.32V.
Again, this is speculation, just my theory. If that VID represents a very low voltage, I may need recover my card If the 3D profile is used/checked during POST. I may lower the clocks to be on the safe side, but anything is secure here, it's pretty much an unknown situation. I will need to dig more on this before make any mistake. I'm thinking on buying an 3700M in the next months, but I cannot risk my card now cause I need the computer.
Focusing in your case, the 3700M has an voltage IC with 8 voltage identifier values.
Converting each label from hex to dec:
[VID]
[0]0.7V
[1]0.75V
[2]0.8V
[3]0.85V
[4]0.9V
[5]0.95V
[6]1V
[7]1.03V
Unfortunately, all the VIDs have voltages <= the default one, so no luck with your 3700M
The unused VIDs exist(VID bitmask = 7) but the VBIOS voltage table don't include them, so it's kinda lucky shot in there... -
moral hazard Notebook Nobel Laureate
When I flash a 9800m GT vBIOS, it has 1.05V, does this really give my card 1.05V?
If so, why can't I mod the 3700m vBIOS to change the VID?
One more thing, has anyone tried a 9800m GS vBIOS on a 3700m? -
The label is just a representative value, what matters is the VID, is what the voltage IC reads. If you try to flash other VBIOS referring 1.05V(or any other value) but with the VID 7(top VID), the voltage stays the same, 1.03V.
The only way is hardmoding it/pencilmod, but considering the miniaturization of these cards, it may be quite difficult to archive. Not only that, it's virtually impossible to check the voltages with a multimeter while working due to it's location and dissipation system.
No luck here my friend... if in the meantime I discover something I will contact you.
Thanks for all the help! -
moral hazard Notebook Nobel Laureate
Thankyou for all the help, you've explained a lot.
A bit sad that it would take a pencilmod to get the voltage up.
I'll try to find the voltage IC and see how hard the mod is (have seen it done on desktops, never mobile cards). -
Kingpinzero ROUND ONE,FIGHT! You Win!
So no luck with bios modding?
That explaination was awesome!
That finally explains why ibm cards are performing better than mine, maybe the ic chip is different,including the way it works.
Im curious: wont desoldering it and resolder a newer one like the ibm cards fix these unlucky cards? -
Great works guys. I guess this proves that pushing it beyond 1.03 isn't really reasonable. Too bad. I was hoping to be able to push the card further in the future when the performance starts to become limiting
-
Kingpinzero ROUND ONE,FIGHT! You Win!
Well anyway the latest MH bios did improved my performances alot.
I forgot to add that i wasnt running the clocks exactly as i mentioned, its 575/1455/900.
Just a 5+ gain on shaders, but its stable. Did some full runs in Furmark, in 3dmark06 and with gaming.
As for the "adaptive" setting MH, go into nvidia control panel, manage 3d settings-there should be power management or something like that in the list.
Enable Adaptive. The vga will downclock instantanely if theres no 3d load, even with standard 3d clocks it stays around 300mhz on core so it idles around 41-45c, which is awesome. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Sorry to bother you guys, but i did a few more benches and tests.
Clocks: 571/1450/950
Cpu: 3,45 Ghz
Furmark: fullrun, 640x480, 8xMsaa, 8 minutes.
Score:
3dMark06: fullrun, 1280x800, default settings.
Score:
Obviously ive used MH bios posted above. Any thoughts? -
Kingpinzero ROUND ONE,FIGHT! You Win!
For rosh:
Stable Clocks with LP2 567_1450_940
CPU 3,45ghz
no AA, no Vsync, 1600x900
-
Something isn't adding up. If Audigy's theory on voltage IC tables is correct then MH's mod for 1.06v shouldn't have made any difference. The fact that is appears to be making a difference seems to disprove that the table entries are a limiting factor in selecting voltage.
-
Kingpinzero ROUND ONE,FIGHT! You Win!
Well im a living proof that MH bios is doing magic on my fx.
Give it a try,its really stable.
Thats why ive been thinking that correct vid reported in bios does affect driver and stability. Something is leading me that way. Ill wait for mh and rosh, i want to know what they think about my recent benchmarks.
Hell, i broke 13k barrier with it! And i dunno what score i can get if i had a quad like niff have.. -
Your previous VBIOS was 62.92.6E.00.10 based, from the same vendor? In other words, was this one but with the original values?
If not you are just experiencing an updated better VBIOS. If you flash this one with the 1.03V label, it will provide the same performance and overclocking ability. The same if you flash it with the 1.80V label, it doesn't matter, the VID stays the same.
-
niffcreature ex computer dyke
Well....
I do think there is something to vbios modding with these cards.... but its a mere speculation at this point.. -
moral hazard Notebook Nobel Laureate
I'm still having problems with the .10 vBIOS and 3dmark06.
Running my CPU @3ghz and GPU @ 600/950/1500 I got just over 8k in 3dmark06.
Don't know what's going on there.
In win xp it's fine and with the old vBIOS it's also fine.
The combination of the .10 vBIOS and win7 isn't working for me.
EDIT: turned Powermizer off and it's all fine now, back to testing.. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Also ive always ran .10 bios since the very first day that the card arrived, so im pretty sure - and repeat - that the last bios posted with correct vids tables is helping me out alot.
Switching to vid 1.03 doesnt let me oc like that, just tried it. -
niffcreature ex computer dyke
Maybe 1.06v works for the 3700m because .03 x 2 is .06
Not that thats a relevant theory based on fact or anything.
I sort of skimmed over audigys posts. Ultimately it does seem like one guess is as good the next with the voltage IC, though... -
moral hazard Notebook Nobel Laureate
One thing that might help a lot is increasing the vram timings, could probably get 1000mhz since the datasheet for our vram said with a higher cas it should get there.
Audigy have you tried changing the timings on your GPU (or any GPU)? -
Humm, I cannot find any technical explication. Unless... considering that these 3700Ms are based on the 8800M's schematic, there is one intriguing factor that always rounded my mind.
The 8800Ms use the same G92 core as the 3700Ms but with 32SPs laser cut disabled. So, why all 8800M's VBIOS can use 1.05V and the 3700M's ones only 1.03V? Considering the higher clock and the extra SPs of the 3700M, this don't make any sense...
Until now I didn't had any reason to doubt those values, the maker/vendor surelly knows this better than me.
So, lets start from the beginning. The GPU have dedicated pins to communicate with the voltage IC(wich ultimately regulate the voltage). There are usually 1, 2 or 3 pins. In the case of 3700M, the VID bitmask is 7 decimal, wich means 111 in binary, one for each pin(3).
VID bitmask
Code:00000111b 2^2 + 2^1 + 2^0 = 7d
Code:000b 0d 001b 1d 010b 2d 011b 3d 100b 4d 101b 5d 110b 6d 111b 7d
Lets move now to the voltage IC. Until recently, these ICs where hard programmed(ROM). The regulation circuits were limited too, so they couldnt exceed the programmed amounts. This is the case of the 3700M.
However, there is a missing peon on this chess board. The driver. Putting all together, we have the driver that analyses the load and communicates with the GPU, that therefore checks the performance tables in the VBIOS and sends the correct VID output via the pinouts(3bits) to the voltage IC. Being the IC programed it knows what voltage the GPU wants.
But there is something that I missed here. How(by the means of wich data) the driver communicates with the GPU? The GPU don't understand loads. It needs a value. So the driver asks the GPU wich voltages are available. The GPU checks the performance tables in the VBIOS and sends back the labels. Why the labels? Simple, because the driver don't understand VIDs. He needs workable data, so he can check wich are the higher and lower ones to correlate them with the different stages of load.
When the driver finishes its work, sends back the labels to the GPU. Then the GPU analyses wich VID is the closest to that label value. If the label is 1.03V, the GPU will call the nearest VID, wich from below is 1V. The algorithm will never exceed the label's voltage. So, if I chose 1.06V for a label? If the voltage IC from the 3700M is the same as the 8800M one, the GPU will get the 1.05V VID. That may explain the improvements on Kingpinzero's 3700M.
-
Kingpinzero ROUND ONE,FIGHT! You Win!
Wonderful explaination Audigy.
I was not thinking about it in so "deep" and technical therms because i lack your knowledge, but i know the driver side of a gpu, about how it is programmed and how it reacts.
Thats why ive been saying all the time in my posts that MH did a thing that we didnt in the past, beside editing the value via HEX to reach a specific vid, he also added the correct VID entry in the VID Table.
Now i am pretty sure that the vid table and its entries are theorically labelled, in order to get the driver to understand them and recognize them. Based on my assumptions and on your huge amount of technical data, we can suppose that having a full vid table that reflects the vid entries on performance levels instead of a blank one (thats where we failed in the past) helps the gpu to better understand how to handle the overclock based on what the driver asks to her.
I guess that i can test a vid higher than 1.060v, prolly it would lead me to a better OC stability, maybe breaking 585 barrier on Core.
BTW WHQL 260.89 nvidia has been released, Verde version. Ill install them and do some runs to see whats changed. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Ok i did some runs with the newer 260.89 with the last stable clocks i found.
Old benchmark (a few posts above):
Clocks: 571/1450/950
Cpu: 3,45 Ghz
Bios: .10 v1.06 Moral_Hazard
Resolution: 1280x800
Gpu Temperature: peaked 83°c
Driver: 260.63 beta
Score: 13,046
New benchmark:
Clocks: 566.7/1450/936
Cpu: 3,60 Ghz
Bios: .10 v1.06 Moral_Hazard
Resolution: 1280x800
Gpu Temperature: peaked 76°c
Driver: 260.89 WHQL
Score: 12,996
So with newer drivers score dropped a bit, but althought the cpu was running at 3,60 ghz instead of 3,45 like the old benchmark i did, the GPU reached 99% in usage and peaked 76°c compared to 83°c of the older benchmark.
Its clearly an improvement.
I doubt that lowering the CPU to a 100mhz less can decrease the score with a large amount, so these new drivers does offer a better handling on mobile quadros about temperatures.
Also based on what ive seen with this newer benchmark, max fps dropped about 1/2 frames while avg and min fps clearly improved to a solid rate, most of the times with Fraps benching in background i got an increase about 2-3fps without insane drops. Its just more stable.
That should smooth and improve gameplay. Now im off to work, but ill test how gpu usage changed in Dead Rising 2 and F1 2010, known to have a low gpu usage all around. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Ok just a bit of testing before i leave for work: gpu usage at 99% on both dr2 and f1 2010.
While on f1 it gained a few fps on lower zones (as i said above i was right, its more smooth no more drops) it doesnt change the bad performance of the game which is known to be bugged as hell.
On dead rising 2 max fps raised; min fps raised as well. With 720p,4xAA,BlurOn, min was 25 (outside) and 30 (inside) while max topped 81fps. Adding or removing AA keeps the same performance.
Thats all for now
Ps: both tests conducted with no triple buffering and vsync. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Ok i did some new benchmarks, this time at the same clockspeed of my 13k record and same cpu speed.
Heres the details:
3dmark06 raised! So having 10mhz less on gpu core does affect alot the gpu, like i lost 5 fps which is alot! -
Ok, awesome results!
The website where Paralel got it's FX don't have anymore stock...
I REALLY REALLY want that graphic for my m570ru, specially for a price similar to that. Anyone knows a secure place to request such graphics card compatible with my pc or anyone that has one and want to sell it, for a price similar? I live in portugal and don't have any experience about buying things on web, and that was what kept me from buying that card in the first place but it is the best card that my laptop suports...
Thanks in advance -
So doesn't anyone have any offer/sugestion? :\
-
moral hazard Notebook Nobel Laureate
You will find them on ebay, sometimes even under $200.
-
-
moral hazard Notebook Nobel Laureate
The one I bought came with 30days warranty and was only $150.
I've seen a few on ebay like it.
The HP cards shoule be good. I would not get a lenovo card though, might not give you an image on the internal screen (would probably work with an external monitor). -
ps:Sorry if this was a stupid request but i'm totally noob at this... :\ -
Would this one work?
Notebookparts.com: HP 8730w FX3700M 1GB NB9E-GLM3 MXM Video Card - 488125-001
the one that paralel bought was this one:
Notebookparts.com: HP Compaq 8730w FX3700M 1GB NB9E-GLM3 Video Memory Card - 460734-001
What are the diferences between them?
Thanks for all your pacience -
niffcreature ex computer dyke
I thought they were out of stock anyway?
Why don't you PM me and I can sell you one. There aren't many places that ship to portugal. -
.
If i didn't get that one i would buy one of you. Really thanks for your offer
So, in theory, this card is compatible with my n750ru, right? -
I'd be careful about thinking you got a card from them. Someone else here ordered one and they came back some time afterward and said "Whoops, our mistake, we actually don't have any"
-
Offtopic:how is your RU doing? is the "warranty" kicking in? I'm sorry for your lost and i understand it because my RU had that problem before,but i thought that it was because it had overheated cause i was on vacations in a hot place. Hope it all resolves soon
Thanks for your awnsers -
Turns out it didn't have any warranty, even though they said it did. So they agreed to refund my money if I sent the machine back. It was basically a big cluster****.
-
-
Hi everyone,
I'm sorry for my bad english, I'm french...
I'm looking for this card too and I've seen that:
Nvidia Quadro FX 3700 FX 3700M MXM III for IBM W700 - eBay (item 320606083010 end time Nov-19-10 08:32:03 PST)
This card working in a IBM W700 but I've got a Clevo M570TU...
Some of you have HP card, I've never seen IBM card, they don't work ?
What differences between this 2 cards ?
The cards go into heatsink without any change on it ?
Thanks -
Bargh!!! I've received the mail from notebook parts saying that they dont have any stock...
I guess i'm going to ask for my money back...What are the odds of they refill the stock fast? -
I would say slim to none. It's hard to imagine how they came by so many new pulled cards in the first place.
-
So I have an opportunity to buy an FX3700m for $240 for the NP9262. Would it be that much of an improvement over a single 8800m GTX. Or would I be better to opt for another 8800m GTX for an SLI configuration.
Without any modifications and just using the nvidia drivers would I have issues with games?
Also are there any differences between the HP and IBM card or will they both do the same thing? -
moral hazard Notebook Nobel Laureate
8800m GTX SLI > FX3700m.
You don't need any driver mods, every game I've tested worked perfectly.
IBM cards have less support than HP cards, some people reported they couldn't get their internal screen to show an image when using an IBM card. -
I know the post is older but I just upgraded my Clevo M570RU-U's 8800m GTX to a Quadro FX 3700m and wanted to share that the card fits the heatsink perfectly and my 3dMark06 and Vantage scores exceed those with the 8800 and real world gaming frame rates have been fine at least for the games I play on my laptop at times. I mention this as I tried to find a replacement 8800m GTX and discovered that used I would be lucky to get one for under $300, more likely over $400. Bottom line I picked up my FX 3700M on eBay for $149, and the card is ranked above the 8800m GTX even in Notebookreview. So if looking for an upgrade at a cheap price you may give it a try (I checked today and there are more than one 3700m to be had for a lot less than a 8800m GTX or 9800m GT. As for drivers, I tried the Clevo driver (from one of the 900 models just for the FX 3700M and found that the card ran a little hot. I am having great results with the latest Nvidia drivers and the card idles under 40c. Per Vantage, 3d Mark and onscreen FPS displays while gaming I can say the FX 3700m has been a good upgrade for my Clevo M570RU-U, the card is a perfect fit for the graphics heatsink enclosure with main memory banks in the right places (I destroyed an 8800 GTS trying to put one into the enclosure as some cards have taller capacitors than others).
-
@Speedmonger
Did you need to remove the metal plate on the rear part of the card ?? If yes, how did you do it??
@Moral Hazard wich are your current "final&stable" clocks/driver ?
@KingPzero why your sig is reporting a humble 12,189pts ??
Thanks to all
( I am waiting for my FX3700M to arrive) -
Kingpinzero ROUND ONE,FIGHT! You Win!
You can easily break the 13.5k barrier in 3dmark06 using a quad core like niffcreature did, even at stock clocks. Cpu really matters in benchmarks, it doubles the score.
Speaking of the gpu score itself, its a matter of luck to get an FX3700m to GTX280m clocks. Mine isnt so lucky and i can only slightly oc it, about 50mhz bump on core and shaders, a bit more on memory.
Thats the best score i achieved at the time and prolly my X9100 was running at 3,6ghz. Higher than that the heat is too much, althought im sure that cpu can hit 4ghz quite easily.
Best Modified Forceware Driver For Gaming Performance With A FX3700M Installed As A 280M?
Discussion in 'Sager and Clevo' started by Paralel, Oct 3, 2010.