The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Help with Quadro FX 3700M card in NP9262

    Discussion in 'Sager and Clevo' started by dpmitchell, Oct 15, 2009.

  1. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    I bought a Quadro FX 3700M from ebay. I just received it.

    Okay - here's the problem. I removed the fan, and then the heatsink from the 9800M GTX (4 screws on the heatsink).

    I took out the 9800M GTX (2 screws), and slid the 3700M into the slot, then bolted that down (2 screws).

    Problem - when I tried to screw the heatsink back onto the 3700M, the 4 screw holes were too large for the screws that are used in the heatsink. Just for diagnostic purposes, I "placed" the heatsink in the proper position, screwed the fan back into place, replaced the bottom cover, and booted the system.

    It started up with the following message (something along these lines) - nVIDIA, Bios 62.92.3b.00.00 - Engineering Sample Not For Production.

    I hit F2 for system setup, and it identified that the Graphics module was nVIDIA Quadro FX 3700M.

    I exited, and allowed startup to continue - the windows logo came up (very large, screen resolution had not been set), and then the computer turned off.


    What is the cause? I have had the computer turn off before when my heatsink wasn't properly bolted down to the MXM card. Is this the cause? Or is it something having to do with the card being defective? If it is because of the inability to bolt the heatsink to the screw holes on the 3700M, what are my options (the holes on the heatsink don't look like they will take much bigger screws). Is there a different heatsink I would need to buy, and if so what?

    Thoughts? Thanks in advance for all of your suggestions!
     
  2. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Ok, nevermind, I guess I can just take the threaded spacer from my 9800M and put it on the 3700M....duh!

    Still curious about this whole "Engineering Sample" business....is this thing going to work like a 3700M, or is it going to have all kinds of problems?
     
  3. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    [Still curious about this whole "Engineering Sample" business]

    Buy it off ebay? You got an ES version, not a production version.

    As far as the turning off, could be a few things, but to rule out any driver issues, just boot into safe mode and see if you make it to the desktop.
     
  4. electrosoft

    electrosoft Perpetualist Matrixist

    Reputations:
    2,766
    Messages:
    4,116
    Likes Received:
    3,967
    Trophy Points:
    331
    Some ES GPU units have faulty temp sensors in that they report incorrectly to the system for proper cooling, even with a OEM/production level BIOS flash.

    Basically, hit or miss. Install it, put it under load and see if your system responds properly.
     
  5. kaltmond

    kaltmond Clepple

    Reputations:
    699
    Messages:
    1,454
    Likes Received:
    1
    Trophy Points:
    56
    If so,is there any ways to solve this problem, like disable the sensors etc?
     
  6. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Thanks guys. It's in, and it stopped turning the compter off. I guess it was just an issue of the heatsink not being bolted down.

    I replaced the thread spacer with the one from the 9800M GTX. Then applied the thermal foil from the 9800M GTX to the 3700M (the stuff Clevo puts on the GPU). I put the card in the slot, and bolted it down. I then applied Arctic Silver 5, and screwed the heatsink down.

    The computer loaded the OS, installed the GPU, and updated the drivers. Everything seems to be working ok.

    Here is the GPU-Z read-out. I don't know a whole lot about how it's supposed to look. Can anyone tell me if everything looks kosher?

    [​IMG]

    [​IMG]

    Thanks guys/gals!
     
  7. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    Looks good to me man.. 49 degrees is decently cool, too, though I suppose it was almost idle at that point.
     
  8. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Not really - I had it running for a couple of hours prior to screen-cap'ing the GPU-Z, and was currently rendering a scene with 3ds max, with Photoshop CS4 open in the background, and several Google Chrome browser tabs open. Not pushing it as hard as I have in the past, but that's a good example of the typical load I'm running at any given time.

    BTW, an hour or so later now, and still using 3ds max, and temp is still 49 C

    Thanks for the input
     
  9. kaltmond

    kaltmond Clepple

    Reputations:
    699
    Messages:
    1,454
    Likes Received:
    1
    Trophy Points:
    56
    Try to reach the GPU temp to 75~80°C, see if the system will shou down.
     
  10. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    How would I go about making the GPU heat up to that temp? Even now, it's been on all day, and I'm running 3ds max, and it's at 44 C.

    80 is another 36 degrees C...what do I have to do?
     
  11. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Ok, one thing I am noticing is when I take the notebook home, I usually enter sleep mode (keeping my programs running in sleep mode), so that when I open the notebook again the programs are still up and running.

    Well now, when it attempts to revert from sleep mode, it restarts.

    Any thoughts?
     
  12. kaltmond

    kaltmond Clepple

    Reputations:
    699
    Messages:
    1,454
    Likes Received:
    1
    Trophy Points:
    56
    Try run some GPU benchmarks like 3DMark Furmakr etc.

    Can you save your V-BIOS and send it to me? :)
     
  13. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    Furmark will heat that sucker up real nice :)
     
  14. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    Prolly drivers.. Which are you running? This is from (HP) 182.67 Rev A (and I know you don't have a 5800, but seeing this makes me think it's driver related)

    "- Resolves an issue that occurs on systems with an NVIDIA Quadro FX 5800 Graphics Card, where entering Sleep mode causes the system to stop responding (hang) or to restart"
     
  15. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    The most intensive graphics test I have seen until now is the one from OCCTP.
    It will push a 280M up to 100C on stock clocks and also check for errors. Helps a lot in finding your best stable overclock.
     
  16. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Thanks guys, I am downloading furmark and 3d mark right now, and I will run them shortly and post results.

    For the record though, why does one want to overclock a GPU? What benefit does one get from knowing the max stable temp of the GPU?

    As I said before, I bought the 3700M primarily for use with 3ds Max. When my scenes would get complex (massive numbers of polygons), the viewports would hiccup and hang in shaded mode (even crash the system) as I tried to rotate or zoom, and even in wireframe mode once my poly count got high enough.

    I noticed that nVidia published a 3ds max performance driver, which only works with Quadro cards, so that was the reason for the switch. I put the Quadro in yesterday, and installed the 3ds max performance driver. When I put the Quadro in, and ran in open GL, I noticed a slight improvement, but was somewhat disappointed with the "improvement" advertised by nVidia - definately wasn't 100% improvement in viewport performance like nVidia claimed. I assumed that was because the 9800M GTX is a pretty good card as well.

    HOWEVER - I was poking around in the viewport settings on max, and I noticed that under driver options in the viewport tab, a new option was available. "3ds Max Performance Driver D3D - nVidia corporation - Quadro FX 3700M" - I selected it, restarted max and WOW!!!!!!!!!!!!!!!!!!!!!!!!!

    I have never had one single improvement make such a dramatic difference in the performance of a piece of software. The jump in performance was more significant than the effect on rendering upgrading from a dual core to a quad core, and the effect on scene size going from x86 to x64. I can scroll the viewports with ease in shaded mode with multiple million poly scenes - no problem at all. Not one crash since switching drivers. Holy ******! Awesome!
     
  17. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    Fantastic! Glad you got your money's worth then!

    As far as overlclocking, it just increases performance.. A simple way of thinking about it is just increasing the clock rate, thus over-"clocking". If you had an old pentium at 333mhz and you clocked it to 433, it would run faster.. Same idea.

    It takes testing though, overclocking and running for long period of time increase temperature, and can exceed temp ratings by the manufacturer causing "very bad things." You might also notice artifacts, and all around weird sh*! showing up on screen. Thus the trial and error with the overclocking settings, temp, etc.. to find a stable combination.
     
  18. kaltmond

    kaltmond Clepple

    Reputations:
    699
    Messages:
    1,454
    Likes Received:
    1
    Trophy Points:
    56
    That is what Quadro made for. Highprice is reasonable.
     
  19. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Ok, here are the results. Ran Furmark for Benchmarking, and for a Stability Test.

    Stability Test ran for approx. 300 seconds, stabilized at about 75 degrees C for several seconds, and then the system crashed.

    Below are the results of the Benchmarking test:

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]

    The GPU returned to an idle temp of 38 degrees C within about 2 minutes. The idle temp is lower than yesterday I think because I'm using my cooling pad.

    Anyway, I think this confirms what kaltmond was saying, that this card can't handle temps near 80. Is this a problem for my use with 3ds max? My temps stay below 50 from what I can tell....what can I do, given with the info above?

    Also, why isn't the full 6 GB of RAM showing? Am I missing something?

    Thanks guys!
     
  20. Opteron

    Opteron Notebook Evangelist

    Reputations:
    91
    Messages:
    418
    Likes Received:
    5
    Trophy Points:
    31
    vista 32bit service pack1 if you have 6gb installed it will show only 4gb installed. I believe if you have service pack2 installed it will show 6gb installed, but will not use the extra 2gigs unless you upgrade to 64bit. hope that made sense.
     
  21. BrandonSi

    BrandonSi Notebook Savant

    Reputations:
    571
    Messages:
    1,444
    Likes Received:
    0
    Trophy Points:
    55
    I would be pretty surprised if 3ds max got the card near the furmark temps. I think you're good, but now you know the upper limits, and can act accordingly.

    If you're not running x64 you're not going to be using all of your memory anyway, so I think you should be more concerned about that, than the memory count not being accurate. :D
     
  22. dpmitchell

    dpmitchell Notebook Enthusiast

    Reputations:
    10
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    I am running 64-bit Vista. Knowing that, any thoughts on why it's not reporting 6 GB?

    Could it be that furmark is a 32 bit program? Even then, why is GPU Caps only showing 4?

    Frustrating. I'm wondering whether I'm getting the use out of the full 6 GB. I recently upgraded from 4 GB DDR2 - 800 Mhz (2x2), to 6GB (4 + 2) DDR2 - 800 mhz, and my system performance doesn't seem any different at all (even the RAM widget on the desktop is telling me that my RAM usage is about the same (percentage wise) as before I added the extra RAM.
     
  23. slawomir

    slawomir Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    hi. I have the same card and system as You. and I have problem with drivers can You help me somehow. I mean i cant download proper(3ds max performance driver). regards
     
  24. slawomir

    slawomir Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    ok i found .thanks
     
  25. Opteron

    Opteron Notebook Evangelist

    Reputations:
    91
    Messages:
    418
    Likes Received:
    5
    Trophy Points:
    31
    dpmitchell the only thing I can think of is that both sticks have to be the same you. You can also remove the 4gig and see if the 2gig sticks is being seen.

    also run memtest.... link to memtest

    put it on a cd as an ISO
     
  26. ryanshengqi

    ryanshengqi Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    er, I also have FX3700m. And also shut down when GPU over 75.

    That's mean the GPU is powerful, and can play game with high performance.

    But only 10mins. what a pity!!!

    is there any ways to solve this problem?
     
  27. ingegneredentro

    ingegneredentro Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Hi Kaltmod, did you have resolved the problem of the fx3700m, that once it has reached 76°, the computer shut down?I have flashed my fx3700 62.92.3b.00.00 with the version 62.92.51.00.05, but the new bios is not compatible with my vga and i see some artifact...What do i do?I have bought the vga on ebay.Help me please!!!Give me a private message at [email protected].I wait for your answer!Many thanks!!!
     
  28. ryanshengqi

    ryanshengqi Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    I am same with you. The FX3700m cannot over 75.
    By the way when you flashed the V-BIOS, has the temperature problem solved?