The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Nvidia 8700M-GT discussion

    Discussion in 'Gaming (Software and Graphics Cards)' started by Gophn, Jun 30, 2007.

  1. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    The bandwith increase does indeed make a difference, however you're correct in that the bandwith increase is directly due to the fact that the memory is clocked 200MHz higher @ stock. (As Bandwith = Bus Width * Bus (Clock)Speed)
     
  2. maksin01

    maksin01 Notebook Deity

    Reputations:
    446
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    FRAPS!!! lol ;)
     
  3. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    I think most people still have the mis-conception that "memory-bus-bit/bandwidth"
    is still directly proportional to card performance in new-generation cards.
    With the new nvidia 8x00 unified architecture, this not longer holds true,
    so stop saying this is the #1 factor limiting the performance of 8700m.

    As you can see, the 8800 Ultra desktop cards have 384bit mem bus,
    but in this anantech test, they follow the same downward pattern in both Direct 9 and 10 games as 8700m which has only 128bit bus:
    http://www.anandtech.com/video/showdoc.aspx?i=3029&p=5
    In this case, the having of 384bit mem does not prevent the 8800 ultra from the drastic drop from 100+fps to 30+fps when moving to higher resolutions.

    What I am saying is that mem-bus is no longer the #1 factor affecting performance in unified architecture.
    It's something else that need to be optimized by Nvidia, and ATI when using unified architecture!
    - which is what we like nVidia to fulfill their promise with better firmware/driver!
    (It's not a simple matter of mem-bus!)
     
  4. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    It still is, for now at least, all current games are heavily reliant on video ram bandwith.

    Of course they all show the same downward pattern, they all show that pattern because high resolutions in modern games is taxing even on a 348-bit bus. The pattern doesn't change based on the bus width, however the performance is clearly higher on the 8800, in a large part due to it's larger bus width.

    It's still the number one factor because of how games are coded. The architecture of the GPU doesn't have anything to do with it. There is no magical firmware/bios/driver fix, the performance limitation is in the hardware.
     
  5. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    Study this more carefully before the deduction
    http://www.anandtech.com/video/showdoc.aspx?i=3029&p=5

    100+ fps (800x600) to 30+ fps (1920x1200)

    That's a 70% drop even for a 384-bit bus of 8800 Ultra in both directx 9 and 10,
    and whether there is 4xAA or 0xAA.

    What I am saying is that memory-bus limitation is not the #1 reason for this 70% drop!
    It's the new-generation "unified architecture" that is not optimized for current-generation games!
    which nVidia and ATI has to work with microsoft and games developer to optimize their game codes, drivers!

    (If it is a simple matter of memory limitation, I would expect the 8800 Ultra's 384-bit bus, to at least stays almost flat from 800x600 to 1280x1024 and then start the drastic downward trend at much higher resolutions,
    but it drop drastically even from low resolutions of 800x600 to 1024x768, which a 7x00 series can handle flatout!)


     
  6. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    It's a 70% drop in performance, but keep in mind that it's also 380% increase in pixel count. Would you seriously expect anything different? Also, remember that Lost Planet is one of the most poorly optimized games ever. These performance drops are due to the increased pixel count, plain and simple.
     
  7. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    Yes full agreed with you about huge pixel counts,

    But if it's a simple mem-bus limitation, for 384-bit 8800 Ultra at least,
    we would expect the curve to stay somewhat gradual/flat at least at the low resolutions parts,
    and only begin to drop drastically when it went into higher resolutions because it's memory-threshold-limit is hit.

    It's simply not this case - it begin to drop drastically right from 800x600 (both 4xaa and 0xaa).

    If fact, if you look at the graphs carefully, ironically,
    the sharpest drop is at the lower resolutions parts, and then more gradual drop towards the highest resolutions.

    Something else is inducing the performance drop even at lowest resolutions of just 800x600.

    I can only infer it's due to non-optimization of the "unified execution architecture" of the new-generation cards in current-generation games.;
    and not due primarily to mem-bus limitations.


    (btw, can look at the other games in the same link as well at the bottom menus)

     
  8. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    There is a very simple explanation for it, and it's because even at very low resolutions the performance was bottlenecked due to the bandwith. There's nothing wrong with the card, it's simply that those games are incredibly poorly optimized. Check out Bioshock results. For the 8800GTX, the scaling is such:

    (Resolution: High Settings|Medium Settings|Low Settings)

    1920x1200: ~45fps | ~53fps | ~81fps
    1600x1200: ~54fps | ~64fps | ~98fps
    1280x1024: ~76fps | ~91fps | ~138fps

    From this we see that (using 1280x1024 as a base res and high settings) a 46% pixel increase results in a 40% decrease in speed, and a 76% pixel count increase results in a 69% decrease in speed, which is nearly perfect scaling. Similar scaling is shown at lower settings (though to a lesser extent the further settings are lowered).

    Edit: Sorry for not mentioning it, this is based on Legion Hardware's reviews using stock drivers (NOT the Bioshock BETA ones). Link is here:
    http://www.legionhardware.com/document.php?id=681&p=0
     
  9. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    Well... if most people fails to recognize the bigger problem beyond mem-bus limitation,
    nVidia and ATI will have an easy job...
    It's very obvious that at 800x600 with 0xAA, a 384-bit 8800 Ultra is not bottlenecked due to the bandwidth.

     
  10. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    There is no bigger problem (beyond the fact that those games are poorly optimized). And looking further into those benchmarks (and other benchmarks on Lost Planet) it appears that you're correct. It appears to be shader bottlenecked because of the poorly optimized coding, that's why it levels out at higher resolutions (because it can push that much memory, and at higher levels the pixel count is more of a tax than the pixel processing).
     
  11. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    Well if you see my other post,
    the architecture has changed drastically between pre-8x00 series cards and post-8x00 cards....
    It's not a simple game of memory-bandwidth anymore, as most people still primarily associate with.
    A lot more areas can be and Yet-to-be optimized or are limiting factors, yes the games codes,
    but also drivers/firmware from ATI and NVidia, and Microsoft codes!
     
  12. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Did you see my post on Bioshock? The 8800 Ultra appears to scale nearly perfectly (at high settings, no less) with pixel count, indicating that the bottleneck is indeed memory bandwith. Yes, the shading calls can certainly be optimized further in the API and the use of such in the game engines, but memory bandwith is still the number one bottleneck to playing at high resolutions.
     
  13. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    Most people plays at 1024x768, 1280x800, 1440x900 or 1680x1024 where other factors are more significant than mem-bus.
    If mem-bus is that important, nVidia and ATI should have more easily spent their main R&D in improving mem-bus limitations, rather than wasting big big resource to completely overhaul the unified execution architecture.
    or ATI could have easily implemented the EDRAM (like they had done years ago in Xbox360 giving 256.00GB/s mem bandwidth) in their next-gen cards instead of unified shader and beat the hell out of nVidia!



     
  14. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    You're absolutely right, however most people have mid range GPU's as well, so the memory bandwith is still important, even at lower resolutions. The memory bandwith limitation is not any fault of the GPU maker's, and FWIW they are indeed trying to overcome it, precisely with the new shader architecture. The general idea is to make "prettier" gameplay using less memory, thus lessening the dependance on memory bandwith. However (my guess is that) it will be a while before we see the fruits of that in mainstream PC gaming.
     
  15. dcp

    dcp Notebook Consultant

    Reputations:
    40
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    The Radeon 2900's memory interface is doubled from the previous generation (512-bit, plus all those tweaks they have made within the internal bus; http://www.anandtech.com/video/showdoc.aspx?i=2988&p=9), nVIDIA has also beefed up theirs too to 384-bit. So I guess they memory bandwidth do mean something or else why bother right..
     
  16. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    What is a puzzle to me is ATI already have the technology years ago to solve the memory bandwidth issue with its xbox360 chip with EDRAM which effectively give 256.00GB/sec bandwidth, which is way much more than that of radeon 2900's 512bit can offer....
    There must be other factors more significant than mem...

     
  17. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
  18. Mimino

    Mimino Notebook Communist

    Reputations:
    1,181
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    so what? doesn't mean it's even near to what 7950 represents...
     
  19. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    Whatever it means to a 8600m GT owner...
    vs
    Whatever it means to a 7950 owner...

    Clearly I am amazed at what a oc'ed 8600m GT can achieved a par 7950-level benchmark.
    Well done.

     
  20. Mimino

    Mimino Notebook Communist

    Reputations:
    1,181
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    i'm a 8600 owner and it doesn't mean jack **** to me ;)
     
  21. planet

    planet Notebook Evangelist

    Reputations:
    6
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    30
    seem to me if u could oc'ed like he/she did, your performance will surely increase....and he maintained it at around 80C temp which is quite stable...
     
  22. Mimino

    Mimino Notebook Communist

    Reputations:
    1,181
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    yeah, well, i don't think an average increase of 5 fps is worth it, but again, that's just me.
     
  23. Kwakkel

    Kwakkel Weirdo

    Reputations:
    222
    Messages:
    791
    Likes Received:
    0
    Trophy Points:
    30
    depends o how much FPS you're getting :)
    from 10 to 15 FPS is a big jump
    from 50 to 55 is a lot less significant :)
     
  24. Mimino

    Mimino Notebook Communist

    Reputations:
    1,181
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    well, agree, but u probably won't see such a big jump from 10 to 15 fps, if it's that bad, it's bad...unless u just overclock it to the max
     
  25. chrous25

    chrous25 Notebook Consultant

    Reputations:
    19
    Messages:
    231
    Likes Received:
    4
    Trophy Points:
    31
    Well I just got my M570RU with the 512 MB 8700M GT and got 5384 on 3D Mark 06 all stock on the VC and 3D Mark on a fresh XP Pro install.

    I like it.
     
  26. chrous25

    chrous25 Notebook Consultant

    Reputations:
    19
    Messages:
    231
    Likes Received:
    4
    Trophy Points:
    31
    I will do one at 1280x1024 the default was 1280x768
     
  27. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    What resolution screen do you have?
     
  28. chrous25

    chrous25 Notebook Consultant

    Reputations:
    19
    Messages:
    231
    Likes Received:
    4
    Trophy Points:
    31
    1440x900 did one last night and got 47xx at 1260x1024 I'm in the norm with my setup :)
     
  29. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    If you have a 1440x900 screen, it's physically impossible to run the test at the defauly 1280x1024, as your screen doesn't have enough vertical pixels. If you have the free version, you will only be able to run at 1280x800 (or 768), however if you have the professional version, try running it at 1440x900, as that provides a very similar score to 1280x1024.
     
  30. BLuDKLoT

    BLuDKLoT Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    Hi guys,

    I have been reading these posts trying to figure out what to do about my current situation, but I am lost and hoping you might help me decide. I just bought a new m9750 and I spared no expense, it is maxed out. I chose the dual 512MB 8700M GT's and I think I may have made a mistake. 5 minutes after I took the thing out of the box and before I installed any games or made any mods to the factory config, I received my 1st "Video Hardware Failure" message. I thought it may have been a fluke so I went ahead and began the install of some of my games, to include Crysis, COD4:MW and R6V to try this baby out and get some glorius gaming......Well after the Crysis install I let Crysis choose the optimal settings and I put the resolution at 1600x1200 (other settings on high) and began to load the game. As soon as the game loaded and I was in freefall heading towards the water, I could tell that this notebook wasn't going to be able to handle that game. The game play itself was totally choppy and absolutely horrid. If I made to quick of mouse movements, the game would freeze. I sometimes had crashes to the desktop, or the screen would just go black and whatever audio got captured at the time would repeat like a broken record. I tried to fix these issues by adjusting the res, setting the settings to med/low, but to know avail. The game persisted to be unplayable due to chop, stuttering graphics and just overall complete failure. It was a real bummer. I tried my other games and did not have any better results. COD4 played as Crysis did initially, but with lowering the settings, it was playable, but the graphics were so horrible I didn't want to play because it looked like crap.
    My other games like BF2 seemed to play fine with all the settings maxed out, but that was the only game I had any luck with. I bought the 9750 because it was advertised on the Alienware site and on the Crysis site as the notebook designed to handle the new DX10 technology and the Crysis engine, but that couldn't be farther from the truth. After trouble shooting with Alienware for a week trying to get this fixed, they finally had me send it to them for repair. Before sending it back to them I tried different drivers, I tried installing XP Pro instead of Vista, and every other typical method most commonly used when troubleshooting, but nothing worked. Well Alienware in their infinite wisdom received my notebook on the 12th, reinstalled the OS, and sent my notebook back to me. It's due to arrive tomorrow, but I know for a fact it's going to have the same issues it had prior to me sending to them, and now I am not sure on what to do. It began to fail right out of the box! They set it just as they did when I got it the first time and I'm really pissed. If I still experience these game play issues I could return the notebook for a refund and move on. Or, I could send it to them and try to have the 7950 vid cards installed. Or, I could maybe ask to replace the RAM, mobo, etc... but I really don't know what the issue is, and it appears they do not as well. I really want to keep my m9750, but I want it to perform as advertised, and as all the reviews raved about.

    What are your suggestions? What do you think the problem could be? What can I do to get this notebook to be the gamer it's supposed to be? Please help me if you can. As of tomorrow I have 4 days to decide on how to handle this. I really want to keep it, but it just can't handle the newer games that are being released, at least the one I had couldn't. Do you think I may have just had some bad luck and got a lemon? Should I have them send me another identical system (if they will) or should I cut my losses and wait for the technology to get unf**ked? --- Here is my system:

    Manufacturer: Alienware m9750
    Processor: Intel(R) Core(TM)2 CPU T7600 @ 2.33GHz (2 CPUs), ~2.3GHz
    Memory: 4GB RAM
    Hard Drive: 160 GB Extreme Performance RAID (80GB x 2)
    Video Card: Dual 512MB NVIDIA GeForce 8700M GT
    Monitor: 17" WideUXGA 1920 x 1200 LCD with Clearview Technology
    Sound Card: SB Xtreme Audio Notebook X-Fi (Vista)
    Speakers/Headphones: SENNHEISER PC166USB Gamer Headphones
    Keyboard: Alienware Standard
    Mouse: Logitech G-5 / Saitek X-52
    Mouse Surface: Acrylic ICE precision surface, optical/laser
    Operating System: Windows Vista™

    Any help would be really appreciated. I'm stuck! (maybe I should send back and wait for the M17x?)

    Thank you.
     
  31. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    What graphics driver version were you using?
     
  32. BLuDKLoT

    BLuDKLoT Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    I tried the Drivers that came preinstalled 7.15.11.5672, and the Xtreme G Drivers at tweakforce.com, neither helped, had the same issues. I just got it back tonight from AW tech repair and it still isn't fixed. All they did was reinstall the OS, lol. I had a page of error logs and it wasn't the OS. I was getting hardware failure errors, amongst the blue screen display driver crashes, etc...I dunno, its boxed and ready for return. I emailed AW about the m17x coming out and they sent me this link: http://www.alienware.com/intro_pages/m17x_m15x.aspx
    and said that they would be out in Jan-Feb, I will wait to buy the m17x. Thanks though...
     
  33. Soulburner

    Soulburner Notebook Evangelist

    Reputations:
    51
    Messages:
    399
    Likes Received:
    0
    Trophy Points:
    30
    I wish they offered this on the 1520/1720...

    Any way to find one and install it in place of the 8600m? Same size, etc.
     
  34. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Not at the moment, no.
     
  35. johnIdol

    johnIdol Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Hi,

    I am going to overclock for the first time the standard dual 8700m.
    Anyone has any idea about stable OC speed? I've been reading of people OCing at 720 Core / 1.2 Gigs o'Memory. Any help, hint rumour, link would be much appreciated.

    I am running 169.04 drivers (best one so far) on a 2.4 CPU on 2 Gigs RAM.

    Cheers,

    JI
     
  36. Samot

    Samot Notebook Evangelist

    Reputations:
    224
    Messages:
    610
    Likes Received:
    334
    Trophy Points:
    76
    You will have to test it yourself...each card has different overclocking capabilities....while my 8700 is stable at 750/1850/1000 it doesn´t mean yours will be...it could withstand higher clocks or lower ones.
     
  37. johnIdol

    johnIdol Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Thanks for helping. I just needed some value to have any idea of a possible range for my tests.

    I am using rivaTuner 2.06, and before overclocking I want to make sure I am able to track GPU temperature. RivaTuner detects two GPU cores but the temperature shown is always the same for both (it changes but the them are always equal), so I am worrying it's picking the same temperature and displaying it for both cores...

    Also I am not sure what is gonna happen if I OC with rivaTuner: does anyone know if both cards are going to be overclocked or just one?

    Cheers,

    JI
     
  38. johnIdol

    johnIdol Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Disabled SLI and the temperature of both keeps coming up as the same. this definitely means rivatuner is displaying the same value for both GPU cores.
     
  39. Samot

    Samot Notebook Evangelist

    Reputations:
    224
    Messages:
    610
    Likes Received:
    334
    Trophy Points:
    76
    Sorry, i can´t help you about SLI...

    From my experience, you should begin by overclocking the memory or the shaders....run a few loops of 3DMark2006...if it doesnt freezes or starts displaying artifacts, raise the bar, run more 3dmark loops...once things start going wrong ease your clocks a little bit...then move to the shaders (if you started with the memory, and make sure that you leave the memory clocks overclocked) and repeat the whole thing...then, at last, overclock the core clock. Usually you should start with the core, but if you´re planning playing crysis, this is the best way because crysis is a shader-intensive game. It´s better to have the shader and memory clocks higher even if you have to lower the core clock,
     
  40. johnIdol

    johnIdol Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Thanks for sharing your exp.
    I've been posting on rivaTuner forums asking for SLI help but all they can say is "rivaTuner" doesn't support laptops ( http://forums.guru3d.com/showthread.php?t=248722).
     
  41. Gophn

    Gophn NBR Resident Assistant

    Reputations:
    4,843
    Messages:
    15,707
    Likes Received:
    3
    Trophy Points:
    456
    it does support it.... but the programmers for RivaTuner does not want to support it themselves. Thats all.

    Many of the Sager 9260/9261 owners with SLI 8700M GT has OC'ed it fine and use RivaTuner to monitor the cards' temps.
     
  42. johnIdol

    johnIdol Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Thanks for the info.

    It sounded strange indeed: RivaTuner comes with an additional tool for the XPS M1730 logitech LCD; as far as I know XPS M1730 is the only machine with an embedded additional logitech lcd, and it comes ONLY with DUAL 8700m 256MB each.

    So, if they bothered implementing a tool for the logitech lcd specifically for the 1730, why in the hell Rivatuner is not working with the line dual 8700m monitoring the right value for the 2 GPUs temp?

    Is anyone able to tell if the GPUs temperature are increasing consistently (always the same) on Sager or they may differ from each other?
    I am mainly trying to understand here before OCing if it is normal to see always the same temp for both.

    Thanks!

    JI
     
  43. Gophn

    Gophn NBR Resident Assistant

    Reputations:
    4,843
    Messages:
    15,707
    Likes Received:
    3
    Trophy Points:
    456
  44. Audigy

    Audigy Notebook Evangelist

    Reputations:
    734
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
    Tried 169.28 drivers yesterday on my 8700M GT, fantastic drivers so far...

    Noticed some performance improvements on NFS :proStreet, UT3, and Crysis.

    Some OC results, 760/1800/1000 at 1280x1024:

    [​IMG]

    It means nothing but it´s nice to see...

    ;)
     
  45. johnIdol

    johnIdol Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Thanks for the info.
    I already used your tutorial to set up temp monitoring: NICE :)
     
  46. kneehighspy

    kneehighspy Notebook Enthusiast

    Reputations:
    0
    Messages:
    27
    Likes Received:
    0
    Trophy Points:
    5
    i've installed the latest drivers (169.25) and my 3dmarks are about the same (1000 points higher though), but for some reason, 3dmark06 shows Primary Device NVIDIA GeForce 8700M GT
    Linked Display Adapters: false


    i do have sli enabled via nvidia control and system config shows both cards enabled, but 3dmark06 doesn't see the cards as linked in the system details.



     
  47. kneehighspy

    kneehighspy Notebook Enthusiast

    Reputations:
    0
    Messages:
    27
    Likes Received:
    0
    Trophy Points:
    5
    does anyone elses 3dmark06 show that their adapters are linked?
     
  48. Arthur77

    Arthur77 Notebook Enthusiast

    Reputations:
    8
    Messages:
    18
    Likes Received:
    0
    Trophy Points:
    5
    Audigy plz you can bench with default frequencies ? ( because i don't find bench on the net with 169.28 :D )

    thx ;)

    Up ? :D

    it's very important for me .
     
  49. bioshock

    bioshock Notebook Enthusiast

    Reputations:
    0
    Messages:
    18
    Likes Received:
    0
    Trophy Points:
    5
    I play Crysis with 1920 * 1200 resolution on my m1730 with 8700m cards (medium-high DX9), and it's sick!! I think the 8700m GT is a great card, and in SLI mod it's just as good as one 7950GTX go or even better..
     
  50. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yeah I agree they are great cards and especially in SLI. With the new 171.16 drivers I got even more boost for my SLI setup :) Now overclocked to 729/1458/900.
     
← Previous page