The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Geforce GTX280M and 260M to launch at CeBit

    Discussion in 'Gaming (Software and Graphics Cards)' started by ichime, Feb 24, 2009.

  1. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Regarding the Vantage scores, what are you guys getting at stock GPU clocks?

    Also, regarding the SM3.0 scores in 3dmark06, they are influenced by the clock speed of the cores, not the numbers of them. This is why an overclocked X9100 system has higher SM2.0 and 3.0 scores than the QX9300 system (X9100 has a higher clock speed, QX has more cores).

    Personally I'd say there isn't any one game that would consitute as the best mark because given the same performance bracket (or price bracket), one card does well at one game, while the other does better at the other. That said, the average of popular games would be a better assessment.
     
  2. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    GTA IV is a good one, but like WiC, it's more CPU limited, probably more so than WiC.

    Big Bang 3 maybe?
     
  3. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    which crysis?
    and if it's warhead.... which benching tool are you using?

    yes it is. i love that little program. simple and gives very acurate results with out having to wait 10 minutes each run...lol
     
  4. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I have regular Crysis, and I've been using Guru3D's benchmark tool.
     
  5. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    i have WIC demo
    gta4 is looking quite a bit better after the second real patch.. :)

    Jlbrightbill
    scores sent.

    crysis: and you have tried the benchmark_gpu.bat file as well?
     
  6. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Setting Crysis in-game everything to High @ 1680x1050, no AA, and using benchmark_gpu.bat, I got 41.5 FPS average over the 3 runs. Windows 7 takes a massive deuce on my FPS in DX10 Crysis unfortunately, I get 46 FPS on the same test in XP Pro.
     
  7. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    i already know what your getting at here...lol
    seems out team mate has the best stock score of about 5k single card 4850 with an over clock of about 5600/5700

    im running one now...ill let cha know in a bit...

    hummmm big ban 3 huh.....could be.

    side note:
    im still trying to see how it can be pulled off. i know i can pull about 18-19k with 112 sp's on an over clocked system. (mobile) and with them only running 128.... it would have been better to run it at 240 to 256 sp's and run the clocks normal and let the user have at it with over clocking in the end.(speculations on my part for sure) that 128 against 1600 then running AA...could be a serious problem for nvidia, but if these cuda shaders over come this...then it could be the other way around....
     
  8. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    The FurMark benchmark looks like a donut that rolled away and was forgotten. For months, but still pretty cool.
     
  9. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    yeah, tis is true...but that program is like 99. percent true and hard to fake. :)
     
  10. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    I hope, I really, really hope that the 2XXM Cards by Nvidia are not just re-brandings of the current 9 series with a bit more sparkles and stamps on them... I'd really love to see something more "in tune" with the GT200 series of chips from Nvidia, now that would be amazingly cool to have, no matter how high the power consumption was, as long as they performed very, very well.

    If the new 2XXM series by Nvidia are compatible (I really doubt it) with my oldy Sager NP9262, and performance-impressive, I might consider saving some money and upgrading my 8800m GTX's, if not, well, I would not.

    Off Topic:

    'Been so long since I visited NBR for the last time, I might share some new-and-reloaded OC results from my 8800m GTX's (which perform better now with new drivers and some Windows Registry housekeeping) for some comparison against the new 2XX Series, but would not be more than just speculation (since there is really little information about the cards right now), and might be for informative purpose rather than comparison.
     
  11. cathy

    cathy Notebook Evangelist

    Reputations:
    47
    Messages:
    551
    Likes Received:
    0
    Trophy Points:
    30
    Sorry, but you're out of luck.

    Seems like the ATI 4800 series will be the king for now. :p
     
  12. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Yeah, kinda out of luck.

    Some competence from ATI would not do any harm, and would force Nvidia to decrease their overpriced products' prices.

    Other than that, it might really be that the new G200m chips are based on the GT215 core on a 40nm architecture.
     
  13. cathy

    cathy Notebook Evangelist

    Reputations:
    47
    Messages:
    551
    Likes Received:
    0
    Trophy Points:
    30
    Competence or competition? It's two totally different words with two totally different meanings. :)

    And yeaaaa, you would need lots of luck for that to come true. Sorry, but I'm a bit of an ATI fangirl. :p
     
  14. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    More FUD from Fudzilla (haha, see what I did there?):

    possible GTX 260M specs

    possible GTX 160M specs

    And I think this little rumor is I think why nVidia can win with older technology.

    So, if these specs are accurate, GTX 280 looks like an OC'd Quadro FX 3700M, GTX 260 looks to be a mildly OC'd 9800M GTX and the GTX 160M is a 9800M GTS, all with shrunken dies.
     
  15. Aeilerto

    Aeilerto Notebook Guru

    Reputations:
    0
    Messages:
    73
    Likes Received:
    0
    Trophy Points:
    15
    Correct me if I'm wrong, but hasn't this (drivers), been the reason for many of the times we've noticed Nvidia cards outperforming comparable ATI cards?
     
  16. cathy

    cathy Notebook Evangelist

    Reputations:
    47
    Messages:
    551
    Likes Received:
    0
    Trophy Points:
    30
    Sadly, yea. As much as I like rooting for ATI, if I were to pick an equally powerful Nvidia or ATI card, I'd pick Nvidia because they will come up with better drivers to make it more powerful.
     
  17. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Dissapointing to say the least...
     
  18. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    johnksss and I ran some Vantage benchmarks on his SLI 9800M GTX's and my desktop HD 4870 1GB. His cards get a higher GPU score (10-15%) than mine until we go to 1680x1050 and use Extreme textures and 8xAA, then my card pulls ahead by around 10%.

    How much this can be comapred to the mobile market, I'm not sure. I think a pattern we're seeing is that NVidia's cards, due to some limitation, start to fade fast at high resolutions and lots of AA. I need to get with some Asus W90 users and Vantage to see if this holds true on the mobile side also.
     
  19. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Yeah, I meant competition, I was writing the post at 4 AM, sorry. :(

    Might be. But I doubt that ATI will stay behind for too long.

    Yes. Hehe.

    That is true, my 8800m GTX SLi is still becoming stronger with every new driver released.

    Agreed... I expected something else from Nvidia, oh well.

    Might be due to the GDDR5 memory (which makes up for the higher Bus in Nvidia Cards) in the desktop side, while the 9800m GTX's have GDDR3 and a 256-Bit Bus, which prevent the cards from outperforming other cards on hard or extreme situations, also the scalability on SLi, due to higher resolutions and bigger resolutions requiring more power, AA too. Might be.
     
  20. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    vantage gpu score
    8700@500/799/1250
    9600@550/850/1375

    single gpu 4430
    over clocked 5815

    well, welcome back ms missing in action! :)

    they would have had to do a bit more than just over clock a qfx3700m running as is it's petty weak(gaming only!). but im not saying that's not what is going on though...just that they would have to do a bit more than just over clock it. the 9800m can still run cuda

    that and the very well known becoming fact that ati cards seem to handle AA better than nvidia.

    that we did... not sure it's so much the resolutions, but that AA is a definate fps dropper for real. when i use SLI AA it's not a 1/3 drop but more like a 1/5 drop in fps.

    it seems to hold true with dmc4 though. with out AA the frames are in the 200+ range. with it, they drop to the lower 100"s
     
  21. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Actually, Quadro cards perform similarly to their Geforce counterparts in games, but usually the GeForce equivalents perform a tad bit better due to drivers. When I built my M1710, I used a Quadro FX2500m (a.k.a. 7900 GTX) and got similar numbers of owners with the 7900 GTX. Btw, I picked the FX 2500M because I got it on ebay for $120 at the time because people thought that quadro cards sucked, lol!

    wow! that's one heck of a jump from a 50mhz overclock! Better loan this guy your drivers
     
  22. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    not saying the quadro cards suck...far from that!.. :D they are excellent cards for autocad. but when tested for benchmarking...it didn't seem to perform as well, but!..yes there is always a but...lol

    im going to ask our member to see if he had a chance to test his further. see if he got any better numbers since when he first tested it.
    this includes any games he may ran as well.
     
  23. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I thought this too initially but looking at benchmarks on the desktop side, this is across the board, the 4850, 4830, and 4670 all distance themselves from their NVidia counterpart when resolution and AA are introduced. It can't be all GDDR5, since those cards are running GDDR3. Everybody thought the 4670 would be memory bandwidth starved but it still manages to pull ahead of the 9600 GSO slightly at 1680x1050 and a bit more at 1920x1200. Whether it's underlying architecture, drivers, or something else, I don't know, but I want to see if it applies to the Mobility 4xxx series too.

    Resolution + AA + high end textures are all things that play in to memory bandwidth. The GTX 280 and GTX 260 don't suffer from the same symptoms the G92/G92b series cards do, their performance scales well to high resolutions. That's why the 4870 and GTX 260 are a toss up performance wise, yet the 4850 / 4830 / 4670 are all spanking their G92/G94 counterparts.

    GT200 and RV700 at the high end are great. Unfortunately for NVidia, their middle and low end are all still G92 or G94 variants and those cards don't scale well at all at high resolutions, with AA, or high quality textures, whereas ATI's offerings do.
     
  24. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Hehe, you are right, this now makes me even more curious about what is really going on with Nvidia's 9 and 8 series, on Left 4 Dead, for example, on 1920x1200, I can manage to go up to 4x AF and 4x AA, but the FPS starts dropping steadily if I increase either past that (going from 60 FPS (With Triple-Buffered V-Sync, locks it at 60 FPS but it really can go up to 120+ FPS peak) to 50-35 FPS), and considering my cards are G92M (8800m GTX's) that only makes it seem like that it is tied with the G92 Core.

    Seems like that Nvidia is having some real issues with AA and High Resolutions. Yeah, I really want to see what happens with the Mobility 48XX cards too.

    Nvidia's Medium-High, Medium and Low sectors are kinda lacking... hopefully this will be fixed over time, rather than just tossed aside by Nvidia.

    Anyway, the benchmarks I promised a few posts earlier, it is funny how my 3DMark 2006 Score is so low, yet my 3DMark Vantage Score so high:

    [​IMG]

    [​IMG]
     
  25. tizzao

    tizzao Notebook Evangelist

    Reputations:
    117
    Messages:
    341
    Likes Received:
    0
    Trophy Points:
    30
    Well, the 48xx series of cards has an easy time with AA because it doesn't depend as much on memory bandwidth to pull it of nearly as much as any other card used to.

    Specifically, the CFAA with Edge detect actually uses stream processors to do it instead of consuming mem bandwidth ad storage.

    I'm sure with other types of AA you'll still need memory and bandwidth. It's just that the cards have a sufficient supply for most people.
     
  26. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    nah, it does it on low resolutions as well. i never really paid it that close attention till the 4800m cards came out. the 3800m cards didn't show this much of an edge. they had problems keeping up. if i scale up to 1920x1200 no aa, my fps are about 20 to 30 plus better than the 4870m's in dmc4. still waiting on someone to bench far cry 2. (this is just mobile cards)

    it because of your driver. when benching 3dmark06 you need your factory driver installed. when benching vantage, 185.20 seems to be the best driver to manipulate.

    yes, the other types of aa take a lessor toll, but that regular aa is a for sure 1/3 drop in frame rate.
     
  27. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Nice to see the scores Aeris. I have 9251 on the GPU score in Vantage though. Could of course be my overclocking too. I will run a test with stock clocks on my 8800m GTX SLI and see where I land at.
     
  28. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    she scored a 10.5 in gpu before magnus72...if im not mistaken...
     
  29. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Thanks. :) Nice GPU Score. Alright, let's see how do we compare on stock clocks! :)

    Yeah, my poor cards were almost screaming (room temp was below 60 F, and even still, they reached 76 C Max.). The overclock was:

    Core Clock: 500 MHz -> 650 MHz.

    Memory Clock: 799 MHz -> 1050 MHz.

    Shader Clock: 1250 MHz -> 1650 MHz.

    Results:

    [​IMG]

    Memory Clock actually makes a difference.
     
  30. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    yes it does....
     
  31. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Hmmm that further confirms johnksss and my results. G92/G80 (Is the 8800M GTX G80?) cards are starved for memory bandwidth.
     
  32. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    yep!

    also, im going to do all them other test we talked about today when i get back.

    i got it to run under windows 7, but i could not go higher than 1280x1024
     
  33. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Did you uncheck 'fullscreen'? I couldn't get it to run properly in fullscreen under Windows 7 either.
     
  34. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    it ran in both.
    try setting your desktop res to 1280x1024 first, then try it
     
  35. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    The 8800m GTX is G92M, yeah, if I increase my 8800m GTX's memory clocks, the scores go up by A LOT.

    I might try to find a set of clocks that are balanced with each other to see if I can get higher scores without having to use extreme frequencies.
     
  36. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    ATI's 4xxx series cards are the exact opposite, core clock is what boosts performance the most, not memory clock.
     
  37. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Hehe, it is curious how ATI and Nvidia are so different.

    Next overclock I will try:

    575 MHz Core / 1600 MHz Memory / 1325 MHz Shader.

    I wonder what will be the outcome of this.

    Edit: Updated the frequencies.
     
  38. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Yep, the 4800 series have oodles of memory bandwidth, especially the 4870 with its insane GDDR5 memory, so the core is what ends up holding back the card's potential.
     
  39. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    1600 Memory was a fail, I got a red screen of death when trying to boot up to Windows.

    Trying 1350 Memory now.
     
  40. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Hence why the 4xxx cards 'underperform' in 3dMark06 and Vantage where the resolutions are so low. Nobody games at 1280x1024 (Gaming laptops are 1680x1050 and 1920x1200) so unless these benchmarks are run at full screen resolution, it gives a bad picture of the actual gaming performance of these ATI notebook GPUs.
     
  41. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Eh... I definitely wouldn't say that nobody games at 1280x1024. I'd say that at least a third of laptop gamers play most of their games at 1280x800 or 1440x900 because not everyone has a high-end GPU. Gaming laptops don't have to house 8800/9800/3800/4800 graphics cards to be considered "gaming" worthy.
     
  42. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Aeris strange you get artiacts. I can overclock my 8800m GTX to 950 mem. I can even overclock the mem further if I want to. But to me I am cpu limited so further overclocking doesn´t yield any performance boost.

    But I ran a stock benchmark with Vantage and scored below Aeris 8800m GTX SLI score. I scored around 7400 points scock on the GPU´s. Probably have to do with my CPU limitation after all.
     
  43. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Drivers methink. She also scored around that for GPU score a while back, unless she changed her CPU?
     
  44. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yeah I can say I currently run at 182.06 drivers for Vista 64. But yes it could be due to she have a Qaud core compared to my. measly Core2Duo. Though strange my 8800m GTX SLi overclocks much better than hears.

    Now I know my 8800m GTX SLI in my XPS is like a 9800 GX2. I.e The GPU on the same PCB. Don´t know if that has any difference. I know if I would just overvolt my GPU´s by 0.5 volt I would push 700 on the core easily without artifacts.
     
  45. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Anybody with a gaming laptop has a 9800M GTS at minimum, otherwise it's just a laptop with mid-range dedicated GPU. I'm comparing 8800/9800 to 3800/4800.
     
  46. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Hehe, yeah, the cards can tolerate up to 1020 MHz Memory without artifacts, if I go higher, I can get higher scores, but start getting a lot of delta's on the ATI Tool's Check For Artifacts test whenever the cards get hot. I think it is because I can only push to 1.05V and overclocked both the Core and the Shaders a lot already, I will try later to see if leaving the Core and Shader clocks lower yields any more possible overclocking on Memory.

    I used to score 7800 on stock with the 180.44 Dox Drivers, so I only got a 200 points increasement with the 185.20 Dox (Quality, not Performance actually) drivers. I agree, it might be a CPU limitation, but I do not really think so much.


    Yeah; I haven't changed my CPU, still my oldy Q9450 (upgrading once I start building my new desktop).

    Your Core 2 Duo is not measly, a Quad Core would not really make that much of a difference.

    Your 8800m GTX's overclock much better because of the Dual PCB design (I think), although Core does not really make that much of a difference due to the Core being able to handle all of the extra workload from an overclocked memory and shader, I might give it a try to overclocking the core to 700 while leaving the other frequencies and stock, I might also try the same with Memory and Shader, to see how far they go, and see which one makes a bigger difference on the score (not right now, heat would not allow me to do it without reaching too-high temps, sadly, 80+ F room temp).

    Also, I want to let my cards settle down a little due to the Red Screen of Death due to video card failure that I had a while back when I tried to boot up on Windows.
     
  47. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yes you could be right Aeris, that my 8800m GTX overclocks better due to to the both GPU´s is on the same PCB. Would be really nice to see some benchmarks of the new 9800m GTX SLI on the XPS m1730. Since these are the same two gpu´s on the same PCB.

    Actually if you want to know my 8800m GTX SLI hangs at an overclock at 655/1600/950 in Crysis. Although I can run the furmark test just great with those clocks, but it hang in Crysis :)
     
  48. Aeris

    Aeris Otherworldly

    Reputations:
    474
    Messages:
    805
    Likes Received:
    20
    Trophy Points:
    31
    Hehe, yeah, it allows for a better and faster connection, a better heat disipation (Not sure...) and a better power flow.

    Hehe, mine hangs after a while on Left 4 Dead / Half-Life 2: Deathmatch when on 650 / 1020 / 1650, I haven't tested on no source engine games, though, but it does not hang on benchmark programs if the room temp is appropiate for such an overclock.
     
  49. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Magnus, with your highest overclock, at 1680x1050, what's your Crysis benchmark score on Ultra High?
     
  50. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    hey Jlbrightbill do you mean Very High DX10 or DX9. Ultra high can only be achieved through a modified autoexec.cfg.

    If you want I can bench against your 4870. I know I will loose really badly, but would be fun anyway :) Let me know if you want to bench at DX10 or DX9? And High or Very High settings? Stock or overclocked like I have now?

    EDIT! Duh sorry I am stupid I saw you wrote highest overclock.
     
← Previous pageNext page →