Regarding the Vantage scores, what are you guys getting at stock GPU clocks?
Also, regarding the SM3.0 scores in 3dmark06, they are influenced by the clock speed of the cores, not the numbers of them. This is why an overclocked X9100 system has higher SM2.0 and 3.0 scores than the QX9300 system (X9100 has a higher clock speed, QX has more cores).
Personally I'd say there isn't any one game that would consitute as the best mark because given the same performance bracket (or price bracket), one card does well at one game, while the other does better at the other. That said, the average of popular games would be a better assessment.
-
-
and if it's warhead.... which benching tool are you using?
-
I have regular Crysis, and I've been using Guru3D's benchmark tool.
-
i have WIC demo
gta4 is looking quite a bit better after the second real patch..
Jlbrightbill
scores sent.
crysis: and you have tried the benchmark_gpu.bat file as well? -
Setting Crysis in-game everything to High @ 1680x1050, no AA, and using benchmark_gpu.bat, I got 41.5 FPS average over the 3 runs. Windows 7 takes a massive deuce on my FPS in DX10 Crysis unfortunately, I get 46 FPS on the same test in XP Pro.
-
seems out team mate has the best stock score of about 5k single card 4850 with an over clock of about 5600/5700
im running one now...ill let cha know in a bit...
hummmm big ban 3 huh.....could be.
side note:
im still trying to see how it can be pulled off. i know i can pull about 18-19k with 112 sp's on an over clocked system. (mobile) and with them only running 128.... it would have been better to run it at 240 to 256 sp's and run the clocks normal and let the user have at it with over clocking in the end.(speculations on my part for sure) that 128 against 1600 then running AA...could be a serious problem for nvidia, but if these cuda shaders over come this...then it could be the other way around.... -
The FurMark benchmark looks like a donut that rolled away and was forgotten. For months, but still pretty cool.
-
yeah, tis is true...but that program is like 99. percent true and hard to fake.
-
I hope, I really, really hope that the 2XXM Cards by Nvidia are not just re-brandings of the current 9 series with a bit more sparkles and stamps on them... I'd really love to see something more "in tune" with the GT200 series of chips from Nvidia, now that would be amazingly cool to have, no matter how high the power consumption was, as long as they performed very, very well.
If the new 2XXM series by Nvidia are compatible (I really doubt it) with my oldy Sager NP9262, and performance-impressive, I might consider saving some money and upgrading my 8800m GTX's, if not, well, I would not.
Off Topic:
'Been so long since I visited NBR for the last time, I might share some new-and-reloaded OC results from my 8800m GTX's (which perform better now with new drivers and some Windows Registry housekeeping) for some comparison against the new 2XX Series, but would not be more than just speculation (since there is really little information about the cards right now), and might be for informative purpose rather than comparison. -
-
Some competence from ATI would not do any harm, and would force Nvidia to decrease their overpriced products' prices.
Other than that, it might really be that the new G200m chips are based on the GT215 core on a 40nm architecture. -
And yeaaaa, you would need lots of luck for that to come true. Sorry, but I'm a bit of an ATI fangirl. -
possible GTX 260M specs
possible GTX 160M specs
And I think this little rumor is I think why nVidia can win with older technology.
So, if these specs are accurate, GTX 280 looks like an OC'd Quadro FX 3700M, GTX 260 looks to be a mildly OC'd 9800M GTX and the GTX 160M is a 9800M GTS, all with shrunken dies. -
-
-
Dissapointing to say the least...
-
johnksss and I ran some Vantage benchmarks on his SLI 9800M GTX's and my desktop HD 4870 1GB. His cards get a higher GPU score (10-15%) than mine until we go to 1680x1050 and use Extreme textures and 8xAA, then my card pulls ahead by around 10%.
How much this can be comapred to the mobile market, I'm not sure. I think a pattern we're seeing is that NVidia's cards, due to some limitation, start to fade fast at high resolutions and lots of AA. I need to get with some Asus W90 users and Vantage to see if this holds true on the mobile side also. -
-
8700@500/799/1250
9600@550/850/1375
single gpu 4430
over clocked 5815
it seems to hold true with dmc4 though. with out AA the frames are in the 200+ range. with it, they drop to the lower 100"s -
Actually, Quadro cards perform similarly to their Geforce counterparts in games, but usually the GeForce equivalents perform a tad bit better due to drivers. When I built my M1710, I used a Quadro FX2500m (a.k.a. 7900 GTX) and got similar numbers of owners with the 7900 GTX. Btw, I picked the FX 2500M because I got it on ebay for $120 at the time because people thought that quadro cards sucked, lol!
-
not saying the quadro cards suck...far from that!..
they are excellent cards for autocad. but when tested for benchmarking...it didn't seem to perform as well, but!..yes there is always a but...lol
im going to ask our member to see if he had a chance to test his further. see if he got any better numbers since when he first tested it.
this includes any games he may ran as well. -
GT200 and RV700 at the high end are great. Unfortunately for NVidia, their middle and low end are all still G92 or G94 variants and those cards don't scale well at all at high resolutions, with AA, or high quality textures, whereas ATI's offerings do. -
Seems like that Nvidia is having some real issues with AA and High Resolutions. Yeah, I really want to see what happens with the Mobility 48XX cards too.
Nvidia's Medium-High, Medium and Low sectors are kinda lacking... hopefully this will be fixed over time, rather than just tossed aside by Nvidia.
Anyway, the benchmarks I promised a few posts earlier, it is funny how my 3DMark 2006 Score is so low, yet my 3DMark Vantage Score so high:
-
Well, the 48xx series of cards has an easy time with AA because it doesn't depend as much on memory bandwidth to pull it of nearly as much as any other card used to.
Specifically, the CFAA with Edge detect actually uses stream processors to do it instead of consuming mem bandwidth ad storage.
I'm sure with other types of AA you'll still need memory and bandwidth. It's just that the cards have a sufficient supply for most people. -
-
Nice to see the scores Aeris. I have 9251 on the GPU score in Vantage though. Could of course be my overclocking too. I will run a test with stock clocks on my 8800m GTX SLI and see where I land at.
-
she scored a 10.5 in gpu before magnus72...if im not mistaken...
-
Nice GPU Score. Alright, let's see how do we compare on stock clocks!
Core Clock: 500 MHz -> 650 MHz.
Memory Clock: 799 MHz -> 1050 MHz.
Shader Clock: 1250 MHz -> 1650 MHz.
Results:
Memory Clock actually makes a difference. -
yes it does....
-
Hmmm that further confirms johnksss and my results. G92/G80 (Is the 8800M GTX G80?) cards are starved for memory bandwidth.
-
yep!
also, im going to do all them other test we talked about today when i get back.
i got it to run under windows 7, but i could not go higher than 1280x1024 -
Did you uncheck 'fullscreen'? I couldn't get it to run properly in fullscreen under Windows 7 either.
-
it ran in both.
try setting your desktop res to 1280x1024 first, then try it -
I might try to find a set of clocks that are balanced with each other to see if I can get higher scores without having to use extreme frequencies. -
ATI's 4xxx series cards are the exact opposite, core clock is what boosts performance the most, not memory clock.
-
Next overclock I will try:
575 MHz Core / 1600 MHz Memory / 1325 MHz Shader.
I wonder what will be the outcome of this.
Edit: Updated the frequencies. -
-
1600 Memory was a fail, I got a red screen of death when trying to boot up to Windows.
Trying 1350 Memory now. -
-
Eh... I definitely wouldn't say that nobody games at 1280x1024. I'd say that at least a third of laptop gamers play most of their games at 1280x800 or 1440x900 because not everyone has a high-end GPU. Gaming laptops don't have to house 8800/9800/3800/4800 graphics cards to be considered "gaming" worthy.
-
Aeris strange you get artiacts. I can overclock my 8800m GTX to 950 mem. I can even overclock the mem further if I want to. But to me I am cpu limited so further overclocking doesn´t yield any performance boost.
But I ran a stock benchmark with Vantage and scored below Aeris 8800m GTX SLI score. I scored around 7400 points scock on the GPU´s. Probably have to do with my CPU limitation after all. -
-
Yeah I can say I currently run at 182.06 drivers for Vista 64. But yes it could be due to she have a Qaud core compared to my. measly Core2Duo. Though strange my 8800m GTX SLi overclocks much better than hears.
Now I know my 8800m GTX SLI in my XPS is like a 9800 GX2. I.e The GPU on the same PCB. Don´t know if that has any difference. I know if I would just overvolt my GPU´s by 0.5 volt I would push 700 on the core easily without artifacts. -
-
I used to score 7800 on stock with the 180.44 Dox Drivers, so I only got a 200 points increasement with the 185.20 Dox (Quality, not Performance actually) drivers. I agree, it might be a CPU limitation, but I do not really think so much.
Your 8800m GTX's overclock much better because of the Dual PCB design (I think), although Core does not really make that much of a difference due to the Core being able to handle all of the extra workload from an overclocked memory and shader, I might give it a try to overclocking the core to 700 while leaving the other frequencies and stock, I might also try the same with Memory and Shader, to see how far they go, and see which one makes a bigger difference on the score (not right now, heat would not allow me to do it without reaching too-high temps, sadly, 80+ F room temp).
Also, I want to let my cards settle down a little due to the Red Screen of Death due to video card failure that I had a while back when I tried to boot up on Windows. -
Yes you could be right Aeris, that my 8800m GTX overclocks better due to to the both GPU´s is on the same PCB. Would be really nice to see some benchmarks of the new 9800m GTX SLI on the XPS m1730. Since these are the same two gpu´s on the same PCB.
Actually if you want to know my 8800m GTX SLI hangs at an overclock at 655/1600/950 in Crysis. Although I can run the furmark test just great with those clocks, but it hang in Crysis -
Hehe, mine hangs after a while on Left 4 Dead / Half-Life 2: Deathmatch when on 650 / 1020 / 1650, I haven't tested on no source engine games, though, but it does not hang on benchmark programs if the room temp is appropiate for such an overclock. -
Magnus, with your highest overclock, at 1680x1050, what's your Crysis benchmark score on Ultra High?
-
hey Jlbrightbill do you mean Very High DX10 or DX9. Ultra high can only be achieved through a modified autoexec.cfg.
If you want I can bench against your 4870. I know I will loose really badly, but would be fun anywayLet me know if you want to bench at DX10 or DX9? And High or Very High settings? Stock or overclocked like I have now?
EDIT! Duh sorry I am stupid I saw you wrote highest overclock.
Geforce GTX280M and 260M to launch at CeBit
Discussion in 'Gaming (Software and Graphics Cards)' started by ichime, Feb 24, 2009.