this 3rd party program is not reliable. some tests will be accurate because the settings are correct. but alot of times the program does not keep the correct settings within the game itself. for example, there is no way your single card can achieve 35 avg. fps in 1680x1050 all high. it's not possible unless you did some crazy gpu overclock. my sli setup will hit that number, not single gpu though.
run the same setup again using the Benchmark_GPU and make sure your in-game settings are correct before the run. you will see what we are talking about. some settings you need to change manually like the "objects detail".
-
-
Trying it now...
-
Well I tried it the way it has been suggested but the Crysis GPU benchmark keeps running 1024x768 2xAA and yes, I changed the settings in the game, I even played the game for a few minutes before exiting the game. I can tell you this, at the highest resolution 1920x1200 set to HIGH and using the console to display FPS, the lowest was 12 and the highest was 26 for the 10 minutes I played.
But if the GTX and a GT are tested using the same 3rd party app at the exact same settings, perhaps they are not exact in true FPS but it should still be a means to guage the two FPS scores against one another---true or false? -
that's strange. i never heard of that before where it does not keep it's settings. you may want to un-instal the 3rd party program and re-instal the game and make sure to patch it to 1.2
-
Some discussion on a msg board talked about deleting a txt file to clear the cache or something---at the time I was looking for the console command so I didn't pay attention to that part of the thread...
I don't plan to go through the touble of uninstalling/reinstalling the game
just to get the benchmark tool working...I'd rather just get a screen grab while it is running...surely there is a console command for that -
Would someone take a screenshot of GPU-Z running on the 9800m GTX?
I'm quite curious about the backend tech information. -
-
It will show all the information regarding your video card. It works just like CPU-Z, just it's for the GPU.
http://www.techpowerup.com/gpuz/ -
Out of curiosity, if we are currently seeing a 5-10% performance increase of GT to GTX, would this translate into a similar or higher performance increase for the SLI configuration of GT to GTX?
-
-
Well, here is my GPU-Z screen. wow, I feel dumb cuz most of this is meaningless to me...lol
Attached Files:
-
-
And, primary reason to buy a high end machine...Compiling and building several 100k lines of software many times per day and performing database operations on several million record tables benefit greatly with a more powerful machine. And if the machine happens to play Crysis well, thats great for break time -
-
-
-
No offense Supernova1 but there more importand things in this thread,like giving informations to potential customers about this card than judging a man who owns it..Lets stay on the subject.
-
so i am looking to GT Sli the FPS is win by 1/2 more power then GTX, let said if we have GTX SLi ready.... not much of different AT ALL? right???
-
As it seems No such diferrence to justify all that extra $ you going to pay,but we dont know if there will be better performance with new drivers etc...
I doubt it,like we discussed in this forum before, the diference was 16 shaders more and memory which was pointing that we were having not more than 10% increase,as it seems atm is even less. -
DevlErD So Drivers update will help the performance of a GPU?
-
don't count on drivers to give this card a boost. the same thing was said about the 8800m gtx when it came out.
-
-
-
sliqsystems,
I'm curious, did apply the Crysis 1.2 patch? I think that gave some people a few more FPS. -
And even 30 with some tweaks, or with a custom config, it`ll look like this:
edited to 640/400 from 1920x1200...
Starting from my original Dell drivers upto 177.92...I gained as much as 5 fps in some games.Last edited by a moderator: Feb 6, 2015 -
so... how much different % between 9800m GTX vs 8800m GTX and 8800m GT SLI
-
-
-
hmm..doesn't sound so good vs none sli but
-
At eleron911: We know you can get a boost with drivers update,what ARGH and i said was not to buy a card and expect,waiting for a driver update to get better performance ...
-
Looks like gpu-Z does not correctly display the 9800M GTX properly. Much of the data is shows is not correct. All I'm looking for is its fab process, whether itst 65nm or 55nm. I've heard some claim either, but I want solid proof. And why do I care? If the card is 55nm, this translates to higher overclocking potential, less heat, less power usage, and would place it as a G92b core. Has anybody tried overclocking yet?
-
The GT shows correctly in GPU-Z
-
Using the data posted by different users. Also using the 8800M GTX as the base line card what we roughly know is this at 1920x1200 at HIGH:
8800M GTX: About 16 FPS in Crysis using GPU Benchmark
9800M GT: ~5% performance increase give or take (about 17 FPS in Crysis).
9800M GTX: ~10% performance increase (wild guess, still no data at 1900x1200 at HIGH)
8800M GTX SLI: ~100% performance increase (about 30 FPS in Crysis)
9800M GT SLI: ~200% performance increase (48 FPS in Crysis)
According to the values we got recentely the 9800M GT in SLi manages to output 48 FPS at 1920x1200 at HIGH. It seams that Crysis with this new drivers is really taking advantage of SLi.
Basically we get far more from SLI with this generation then we did previousely in terms of FPS in Crysis. From single to SLI we get 207% increase.
Here are some threads to follow up:
9800M GT vs 8800M GT
http://forum.notebookreview.com/showpost.php?p=3627932&postcount=171
9800M GT in SLI vs 8800M GTX in SLi
http://forum.notebookreview.com/showthread.php?t=291995&page=11
Have fun,
Trance
PS: Notice that this was done using data posted by multiple users. Some of them not using the same CPU at all. I am assuming that the CPU is not really giving much to the FPS count increase. -
-
I've already fixed the percentages. Too little sleep that is what it does. Also posted some links to threads to follow. My guess is that the increased memory is helping a lot on textures.
-
-
Corrected. You may not think so, but that is what the poster shows. Maybe when more people post some results we will have more certainties. But people have been shy posting them
Trance -
I say 8800M GTS, 9800M GTS, 9800M GT, 8800M GTX and the 9800GTX from slower to fastest.
-
-
Looking over GPU-Z, the only difference in the two cards is PCI-E 16@16 vs PCI-E [email protected]
Just joking! The 9800M GT is around 8 percent faster than the 8800M GTX.better?
-
9800m GTX only 15% gain.. i don't see much of a different 9800m GT Sli like most people said.
although the long run 5% a bit of different, we paying 9800m GTX is 365 ATM what if later Sli ready it the price is 365+- worth to get? -
The 9800M GT IS the REBRANDED 8800M GTX.. so NO DIFFERENCE!!
-
- there are minor differences, slightly modified core with higher power rating
2) a single 9800M GTX (not GT) will not perform better than SLI 9800M GT's (or 8800M GTX). -
It seams that NVIDIA focused on tunning these new cards for SLi rather then improving the single card performance, but I need more data.
I'm a wee bit disappointed at the moment over the improvements of the 9800M GTX over the 9800M GT especially for the price difference. But still we have to check scores at 1920x1200 both with a single card and dual cards (SLI).
Trance -
However here are my results of Crysis 32-bit DX9 on Vista 64-bit with 177.92
01/09/2008 11.16.01 - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX9 1680x1050, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: High
==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 91.52s, Average FPS: 21.85
Min FPS: 17.26 at frame 1957, Max FPS: 26.40 at frame 997
Average Tri/Sec: -21248384, Tri/Frame: -972344
Recorded/Played Tris ratio: -0.94
!TimeDemo Run 1 Finished.
Play Time: 87.29s, Average FPS: 22.91
Min FPS: 17.26 at frame 1957, Max FPS: 26.86 at frame 1007
Average Tri/Sec: -21991768, Tri/Frame: -959871
Recorded/Played Tris ratio: -0.95
!TimeDemo Run 2 Finished.
Play Time: 87.90s, Average FPS: 22.75
Min FPS: 11.68 at frame 1602, Max FPS: 26.86 at frame 1007
Average Tri/Sec: -21851538, Tri/Frame: -960333
Recorded/Played Tris ratio: -0.95
TimeDemo Play Ended, (3 Runs Performed)
==============================================================
Completed All Tests
<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>
01/09/2008 11.16.01 - Vista 64
Run #1- DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 2283 -
Hi, in order to get some comparison over one needs to focus on the highest resolution possible. In our case it is 1920x1200. The reason why is simple:
The relationship between the resolution and FPS can be mapped to a curve. Slower cards "reach" 0 faster then faster cards as resolution increases. Within the set of high end cards the difference between one and the other is usually the optimum resolution taking into account quality. For instance, the optimum for a 8800M GTX is usually 1680x1050 (work for almost for any games), for the 8700M GTX is 1280x1024 etc etc.
So comparing cards at lower resolutions gives in theory an unfair advantage to slower cards. This is to say that if you want to compare the 8800M GTX, the 9800M GT and the 9800M GTX one needs to use 1920x1200.
Granted, from the practical point of view it may be the case that both cards are equivalent, since at such high resolution the game may be unplayable by any card in the set, and at the next resolution results can be very similar.
The test by paladin44 (I posted the link) is the most rigorous one. Not only is using the same CPU but he also making the test at the highest resolution possible. In this test he shows a difference of 5% between the 9800M GT and the 8800M GTX nevertheless negligible in a single card config.
We need a test in the same circunstances for a single 9800M GTX. What this test might reveal is that the 9800M GTX is an overkill for 1680x1050 or that it is a big flop in a single card mode.
Trance -
That test showed he could have used three different 8800 GTXs and came out with one being 5 percent higher or lower. For all I know, the 8800GTX could have won a few of those benches, but it wouldnt have been to xxx advantage to post that. There is no difference between those two cards, the 5 percent is give or take between any individual cards of the same model. Of course, we are speaking about the same model comparing 9800M GT and 8800GTX. We can take those same cards and put them in different computers and come up with minimal performance more or less. If there were a big improvement of the 9800M GTX, then we would have seen numerous post already of what it can do. True, it might give better performance in 1920x1200. But at its price point, it better be way over 10 percent performance increase compared to the 9800GT (8800GTX). -
I don't think that any drivers will boost it.
-
can you summarize all in one posts (very 1st post of the thread) & bold imfortant information (max,min,average fps) so we can look it up easier? thanks
-
Trance
Finally. 9262 and 9800 GTX Crysis Benchys
Discussion in 'Sager and Clevo' started by sliqsystems, Aug 30, 2008.