If anyone has a 9800M GS I would really like to know what settings you play GTAIV on or what 3DMark06/Vantage Score you get.
I want to know this because I want to see exactly were the ATI 3850 fairs on the Nvidia side. I think its pretty powerful and alot of the times the CPU holds it back, I know im probably even holding the CPU back quite a bit. Anyways my scores are below and Id be interested to hear your thoughts.
*Tests were done with stock video drivers when I got the laptop. And of course the laptop was plugged into the wall.
3DMark06: Ranges from 7k+ (Low-Mid 7000s).
Vantage: P2800~ (I had it on Balanced Power mode by accident and I only can run vantage once, but I don't think it makes too much of a difference)
So far im guessing the 3850 would be like a 9700M GT or something. Or in between the 9600M GT and 9700M GT? But thats a pretty big gap in between. Why Nvidia, Why? lol.
-
I believe the stock 9800M GS will pull about 8,500 in 3dMark06 at 1280x1024.
The closest ATi equivalent is probably the 3870. -
The Geforce 9800M GS scores around 8100-8500 points in 3dmar06 at 1280 x 1024 resolution. The closest ATi crad to it would be the HD 4850 which overpowers it. Last time I checked GTAIV on my machine I could run mostly everything on high except I believe it was texture or possibly render quality at medium @ native resolution.
Edit:
-
4850 should be quite a a step up from the 9800m gs. i believe i read it was something klike 15-25% more powerful than the 9800m GTS which is a little less than that much more powerful than the 9800m GS
I would say your probably about right about the 3850~9700m GT -
3850/70 are pretty close to the 9800m gs and gts
-
yup 3dmark 06 gets me like 9500-9700 tweaked on a single card at 1280x720 --- i can run it on a single car at 1280x1024 later when i get a screen pm me if you want
-
The 9800m GS is basically a 9800m GTS quality tested for a smaller chassis with less cooling and downclocked to accomodate its lesser cooling.
The performance difference between 9800m GS and GTS is about 10% in practice, assuming stock speeds.
I think the closest AMD version is the 4850 which is as mentioned a reasonable step up even from the GTS.
The 3870 is a step down IMHO... 9700m GT equivalent. -
9700M GT is only has 128bit Memory whereas the HD3870 has 256bit memory. The HD3870 is probably about 30% better than the 9700M GT. -
3850 is also 256-Bit.
Hrmm, i think its about the 9700GT Give or take a few fps on games, however I think the 3850/ATI Cards, handle high resolutions better as well as AA. -
9800M GS and ATI HD 3870 are roughly the same in score/speed.
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html -
mobius1aic Notebook Deity NBR Reviewer
I can hit ~9500 points on 3DMark06 with my 9800GS when it's overclocked with the latest Nvidia drivers. I mean to do a DOX 184.79 driver 3DMark06 test soon.
-
The 9800m GS is quite stronger than the 3850 card, but the 3870 should be near it or perhaps above it.
I, using GTS speeds, basically get 9k on 1280x1024.
It goes like this:
3850---9700mGTS
3870---9800mGS
HD4850---9800GTS to GT
HD4870--- 9800 GTX and 280m -
3870 is between the GTS and GT.
4850 is greater than the GT, equal to or slightly greater than the 9800M GTX.
The 4870 will be better than the 9800M GTX once it isn't fubar.
GTX 280M will battle with the 4870 for the top spot. -
If you want a deeper comparison, Ive been doing benchmarks with my Asus G50VT-X5, which has a 9800m gs, as well as the rest ofthe specs being fairly baseline. The benchmarks Im getting should be easily attainable by anyone with a 9800m gs.
http://forum.notebookreview.com/showthread.php?t=367670 -
mobius1aic Notebook Deity NBR Reviewer
How the hell are you achieving 782 MHz core?!?!?!
-
MrButterBiscuits ~Veritas Y Aequitas~
9800m gs is roughly a 3850 not 3870 which is equivalent to a 9800m GTX lol the notebook check has it listed out of order for scores
Dual mobile 3870= 13270 3dmark06
Dual 9800m gtx= 12719 3dmark06 -
3dmark06 scores are worthless without normalizing for CPU and resolution.
Giving a score without the rest of the information is not helpful.
For instance, if the dual 3870 with that score was using an OC'd 3GHz processor running at 3.3GHz and the dual 9800m gtx using a 2.5, then the dual 9800m GTX are actually considerably stronger.
If one score is 1280x768 and the other at 1280x1024, then the scores are even MORE skewed.
This is why simply checking scores at notebookcheck without checking the context is not a good comparison.
The 3870 gives low 7k scores at 1280x1024 with a 2.5Ghz processor while the 9800m GS gets at LEAST 1000 points higher at stock speeds with a 2.26 GHz processor.... and that's all at stock.
The 3870 is at best a 9700m GT at lower resolution or maybe a nerfed 9700m GTS at high res... nevermind what stats it has as quite frankly its a different GPU. -
MrButterBiscuits ~Veritas Y Aequitas~
I thought that was a GPU benchmark... I have heard you can do full system, Gpu or Cpu benchmark... am I wrong?
-
dondadah88 Notebook Nobel Laureate
benchmarks in sig. it depends on what cpu you are comparing it to. and these benchmarks i had issues running the game. and a old driver. it was lower then 8.10 and we are now on 9.3
the 9800m gts is equal to the 3870
the 9800m gs is lower.
when i get a chance (very very busy) i will update those benchmarks in dual core mode.
does anyone have a 9800m gs we can compare this to. if so i can run test very soon. unless no one is intrested. -
3870 = 9800M GTS. Period. -
dondadah88 Notebook Nobel Laureate
can o see that link. o was looking for it om the ocz forum's but couldn't find it.
and yes the 3870 is equal to the 9800m gts. -
Well the 9800M GS and 9800m GTS are the exact same save for the clocks, but I use my card at GTS speeds without issues.
At stock clocks it is still basically equivalent of an ATi HD 3870, perhaps just a tad bit lower. (since the only benchmark of the 3870 I have seen scores 8100ish in 3dmark at 1280x1024, and the 9800m GS at stock gets 8500ish both with 2.2ghz cpu etc) Thats why I assumed both cards would be in the same level. Remember that from GS to GTS is not a big difference in performance.
Why is the 3850 so underpowered compared to 3870? 7k 3dmarks is rather low... but well its on the same level of an 9700m GTS. -
dondadah88 Notebook Nobel Laureate
what do you mean. my single stock score was 84?? and if I installed everything like .framework I am sure I could break. 87xx 89xx. look atmy sig.
-
Well, your scores are not exactly that higher than a 9800m GS at all, and considering the 9800m GS and GTS are the exact same card save 70 core clos less and 175 shader clocks less, I doubt the 3870 would be that much powerful.
Altho I gotta say those scores with crysis all high at 1920x1200 are rather impressive, with an average of 18fps on a single cards?? Wow, I gotta try that! I usually get like 25 fps on all high at 1366x768 so performance seems to be similar in that regards, but I wonder how I can do at 1920x1200.
No matter where I see it, it just doesn't seem to be the 3870 to be really above a 9800m GS at all, at least not from a noticeable margin. Thats why I consider them on the same level. -
dondadah88 Notebook Nobel Laureate
well the funny thing isthat I get lower when crossfire was on.
-
Not only that HD4850 is better than the 9800GTS and GT. -
BTW Update on some more benchmarks.
CPU: 2.13ghz Overclocked to 2.488ghz with setfsb
GPU: 530/799/1325mhz Overclocked to 836/1019/2019mhz with ntune.
SM2: 2762
HDR/SM3: 2671
CPU SCore: 2199
Total: 6559 -
Wait, are those 3dMark06 scores? What resolution?
EDIT: My post from the Asus forum G50VT-X5 thread...you broke something.
-
-
Set everything back to stock and run ATItool, just to check if it's toast or not. I know you said you were only getting around 60C at load, which is what first led me to think something was fishy. So the GPU may have been downclocking to save itself.
-
6k for a 9800? Lol somethings wrong there.
Anyways, whats your guy's take on the difference of Nvidia-ATI. I don't know if this is just because my last card was much more underpowered (8600M GT). But I found that it played games ok, except when shaders kicked in. With the old 8600M GT RA3 ran fine, but If I put shaders on high, the game would die! Completey lag feast, jitters everywhere.
With the 3850 its smooth, theres almost no difference between med to high. I noticed that the ATI cards tend to have much much more stream processors.
ATI 3850 has something like 320 stream processors I think.
And the Geforce 9700 GTS has 48. Now thats a huge difference, however Geforce is clocked higher.
I wonder does one of these offer a advantage over the others for shaders? Since the 3850 is 256-bit perhaps it will handle higher resolutions better.
We also must remember im scoring 7+k in 3DMark06 with only a 2.4ghz Turion processor, which might not be as strong in synthetics as a C2D.
Is there any Synthetic benches of a 3850 with a C2D? Cuz I sure haven't seen one, its a rare breed but im sure its out there. -
Nvidia shaders are a lot more powerful than ATI's, which is why not a lot are needed.
Generally, I prefer Nvidia over ATi at the moment. The drivers are better, and some applications I use (such as a PS2 emulator) run better with Nvidia cards. I'm flexible though. I'd have a 9800 over a 3850, but I'd have a 3850 over a 9700.
I started off in the ATi camp when it came to PC gaming though. My first real GPU (after having a Voodoo 2) was a Radeon 9600 Pro, and that thing was fantastic. -
dondadah88 Notebook Nobel Laureate
oh the shaders for ati are different the nvidia. you have to divide them by 5.
so the 3870 has 320stream cpus. divide that and you get 64. (9800m gts status)
it works as one boss shader and 4slaves.(sum it up,) -
Yea, Nvidia shaders are stronger, perhaps is that due to the shader clock. I haven't seen any info on shader clocks on ATI cards. -
I get 8534 at 1280x1204 with the laptop in the sig and processor OC'd to 2.5GHz using the ASUS OC tool. (The GPU is stock clocks and using 182.46 DOX drivers)
Judging from that, the 3870 is slightly weaker to equivalent to a 9800m GS.
A 9800m GTS will be in the 9k+ zone. (about 10% faster than 9800m GS)
Note, this would make it more powerful an I had originally taken it for, but not as powerful as you seem to think. I reserve judgement on at least one more benchmark which I can run later. -
dondadah88 Notebook Nobel Laureate
sorry monkeymhz i am looking for it
kernal can you run a game that i have in the sig. i really dont want to pull off my CF cable right now. unless you think that you need up to date results to solve this issue. -
-
dondadah88 Notebook Nobel Laureate
ok i guess i have to pull off the CF cord. i am not overclocked right now so you will only see stocked scores from the gpu. and i will post cpu overclocks as well. give me alittle maybe about a hours or two
-
mobius1aic Notebook Deity NBR Reviewer
But quick question: what program did you use to overclock your CPU? I've never really delved into CPU overclocking, but I'm very interested in it. I want to hit 10,000 on 3DMark06! So far I've gotten to 9200 Marks with Nvidia drivers, and I actually lost points using the newest DOX driver when I OCed. -
If possible can you do a run of 3DMark06 without your CPU oc'd so we can see if the CPU has any drastic impact on the score. Thanks.
I would also be interested in this CPU OC'ing, my CPU runs insanely cool right now. 36-38C Idle, 68C Under GTAIV load.
Also does anyone have any idea how much slower the Turion X2 Ultra Duel Core ZM series is to the C2D's?
I hear its slower, but I ran WPrime32 and it beat my old T7500 C2D. But maybe thats just for that application. I noticed zip extracting is slower, but I get a smoother Windows experiance, and it runs much cooler than my C2D. Really I don't know what to side on.
Im just heading to AMD because Bulldozer will eventually be comming out.
+ Its more affordable, I rather have a Great CPU and a Epic GPU than a Epic Cpu and a Great GPU. -
Using my current overclocks in my sig, I hit 10,019 @ 1366x768 with DOX 182.46 drivers. -
mobius1aic Notebook Deity NBR Reviewer
ATi Shaders are in clusters of 5 and in proper nomenclature they can be summed up as 5-dimension units. A 3850 has 64 5-dimension units. I'm not completely up on my knowledge of ATi cards or Nvidia cards for that matter but I think even the ATi Xenos GPU (which as "48 shaders") in the Xbox 360 is based on the predecessor architecture before the X2000 series as I've seen the Xenos documentation listing the Xenos as having 48 5-dimension units which would possibly explain why 48 shaders vs 32 shaders in an 8600GT would be so even on multi-platform games. Would also bring to light some ideas in the comparisons of Xenos to ATi X2000/X3000 to Nvidia 8000 architectures. We're possibly looking at a better "ideal" comparison of 48 5-dimension shaders vs. 64 5-dimension shaders vs. 96 Shaders shaders (Nvidia 8800GTS 320). Multiplatform performance always seemed to place the 8800GTS as being twice as powerful as Xenos, and the ATi X2900/3850 as being around 1.5x or more powerful than Xenos in multiplatform games, but also possibly explains lack of equality vs. Nvidia GPUs and their shader architecture.
Based on my observations, these 5-dimension units although much simpler apiece (per "dimension") than Nvidia single shader units, they seem to be more powerful as these 5-unit clusters, but of course then it comes to numbers of shaders total and clock speeds but it gives us an interesting rough comparison of ATi and Nvidia GPUs so lets look at it like this a bit (this compares laptop parts):
ATi 2400/3450: 40 shaders/5 = 8 --------> Nvidia Geforce 8400 GS: 8 shaders
ATi 4350: 80 shaders/5 = 16 ------------> Nvidia Geforce 8400 GT/8600GS/9400M: 16 shaders
ATi 2600/3650: 120 shaders/5 = 24 -----> Nvidia Geforce 8600GT/9500GS/9600GT: 32 shaders
ATi 2900/3650/4650: 320 shaders/5 = 64-> Nvidia Geforce 8800GTS/9800GS/9800GT: 64 shaders
ATi 4850: 640 shaders/5 = 128 ----------> Nvidia Geforce GTX 280: 128 shaders
What's keeping Nvidia on top I suppose is just the much higher core and shader clock speeds, but it also explains why Nvidia GPUs have more power draw in comparison to their "equivalent" (I use that word very lightly) ATi GPUs. I hope that some people get some understanding out of this, but don't misconstrue it as being completely factual in the real world, because yes clock speeds are very important here, then you have memory interfaces and bandwidth to worry about [there is a pretty big difference between the ATi 3850 (256 bit memory interface) and the ATi 4650 (128 bit memory interface)] and then you have the drivers and games themselves. ATi cards through the X2000/3000 series very favored by 3DMark06 probably due to the even higher parallelization (is that even a word?) of ATi cards' and their shaders since 3DMark benchmarks have always been very shader heavy. However in games Nvidia seems to be favored simply because games are developed with those cards in mind possibly more than ATi cards as the ATi favoring games, as few as there are, really do tend to equalize the playing field between the two graphics companies. But like I said, this is more food for thought, not absolute truth in the graphics world. Take and make of it as you will, and ALWAYS ask for advice especially with particular games before you make a decision in what kind of graphics set up you look for in a laptop or desktop unless you're pretty knowledgable about the stuff. *SIGH* I need some air.
-
mobius1aic Notebook Deity NBR Reviewer
I had an HP with a Turion x2 TL-56 @ 1.8 GHz and it was a good laptop, I really liked it but eventually it died early on due to my OCing antics and probably due to the lack of quality that HP probably puts into their machine design since my wireless card died a year and a quarter into it's life and the computer itself about a year and half (I'm not trying to dis HP owners, this is out of my experience! Though I wouldn't mind having one again either!).
Yes C2Ds beat the Turion, but at the most it seems 15 to 20 percent faster at same clock speeds, in games this won't matter to much, it's like your GPU and CPU comparison. I did have experience with an HP dv2 series that I got but returned to get my Asus (the HP was a floor model, nice and cheap but had sound problems plus I told them that if the Asus went on sale within return period, I'd get the Asus) and I must say that the Turion Ultra ZM-82 performed fantastically. That computer actually has faster start up than this Asus (but this Asus also has a splash screen too) and did come out of sleep faster as well. Except for the sound problem I really liked that HP, it was small and compact but pretty fast still. A version of the Asus N81 with a Turion Ultra and ATi 4670 would be right up my alley with a cheaper price point as compared to the current Intel C2D/ATi 4650 combination. Everyone rats on AMD Turions, and yes they have worse battery life, but they still perform quite compitently especially being cheaper. However I must say that the Turion Ultras are the ones to get, the RM-70s and below are really weaker than current C2Ds by a pretty large margin but they are not bad still, just aim for a Turion Ultra. -
Thanks mobius1aic, that clears up alot.
I have a turion x2 ultra duel core and its pretty good, its my very first AMD. I have always been intel. I think in the next year or two were gonna see AMD take the lead again, but it will probably go back to intel over time. In my opinion they are both good.
But yea the turions are interesting. Ive noticed a huge change in going from a Intel/Nvidia Laptop to a AMD/ATI laptop. Some better, some for worse.
AMD/ATI
+ Smoother Desktop Experiance
+ Resolution And AA is handled faster
+ Worked flawlessly with OpenGL Hardware Acceleration in 3D Modeling Apps
+ HD Videos used less CPU
+ Multitasking was better
+ Ran Cooler
Intel/Nvidia
+ Faster Unzipping/Raw Number Crunching
+ Better Drivers For GPU
+ More Battery Time
Now, from my experience the Intel/Nvidia had more battery time but that also could be cuz my new laptop has a 3850 which is a power hog. I did undervolt the turion which gave me a extra +30 mins. Giving me a sweet 1:45 - 2 Hours battery time in performance/gaming mode.
My old laptop gave me about 2:30-3 Hours, but it had a 8600.
Now back on topic. I noticed my 3850 can have up to 2GB of turbo cache. And what I found out yesterday is interesting, I normally run GTAIV in Medium Textures, yesterday I decided to put it into High. And suprisingly, the FPS did not change. However I did notice when driving fast I would reach a point every couple of mins were the game would pause for like a quick 2-3 seconds and then it would continue smoothly. Im guessing thats the VRAM and TC swapping or some overflowing going on. But it ran well, Im wondering, is TC faster than traditional Pagefile? If not whats the point of advertising it as Turbo Cache up to Xgb. Or whatever.
Do higher cards, eg, 9800GT/260GTX/280GTX have TC? I heard turbo cache is for more lower end cards, the 3850 isn't top of the line and im sure it could have done fine without TC, but im kinda glad TC is there.
For Crysis and FC2 its the same story (cranked to ultra for textures), I can crank textures with no performance impact, no sec pause on those games, only on gtaiv does it pause.
And from anyones experience was ATI or Geforce better for OpenGL because thats very important for me. And does one or the other perform better in it? -
mobius1aic Notebook Deity NBR Reviewer
Actually ATi's version of Turbocache is called Hypermemory.
And as for the Turions go, my only big problem I had with it was that coupled with the GeForce Go 7200 in my HP it crashed in BF2. I think it was simply the lack of cache in the older Turion design. BF2 loved to screw up and crash on non-infantry only servers on my old HP but when I played BF2 on that newer HP (the one I returned for my Asus) with the Turion Ultra I didn't have a hitch, only problem I had was with graphics because the Radeon 3200 can't run the game maxed out with AA at good framerate. I remember in January 2006 my old Gateway laptop with an Athlon 4000, ATi Mob. Radeon X600 with 64 MB dedicated/64 MB shared, 1 GB of RAM had no crashes related to straight up game orchestration (although at the time the game still crashed due to glitches no matter how good your computer was). I got my old HP in August of 06, it had a Turion x2 TL-56 @ 1.8 GHz, GeForce Go 7200 w/ 64 MB dedicated/192 MB shared, 2 GB DDR2-533 and in servers with vehicles BF2 would crash I think due to the inability of the Turion to handle the game on one core as BF2 only operates on a single core, but I think it really came down to just the small L2 Cache because I noticed it spiking when I took a look at the task manager.
The Athlon 4000 in my Gateway from 2006 had a 1 MB L2 cache, my HP's Turion x2 TL-56 had a 512 Kb L2 Cache per core, my desktop I built in 2007 had an Athlon x2 5600 with 1 MB L2 Cache per core and it ran BF2 with absolutely not a problem, and the HP I had for a week before I got my current Asus had a Turion Ultra with 1 MB L2 Cache per core. So thinking about it, it must've been the lack of larger L2 cache size that held back my HP and/or the weird graphics set up. The GeForce Go 7200 couple with more video RAM was actually more powerful than the Mob. Radeon X600 in the Gateway, games like CS:S and CoD2 played better and at higher resolution, but BF2 just loved to crash at high settings or moments of intense CPU orchestration that the Gateway handled with ease. I don't know why I had to say it, but I always feel like I do because playing BF2 on infantry only servers (of which there are very few) is kinda boring after a while. It would always crash during an artillery strike or air strike.
So that's my only really bad experience with any AMD product............and it wasn't that bad, I just couldn't play BF2 like I hoped to. But Source games, Call of Duty 1/2 both played pretty well, CoD2 in DX7 mode of course and even got CoD4 playing reasonably too! But I was kinda dumb for getting the HP after only having my Gateway for half a year, I sold it to a friend for the bargain of $300 (it had cost me $1600!) only to get a computer that game experience wise, wasn't that much different in the gaming arena, and it cost me $1300, but that idea of a dual-core AMD, what I thought were much better graphics (I didn't really understand ATi's and Nvidia's numbering systems that well at the time) and 2 GB of DDR2-533 vs 1 GB of DDR-333 as well as the very pretty design of the HP enticed me so I bought it, sold the Gateway in a couple days. I did learn to graphics overclock on my HP though, and I have lots of good memories of it, like all of my computers.
Now I'm not so dumb. -
dondadah88 Notebook Nobel Laureate
sorry for taking so long with the benchmarks.
but first I would like to say the I ran it on a single card and with two cores only. I only can run dmc4. I cnt beat or catch up to the guy with the flashed to 9800m gt asus because I cantoverclock my gpu. but for dmc 4 I got
61.80
47.43
77.30
50.23
if anyone wants screenshots for proof. I c put them up. -
-
dondadah88 Notebook Nobel Laureate
yeah your card is pretty stong. I rit at the same resolution as yours.
I can't really overclock my card right now but I will try to provide as much benchmarks as I can. -
I wish I had the guts that Tavara does. I'm scared to progress past 617/1550/828 on 1.15v
Furmark eventually hits 96C on those speeds in Xtreme Burning mode, wondering if I should do some case drilling >=)
EDIT: Best 3dMark06 so far with my OC'd 9800M GS: 10,145 @ 1366x768Attached Files:
-
-
In the C2D series, add 200-250 points for every 200 MHz jump.
(So figure 8200-8300 with processor at 2.26)
I ran the test OC'd to match his processor to give a fair comparison.
How strong is the 9800M GS, ATI Equivelent?
Discussion in 'Gaming (Software and Graphics Cards)' started by MonkeyMhz, Apr 3, 2009.