Welcome
Couple of months ago I delivered comparison between GF7600 and X1600. Now it's time to check out what younger brothers, X1700 and GF7700, have to offer. In this test again we have 2 Asus laptops :
Asus G1 with Core 2 Duo 2,16GHz + 2GB DDR2-667 + GF7700
Asus A8JP with Core 2 Duo 2GHz + 2GB DDR2-667 + X1700
G1 has slightly better CPU but diffrence is not so big so I think we can say that both machines are equal.
Stock clocks for GF7700 are 450/400 (core/mem) and 475/400 for X1700 . For nVidia I've used ExtremeG MobileForce 93.71 G3 and for ATI newest (20.12.2006) drivers from Asus website.
Like previous time we will test both laptops only with games, no 3dmark bull...it. So we have :
F.E.A.R
Far Cry
Prey
Company Of Heroes
Now, take a look at the scores.
With G1 I could set 1280x1024 resolution and with A8JP max was 1440x900 but it's the same (1280*1024=1,31 mpix while 1440*900=1,29 mpix)
F.E.A.R (1280x1024, all set to max, soft shadows off, AA x2, AF x8)
GF7700 : 24 fps
X1700 : 21 fps
Winner : GF. Slightly more than 10%.
FAR CRY (1280x1024, ultra details, AA x4, AF x8)
GF7700 : 26,5 fps
X1700 : 26 fps
Winner : Tie, no more no less.
PREY (1280x1024, highest details, AA x4, AF x8)
GF7700 : 23 fps
X1700 : 20 fps
Winner : GF 10% faster than ATI, no big deal but still.
COMPANY OF HEROES (1280x768, all set to High, shadows + trees at medium, model details at 2/3)
GF7700 : 20 fps
X1700 : 20 fps
Winner : Tie.
CONCLUSION : like in GF7600 vs X1600 case, GF and Radeon are very similar cards in terms of performance but this time nVidia has a little more power BUT not 30%-40% as some of you guys think (nVidia fanboys do you hear me?!), it's just about 10%, maybe 15% at best. I think we can make this thread sticky for a while, Chaz are you there?
That's all so merry christmas & happy new year (even you, nV+ATI fanboys)
cheers![]()
-
Notebook Solutions Company Representative NBR Reviewer
Thanks for that! The other thread was very succefull. But can you test Oblivion too please
Charlie -
nice thread, i'll link people to it when they want to know which is better
could you run 3dmark06, just so we can see how accurate this website is -
Sorry no Oblivion available
. Will run 3dmarks in a moment.
edit.
3dMark05 :
GF7700 : 4250
X1700 : 4126
3dMark06 :
GF7700 : 2380
X1700 : 2109 -
Charles P. Jefferies Lead Moderator Super Moderator
Very informative post mastha, thanks for taking the time to compile that.
-
3dmark06
GF7700 :2700
X1700 : 1850
i guess i'll go with your numbers since i know how you tested the cards
edit
and i guess there's probably a 10% difference between the 7700/x1700 and the 7600/x1600, so you could probably increase the AA up one notch with the 7700/x1700 and get the same FPS as you would with the 760/x1600.
this is pretty important for people deciding who are deciding whether ot not it's worth it to spend ~$200 more for one more increase of AA. it doesn't seem worth it -
Yeah, those numbers are not accurate. There are different revisions of 3DMark06 that no one takes into account. Thank you mastha for setting the record straight again. =D
-
Exactly what I expected....
Not to be a pain, but if you could pick one or two of those games and adjust the settings to get FPS that most people consider 'playable' (50-60) and use the same settings on both cards again, we could get another important datapoint. -
-
ltcommander_data Notebook Deity
Thanks mastha212. Timely and insightful as always.
It's too bad ASUS doesn't use GDDR3, since with DDR2 at 400MHz, both GPUs are memory bandwidth bound. Especially the 12 pipeline Go 7700. Must be a cost issue. Based on the V1Jp review on the front page, even that X1700 is mated with DDR2 memory at 396MHz. I wonder if anyone has a W1Jc or W2Jp with X1700 to see if those are using GDDR3. -
I should add that the 3DMark06 scores aren't comparable unless you took steps to correct what I think is Futuremark's biggest screw-up yet (aside from perhaps making the CPU an integral part of a graphics card benchmark): 06 defaults to 1280x1024, so if your monitor can't support that, it starts scaling back. So basically, the G1 was running at 1280x1024 while the A8Jp was running at (I believe) 1280x854 - a sizable difference.
-
-
John Ratsey Moderately inquisitive Super Moderator
I've spending part of my day experimenting with different clock speeds for the X1700 in my Samsung X60plus. The default clock settings are 405/446.5 (core/mem) so Samsung are being conservative on the core clock speed but have the emeory set relatively fast. Using ATI Tray Tools I got the core speed up to 533 before artifacts started to appear and the memory up to about 550 before the screen went blank.
Using Omega drivers 3.8.291, 3DMark06 gave the following results:
405/446.5 (stock) : 1813
449/450 : 1905 (+5%)
476/400 : 1947 (+7%)
476/500 : 2024 (+12%)
500/500 : 2077 (+15%)
Increasing the core speed increased the peak power consumption, from 59W at the mains socked with the stock settings to 63W at 500/500. However, I should point out that I have the T7200 CPU undervolted down to 1.0V, which reduces the peak CPU power consumption by about 9W, so the increase in GPU power is within the overall design of the cooling system.
Research study now over, I'll be leaving the settings at the default, but it does appear that there is some scope for tweaking the X1700.
John -
ltcommander_data Notebook Deity
John, what's the specs of your X60plus?
A stock W3J with a C2D T7200 and the X1600 gets around 2200 in 3DMark06.
http://forum.notebookreview.com/showpost.php?p=1721046&postcount=25
I believe the stock clocks are 450MHz/450MHz so it's strange that your X1700 at 500/500 is slower. -
And I sincerely hope when you say as your "primary screen" you mean only screen - if you were using Dualview or whatever ATi's equivalent is, then a lot of games take huge performance hits.
Also, I notice a distinct lack of any benchmarks of games that can take advantage of larger amounts of video memory - try BF2142, for example. Also, the Doom 3 timedemo is probably a better pure-OpenGL benchmark than Prey. You might also consider benchmarking some things (Doom 3 is also good for this) on Linux. You're using Direct3D benchmarks on Windows - admittedly the most common use of a GPU, but it only tells half the story. Anyone who thinks ATi's OpenGL support or Linux/OSX/BSD drivers are anywhere near nVidia's is just plain wrong.
Also, has anyone confirmed the number of pixel pipelines in each card? The news post about it here at NBR said the x1700 has 12 pipes, and I've heard the Go7700 has 16...which would make sense given ATI's MR x1800 has 16 and nVidia's Go 7800/7900 series have either 20 or 24 (depending on the model). How about vertex shaders? It's a shame to hear ASUS is using GDDR2 memory though... -
ltcommander_data Notebook Deity
Well, ASUS is not using GDDR2. They appear to be using regular DDR2. GDDR2 never really caught on after nVidia's attempt. If the memory is clocked above 400MHz then it's almost certainly GDDR3, if it's 400MHz or under, it's almost certainly DDR2 to save cost.
In terms of the configuration. The X1600 and X1700 are architecturally identical. The layout is 4 pipelines arranged as 4 TMUs with 12 PS and 5 VS. The only difference is that the X1700 is produced using a 90nm strained silicon process. The Go 7600 is configured as 8 pipelines with 8 TMUs, 8 PS, and 5 VS. The Go 7700 is configured as 12 pipelines with 12 TMUs, 12 PS, and 5 VS. Obviously, the Go 7700 should be a lot faster like the desktop Go 7600GT is over the X1600XT, but the Go 7700 is limited by memory bandwidth, especially when DDR2 is used, along with whatever "mobilization" was done.
The MR X1800 is 12 pipelines, 12 TMUs, 12 PS, and 8 VS. The MR X1800XT is 16 pipelines, 16 TMUs, 16 PS, and 8 VS. nVidia's high-end mobile have 20/24 pipelines (1:1 TMUS ratio) as you mentioned and 8 VS.
-
-
Thank you so much for setting the record straight, sincerely.
I knew these cards were on par (as close to par as you can get), and I knew the rumours about 7700 killing the x1700 were false because the memory of the two cards being tested the other time was different. -
3Dmark is the last thing i would use to compare GPUs
I have a GO7600 128MB VRAM and it scores 1850 on 3Dmark06, and with a bit overlock, it reaches 2050, equivalent to the X1700
freaks -
Thanks for posting your findings, mastha. It's great to have these comparisons to use as a reference.
Much appreciated. -
LOL, after seeing the 3D Mark06 scores, I have to agree. My underclocked 7600
gets 2300, which puts it on par with the 7700. 3Dmark needs to be more accurate. -
My 7700 3d mark test was 2700 and it over clocked to 3100.
So the 3d mark o6 test was recently changed i think,
I assume if you tested your 7600 again with the new test it would be lower.
I dont think comparing these scores to scores made a month ago is going to be too useful.
I dont recomend actually using 3dmark anymore lol I think its actually a waste of time. If the test has actually changed how can you compare these scores to the ones made a month ago?
Thanks mastha for your data. But all of the game settings are at settings that the ati card would catch up on.
With AA and ani turned off the nvidia card would have much higher fps.
If you want to believe an ati card can run the same games you can try to change the settings to its advantage. If you wanted to present data in fps that would show these cards had similar fps you would do the settings just like you did you would give them both low fps aa and af off.
So some sort of real comparison would show different settings. Maybe someone will do that.
Anyhow it is more data than I had a minute ago so its useful thanks. -
ltcommander_data Notebook Deity
-
Stamar but tell me:who turns off AA,AF when games can be played with both turned on?! GF has probably more raw power but I think we should compare this cards with eye-candys and effects enabled.
-
Well, I do, I turn those things off. I do my settings of high details no aa no ani. I play in 1440 x 900 also My fps are more in the 50s for most of the recent games including oblivian.
Your tests are for how you play games.
I couldnt tell you how everyone does their settingsr. But if you have 6 x aa on all of your games you might like ati cards more. You will learn to change your settings over time for how your hardware performs best....
The 3d mark 06 score was changed in the last month. It is probably MORE accurate now than it was when i used it.
It subtracted 350 points from the nvidia scores and added 300 points to the ati card score.
That difference is huge lol it changes 1900 to 2200 and 2700 to 2350
Im not going to take the test again Im going to assume if i took it again id get the same score as the g1 -
Turning AA I agree, but it's better to lose few FPS than AF , the difference is huge...
-
One thing thats different about this comparison too is that so far as I know the 7700 and x1700 are still only in notebooks.
And for the most part they are only in asus notebooks too. Theres x1700 in a samsung notebook
So theres almost no hardware comparisons being made. Someone that actually has two asus notebooks to test like mastha is actually rare. -
I think, to be fair, compare G1 and G2...
If you want to test it accurately,
Set 3d05 and 3d06 to MAX res. and MAX detail (all candies on).. -
um
well the x1700 in the g2 is different than the x1700 thats in the a8jp v1jp and um
f3jp and w2jp
Its a 512 mb dedicated x1700 and it is overclocked, but just a little.
You should compare those two though they are about the same speed actually.
But the average nbr consumer isnt going to be too into the g2, there arent enough owners of that machine. -
here's the scores in my G2.. all I can say is that I'm not really pleased with asus putting the x1700 in this 'gaming machine'.. I'd say the 15'' g1 performs better.
3d2005 - 4500 pcts
352005 - 2316 pcts
stock drivers, no OC, ran it today again. I get different scores +/- 50 pcts each time I run 3d.
@stamar
u said it comes a bit overclocked.. do u think there is room to oc a little more? If so, this would be my first OC, do you think ati tool is enough? What would be the safe limit? Thanks, I'd appreciate some advice. -
I dont own this hardware so Im the wrong person to ask.
I dont think the ram is going to go above 500 I think you get the core to the ram speed and thats it.
I dont predict you can overclock your card very much.
here is a benchmark showing games in the 7700 go and the x1700 in the g2.
Much bigger difference than masthas comparision.
http://www.anandtech.com/mobile/showdoc.aspx?i=2899&p=1
also shows the stock 3dmark 06 scores that I got. What this is I think is just an error in 3dmark 06 that sets the vertical resolution to 1024, or as much as you have. in the case of the a8js it only goes to 900 but in the g1 it goes to 1024 -
yes, just read the article and all I can feel is just a sour taste.. Should've waited a little bit more.. Thanks
-
ltcommander_data Notebook Deity
-
Well, the only thing I'm really pissed, actually I think my mind is, cause, paradoxically, I'm not, they announced it as the gaming machine, so it should be the best Asus laptop in gaming... whereas 'the choice of graphics chip definitely makes the G2P unworthy of the "gaming laptop" ..
-
lol i wouldn't be pissed man. The G2 is just a G1 with different GPU and bigger screen and better build. It's not like the extra $100 was a major rip off your still getting killer price/performance ratio (theres no way they'd put an x1800/7900 in there and still have it at the same price range).
Go install catalyst 6.12 right now. Go download ati tool and run "find max core and find max memory", then go run 3dmark05/06. If you don't score 5200-5400 in 3dmark05 and just under 3000 3dmark06 then i will be truly disappointed. 4500 stock drivers/stock speeds is NOTHING to be disappointed about, thats better then i expected considering x1700 = x1600 + HD support and x1600 only gets like 4000.
edit: but i still prefer the g1 > g2 just because 7700 is a better card (albeit not by much) and because i hate 17" screens unless of course its with a x1800/7900 series. but the g2 isn't as bad as you say -
Can you make comparisons from the 7600 vs x1600 thread? (i'm a bit too lazy to round up the numbers).
How much more fps could you get going from a 7600 to a 7700 or x1600 to a x1700? Or could you increase the AA by one increment? -
well you are one of the few people who own a g2.
everyone owns a g1 but yours is rare. It looks very nice. isnt the lcd lid aluminum? It appears to be a nicer chassis than the g1. I mean it doesnt compare well to it in some ways but its still a very good computer.
Complete a review for this site and they will pay you 40-60$
Many people are interested in the benchmarks of your machine. Download everest and tell us the lcd model.
most people dont even know you have a different version of the x1700 than is in the other machines. your scores are better than they think, most people think your machine is actually slower. -
Anandtech compares the x1700 (Asus G2P) and the 7700 (Asus A8JS).
http://www.anandtech.com/mobile/showdoc.aspx?i=2899&p=13 -
I guess 7200rpm really makes a difference...
If there is a X1700 in Acer TM 8210 7200rpm HDD........................ -
Charles P. Jefferies Lead Moderator Super Moderator
The hard drive has no impact on in-game performance; this is out of the GPU guide:
-
beefdonkey Notebook Enthusiast NBR Reviewer
I have an A8JP w/ x1700 stock drivers
Core 473 MHz / Memory 396 MHz
3DMark05 - 4205
3DMark06 - 2153 (1280x768 by default)
3DMark06 - 2376 (1024x768)
Also, how did you OC the x1700? I try and install Omega drivers and it won't let me -
Hi, I am owner of the F3Jp Notebook from asus, x1700 , 2*2GHz core2duo, 2048 MB ram. In your report you tested the game prey on the notebook with the x1700.
I bought this game today, but cant play it.
Instead of running the programme, it shows me a window and tells me that the couple of graphicadapter and driver is not compatibel.
Which driver version has you used to run prey?
I have actually Cataclist 8.33 installed, and patched prey to version 1.3 but no results.
Help me, please
mfg Tekky
my english isnt very good, sry for that^^ -
-
deleted---ancient thread.
GF7700 vs X1700 !!! Real-life games benches !!!
Discussion in 'Gaming (Software and Graphics Cards)' started by mastha212, Dec 25, 2006.