here are some of the benchmark results for the 8600m-GT but i took the liberty to put it here![]()
there are some benchmark results for uk company evesham with regards to the Zieo N500-HD
the main specs of laptops as follows:
Intel Core 2 Duo t7500 2.2ghz 800mhzfsb
2gb 667Mhz
160gb 5400rpm
512mb gddr3 8600m-gt
also 1gb of intel turbo memory
Vista home premium
here are the results:
Prey
1024x768 0xfsaa 0xaf 58.1 fps
1024x768 2xfsaa 4xaf 39.83fps
1024x768 4xfsaa 8xaf 34.83fps
1680x1050 0xfsaa 0xaf 25.53fps
1680x1050 2xfsaa 4xaf 19.96fps
1680x1050 4xfsaa 8xaf 17.63fps
Call of duty 2
1024x768 0xfsaa 0xaf 32.92 fps
1024x768 2xfsaa 4xaf 27.62fps
1680x1050 0xfsaa 0xaf 18.35fps
1680x1050 2xfsaa 4xaf 15.38fps
Counter Strike: Source
1024x768 0xfsaa 0xaf 68.98fps
1024x768 2xfsaa 4xaf 58.67fps
1024x768 4xfsaa 8xaf 37.74fps
1680x1050 0xfsaa 0xaf 46.05fps
1680x1050 2xfsaa 4xaf 31.85fps
1680x1050 4xfsaa 8xaf 18.12fps
Seems that it is definitly (correct me if im wrong) sub par to the 7900gs but still a good card...however reading from another thread the go 7600 got 53 fps in prey at 1024x768 (in xp) however this was done in vista (for the 8600)...and im pretty sure that the drivers are stilll suck for vista...(well for nvidia it does...doesnt it?) so could see some better improvements as the drivers mature...
-
-
oh also anyone with results for these games in vista post your benchmarks if you can...and see how it comes up against this
-
Yes Finally! =P
-
Hrmmm, yeah, those results don't seem very good. I have to think the card is capable of more than that.
If they are accurate though, it seems like if you play at a native resolution of 1680x1050, like on an Asus G1S, you wouldn't be getting anything all that fancy. -
i still will hold to my theory that it wont be better than a 7900gs...i dunno how much more capable it will be though..ones things for sure it wont be drastically better...perhaps the fact that it is 128bit only cripples it? (again feel free to set me straight if im wrong lol)
-
eww and this is with a 512mb ddr 3 8600gt?
so it might suck a tad more for either the 512mb dd2, or 256mb ddr3? -
-
How did the 8600GS get higher FPS than the 8600GT in some tests?
-
usapatriot Notebook Nobel Laureate
So far I have been disappointed by the overall result of mobile DX10 cards.
-
-
Wow... that's pretty bad.
It's strange that at 1680x1050, 4xAA & 8xAF, Counter Strike: Source and Prey have the same framerate (~18)? How does that work out? Prey is a much more graphically demanding game than CS: Source is (and 18 FPS in CS at those settings seems ridiculously low, even for a laptop card). Hopefully its just a Vista driver issue and performance will improve with time... -
I knew it was coming but I still want to see a good mid-range card for dx10. I think the 8400GT will be really good though. Actually these scores are ridiculous the 7400go scores better in CS: source at those settings. I'm fairly positive these were tested using those bad nvidia vista release drivers.
This is a quote from the article:
First, considering the new Core 2 Duo's have an 800MHz FSB the memory being clocked at 667MHz is a potential bottleneck
Can someone explain why this would be a bottleneck!?!?!?!??!?!? (im like 99% sure it isn't) -
Reviews are still spouting crap. They should really learn the intricacies of these platforms before jumping to conclusions.
The 8600M-GT won't be as good as the 7900GS, but we can hope it'll also consume much less power.
GDDR3 is based off of DDR2, not DDR3.
I don't know what the situation is with the mobile drivers, but on the desktop, performance isn't too bad in Vista - somewhere in the range of 5-10% slower than XP for most games. -
Sneaky_Chopsticks Notebook Deity
Wow, these graphics cards can play games, but with low results
-
i don't see why everyone is so dissapointed, the 8600GT is a very good card for a 14 or 15 inch laptop. If those cards where for a 17 inch laptop than i could see the point, but now i don't see why everyone is whining about the results.
-
well pc pulsar...funny you should say because this particular model was was a 17inch laptop lol....
-
hey, you all depressed me
. Can this possibly be because of the drivers or something else? hope it is, cos if not...
-
same lol i hope its the drivers, also someone needs to benchmark this card using windows xp -
-
Why can't they just leave all the pipelines unlocked? Such a waste.
-
-
And nothing if i'm right
, just that the FSB-DRAM ratio will be at 5:4.
-
ViciousXUSMC Master Viking NBR Reviewer
from another thread I'll repost since its directly relivant here.
So take that to heart guys, just because they are the new cards doesnt mean it will play all you old games better than your old card. These are DX10 cards and made for the future not the past.
So its kinda early to jump on the DX10 bandwagon, unfortunatly not every average joe knows this stuff so most of the new high end notebooks will be comming with the DX10 cards to promote high sales and in reality most people will be much better off with a DX9 card. Oh and yet another side note the DX10 cards have newer HD video decoding.. if you a HD Fanatic you may like this as its much easier on your cpu.
I really like the new G series from asus but they come with the 8600GT I will be ok with that, but the C90 may be better as I can atleast put a DX9 card in there and let my 8600GT collect dust for now untill DX10 is mainstream. -
It's actually quite the opposite: the first generation of DX10 cards actually perform better in DX9 than in DX10 (although that's likely more to do with immature DX10 drivers). -
What you should be comparing : Go7600GT/7700 against the 8600MGT
With this setup, crank up the AA&AF and you ll see gains from the 8 series, with no AA&AF it will only par the 7 series only. This might be the result of the new board since it is shader orientated.. Looks like what we have last time, the 7900 vs X1900.. if it is shader heavy,the ATI will win, if not the Nvidia... -
ViciousXUSMC Master Viking NBR Reviewer
I'll call you out on this. Its been said many times that DX10 cards are a whole new ball game, Gone are pixel piplines, and vertex shaders, and all that stuff for the new stream processors.
They have to "emulate" DX9 to run that content as the cards dont process data in the same way at all. Why do you think there have been all these driver nightmares for even the DX9 games and not just DX10.
Let me post some resources for you to read up on.
http://news.softpedia.com/news/DirectX-10-and-so-it-ends-7762.shtml
here is one by catweazle, me and him are like guru's on this stuff (heck infact I am one of the top guys at guru3d.com a huge computer enthusist site) but thats besides the fact, catweazle runs a computer knowledge database, thats his job is to know this stuff and this link goes to his post:
http://www.daniweb.com/blogs/entry353.html
Dont forget things like Open GL doesnt even exsist in Vista anymore, that has to be emulated with a software layer aswell.
If you proove me wrong thats awsome I just learned somthing and feel abit smarter, if not then I am glad I could teach you somthing. -
On the software side of things: while it was rumored a while back that OpenGL would have to emulated through DirectX (with a huge performance cost), this has changed (see Digg article which links to the MSDN development blog.) <- The third option, "Windows Vista ICD's", is essentially native OpenGL support in Vista.
You are right in saying that DX10 isn't backward-compatible with DX9, but Vista sidesteps this issue and does indeed natively support DirectX9 through a separate API called Direct3D 9Ex (formerly called DirectX9.0L). Since Vista includes both DX 9Ex and DX 10 API's, there is no performance hit associated with running DX9 applications with Vista. In fact, Vista's Aero Glass GUI runs off of DX9 (which is why Vista requires a DX9-compatible card in its system requirements.) -
ViciousXUSMC Master Viking NBR Reviewer
Just seems like common sense to me the reason we see a 3-6% performance drop in vista is due to that stuff you just mentioned. Everything is done via software layers now of some sort rather than directly. I call it "emulation" with quotes for a reason.
All of this doesnt change my original point tho that DX10 is supposed to offer the same quality visuals with lower demands on system power.
So its very well possible a slightly lower scoring dx10 card can out preform a better DX9 card when we see some main line DX10 games out that are properly coded and drivers have matured.
Question is tho..... will game devopers actually do things right and make a DX10 game run 20% better in DX10 vs its DX9 version or will they just code it 20% worse... so it runs the same and they can be lazy.
That trend is afully common, making us suffer with horribly coded games just because we have powerfull hardware. -
well does anyone have these games and are playing them in vista? if so plz do post your results if you can of your peformance(fps) and also what gpu your using....
cheers -
It's funny how these guys compare an 8600GT with a 7900GS with twice the memory bandwidth.
-
hhhmmm... stalker comes to mind lol...
i agree with alot of people that these directx 10 cards will be alot better as the drivers mature.
Also could the rig used for the benchmarks posted in the original post be bottlenecking? The rig uses the slower 5400rpm hard drive which should definatly slow performance. Also only 677mhz ram is used. I have just ordered one of these laptops and it will have the faster hard drive and i'm getting 800mhz ram to put in it (lets see if santa rosa can use it?!?), i should definatly see better proformance. -
5400 rpm won't make a difference except in loading times.
-
SymphonyX: they are comparing it to the 7900 Go mobility version; make a difference?
-
You don't make sense. You should rephrase that.
And duh!!! They should compare it to the mobile version of the 7900GS! It doesn't make sense to compare a desktop 7900GS to a Geforce Go 8600GT. It's even going to spread the comparison benchmarks even farther. -
make sense?
-
The Geforce Go 7900GS is in a different league and has a different target market as compared to the mid-ranged cards like the 8600, 7600, X1600, etc. They should compared the 8600GT to the 7600GT or even the 6600 Ultra if they're wondering how much performance has increase since then. And I'm referring to the mobile version of the cards. BTW, it's also hard to find the GF Go 7600GT and 6600 Ultra. They didn't have a GF Go 6600GT, only 6600 or 6600 Ultra.
-
-
totally agree
-
@ Chuck232
It's partially true with the newer games since the newer cards have new features. Take for example the Geforce Go 7600, if you see the 3DMark03, it isn't really too far from the GF Go 6600. In 3DMark05 however, the performance difference is much wider and is bordering on the performance set by the previous high-end, the GF Go 6800.
GF Go 7600
GF Go 6800
Notice how in 3DMark05, the GF Go 7600 is nearing the performance of the GF Go 6800 while in 3DMark03, it gets slaughtered. -
Please show me how the 8600GTS is so immature, even on the desktop. (And no, don't show me DX10 benches. That's just a quagmire right now.) The 8600/8500 series are memory bandwidth limited. New drivers will not help that substantially. Shader power is there - there's just no bandwidth to support it. Case in point: Oblivion + STALKER.
DX10 fully supports DX9 in the hardware. You go on a few posts later to pull out two articles from 2005. Good for you. That was when the DX10 spec wasn't even fully available. Unified shaders mean just that - shaders which are unified. They can now serve the purpose of any of vertex, pixel or geometry. They are mapped to each function as necessary, in the hardware. To say that DX10 cards are not compatible (in the hardware) with DX9 is either a complete, blatant, lie or misinformation.
EDIT: In fact, I'll add something more. One of the biggest features of these DX10-supporting cards, unified shaders, was predicted to bring performance enhancements to DX9 content as well, not the other way around as you have argued. With on-the-fly reprogramming of shaders, it is quite possible to increase performance in DX9 games over fixed-shader hardware. The idea is you have no unused resources - if a 20:12 pixel:vertex shader combo is best for that scene, then that'll outperform a fixed hardware implementation that only has say 8 vertex shaders available but 36 pixel shaders, many of which would not add to performance. (assuming all else equal - and that's a pretty primitive example) -
Zepto did some benchmarks on their 6224 with a 8600M GT 512MB (GDDR2)
Counter Strike: Source
1024x768 4xfsaa 8xaf 109.00fps
1280x800 4xfsaa 8xaf 88,81fps
1440x900 4xfsaa 8xaf 72,13fpsClick to expand... -
wave said: ↑Zepto did some benchmarks on their 6224 with a 8600M GT 512MB (GDDR2)
looks alot better... maybe new drivers do help.Click to expand...
Wonder how the 256MB GDDR3 would perform -
wave said: ↑Zepto did some benchmarks on their 6224 with a 8600M GT 512MB (GDDR2)
looks alot better... maybe new drivers do help.Click to expand...
Thanks heaps.
Petrov. -
The source is this:
http://www.forumdeluxx.de/forum/showthread.php?t=360222&page=2
Post 58 has the CS:Source info. (need to scroll down alot)
Zepto also posted some overclocking scores for 3dMark06 in post 60 and 66 -
those CS scores seem to be TOO good. I think is more of a funny lie or something. How can drivers boost from 18fps (1st post here) to 70 (58th post there)??
utopia? -
wave said: ↑The source is this:
http://www.forumdeluxx.de/forum/showthread.php?t=360222&page=2
Post 58 has the CS:Source info. (need to scroll down alot)
Zepto also posted some overclocking scores for 3dMark06 in post 60 and 66Click to expand...
Just trying to get a sense for whether the GDDR3 v GDDR2 is a big deal or not.
Thanks,
Peter. -
6224 gets around 3800 3dMark06 not overclocked, but had been overclocked to 4500.
-
Petrov said: ↑Thanks. Sad that I can't access that forum from work - any chance you can tell me what the 3dmark scores were for the 6224 (and was it overclocked, and at what resolution)?
Just trying to get a sense for whether the GDDR3 v GDDR2 is a big deal or not.
Thanks,
Peter.Click to expand...
that is the best score that Zepto posted. He said it was the highest he could push it or close to that. This is with a prerelease 6625WD and beta bios. He keeps saying that so I will say it too. No idea how much of a differance a final bios will make. -
B4TCH said: ↑6224 gets around 3800 3dMark06 not overclocked, but had been overclocked to 4500.Click to expand...
Petrov. -
Petrov said: ↑Thanks very much for that. Am I right in thinking its still lagging the G1S by a good 700-900 points?
Petrov.Click to expand...
8600m-gt 512 gddr3 Benchmark results in prey, cod2 and counter strike
Discussion in 'Gaming (Software and Graphics Cards)' started by sunjhoon, May 28, 2007.