The 128 bit bus does actually make some sense because most of the laptops these 8600m gt will be put in will have fairly low resolutions anyway. Most 15.4" are 1280x800 or 1440x900, the cards will not be reaching for large enough textures to utilize a larger bus. These really shouldn't be in 1920x1200 resolution screens, those will need "8800m" style cards and THOSE can utilize larger bus sizes.
When exactly did bus size become a big deal with people?
-
-
well the go Geforce 7900 range come with 256-bit databus
-
-
Meaker@Sager Company Representative
In the end all that matters is total memory bandwidth, we could see GDDR4 versions to help a bit. Memory bandwidth has gone up slower than the raw horsepower but I guess simple PCBs save more than cheaper mem *shrugs*.
TBH my opinion of the 8600 series is underwhelming horsepower supported by even more underwhelming mem bandwidth. -
I will maybe repeat myself, but anyone of You know the impact of DDR2 graphics memory on 8600GT performance?
-
-
masterchef341 The guy from The Notebook
and....
the go 7900 range isn't in 15" laptops.
15" have similar dpi to larger monitors, so they end up with low comparative resolutions (1440x900 or less - still more res than 720p hd)
most people running the 7900's are also on 1920x1200 res monitors, or at LEAST 1680x1050.
thats more where the need for that 256 bit bus comes in. -
-
I think everyone keeps looking at the gddr2 8600gt as a step down from the G1S. It is in fact the other way around, the bulk of 8600gs and gt cards seem to be all gddr2 with the Asus G1S and C90 models the exception. The Dells and Sagers 8600gt bench 600-800 higher than the GS cards do, I guess I just don't see the "crippled" aspect that everyone is talking about. The G1S is a step up over the mainstream 8600gt, you may see 5-10 fps more in performance but you do lose some battery life with it, and in the end it will go extinct right around the same time as the rest of the 8600gt's.
And in the end, what do you do about it? One of 2 things: Buy the Asus gddr3 or wait for the next generation where maybe they all finally go to gddr3. Its a mystery why they haven't yet so who knows when they all will get the hint. -
masterchef341 The guy from The Notebook
yeah you are right. the performance difference is minimal. probably about 10%. its a lot, because its 10%, but at 30 fps 10% = 3 frames per second, and at 60 its still only 6 frames / second... all the gt's are faster than the gs'es, regardless of memory clock.
-
You can't just round the total difference in performance off by a percent. Certain cases will result in more or less of a performance difference. In a scene that is heavy in memory bandwidth-taxing elements (anti-aliasing, high resolution textures, large amounts of textures, etc), the difference will be extreme. In other scenes, the difference won't be as big.
-
Asus G1s's run their temps around 100 in games :O is it worth it to get that hot for the extra performance?
-
The G1s's temperature monitor is based on its GPU, not the 8600m's memory modules. The reason for the high temperature is lackluster cooling on the core, not higher clocks on the memory.
-
Hmm would've thought it might have had something to do with it since people that overclocked the underclocked ones got higher temps, but they also overclocked the core so that might have had something with it.
-
masterchef341 The guy from The Notebook
i was just trying to give a practical estimate. of course it depends, thats why i said "probably".
practical average situations you will probably see a few fps difference. im just trying to get people to step back and see the large picture.
words like "extreme!" and "minimal!" give impressions that may or may not actually be justified in the real world - extreme varies from one person's perception to the next, as does minimal. i might think 3-6 frames per second is no big deal. to someone else, that could be the difference between "smooth image" and "choppy image"... who knows.
just trying to frame the picture here. -
There are a lot of differences in architecture between the G60 and G80 cores, with the latter able to ... 'do more with less', sort of, but it's still a big difference. Add to this the fact that Doom 3 is hardly a new engine in this day and age, and it's hardly as stressing as other games, and should show you how memory bandwidth, or lack thereof, can cut performance. -
masterchef341 The guy from The Notebook
those are some solid numbers. good. i won't argue with that. still- i wasn't off by too much if you give me a decent margin of error. you are looking at about 59 vs 71 fps. you can extrapolate and say a similar situation with 30 fps on the low end would give you about 36 fps on the high end. i was close, and i didn't have any benchmarks.
-
The Forerunner Notebook Virtuoso
What is up with this thread? Why does it always how a new post though no1 has written a new post?
-
What i got from another member was that it goes to the top when someone votes on it.
-
ViciousXUSMC Master Viking NBR Reviewer
A vote on a poll will bump it to the top. -
how much of a jump in performance should we expect from good drivers in the future?
-
It's hard to gauge, some cards in the past have received driver updates that increased performance by insane amounts (30-60%), while some cards will only increase about 5-10% over their lifetime. I'm putting my money on the hope that DX10 improvements will further enhance modern unified shader architecture graphics cards. -
masterchef341 The guy from The Notebook
its not that some cards get only 5-10% performance increase over the lifetime. its that some games will only see a 5-10% performance increase with a certain generation of gpu.
other games will see drastic performance increases. bigger budget games will be more likely to get driver attention.
new drivers are pretty much certain to further the spread between the new gpu's and the previous gen.
all the time you see drivers that either improve the new tech exclusively, or, for example, might improve the old tech by 15% and the new tech by 25%, with a certain game. -
masterchef341 The guy from The Notebook
which might put the performance difference at maybe 3-6 frames / sec instead of 10-20...note previous post.
of course- it totally depends on the situation, and of course those numbers alone are useless without being relative to anything. im thinking your average gaming situation rocking about 40 fps. but again, certain situations will obviously change the difference a lot.
8600M GT opinions
Discussion in 'Gaming (Software and Graphics Cards)' started by selu_99, May 29, 2007.