The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    8600M GT opinions

    Discussion in 'Gaming (Software and Graphics Cards)' started by selu_99, May 29, 2007.

  1. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    The 128 bit bus does actually make some sense because most of the laptops these 8600m gt will be put in will have fairly low resolutions anyway. Most 15.4" are 1280x800 or 1440x900, the cards will not be reaching for large enough textures to utilize a larger bus. These really shouldn't be in 1920x1200 resolution screens, those will need "8800m" style cards and THOSE can utilize larger bus sizes.

    When exactly did bus size become a big deal with people?
     
  2. FreeloaderUK

    FreeloaderUK Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    well the go Geforce 7900 range come with 256-bit databus
     
  3. calvarez

    calvarez Notebook Consultant

    Reputations:
    0
    Messages:
    145
    Likes Received:
    0
    Trophy Points:
    30
    Actually, the most expensive operation (timewise) a card can do is moving data from system memory to graphics memory and viceversa. Bus is very critical (that is one of the main points of having shader programs, it moves some of the computation to the graphics card just so you don't have to send so much data through the bus).
     
  4. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,910
    Trophy Points:
    931
    In the end all that matters is total memory bandwidth, we could see GDDR4 versions to help a bit. Memory bandwidth has gone up slower than the raw horsepower but I guess simple PCBs save more than cheaper mem *shrugs*.

    TBH my opinion of the 8600 series is underwhelming horsepower supported by even more underwhelming mem bandwidth.
     
  5. atiesh

    atiesh Notebook Enthusiast

    Reputations:
    5
    Messages:
    47
    Likes Received:
    0
    Trophy Points:
    15
    I will maybe repeat myself, but anyone of You know the impact of DDR2 graphics memory on 8600GT performance?
     
  6. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Read this thread, especially Chaz's comment at the bottom.
     
  7. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yes.

    and....

    the go 7900 range isn't in 15" laptops.

    15" have similar dpi to larger monitors, so they end up with low comparative resolutions (1440x900 or less - still more res than 720p hd)

    most people running the 7900's are also on 1920x1200 res monitors, or at LEAST 1680x1050.

    thats more where the need for that 256 bit bus comes in.
     
  8. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    Probably quite a bit since it was already very memory bandwidth limited, normally you wouldn't expect a card to fall that much in performance (especially at a low resolution) with only lower memory speeds.
     
  9. sco_fri

    sco_fri Notebook Evangelist

    Reputations:
    75
    Messages:
    451
    Likes Received:
    0
    Trophy Points:
    30
    I think everyone keeps looking at the gddr2 8600gt as a step down from the G1S. It is in fact the other way around, the bulk of 8600gs and gt cards seem to be all gddr2 with the Asus G1S and C90 models the exception. The Dells and Sagers 8600gt bench 600-800 higher than the GS cards do, I guess I just don't see the "crippled" aspect that everyone is talking about. The G1S is a step up over the mainstream 8600gt, you may see 5-10 fps more in performance but you do lose some battery life with it, and in the end it will go extinct right around the same time as the rest of the 8600gt's.

    And in the end, what do you do about it? One of 2 things: Buy the Asus gddr3 or wait for the next generation where maybe they all finally go to gddr3. Its a mystery why they haven't yet so who knows when they all will get the hint.
     
  10. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yeah you are right. the performance difference is minimal. probably about 10%. its a lot, because its 10%, but at 30 fps 10% = 3 frames per second, and at 60 its still only 6 frames / second... all the gt's are faster than the gs'es, regardless of memory clock.
     
  11. Osserpse

    Osserpse Notebook Evangelist

    Reputations:
    39
    Messages:
    336
    Likes Received:
    0
    Trophy Points:
    30
    You can't just round the total difference in performance off by a percent. Certain cases will result in more or less of a performance difference. In a scene that is heavy in memory bandwidth-taxing elements (anti-aliasing, high resolution textures, large amounts of textures, etc), the difference will be extreme. In other scenes, the difference won't be as big.
     
  12. osso002

    osso002 Notebook Evangelist

    Reputations:
    33
    Messages:
    419
    Likes Received:
    0
    Trophy Points:
    30
    Asus G1s's run their temps around 100 in games :O is it worth it to get that hot for the extra performance?
     
  13. Osserpse

    Osserpse Notebook Evangelist

    Reputations:
    39
    Messages:
    336
    Likes Received:
    0
    Trophy Points:
    30
    The G1s's temperature monitor is based on its GPU, not the 8600m's memory modules. The reason for the high temperature is lackluster cooling on the core, not higher clocks on the memory.
     
  14. osso002

    osso002 Notebook Evangelist

    Reputations:
    33
    Messages:
    419
    Likes Received:
    0
    Trophy Points:
    30
    Hmm would've thought it might have had something to do with it since people that overclocked the underclocked ones got higher temps, but they also overclocked the core so that might have had something with it. o_O
     
  15. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    oh, but i can.

    i was just trying to give a practical estimate. of course it depends, thats why i said "probably".

    practical average situations you will probably see a few fps difference. im just trying to get people to step back and see the large picture.

    words like "extreme!" and "minimal!" give impressions that may or may not actually be justified in the real world - extreme varies from one person's perception to the next, as does minimal. i might think 3-6 frames per second is no big deal. to someone else, that could be the difference between "smooth image" and "choppy image"... who knows.

    just trying to frame the picture here.
     
  16. Osserpse

    Osserpse Notebook Evangelist

    Reputations:
    39
    Messages:
    336
    Likes Received:
    0
    Trophy Points:
    30
    By extreme I mean anywhere from 10-20 frames per second. I performed a little experiment with my 6800, which has a 256bit memory bus, so the memory only has to be clocked at half the rate of the 8600m GT to achieve the same bandwidth (note: not performance). When the memory was clocked at 700MHz (comparable to the bandwidth on the 8600m GT at 1400MHz), Doom 3's timedemo achieved 71.5 fps on average at 1280 by 1024 on high quality settings, but no anti-aliasing. When the memory was clocked at 400MHz (comparable to the 8600m GT at 800MHz), the timedemo achieved 58.9 fps. Oh my, it appears there's a lack of performance somewhere... hmm. If I were to apply anti-aliasing, the difference would've increased a lot.

    There are a lot of differences in architecture between the G60 and G80 cores, with the latter able to ... 'do more with less', sort of, but it's still a big difference. Add to this the fact that Doom 3 is hardly a new engine in this day and age, and it's hardly as stressing as other games, and should show you how memory bandwidth, or lack thereof, can cut performance.
     
  17. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    those are some solid numbers. good. i won't argue with that. still- i wasn't off by too much if you give me a decent margin of error. you are looking at about 59 vs 71 fps. you can extrapolate and say a similar situation with 30 fps on the low end would give you about 36 fps on the high end. i was close, and i didn't have any benchmarks.
     
  18. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    What is up with this thread? Why does it always how a new post though no1 has written a new post?
     
  19. osso002

    osso002 Notebook Evangelist

    Reputations:
    33
    Messages:
    419
    Likes Received:
    0
    Trophy Points:
    30
    What i got from another member was that it goes to the top when someone votes on it.
     
  20. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466

    A vote on a poll will bump it to the top.
     
  21. spexc31

    spexc31 Notebook Evangelist

    Reputations:
    12
    Messages:
    473
    Likes Received:
    0
    Trophy Points:
    30
    how much of a jump in performance should we expect from good drivers in the future?
     
  22. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    I think that when these drivers are fully mature we will see anywhere from 10-30% increase in power. When the DX10 API is fully mature I would also anticipate tacking on another 10-30% increase in power.

    It's hard to gauge, some cards in the past have received driver updates that increased performance by insane amounts (30-60%), while some cards will only increase about 5-10% over their lifetime. I'm putting my money on the hope that DX10 improvements will further enhance modern unified shader architecture graphics cards.
     
  23. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    you can't make the card faster. you can only make the card run a game faster.

    its not that some cards get only 5-10% performance increase over the lifetime. its that some games will only see a 5-10% performance increase with a certain generation of gpu.

    other games will see drastic performance increases. bigger budget games will be more likely to get driver attention.

    new drivers are pretty much certain to further the spread between the new gpu's and the previous gen.

    all the time you see drivers that either improve the new tech exclusively, or, for example, might improve the old tech by 15% and the new tech by 25%, with a certain game.
     
  24. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    also- you tested that with a card two generations down. its very possible that the newer generations can do more with less- its very possible that cutting the clock might have less of an effect on an 8 or even 7 card, compared to the 6 series.

    which might put the performance difference at maybe 3-6 frames / sec instead of 10-20... ;) note previous post.

    of course- it totally depends on the situation, and of course those numbers alone are useless without being relative to anything. im thinking your average gaming situation rocking about 40 fps. but again, certain situations will obviously change the difference a lot.
     
← Previous page