The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    One take on the DDR2 vs GDDR3 8600m gt

    Discussion in 'Gaming (Software and Graphics Cards)' started by masterchef341, Jul 28, 2007.

  1. rafiki6

    rafiki6 Notebook Consultant

    Reputations:
    40
    Messages:
    235
    Likes Received:
    0
    Trophy Points:
    30
    This entire thread is pointless. I just want to make a few corrections:

    1. The maximum threshold a human can detect is actually 12 FPS not 60 like someone mentioned earlier. Anything more just means you are more prepared for the future.

    2. A graphics card is essentially obsolete in about 2 years anyway. Now adays it seems to be less about 1 -1.5 years. So I don't care if you own a GDDR3 or a DDR2 variant of the 8600M GT, in about a year your gaming performance will lack significantly.

    3. The 8600M is supposed to be a mainstream card. Currently it is one of the higher cards from Nvidia, but in about 2-3 months Nvidia will be releasing en mass, their 8700M GT, already in Toshiba notebooks, and their 8800M line, by the end of this year and early next year we will begin seeing the 8900M, and the 9000 series will be released by the end of next year latest. So the 8600/8700M are mainstream and will end up being like the 7600/7700 Go cards are today. Go for the cheaper of the two, it will be future proof for about a year or a year and half.
     
  2. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Care to back this up with references? I think everyone here will agree that 12 FPS is absolutely horrid.
     
  3. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    1. that is blatantly incorrect. there is no threshold of perception of fps. you can certainly contrive a situation where 12 fps would be the limit of perception, though. for instance, if you were perfectly still, or if it was completely dark... 12 fps would be plenty. in 3d games, you tend to be able to perceive smooth motion at about 30 fps. you will generally be able to tell with EASE the difference between 30 and 60 fps also.

    http://www.100fps.com/how_many_frames_can_humans_see.htm

    2. that is the curse of computer tech. you will still be able to play the latest games in two years, but you will almost certainly have to lower the resolution and settings. that doesn't mean the games won't still look good, though. a go 7600 can still play all the latest games. its obsolete in the sense that you shouldn't buy one at this point, but its not useless. its still a good card, and it will play crysis and ut3, and those games will look good too!

    3. the 8600m gt is the fastest card available in a 15" notebook. that is its purpose. larger notebooks can use more powerful graphics cards like the 8700m gt and the 8800m series, when it becomes available. the 8800m still doesn't have an official release date, but the general consensus is that it will be released by about the end of the year. that makes it unlikely that an 8900m series will appear before years end or early next year. no 8900m series is speculated or announced at this point from the graphics community, either.
     
  4. oblomschik

    oblomschik Notebook Evangelist

    Reputations:
    20
    Messages:
    558
    Likes Received:
    0
    Trophy Points:
    30
    Any chance you can run newer game benchmarks for say Lost Planet Demo, Fear, or Oblivion? They will task video harder then CS:S. It would be interesting to see the difference. Also, it's kind of silly of people to say, who cares about aliasing and such, I am not going to turn it on, that stuff helps in image quality, IMO, especially on lower resolutions.
     
  5. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Well, some people just want the game to look okay. I know people who are perfectly happy to game at 640x400, all settings low or off, as long as the game runs. It's not silly, it's just that they're not as demanding as some people. If you play a game for the strategy or the story line, image quality just isn't that important.
     
  6. oblomschik

    oblomschik Notebook Evangelist

    Reputations:
    20
    Messages:
    558
    Likes Received:
    0
    Trophy Points:
    30
    Well, it is silly when they state that speed difference does not matter for ANYONE. Also, IMO, for strategy games, resolution and smooth frame rates do matter a lot, since it's big advantage to the opponent otherwise. Just wait till SC2 comes out and see all the people screen who can't run it a good res/fps.
     
  7. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    For RTS, rez very rarely matters to me, since I often don't care about what the graphics look like. Smooth frame rates matter, but you can almost always get smooth frame rates if you lower settings and resolution enough. I agree that it does matter to many people, but IMO, it's just as silly to say that these settings and resolution differences matter to EVERYONE, which some gamers are apt to say.
     
  8. hlcc

    hlcc Notebook Evangelist

    Reputations:
    113
    Messages:
    634
    Likes Received:
    0
    Trophy Points:
    30
    2. why are you so sure a go 7600 can play games like Crysis nicely? anyways the Geforce 7 series is one year old, its the Gefroce 6 series thats about 2 years old.
    3. sure its the best mobile GPU for a 15 incher, but in the whole computer gaming industry a 8600M GT is nothing more than a downclocked mid-range card thats schedule to be replaced by the end of this year or early next year. Geforce 9 series are scheduled to launch by the end of this year
     
  9. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    when the geforce 9 mobile cards come out, the geforce 10 series will be just around the corner.

    but if i wait until the geforce 10 series, the geforce 11 series will probably be out by the end of that year!!! i might as well just get an 11 series!!!

    why stop there?

    by the time the geforce 11 series comes out, you should probably just wait for the 12 series which will introduce directx next! its going to be way better!

    but really those cards won't be able to take advantage of all the directxnext eyecandy, so you should probably just hold out for the directxnext's second line of gpu's, which will be much better at directxnext. so just hang on for the geforce 13 series, and you will have your gpu. of course it will still only be top of the line for a short time, but you will have made the right decision.

    *implodes*

    check out the "official crysis requirements" thread for info on system requirements for crysis.
     
  10. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    But some people don't have access to the whole computer gaming industry, they only have access to that portion which can be installed in a 15.4" laptop. In that case, it is unhelpful to the point of being rude to say that the 8600M-GT is anything short of the absolute best graphics experience available to them. And no, it is not officially scheduled to be replaced at all, as I don't think nVidia has officially scheduled a mobile 9 series card to fit in a 15.4" laptop as of now.
     
  11. windchill

    windchill Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    Sorry if this is a stupid question ^^;.. could someone please answer it though? Thanks!
     
  12. Ultim4

    Ultim4 Notebook Evangelist

    Reputations:
    112
    Messages:
    325
    Likes Received:
    0
    Trophy Points:
    30
    Further to this point, there'll always be some new behemoth game around the corner, too. (e.g. right now it's Crysis) Just buy for now, unless there's a game you know you want to wait for.
     
  13. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    At lower settings, those two cards will probably perform almost identically. However, at higher rez and texture settings, the 128mb GDDR3 will still outperform the 512mb DDR2, since it's the memory bandwith that's limiting you, not the amount of VRAM.
     
  14. Ilovelamp

    Ilovelamp Notebook Consultant

    Reputations:
    52
    Messages:
    225
    Likes Received:
    0
    Trophy Points:
    30
    If you are referring to Starcraft 2, then I don't think it'll be a huge problem. Blizzard has repeatedly stated that its going to make the game so people with "average" pcs can still play the game. I think they said you would only need a 128 mb card. They have the right marketing idea. No point making a game which a handful of people can play. Makes much more sense to release a game that is accessible to a much bigger audience.
     
  15. joeyscl

    joeyscl Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15

    Just like 1 is 50% of 2
    not 1% ;)
     
  16. LBThorn

    LBThorn Notebook Consultant

    Reputations:
    2
    Messages:
    251
    Likes Received:
    0
    Trophy Points:
    30
    This guy is absolutely RIGHT!!! I have the 8600GT and it is wonderful. Of course it will be out of date in several years. EVERYTHING DOES BECOME OBSOLETE SOONER OR LATER!! But for now it is a GPU that just nails every graphic for me. I am pleased very much. So for the few people who complain or ridicule this card I would just like to say that you are wrong. You are wrong for saying that its not worth it. It hasn't even been six months and I am hearing complaints. Geeze! Thats my point of view.
     
  17. JCMS

    JCMS Notebook Prophet

    Reputations:
    455
    Messages:
    4,674
    Likes Received:
    0
    Trophy Points:
    105
    Yeah it's a good card right now.

    I really think those DDR2 cards are somewhat stupid right now, Nvidia has made them to be 256MB GDDR3, not 128MB GDDR3, 256MB DDR2 or 512MB DDR2!!!!
     
  18. 10ten92

    10ten92 Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    If 115 Is 100% = 83/115 = 72.17 % = 72%- 100 = 28 %

    So 28 % Is Correct. By The Way Iam A Korean.
     
  19. Ultim4

    Ultim4 Notebook Evangelist

    Reputations:
    112
    Messages:
    325
    Likes Received:
    0
    Trophy Points:
    30
    ...Then clearly that settles it...
     
  20. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    and its settled.
     
  21. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    we had already figured this out. it was a language thing, not a number thing. thanks though.

    look closely at the words he used originally, they do not match up to the math properly.

    but i won't deny that the korean's in my class were generally exceptionally talented in mathematics. it might be the fact that they lived on campus, or different math training earlier, or that koreans are just good at math! i don't know.
     
  22. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    That's racism!
     
  23. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Seriously people, why are we still talking about this?
     
  24. Bona Fide

    Bona Fide Notebook Deity NBR Reviewer

    Reputations:
    94
    Messages:
    754
    Likes Received:
    0
    Trophy Points:
    30
    I haven't looked at pages 2-7 but did you really spend 8 pages arguing about who's right?
     
  25. 10ten92

    10ten92 Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    Korean's middle school math level is just same as the other contries's college level.
     
  26. meh_cd

    meh_cd Notebook Evangelist

    Reputations:
    140
    Messages:
    422
    Likes Received:
    0
    Trophy Points:
    30
    Uh, good for you? People already figured it out regardless.
     
  27. 10ten92

    10ten92 Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    So what? it's none of your bs... :)
     
  28. meh_cd

    meh_cd Notebook Evangelist

    Reputations:
    140
    Messages:
    422
    Likes Received:
    0
    Trophy Points:
    30
    You came into a thread, solved a math problem that other people already solved, and then threw around the fact you were Korean. Your ethnicity/nationality really makes no difference in this case.
     
  29. Odin5578

    Odin5578 Notebook Evangelist

    Reputations:
    21
    Messages:
    542
    Likes Received:
    0
    Trophy Points:
    30
    I ordered a Dell Inspiron 1520 and it comes with a 256MB 8600M GT does anyone know if it's 700 or 400MHz?

    Thanks,
    -Taylor
     
  30. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    400MHz, almost certainly.
     
  31. MttFrog13

    MttFrog13 Notebook Guru

    Reputations:
    0
    Messages:
    65
    Likes Received:
    0
    Trophy Points:
    15
    can somebody for the love of god provide me with the 3dmark for a 8400m gt. I searched every page of this thread and didnt find it. I check notebookcheck.net and didnt find it even tho i found the 8600's and the 8400m gs. I even checked the futuremark online database of 3dmark scores. I'm not saying its the best way to compare cards but what other unified way to analyze all the cards across all platforms? Unless someone has every computer to run the test they want, we kind of have to rely off 3dmark.

    Can somebody please please tell me the 3dmark06 score of the 8400m gt. i just want to compare it to the HP 8600m gs i am about to get. If you have something against 3dmark, i dont mind but can you please provide me with another way to compare these two cards.
     
  32. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
  33. Business Man

    Business Man Notebook Consultant

    Reputations:
    4
    Messages:
    279
    Likes Received:
    0
    Trophy Points:
    30
    Anyone know any UK laptops that have the GDDR3 version? I can only find the DDr2 version of this gfx card.
     
  34. MttFrog13

    MttFrog13 Notebook Guru

    Reputations:
    0
    Messages:
    65
    Likes Received:
    0
    Trophy Points:
    15
  35. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    http://forum.notebookreview.com/search.php?searchid=1606654

    The 8400M-GT is the same card as the 8600M-gs, just with lower clock speeds. It will handle games at relatively the same levels as the 8600M-GS, because the bus width and number of stream processors are the same. It's very similar to the performance difference between the Go7600 and the Go7700.
     
  36. 10ten92

    10ten92 Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    If you don't understand that joke, it's so sad dude and nothing to say for you.

    pity :cry:
     
  37. Seijun

    Seijun Guest

    Reputations:
    1
    Messages:
    92
    Likes Received:
    0
    Trophy Points:
    0
    How will the performance of the DDR2 differ from DDR3 on a WXGA (1280x800) screen? Anybody know?
     
  38. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    There will probably be about a 10-15% performance difference in most games, maybe more if you're using memory-intensive textures or shading/lighting.
     
  39. meh_cd

    meh_cd Notebook Evangelist

    Reputations:
    140
    Messages:
    422
    Likes Received:
    0
    Trophy Points:
    30
    There is no joke. You're just being obnoxious. "lol asians are good at math"
     
  40. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Agreed. Repped.
     
  41. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Quoted for truth.
     
  42. Seijun

    Seijun Guest

    Reputations:
    1
    Messages:
    92
    Likes Received:
    0
    Trophy Points:
    0
    How noticeable would an increase like that be, if the two were side-by-side?
     
  43. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    If you were actively benchmarking, then yes, it would be noticeable in the scores. However if you were just playing the same game on two machines, side by side, the DDR2 version might have to have a texture setting turned one notch down, and to be honest I seriously doubt you'd notice the difference if you were just playing the game. The picture quality would be very similar, it's just the GDDR3 would be able to handle a little clearer picture, or maybe the same picture quality with a little smoother gameplay. The difference isn't huge at that kind of resolution.
     
  44. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    in this particular game scenario?

    impossible to detect. Most monitors display 85 fps max, so they are both displaying the same image.

    if it was a difference of, say, 60 frames per second and 80, it would be extremely difficult to detect, you might be able to see a difference side by side. maybe.

    If there was a difference between like 28 and 35 frames per second, that would be easy to detect with the human eye.
     
  45. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Well, that could be the case if you're playing a really hardware intensive game.
     
  46. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Yes, but the point is, that neither the GDDR3 or the DDR2 version will handle a real hardware intensive game that well. If you want smooth gameplay, you just crank one notch down in the settings, and you're suddenly getting 45 and 55 fps, which may be detectable, but both would look very smooth.
     
  47. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Actually that doesn't seem to be masterchef's point.
     
  48. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Your correct, that isn't masterchief's point, that's my point. The two cards differ only in memory bandwith (due to the lower clocked VRAM), which a) matters less and less as you go to lower resolutions, and b) has no effect on shader power. So at a fairly low resolution (1280x800), and equal settings, the difference will depend on how texture heavy the game is. However for modern games, at low resolutions, the lower clocked DDR2 memory shouldn't bottleneck your card more than 10-15%, and as I said, the picture quality produced by both cards will be the same, so as long as you choose settings that give you decently playable framerates on the GDDR3 version (~45 fps), and then compared that with a computer running the same game, at the same settings, on the DDR2 card, you'd see that it has the same picture clarity, with a slightly lower frame rate (maybe ~40fps). It's true that the GDDR3 card may be able to push the settings one notch higher than the DDR2 card, due to it's ability to handle larger textures faster, however my point is that in real world gaming scenario's, at lower resolutions, it would not detract from your game playing experience to have a DDR2 card rather than a GDDR3 card.

    Note: Of course, I could be completely wrong about all of this. As next-gen games develop, it's possible that the minimum memory bandwith for all the new games will exactly cut off the DDR2 card from being able to play them. But that would be highly doubtful. In the end, you have to realize that the raw shader power of the two cards is equal, and as long as you're not pushing your game settings to the limits, you'll get a very good experience out of either card.
     
  49. Seijun

    Seijun Guest

    Reputations:
    1
    Messages:
    92
    Likes Received:
    0
    Trophy Points:
    0
    Thanks guys, makes me feel better about getting the Sager 2090 over the ASUS G1s (the ASUS being more expensive and having poorer battery life). We will probably get another nb in 6 or 7 years anyway to upgrade with the times.
     
  50. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    ill second that, both ddr2 and gddr3 cards are easily in the same performance class. the performace gap should close at lower resolutions and aa settings.
     
← Previous pageNext page →