The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Differences? 8800M GTX and 9800M GT

    Discussion in 'Sager and Clevo' started by wr0ck, Jul 15, 2008.

  1. kobe_24

    kobe_24 Notebook Deity

    Reputations:
    292
    Messages:
    1,088
    Likes Received:
    5
    Trophy Points:
    56
    Waiting on the answer to this, thanks!
     
  2. Hauser

    Hauser Notebook Enthusiast

    Reputations:
    11
    Messages:
    16
    Likes Received:
    0
    Trophy Points:
    5
    The 8800 GT is using a G92 GPU(=Graphic Proccesing Unit) as base
    Same GPU, no changes beside the clock and deactiveted units, as on a 9800 GTX.

    There is only some optimization to the orginal G80 GPU, wich was only used on the 8800 GTX, 8800 Ultra and 8800 GTS(384MB/786MB Version, NOT the 512 MB/1024 MB Versions). The desktop 8800 GT has already this optimizations since it uses a G92 GPU. Samething for the 8800M GTX. The G94 GPU, which is used for the 9600 GT is simply a halfed G92.

    Between the GPUs of a 8800 GT and a 9800 GTX is no difference in the GPU architecture, only deactived unit block, and so no difference in the architecture of the shader units.
    The whole thing is only PR.
     
  3. Hauser

    Hauser Notebook Enthusiast

    Reputations:
    11
    Messages:
    16
    Likes Received:
    0
    Trophy Points:
    5
    You have a 8800M GTS, where half of the units of the used G92 GPU are deactivated (16 ROPs, 64 Shaderunits, 32 TMUs, 32 TAUs). The 9600 GT based the G94 has the same parameters and is simply a "physically" halfed G92 for cheaper production. At the desktop 9600 GT no units are deactivted. From the side of the active units they are equal. So only the clock determinated how much faster the 9600 GT is:

    500 Core/1250 Shader/800 Memory (8800M GTS) vs 650 Core/1625 Shader/900 Memory (9600 GT)

    The 8800M GTX has the same clock as the 8800M GTS but 1/3 more units. Compared to the 9600 GT clocks the perfomance should be very comparable.
     
  4. Hauser

    Hauser Notebook Enthusiast

    Reputations:
    11
    Messages:
    16
    Likes Received:
    0
    Trophy Points:
    5
    Sorry, it was not mentioned offensive from my side.

    Sure, there is 40 % difference.................in the shader clocks....... :D :D Sometimes it is more efficient to have more clock then more execution units.

    But the 8800M were already "generation 9", or better said they even were the first cards based on the G92 GPU. :D

    I belive the 9800M GT will be based on the G94b (desktop 9600 GT shrinked to 55nm) and the clocks will decide if it is faster as a 8800M GTX or not.
     
  5. ThmsLngbrd

    ThmsLngbrd Notebook Guru

    Reputations:
    14
    Messages:
    52
    Likes Received:
    0
    Trophy Points:
    15
    At Multicom.no the 8800M GTX costs 1670 NOK (300+dollars) more than the 9800M GT. Thank you all for helping me save money! ;)
     
  6. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    I didn't want to post here guys, but I figured i should clear this up since there are a lot of laptop junkies posting about hardware as if they know something and I am upset that people are believing this junk. It's not true.

    And I don't know why people are trying to think the 8800m GTX is equaled by the 9800m gt.
    Lets get something straight, I've been building PCs for 8 years and I'm 20 years old now. I started quite young and I'd like to say I'm quite the hardware aficionado. Hell I've just built a GTX 280 SLI system with my qx9650 and Striker II Extreme watercooled by about 600 dollars in parts, so I'd like to think I know what I'm talking about.

    What people are saying is incorrect, the 9800m GT is nowhere near the 8800M GTX. I understand paladin44 is doing his job saying they're statistically insignificant, this is not true Paladin, but mistakes happen and I forgive you since you're only advertising what your suppliers tell you.

    Let me break it down this way.
    The "64 cores" that are referred to in the newly-loved 9800 GT means 64 shader units. This is in no way DIFFERENT From the G92 96 shaders that are in the 8800M GTX.
    The 8800M GTX's statistics are the following:
    96 shader units (or cores)
    500 mhz core clock
    1250 mhz shader clock
    1600 mhz memory clock
    51.2 GB/S Memory bandwidth
    24.1 GigaTexels fill rate
    256-bit memory bandwidth

    THe 9800M GT statistics (per that blooper by xotic PC releasing the specs of the card on that gpu-z, yet we never see it in the next batches of screenshots).
    64 shader units (or cores)
    500 MHZ core clock
    800 mhz memory clock
    1250 shader clock.
    .

    Now why on earth would these notebook sellers tell you that the "difference" is insignificant? Simple, if you use 3dmark to prove this, we know that a 9800m GT is on a montevina platform which uses ddr3. Trust me when I say ddr3 gives about 4-5000 more 3dmarks depending upon the CPU and the memory FSB>
    When I had a DDR2 desktop rig and I upgraded to a DDR3 rig the only thing that changed was my q6600 to a qx9650, and my ddr2 6400 at low timings (4-4-4-12, which matter to 3dmark) to ddr3 12800 at 7-7-7-20. I went from 14500, to about 20 000. Think I'm kidding? I'm not.

    I know alot of people want a ddr3 laptop to say "hehe yea i got ddr3 a whole TWO GIGABYTES :eek:" And they're quite proud. I'd like to say good for you. My desktop runs 2x2gb dominators at 1800 with 8-8-8-24 timings, so I must say I have it pretty good.
    The only reason I am here now debating a montevina purchase is due to a return of my m15x from alienware and the availability of the quad core. I know what ddr3 offers, and contrary to common belief, memory bandwidth does help in a significant amount of games. If you're too lazy to google then I can't help you. Just because a few games don't benefit from the higher overhead doesn't mean that all games will not, a good example will be looking at those benchmarks available at all high-end hardware sites.

    Why am I posting this? Because I Don't want you to get misled. a 3dmark score of 10 000 with ddr3 on a 9800 M GT with an equivalently-clocked montevina gpu will not be equivalent to a montevina with an 8800m GTX, I'd like to hedge bets on about a difference of 20-30 percent.

    The way I'd like to compare the 2 video cards is comparing and looking at benchmarks of the 9600 GT and the 8800 GT. Yes I know that the 8800 GT has 112 shaders as opposed to the 8800 M gtx's 96, but it should give you a healthy idea of how different these 2 video cards are.

    Montevina is a great platform and I, myself will be getting it; but don't let yourself believe you neeed ddr3, and with that compromise in mind don't think that a 9800m GT and a montevina will offer better performance in GAMING than a 8800M GTX and santa-rosa. You have to remember, in games that actually benefit from ddr3 that the montevina 9800m GT will, maybe at BEST tie the santa rosa 8800m GTX. Put them in a game that doesn't benefit from memory speed such as, say, Oblivion, you will see that the 8800m GTX outmuscles the 9800m GT>

    You guys should actually stop believing what one retailer or someone tells you, sometimes we make mistakes or we're misinformed, There is a big difference in the GPUs, as there should be. Do a little look for yourself and try to see this because I just want you guys to know what you're getting into.

    All the best.
     
  7. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    interesting in this wall of text here you do not have a link to prove anything you say >.>. I'm not saying you are wrong, but you shouldn't say someone else is misleading people and not offer any proof of it.
     
  8. Dreidel

    Dreidel Notebook Evangelist

    Reputations:
    144
    Messages:
    315
    Likes Received:
    0
    Trophy Points:
    30
    If you go to the Sager section of the forums, you will notice that Justin@XoticPC posted benchmarks of the 9800m GT scoring 9600+ in 3dMark06 @ 1280x1024 resolution.
     
  9. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Right, but 3dmark is affected by ddr3 performance, as I've stated Dreidel.
    Tell them to bench the 9800 GT in a ddr2 system to level out the playing field and see the discrepancy :)
     
  10. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    Your reasoning is flawed, memory does NOT have that much to do with 3dmarks.
    The difference you saw was between the processor upgrade, not the memory upgrade.
    Its been stated in many places, even this forum that 3dmark is highly CPU bound.

    I have a q6600 + 4gb of ddr2 + an 8800gts on my home rig.
    My roomate has a q6600 + 4gb of ddr3 + the same exact 8800gts.
    His scores beat me by around 50 points.
     
  11. MKang25

    MKang25 NBR Prisoner

    Reputations:
    179
    Messages:
    1,715
    Likes Received:
    0
    Trophy Points:
    55
    DDR 3 does jack for 3d mark no idea what you are talking about 4k increase..... My PC has 2.7dc, 4gig ddr3 and 8800gt vs my 5793 their is only a 1.5k point difference which is the .2 difference in clocks and the GPU..
     
  12. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Here is the benchmarks for gaming in general ddr3 vs ddr2 on the newer chipsets. They are desktops but do not misunderstand the purpose of what I"m trying to say.:
    http://hardocp.com/article.html?art=MTUxMyw1LCxoZW50aHVzaWFzdA==

    Psycroptik you are correct, I was mistaken about 3dmark, I meant all SYNTHETIC benchmarks such as everest, superpi and whatnot. And when you say I got that much from a processor upgrade, what's the difference here? I went from an overclocked fsb at 1200 to a stock fsb at 1333 and from 8mb cache to 16. The difference between santa rosa and montevina chips is that the front side bus is greatly increase from 800 to 1066, but this time also running 1:1 with the memory (2:1 if you know that ddr3 is twice as fast but who cares) which increases the score as well. You are right in saying that the memory speed doesn't affect it, but going from a 65nm q66 to a 45nm qx9650 on stock doesn't result in that big of a difference unless you're also changing other parameters of the system : EG Chipset, memory, and thus affecting the cpu scores by affecting synthetic benchmarks because as you know, running 1:1 significantly improves performance, here let me show you:
    http://www.anandtech.com/mb/showdoc.aspx?i=3283&p=7
    Shows roughly about a 600 point increase with ddr3 alone, so say that the score was 9000 something, subtract 600.
    I'm almost certain that if they bench the ddr2 systems that the 8800m GTX will win by a bigger portion than you think. The specs prove it.
     
  13. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    .2 difference in 1500 3dmarks? You realize there's more than just a .2 difference with that kind of discprency, right?
     
  14. MKang25

    MKang25 NBR Prisoner

    Reputations:
    179
    Messages:
    1,715
    Likes Received:
    0
    Trophy Points:
    55
    also the fact that the 8800M Gtx is considerably weaker than my 8800 GT
     
  15. someguyoverthere

    someguyoverthere Notebook Evangelist

    Reputations:
    123
    Messages:
    401
    Likes Received:
    0
    Trophy Points:
    30
    There's no way simply switching to DDR3 adds 5000 3dmarks. Especially if you consider the higher latency. Also, I'm assuming you're talking about system RAM since GPU RAM is GDDR RAM.

    All of that is frankly academic. What matters to be is FPS in games. If 3dmarks are even a half decent way of predicting that, then who cares how the 9800 GT manages to be equal to the 8800GTX, as long as it is.
     
  16. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    You know all that about computers and you still bought a m15x?

    I call bluff ;)
     
  17. DRTH_STi

    DRTH_STi can't.stop.buying.laptops

    Reputations:
    142
    Messages:
    751
    Likes Received:
    13
    Trophy Points:
    31
    Noel might have a point here. Put it simply - maybe whatever gains ddr3 is supposed to have is being soaked up by the weakness of the 9800gt giving us a relatively regular benchmark number to the 8800gtx. if it is equal then + ddr3 should give it higher 3dmarks

    think about it too- shouldn't we get faster benchmarks cuz of faster processors?
     
  18. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    Lets start with the first link. Only what we care about is the blue bars, since those are all the same CPU, just different memory (DDR2 or DDR3)

    [​IMG]

    One of the DDR2 here does bad, the other is comparable to the DDR3. Nothing to go crazy over here..

    [​IMG]

    Here they all do about the same, kinda going against what you said a little more now..

    [​IMG]

    Were still doing the same, that EVGA nForce board with DDR2 doesn't do so well though..

    [​IMG]

    Still very comparable, even in Crysis..

    I think you might have been reading this wrong and/or comparing the green bars with the blue... The green bars are all different CPU's.
     
  19. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Err,
    The 8800m GTX is not far off from your 8800 GT champ:
    http://hothardware.com/Articles/NVIDIA_GeForce_8800M_Preview/?page=2

    As I see, being 16 SP short, plus being downclocked on the shader, memory and core (it would be too hot to fit in a notebook) so being 9.6 gb/s short in texture fill rate is okay with m.e
    What I was saying is that the 9800 GT is not statistically insignificant compared to the 8800m GTX. All signs point to it being better, vantage isn't fair to use just yet until both are benched on the same board and processor.
    As far as I can see, the GTX is much faster, as it should be and is also more expensive. The 8800m GTX is not deprecated until the 9800M GTX is officially released, that's how nvidia works.
     
  20. Ad@m

    Ad@m Notebook Guru

    Reputations:
    24
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    I wouldn't be surprised if it is weaker because the 9800gt is so much cheaper than the 880mgtx.
     
  21. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    Now your saying a 600 point difference? 4-5K or 600 which is it?
    Also this link doesn't show a 600 point difference. Might be the wrong one, or your reading it wrong..

    If you even read the first paragraph it says...
    "Futuremark's 3DMark06 benchmark scores depend solely on two subsystem evaluations: how powerful is your 3D graphics card(s) and just how good is your CPU at crunching numbers? In fact, memory bandwidth and access latencies have no noticeable effect on the final score. Of course, there are games that also show little benefit from improvements in the memory subsystem. Regardless, 3DMark06 is and always will be a synthetic benchmark and we put more stock in actual gaming performance results."
     
  22. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    Now I'm really lost....first its not even close comparison and now you sound like you are saying the 9800m is better
     
  23. anexanhume

    anexanhume Notebook Evangelist

    Reputations:
    212
    Messages:
    587
    Likes Received:
    2
    Trophy Points:
    31
    Being completely legitimate, if this is the case and 9800m Gt is more like 8800m gts (both share the 64 number), then what the heck is 9800m gts? Furthermore, if the estimates that the hd3870 is between 8800m GTX and GTS are correct, it would turn out to be the best performing card for this particular config with DDR3, and for 195 less. I'm a little skeptical about it all though. Too bad we can't bench the system for a while (esp with 3870 in there).
     
  24. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    If you read my wall of text psyc I said that in some games ddr3 has a clear advantage, in some games it does NOT. I did say that in my official post. Some games you will see the difference and some you wont. Vantage is not a fair comparison either since it likes high memory bandwidth.
     
  25. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    I read it, gave me a headache though.
     
  26. v_c

    v_c Notebook Evangelist

    Reputations:
    124
    Messages:
    635
    Likes Received:
    0
    Trophy Points:
    30
    Nvidias naming conventions give me a headache...
     
  27. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    You're telling me.
    I'm trying to find some Fully buffered dimm benchmarks of an equivalently clocked Xeon versus a DDR3 Core 2 counterpart to illustrate my point in 3dmark06 because It's not just a socket change, it's a chipset/socket/memory change.
    I could understand if it's just 2 q6600s on 2 different motherboards and memory, since their socket and ability is the same. However when a new chipset with a new socket and memory is introduced as whole, without an alternative (You cannot get ddr2 montevina), this will show that there is more of a difference clock for clock than we think, we just need Chaz or someone to step up and show everyone this because he has the means to benchmark a 9800 GT on the same testbed as an 8800m GTX to show
     
  28. metromike

    metromike Notebook Consultant

    Reputations:
    60
    Messages:
    161
    Likes Received:
    0
    Trophy Points:
    30
    @ Noel -

    You seem pretty knowledgeable about this stuff. If the gigaflops of the 9800M GT and the 8800M GTX are the same as reported by nVIDIA, should the cards perform the same? The 8800M GTX and 9800M GT are both rated at 360 gigaflops, and the 8800M GTs and the 9800M GTS are both rated at 240 gigaflops. I think it was noted earlier that 9000 series is supposed to used shaders that are 40% more efficient than the previous generation (8000 series). Couldn't this account for the performance increase at the same number of stream processors?

    Perhaps the 9800M GTS is using last generation stream processors or something -- I don't know how, unless it's just an underclocked 9800M GT, it could be rated at only 2/3 the processing power of the 9800M GT.

    I am not saying that DDR3 isn't playing a role in the 3DMark 06 score. Just offering some alternative possibility as to why the GT is getting similar marks as the 8800M GTX with 1/2 the SPs.

    Can Justin or Paladin explain this gigaflops difference? It's vexing me and probably a lot of other people out there.
     
  29. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    We already have benchmarks up from Justin @ XoticPC for the 9800GT and I can tell you that the scores are almost identical to the 8800GTX on the old DDR2 platform
     
  30. Dreidel

    Dreidel Notebook Evangelist

    Reputations:
    144
    Messages:
    315
    Likes Received:
    0
    Trophy Points:
    30
    Not to mention the drivers for the 9800m GT are pre-mature and new drivers could attribute to a better benchmark.
     
  31. Anomaly10

    Anomaly10 Notebook Evangelist

    Reputations:
    83
    Messages:
    311
    Likes Received:
    0
    Trophy Points:
    30
    I think the best way to solve this would be to have someone at Sager fire up Crysis on an NP8660 and crunch out some framerates for us =P
     
  32. diabolical

    diabolical Notebook Consultant

    Reputations:
    0
    Messages:
    108
    Likes Received:
    0
    Trophy Points:
    30
    ^^ agree... lol
     
  33. Badday17

    Badday17 Notebook Guru

    Reputations:
    0
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    ^^
    ^^ I agree TO......................
     
  34. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Gigaflops is a terminology for the amount of work one piece of hardware can handle.
    Although both have 360 gigaflops but that doesn't necessarily mean they have the same texture fill rate (which is more important), but likely they have similar memory fill rates.

    About your question about more efficient shaders, this is actually incorrect. The ability of the shaders to be more "efficient" persay is a bit of a lie, if we look at nvidia's new nomenclature they've just stated that the shaders are now cores. You've also gotten it reversed, the lower-shader count GPUs, when competing against eachother will fall bigtime short in shader intensive games, eg crysis. To see the analogy, look at an 8800 GT and compare it to a 9600 GT>.

    In general ,what I'm going to expect to see when these 2 are pitting against eachother, is that as the amount of filters increase (resolution, aa, etc) the 8800m GTX will start to pull away. In some games the 9800 GT will tie if the games will benefit from it's architecture.

    The GT 200 does have a "more efficient" shader but don't expect to see that fabrication anytime soon. It's the GTX260 and 280 that you see on shelves today. What nVidia did with the GT200 shaders is that there are now 3 streaming multiprocessors per Texture Processing Cluster and there are now 10 TPCs as opposed to 8.
    This gives a higher shader count of 240 (10*8*3) which isn't really more efficient.
    So per TPC we have:
    24 Shader processing units
    8 texture fetch units.
    10x24 = 240 shaders
    8x 10 = 80 texture fetch units.
    plus a 512mbit memory bus
    This all equals a huge and overpowered single GPU solution.

    Also, when addressing your last generation first generation, there is really no such thing my friend, From the G80 architecture (The 8800 GTX and the 8800 GTS 640 and 320) the G92 was a die shrink with Onboard HD decoding.
     
  35. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Can you please shut up about this?
    Please google unified driver architecture!

    THERE IS NO SUCH THING AS PREMATURE DRIVERS, THEY'RE UNIFIED DRIVERS THAT SUPPORT PRESENT AND PAST NVIDIA GPUS HENCE THE PRAISE
     
  36. metromike

    metromike Notebook Consultant

    Reputations:
    60
    Messages:
    161
    Likes Received:
    0
    Trophy Points:
    30
    Noel,

    Thanks for taking time to go through all of that.

    Does texture fill rate contribute to the amount of work the card can handle (gigaflops)? If it does, then by conservation mustn't something balance out in terms of processing on the 9800M GT? What would that be? Could this card be better for games that aren't texture intensive (RTS games, etc.) or are you thinking that the 8800M GTX will perform better in pretty much all cases?

    I wish XoticPC and PowerNotebooks offered the 8800M GTX as an option at the very least... :-\

    (Though I know it's not under their control)
     
  37. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Texture fill rate is something I consider vital, it's basically the amount of pixels rendered on a screen. Eg if you're playing a game like Crysis, a higher texture fill rate will usually result in a higher ability to render games at better frames. Texture fill-rate is becoming less important though since the introduction of shader cores since they offload the stress, here is a good definition that is not available anymore, courtesy of gpureview.
    Keep in mind it's a little old, but you should get it if you compare the video cards of now to those in there. Eg a 6 series would be equivalent to a 9 series in this day and age, and whatnot.
    Eurocom offers the 8800M GTX on the 860TU I may get it, but I"m not sure. I may go back to my original plan and just wait it out until I find a better looking laptop. The clevo is awfully heavy (8 pounds nearly with batter) and the same size as an m15x but a lot less good-looking. I may or may not get it depending on my ability to force a quad in there.
     
  38. Nirvana

    Nirvana Notebook Prophet

    Reputations:
    2,200
    Messages:
    5,426
    Likes Received:
    0
    Trophy Points:
    0
    so, 9800m GT is not close to 8800m GTX because:
    1) the new system uses DDR3
    or
    2) one card is actually slower than the other?
     
  39. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    One system is using a completely new interface with a new socket. This is more than just a change in ram or comparison to, say, a q6600 on ddr3 ram and ddr2 ram. This socket P is designed for one memory interface, ddr3. This allows intel to change the way how the processor and memory communicate since the speed is now established at a baseline of atleast 1066 MHZ (Since ddr3's lowest speed is 1066 approved by JEDEC). ALSO the 9800 GT is slower statistically in every facet I've seen so far from the 8800m GTX, so Id like to know how people are deducing they're the same.
    An x9000 benchmark on a 9800GT isn't really fair either, since it's much faster (2.8 I think right, or 3.0? Nonetheless compared to our beloved t9300 that's atleast 300 megahertz per core)
     
  40. v_c

    v_c Notebook Evangelist

    Reputations:
    124
    Messages:
    635
    Likes Received:
    0
    Trophy Points:
    30
    Yep, that is correct. Ive seen benchmark of 8800M GTS with x9000 score around the same. And similar scores in the sm2 and sm3.

    The 9800M GT will definitely not perform as well as the 8800M GTX in real-world applications (ie gaming). It will perform somewhere between the 8800M GTS and 8800M GTX, but probably closer to the GTS.

    The 9800M GTX on the other hand, will be quite a lot better than the 8800M GTX. 112 stream processors and 420 GLOPS.
     
  41. NoelGallagher

    NoelGallagher Notebook Consultant

    Reputations:
    39
    Messages:
    147
    Likes Received:
    0
    Trophy Points:
    30
    Absolutey V_C, I'm with ya man.
    I'm just fed up with people taking some words as gospel, I'm trying to help everyone get educated on what a video card's primary attributes
     
  42. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    It all depends on how "watered down" the 9800M is when we get it.
     
  43. Guiming

    Guiming Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    which card is more future proof though? i do game, but i dont care that much about running every single game at highest resolution w/ 132432141324x antialiasing and other stuff, i just want a gfx card that will last me a few years and still be able to run most games i throw at it well
     
  44. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    I don't think they are selling the 8X series anymore here in the US. So I think that leaves you with the 9x or the 9X.
     
  45. Guiming

    Guiming Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    I realise that, i just want to know if im getting more or less for my money since the prices are comparable
     
  46. Badday17

    Badday17 Notebook Guru

    Reputations:
    0
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    I went with the 9800m gtx in my order(NP5976) at Xotic. But I would like to see some games benches on the 9800m gt.
    I wanted to be more further proof. I hope I spent my money wisely. ???? Compute Cores 112 Gigaflops 420.....
     
  47. DROT

    DROT Notebook Guru

    Reputations:
    0
    Messages:
    66
    Likes Received:
    0
    Trophy Points:
    15
    I don't have time to wait until early august eta so 9800GT for me :)
     
  48. Nirvana

    Nirvana Notebook Prophet

    Reputations:
    2,200
    Messages:
    5,426
    Likes Received:
    0
    Trophy Points:
    0
    can someone answer my question in post#88?
     
  49. MKang25

    MKang25 NBR Prisoner

    Reputations:
    179
    Messages:
    1,715
    Likes Received:
    0
    Trophy Points:
    55
    One card is slower than the others.
     
  50. psycroptik

    psycroptik Notebook Consultant

    Reputations:
    117
    Messages:
    246
    Likes Received:
    0
    Trophy Points:
    30
    Everything we've seen so far leads us to believe that the 9800MGT is on par with the 8800MGTX. We need some Crysis benchmarks at different resolutions to really see. My guess the 9800M GT will outperform the 8800M GTX at higher resolutions by 5-10%.
     
← Previous pageNext page →