The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    So I have been reading a bit.. correct me if I'm wrong.

    Discussion in 'Other Manufacturers' started by DIGev, Jun 15, 2007.

  1. DIGev

    DIGev Notebook Consultant

    Reputations:
    2
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30
    I brought this up before in another thread, but I want to make this one because I want to argue with what I now know.

    The matter is about the nVIDIA 8600M GT GPU that comes with the 6625WD/6224W/6324 models of the Znotes.

    Zepto has confirmed that these will be shipping with 512MB of GDDR2. On nVIDIA's website the specs all show GDDR3, but of course now I appreciate the fact that the builders choose what is on the card, so GDDR2 models are possible, in this case, Zepto's. Let me show you some numbers:

    8600M GT - Stream Processors 32 :: Core Clock 475 MHz :: Shader Clock 950 MHz :: Memory Clock 700 MHz (1400 MHz) :/: Texture Fill rate 7.6 billion texels/s :: Max Memory Bandwidth 22.4 GB/s (128-bit)

    Desktop 8600GT - Stream Processors 32 :: Core Clock 540 MHz :: Shader Clock 1190 MHz :: Memory Clock 700 MHz (1400 MHz) :/: Texture Fill rate 8.64 billion texels/s :: Max Memory Bandwidth 22.4 GB/s (128-bit)

    8700M GT (Supposed specs) - Stream Processors 32 :: Core Clock 625 MHz :: Shader Clock 1250 MHz :: Memory Clock 800 MHz (1600 MHz) :/: Texture Fill rate 10 billion texels/s :: Max Memory Bandwidth 25.6 GB/s (128-bit dual-thing)

    Ok, so just looking at the numbers we can see that the 8600M GT is slightly slower than the desktop 8600 GT and the 8700M GT seems to be the better of them all. Naturally the desktop version could be better but considering that it only needs 43W to run at stock speeds; a 8800 Ultra needs 175W. I checked the 3DMark 06 score of a Desktop (similair setup as a 6224W or 6625WD: 2Ghz C2D, 2GB RAM) with a 8600GT and it gets a score of about 5000 with 1280x1024 resolution. Considering that desktop components, especially CPU, are generally more powerful because of luxury of power, it might be correct to say that a 8600M GT is indeed as powerful as a 8600 GT if clocked at same speeds, or not?

    The performance of the 8600M GT is really awesome for a mobile card, and it really, according to this http://service.futuremark.com/compare?3dm06=1791961 (Zepto claims 3500 points for 3DMark 06 at 1440x900), is an equivalent of a Go 7900GS, a card with 256-bit bus-width whereas 8600M GT has 128-bit!

    So here is my point (finally): Zepto says the 8600M GT coming with their laptops features GDDR2 clocked at 400MHz (800MHz). That basically guts the memory bandwidth nearly in half. I understand that if similiarly clocked GDDR2 and GDDR3 show no real difference in real-world performance (or so I have been told), however since GDDR3 is able to work on lower voltage levels that means it can work at clock speeds higher (2x as we can see) than GDDR2; maybe at this stage the power consumption is the same, but the performance is almost twice as good concerning memory bandwidth. Is it me, or is the placing of GDDR2 over GDDR3 on 8600M GT a large mistake, in terms of performance/power?
     
  2. wave

    wave Notebook Virtuoso

    Reputations:
    813
    Messages:
    2,563
    Likes Received:
    0
    Trophy Points:
    55
    You are right about pretty much everything you said.

    But I think the GDDR2 will make less of a differance in games then it did in 3dMark06. Also it is nothing to get too upset about. You are comparing this to the G1s right? I am pretty sure that the Zeptos and the G1s will be able to play the same games at the same settings. The G1s might get a few extra frames in some games but it wont be able to use higher settings where the GDDR2 will make it unplayably slow for the Zeptos.

    Some games might even run better on the Zepto because of the 512MB compared to 256MB in the G1s.
     
  3. DIGev

    DIGev Notebook Consultant

    Reputations:
    2
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30
    I'm not upset over this, I'm aware that it doesn't show so much in real-game performance. My quandry is the following: If GDDR3 uses less power and emits less heat while performing similiar or even better than GDDR2, then what sense does it make, especially in the case of laptops, to put in GDDR2, where the entire idea and process is to work towards better performance/power?

    1440x900 isn't exactly a large resolution (almost like 1280x1024), but for people with 6625WD's which have a 1600x1050 resolution this might be a problem if they want to run native on games. How much use is 512MB if it's GDDR2 when GDDR3 could use that memory MUCH better.
     
  4. Petrov

    Petrov Notebook Deity

    Reputations:
    212
    Messages:
    861
    Likes Received:
    78
    Trophy Points:
    41
    Like most things in life - price! The Zepto reps have previously indicated they felt GDDR2 gave an acceptable real-world price vs performance trade-off (indeed, they may well be right!). No-one at Asus has graced these forums to explain why they used 256mb instead of 512mb, but I suspect its the same reason (rather than a technical limitation). I'm speculating, of course.

    Petrov.
     
  5. DIGev

    DIGev Notebook Consultant

    Reputations:
    2
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30
    Price? Zepto sells DDR2-800 for the notebooks when Santa Rosa can't even take advantage of 800MHz, you only get slightly better timings for this; is that price/performance? No, because DDR2-800 on the Zepto site costs way, way to much for the increase in performance; approximately €150 more than having 2GB of DD2-667. Is better timings worth €150? I think opting for 512MB GDDR3 8600M GT's instead of stocks of DDR2-800 would have a greater price/performance ration. Of course I may be wrong because I'm not so sure how much GDDR3 costs compared to DDR2-800. Let's say this: If Zepto offered you the choice of this:

    1. Upgrade your DDR2-667 2GB to DDR2-800 for €150: you get better timings.
    2. Upgrade your 8600M GT from 512MB GDDR2 to 512MB GDDR3 for €150.

    Right now Zepto give option 1... why not option 2? Again, I may be wrong how much the price difference is between 512MB GDDR3 and GDDR2, but it doesn't seem to me it could be more than €200.
     
  6. Thorne

    Thorne Notebook Evangelist

    Reputations:
    15
    Messages:
    415
    Likes Received:
    27
    Trophy Points:
    41
    I've understood that it's significally much more expensive to use GDDR3 instead of GDDR2.

    And, from various articles and threads i've read that 256 MBs of GDDR3 is like 1/3 faster than 512MBs of GDDR2. I wonder, will even 8600M GT be able to fully use all that 512MBs of memory, 7600 certainly can't do that. :D
     
  7. DIGev

    DIGev Notebook Consultant

    Reputations:
    2
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30
    The first commercial product to claim using the "DDR2" technology was the NVIDIA GeForce FX 5800 graphics card. However, it is important to note that this GDDR-2 memory used on graphics cards is not DDR2 per se, but rather an early midpoint between DDR and DDR2 technologies. Using "DDR2" to refer to GDDR-2 is a colloquial misnomer (knowing that DDR2 is not like GDDR2, but since it's common knowledge it's ok to say DDR2.. just incase people didn't get this part). In particular, the performance-enhancing doubling of the I/O clock rate is missing. It had severe overheating issues due to the nominal DDR voltages. ATI has since designed the GDDR technology further, into GDDR3, which is more true to the DDR2 specifications, though with several additions suited for graphics cards.

    GDDR3 is now commonly used in modern video cards. However, further confusion has been added to the mix with the appearance of budget and mid-range graphics cards which claim to use "DDR2". These cards actually use standard DDR2 chips designed for use as main system memory. These chips cannot achieve the clock speeds that GDDR3 can but are inexpensive enough to be used as memory on mid-range cards.

    - Wikipedia
     
  8. Z-MAN

    Z-MAN Notebook Enthusiast

    Reputations:
    4
    Messages:
    18
    Likes Received:
    0
    Trophy Points:
    5
    All Country , same Thinks.

    Stupid
     
  9. Petrov

    Petrov Notebook Deity

    Reputations:
    212
    Messages:
    861
    Likes Received:
    78
    Trophy Points:
    41

    For me, without doubt, 2 - I'm hearing you! I wonder if another reason was availability of GDDR3... maybe Asus hoovered it all up for the G1S', which is why noone else seems to be using it on their cards, yet. Again, I am speculating, but that's what nerdy tech forums are for, right? ;)

    Petrov.
     
  10. wave

    wave Notebook Virtuoso

    Reputations:
    813
    Messages:
    2,563
    Likes Received:
    0
    Trophy Points:
    55
    Something like this is not possible. You would need 2 kinds or barebone because you need different motherboards. The GDDR is on the GPU which is soldered on the motherboard.

    Changing the ram is much easier. Just put in a different stick of ram.
     
  11. DIGev

    DIGev Notebook Consultant

    Reputations:
    2
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30
    I thought the Santa Rosa platform came with a MXM (or whatever it's called), so that means the GPU is NOT soldered to the motherboard. That's how Dell, I think, was able to provide graphics card upgrades.

    Anyway that is not my point. They are shipping with GDDR2, but the issue is why stocks of DDR2-800 were bought over stocks of GDDR3 512MB 8600M GT's, clearly the latter provides the greater benefit? I would rather have just the choice of GDDR3 512MB 8600M GT and DDR2-667, because later I can upgrade the RAM to DDR2-800 if I think I need better timings.
     
  12. Thorne

    Thorne Notebook Evangelist

    Reputations:
    15
    Messages:
    415
    Likes Received:
    27
    Trophy Points:
    41
    Well, no company's perfect. :)

    And MXM is an option to use, not a standard in Centrino Pro laptops (santa rosa).
     
  13. DIGev

    DIGev Notebook Consultant

    Reputations:
    2
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30
    True...

    Even now the new Znotes are really powerful machines. I'm not a tech junkie so maybe there is more to this issue (if it even is one) that I'm unaware of, so I can be comepletely wrong. Though, after much reading I still see it, from my own opinion, that a GDDR3 solution would have been better than stocking up on DDR2-800's. Gah, too used to complaining about things... I mean I have ordered my 6224W... I couldn't change it now. I like speculative complaining.
     
  14. Thorne

    Thorne Notebook Evangelist

    Reputations:
    15
    Messages:
    415
    Likes Received:
    27
    Trophy Points:
    41
    Well of course i'd love that too if Zepto would've used GDDR3 in the new models.

    But i think one of Zepto's selling tricks is to keep the prices lower than their competitors, and therefore they couldn't take the more expensive GPU memory.

    They had those PC6400 sticks even when i bought my laptop (roughly 3 months ago) and i believe they noticed that they don't sell all that well, so they have to somehow get rid of them. That's just my guess.
     
  15. wave

    wave Notebook Virtuoso

    Reputations:
    813
    Messages:
    2,563
    Likes Received:
    0
    Trophy Points:
    55
    If you read the HP review from yesterday you will see that it also uses 800mhz ram. I am not sure how it is supported. I asked Andrew for CPUz screen shots and some memory benchmarks. He said he is on the read this weekend but will do them next week. We will know then.