The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Why is X1400 better than 7400? (No Vague answers plz)

    Discussion in 'Gaming (Software and Graphics Cards)' started by latestgood, Sep 5, 2006.

  1. latestgood

    latestgood Notebook Consultant

    Reputations:
    30
    Messages:
    251
    Likes Received:
    0
    Trophy Points:
    30
    Hello,

    A lot of people are saying that X1400 is better than 7400. People are keep saying that it has better feature. This is ver vague answer and I really don't why it's better. It would be great if someone can explain to me why it's better in details.

    I know that 7400 has 64bit but has higher clock and core speed than x1400. Even though 7400 is 64bit it has higher speed.

    Thank you~
     
  2. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    A lot of a video card's performance can be tied to its memory bus. The difference between 128-bit and 256-bit is night and day. 128-bit is the accepted standard for "good."

    64-bit seriously cripples a card's performance, and oftentimes the performance deficit is very nearly linear; two identical cards, one with a 64-bit bus and one with a 128-bit bus - the 64-bit one will perform little more than half as well as the 128-bit one. Mercifully, this linear drop doesn't translate as heavily to the difference between 128-bit and 256-bit. It's also worth noting that as technologies mature, a part with half the memory bus can still outperform an older part (i.e. GeForce 7600GT is on par with a GeForce 6800GT, and faster in some cases).

    The 7400 also has half the number of Raster Operators (last stage of the graphics pipeline) as the X1400, and relies much more heavily on TurboCache than the X1400 relies on HyperMemory.
     
  3. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    To be fair though, TurboCache is a much better implemented technology than HyperMemory. That, and I hate ATI's drivers. Maybe things will change now that AMD is in charge, but ATI's drivers have issues that I've never run into with NVIDIA.

    What Pulp didn't say:

    The X1400 has a 128-bit memory bus, and the Go 7400 has a 64-bit one. So that's a big point in the X1400's favor.
    You'll also note that Pulp's notebook has an NVIDIA GPU in it as well ;)

    Even better: NVIDIA vs. ATI , including almost all relevant specs of the different cards.
     
  4. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    the only thing I have to say is this: I believe ATI recently released drivers that *through code alone* were able to improve the performance of some of their GPUs 10-20%.

    That implies two things: 1) they don't write the best drivers out there, and 2) they at least are learning...maybe some day their drivers will be top notch.

    P.S.: Recently, when I was in Best Buy shopping to upgrade a PCIe GPU, they recommended NVIDIA over ATI solely because of driver related reasons.
     
  5. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231

    The 6.8 release added almost 100 3dmarks from my 6.7 score. They are getting better with thier drivers. I just hope that AMD doesnt do anthing to limit Intel/ati performance when compared to AMD/ati performance. Truth be told , even though Im more ATI sided, turbocache is much better used then hypermemory. But I look at it this way the dedicated vram is what makes the biggest impact.
     
  6. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Nvidia and ATI both improve performance in games through driver releases, its called optimization and its perfectly normal to do so. Just like say Windows programmers are able to tighten up their code to improve performance, thats what Nvidia and ATI do when they release new drivers. When these performance increases come at a cost to image quality though, thats a different matter. Both companies have done this in the past (Nvidia infamously with their FX range of cards as they were so piss-poor the only way they could compete with the 9800 series was to cheat) but neither company has resorted to such tactics for quite some time now.
     
  7. KGann

    KGann NBR Themesong Writer

    Reputations:
    317
    Messages:
    2,742
    Likes Received:
    0
    Trophy Points:
    55
    I used to have an FX5500 back in the day, and it ran dang good! It ended up melting, and PNY wouldn't allow me to RMA it... but it was a good card in it's day!