The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    ATI Mobility Radeon X2300

    Discussion in 'Gaming (Software and Graphics Cards)' started by Xolith, Mar 5, 2007.

  1. Xolith

    Xolith Notebook Enthusiast

    Reputations:
    0
    Messages:
    38
    Likes Received:
    0
    Trophy Points:
    15
  2. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
  3. HavoK

    HavoK Registered User

    Reputations:
    706
    Messages:
    1,719
    Likes Received:
    0
    Trophy Points:
    55
    Basically, it's nothing special at all...
     
  4. FREN

    FREN Hi, I'm a PC. NBR Reviewer

    Reputations:
    679
    Messages:
    1,952
    Likes Received:
    0
    Trophy Points:
    55
    Don't buy the A8JR. Buy the A8JP instead. It has the Radeon X1700, which is a much better card than the X2300, for only about $50 US more.

    The X2300 isn't DX10 compatible - it's just a refresh of the X1300, which as you can tell by the numbering, is weaker than the X1700 (in fact, the X1700 is more than twice as powerful).

    Don't let ATI's marketing scheme fool you - there's nothing special about the X2300.
     
  5. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    There is some interesting details about the X2300s people don't know.

    The typical/regular X2300 is basically the same as an X1400 from what I here, just a bit of modification towards being used for Vista. However, the X2300 HD is supposed to be a whole other animal, as it's slated to be utilizing unified shader architecture and be fully DX10 capable.
     
  6. Xolith

    Xolith Notebook Enthusiast

    Reputations:
    0
    Messages:
    38
    Likes Received:
    0
    Trophy Points:
    15
    Cant seem to find A8JP in the uk, Thanks for your help as i probably would of bought the asus because the X2300 sounded new and better then the X1400.
     
  7. mujtaba

    mujtaba ZzzZzz Super Moderator

    Reputations:
    4,242
    Messages:
    3,088
    Likes Received:
    515
    Trophy Points:
    181
    It seems to be equal to GeForce Go X1450
    (GeForce 7100 is nothing but a rebranded 6200)
     
  8. HavoK

    HavoK Registered User

    Reputations:
    706
    Messages:
    1,719
    Likes Received:
    0
    Trophy Points:
    55
    Now that is one seriously confusing post....

    Geforce X1450?? What's the 7100 got to do with it either?
     
  9. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    Alright stop saying its an x1450 if you don't know what your talking about.

    First off we haven't even seen benchmarks for an x1450 or an x2300so stop with the 'it seems to be a geforce go x1450'; first off whats a geforce x1450 and how do we even know the x2300 ~ x1450??????? Give me a link or something don't just speculate.
     
  10. FREN

    FREN Hi, I'm a PC. NBR Reviewer

    Reputations:
    679
    Messages:
    1,952
    Likes Received:
    0
    Trophy Points:
    55
    Do you mean Radeon X1450 ... ?
     
  11. Mayer

    Mayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    18
    Likes Received:
    0
    Trophy Points:
    5
    So the MRx1900 is still the latest and greatest from ATI??

    A little besides the point...but will DX10 games still be playable with DX9 cards? Like the x1900? Is Crysis a DX 10 game?
     
  12. jak3676

    jak3676 Notebook Consultant NBR Reviewer

    Reputations:
    13
    Messages:
    184
    Likes Received:
    0
    Trophy Points:
    30
    You can look up the specs on ati/amd's website. They list the number of shaders, memory, etc. It is a rebadged x1450, no room to dispute it.

    X2300 specs
    http://ati.amd.com/products/mobilityradeonx2300/specs.html

    X1450 specs
    http://ati.amd.com/products/mobilityradeonx1450/specs.html

    The Go 7100 thing is just Nvidia's doing the same kind of thing. Just as ATI is releasing a previous generation part (x1450) with a new generation name (x2300). The Nvidia Go 7100 is a rebadged Go 6200 - Same specs. Both companies have done this before, and both will likely do it again. It's just marketing (although slightly misleading for the uninformed). There's nothing wrong with either part - so long as you recognize it for what it it and is not.

    As far as running DX10 games on a DX9 card (or a DX9 OS for that matter):
    For the next few years game makers will have to release their games to support both versions. All of the upcoming games that are currently being discussed (Crysis, Alan Wake, etc) will be playable on DX9 hardware (and OS). There will likely be some graphical bells and whistles that do not work on DX9 however. People like to jump from my last statement and assume that that means a low-end DX10 card will be better than a high-end DX9 card. This is not true for high end games. If you have a low end card (little memory, few shader units, weak memory interface, etc.) the card will not be able to render the games graphics at high settings (possibly even medium settings if you get a really low-end card). The new DX10 bells and whistles will need to be disabled in order to free up GPU for basic drawing and shading.

    A low-end DX10 card will be better than a low-end DX9 card, and a high-end DX10 card will be better than a high-end DX9 card. But if you are looking to play high-end games at high resolutions, you need to compare number of shading units, memory, clock speed, memory interface, ..., not just DX level of support. It's kinda like trying to compare a T2500 to a T5200. There's a lot more you need to look at than just the generation of the chip.

    Sorry if this seems like ranting. I see lots of people just about to make this mistake.
     
  13. guy_from_oz

    guy_from_oz Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    hey there

    i just wanted to say that i bought an Asus A8Jr which has the X2300 in it and the card does seem to be worth sumthin. For one thing i can play C&C 3 on it at 1200x800 with all details set at medium and running at a constant 35 fps. Also, it can play battlefield 2142 at 1024x768 with details all turned down low but veiw distance high at a steady 60 fps. However, it isnt DX10 compatible which really annoys me :mad: