The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    x2300 is not DirectX 10 compliant

    Discussion in 'Gaming (Software and Graphics Cards)' started by Mr._Kubelwagen, Feb 7, 2007.

  1. Mr._Kubelwagen

    Mr._Kubelwagen More machine now than man

    Reputations:
    398
    Messages:
    744
    Likes Received:
    0
    Trophy Points:
    30
    Ok, so I've been reading a lot about the x2300, available in the a8jr, and it has been toted by forum contributors as being the "first mobile dx10 video card". Unfortunately, according to ATI's own site, the X2300 is only DirectX 9.0c capable.

    This leads me to wonder if ATI released the x2300 for the sole purpose of confusing hopefuls like us into buying this card for it's "reputational DirectX 10 compatibility". Just because it's part of the x2000 line, doesn't mean it's DX10 compliant.

    I just hope nVidia or ATI release something soon, and stop trying to brush off their "new" technology as being state of the art.
     
  2. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    This kinda sucks, while I didn't expect much from the x2300, I at least expected it to be a low end dx10 card but now it just seems some rather useless remake of the x1300. They should just call it the x1350 but I guess the x2300 just sounds "more advanced".
     
  3. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
  4. kungfu11

    kungfu11 Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    0
    Trophy Points:
    5
    wow, sucks... so wat is it? something better than x1300 and x1400?
     
  5. crappyGPU

    crappyGPU Notebook Consultant

    Reputations:
    102
    Messages:
    248
    Likes Received:
    0
    Trophy Points:
    30
    too bad, i was hoping to see a mobile DX10 card review :(
     
  6. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    To me it sounds like it only consumes less power but is still as weak as the x1300.
     
  7. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Gah, that's ridiculous... At least NVidia never pulls that trick. When they launch a new card, it's always easy to see which generation it belongs to, and which featureset it has. 6xxx? SM3.0 support, all of them. 8xxx? Every single one is DX10.

    Only ATI mixes the numbers around "Ok, X2xxx will be DX10, *except* for the X2300"... Was the same with the 9200, as I remember. Or was it 9000? And probably a few others... (Unless of course, the entire X2xxx series will be DX9 only, which would be.... disappointing)
     
  8. Intensity

    Intensity Notebook Geek

    Reputations:
    5
    Messages:
    79
    Likes Received:
    0
    Trophy Points:
    15
    Well, there was a Geforce4 MX440. Bought it when I was a noobie, thought it was a good graphic card, but the rest is history.
     
  9. mujtaba

    mujtaba ZzzZzz Super Moderator

    Reputations:
    4,242
    Messages:
    3,088
    Likes Received:
    516
    Trophy Points:
    181
    Same goes for me :( ...
    Don't worry both companies use the dirtiest tricks availible when it comes to (saving their necks)/(making loads of money).Few will forget what nVidia did with the GeForce FX series.Though ATI's profile has been much cleaner, they sometimes do this kind of stuff especially when people are starving for DX10 laptops...
     
  10. HavoK

    HavoK Registered User

    Reputations:
    706
    Messages:
    1,719
    Likes Received:
    0
    Trophy Points:
    55
    The Geforce 4MX was always intended to be a value end product. It did what it said on the tin fairly well, regardless. It was never really touted as a Dx8 card at the time. This X2300 is ridiculous though - as Jalf mentioned it's not the first time Ati has messed around with series numbers...

    So basically its an X1300 1.5....although one slightly positive thing is that it'll increase your battery life, although hard to sing such praise after all the misleads.
     
  11. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Well this is not being touted as DX10 either technically if you want to ignore the namenculture of the 4mx, which was worse since it was Geforce 2 based. At any rate yes I disagree with both of these things.
     
  12. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    True about the GF4MX. That was the same dirty trick. But at least that was ages ago. Since GF5, they haven't done that. But apparently the trend is still alive and kicking with ATI...
     
  13. MGW

    MGW Notebook Guru

    Reputations:
    31
    Messages:
    69
    Likes Received:
    0
    Trophy Points:
    15
    I think it sucks, and this could maybe damage ATIs image. Allot of people will not be pleased by this since they where expecting it to be DX10 especially since we are close to the intro of DX10...
     
  14. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Dunno, a lot of people won't care. And to those who do care, well, they already know it's a DX9 chip, so they won't buy it in the first place, so where's the loser?

    Other than us poor sods who try to keep up with the GPU market, that is... ;)

    But I guess it's no worse than Intel confusing the market by naming their CPU's Core Duo when everyone are talking about dual-core.....
     
  15. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,090
    Trophy Points:
    931
    What a letdown this information is, thanks for posting it though. At least we find this out early.

    I had the mobile version of the GeForce 4 MX 440 in my old Athlon 64-based HP laptop (zv5000z), that was garbage. I have terrible memories of that whole experience. The fact that HP touted the zv5000z as a gaming notebook and only used a DX7 graphics card in it bothered me. Of course I found that out a month or so after I bought it. I spent $1400 on that machine and couldn't even play the newer DX8/DX9 (games (at the time, I was ignorant about graphics cards in general).
    Enough of my rant . . .
     
  16. FREN

    FREN Hi, I'm a PC. NBR Reviewer

    Reputations:
    679
    Messages:
    1,952
    Likes Received:
    0
    Trophy Points:
    55
    I played Half-Life2 fine on my Geforce 4 MX desktop that's now almost 4 years old :p

    It's a shame to see the X2300 is only an X1350, or an X1375. Still can't wait for the Geforce Go 8600 and 8400 to come out. The desktop 8600 and 8300 are coming out in March. My guess is DX10 compatible notebook graphics cards will be available when Santa Rosa is launched.
     
  17. Lysander

    Lysander AFK, raid time.

    Reputations:
    1,553
    Messages:
    2,722
    Likes Received:
    1
    Trophy Points:
    55
    Well, atleast they didn't advertise it as a DirectX 10 card, then we would have been really disappointed. Could be worse, but a shame nonetheless.
     
  18. Lil Mayz

    Lil Mayz Notebook Deity

    Reputations:
    599
    Messages:
    1,463
    Likes Received:
    0
    Trophy Points:
    55
    LOL Chaz....my nVidia GeForce 400 still races through CS:S firing M4 fifles and so on at a resoluton of 1600 x 1200...at 40FPS.

    It looks like ATi did this just for publicity. Lets hope nVidia produce what millions of customers are looking for...
     
  19. Mr._Kubelwagen

    Mr._Kubelwagen More machine now than man

    Reputations:
    398
    Messages:
    744
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, I can't wait for the Go 8600 either. I'm starting a semester at university at the fall and would love to have an 86/8700 in my laptop. Assuming it's DX10, of course.

    This situation almost reminds me of how a bunch of people are thinking of sueing nVidia for not releasing proper drivers for the 8800 for Vista. "The $650 coaster".
     
  20. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    My laptop also had a g4mx 420 and its the part that became obsolete. I think the cpu which was a amd 1800+ would still have served me today 4 1/2 years later.

    When it came out ati had the 9000 mobile gpu and I should have spent more for that.

    Benchmarks even showed me the g4mx was faster as I recall

    This is going to be bad marketing in the long run, because when the x2600 comes out I think consumers are going to look twice at it because it might be bs.
     
  21. FREN

    FREN Hi, I'm a PC. NBR Reviewer

    Reputations:
    679
    Messages:
    1,952
    Likes Received:
    0
    Trophy Points:
    55
    It will be fully DX10 compatible. The desktop Geforce 8600 (80 nm) is shipping in two versions:

    (1) The 8600GT (~$150), which is supposed to be a replacement for the 7600GT. It'll have 256 MB of GDDR3 and a core clock of 350 Mhz.

    (2) The 8600 Ultra (~$180), which is supposed to be a replacement for the 7900GS. It'll have 512 MB of GDDR3 and a core clock of 500 Mhz.

    I'm guessing we'll see something of the same thing for the laptop cards. Maybe nVidia will ship the Go 8600GT as the Go 8400. Only time will tell :p
     
  22. XIII_GT

    XIII_GT Notebook Consultant

    Reputations:
    -3
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30
    This sux...every Company/game devs i loved started to use all these dirty tricks...Valve,ATI,UBI soft,Bungie,and a whole bunch more..
     
  23. mujtaba

    mujtaba ZzzZzz Super Moderator

    Reputations:
    4,242
    Messages:
    3,088
    Likes Received:
    516
    Trophy Points:
    181
    I had a Desktop version too slow, horrible and idiotic , and on top of that when I installed the network adapter I started expiriencing terrible lags,after a while I found out in that mobo (A7N8X-X) one of the PCI slots and the AGP slot shared the same IRQ, moved the network card to another slot the IRQ problem was fixed...
     
  24. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    It's not true that nVidia hasn't been plaing the branding trick recently. The prime example is the "new" GeForce 7100GS that was released a few months ago. This card was released as part of the Aero-compatible cheap upgrade for the masses push and is not related to the GeForce 7000 series (G7x core). The GeForce 7100GS uses a NV44 core and is basically a rebranded GeForce 6200.

    Admittedly though ATI is a lot worse at this. They started with the Radeon 7000 (Radeon VE) which lacked a hardware T&L engine. Then they had the Radeon 9000/9100/9200/9250 which were not DX9 but DX8.1. Then came the X300/X550/X600 which were just rebranded Radeon 9600 and so only supported DX9.0 and not DX9.0b.

    A unique case on the mobile front was the Mobility Radeon 9700 which was actually just an overclocked Mobility Radeon 9600 instead of being related to the desktop Radeon 9700. Although I suppose they could be forgiven, since the follow on Mobility Radeon 9800 was actually based on the desktop X800 so it's a DX9.0b card pretending to be a DX9.0 card. Probably the only time a company tries to disadvantage their product unnecessarily.
     
  25. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    When word of the A8Jr and its x2300 came out it was exciting news considering it seemed to mean that DX10 notebooks were coming out sooner than expected. A few weeks back I began hearing that the x2300 wasn't DX10--and now this confirmation that it is indeed a DX9 GPU is a bit disappointing for those who are waiting for DX10 notebooks to show up.

    Thanks for sharing that bit of information.
     
  26. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Hm, didn't know about that one... That sucks too. (But in their defense, at least the differences in feature-set between the 6 and 7 series was very minor)
     
  27. Juz_Follow_ATI

    Juz_Follow_ATI ATI all the way

    Reputations:
    148
    Messages:
    985
    Likes Received:
    0
    Trophy Points:
    30
    To those of you's who think the Mobility Radeon X2300 is DX10, well it's not, so stick that into your heads. ;) The ATI X2300 HD might be DX10, but I can't confirm that.

    And for those of you's who can't tell between DX10 and DX9 graphics cards, all you have to do is see wheather it has a unified shader(DX10) or have pixel and vertex shaders(DX9). Well, that's how I tell wheather it's DX10 or 9.

    Good luck with buying your notebooks! :):
     
  28. otakuoverlord

    otakuoverlord Notebook Evangelist

    Reputations:
    23
    Messages:
    420
    Likes Received:
    0
    Trophy Points:
    30
  29. Notebook Solutions

    Notebook Solutions Company Representative NBR Reviewer

    Reputations:
    461
    Messages:
    1,849
    Likes Received:
    0
    Trophy Points:
    55
    Thanks for the post, I already read about it but I was never sure. It really sucks that the X2300 is not DX 10!
     
  30. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, this sucks but it probably hadn't been powerful enough to play dx10 games anyways.
     
  31. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Then again.. i never get gpu that is Xx300 or even Xx400... mayb the X2400 is DX10..
     
  32. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    I don't think I ever seriously expected it to be, I mean how in the world would ATI already release a mobile DX10 card before a desktop one?
     
  33. HavoK

    HavoK Registered User

    Reputations:
    706
    Messages:
    1,719
    Likes Received:
    0
    Trophy Points:
    55
    Hasn't there already been a few long threads on this from last week?
     
  34. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    Yeah there have been, in fact I started one after I found out about this.
     
  35. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    What exactly is the difference between the x2300 and the x1450? They appear to have the same specs based on what I can see from ATI's website.
     
  36. Juz_Follow_ATI

    Juz_Follow_ATI ATI all the way

    Reputations:
    148
    Messages:
    985
    Likes Received:
    0
    Trophy Points:
    30
    I just posted this because I wanted to tell people how to tell wheather a graphic card is DX10 or DX9. YOu just have to see wheather it has unified shaders(DX10) or maybe it might have the original vertex and pixel shaders (DX9). Also, ATI's DX10 X2300 might be called X2300 HD or something.

    -Juz_FOllow_ATI
     
  37. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    I'm wondering why AMD wouldn't just call it the x1350 or x1375. It's just gonna be gonna be confusing since the x2300 is dx9 but the x2600 is dx10.
     
  38. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Because they wouldn't be able to sell a new x1000 series card this late in the game, they need to give it a new name to confuse people into buying it.
     
  39. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    Ya that must be the case. I saw a ati roadmap that showed the x2300 was the first dx10 gpu they were going to release.
    So I think its going to be even more confusing if they release a x2300 with dx 10, but that must be what they are going to do. The desktop one is called the r610
     
  40. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55
    Hah, judging from the system specs of the people that's posted so far, you guys should be happy about this news. It means the gfx card industry---both ATI and nVidia---is having trouble implementing their DX10 strategies in notebooks, and your current hardware will be king of the hill for a little while longer. My prediction is we won't see DX10 in notebooks until the Santa Rosa platform debuts.

    So unless you were planning on buying a notebook anytime soon, this is nothing to be concerned about.
     
  41. mujtaba

    mujtaba ZzzZzz Super Moderator

    Reputations:
    4,242
    Messages:
    3,088
    Likes Received:
    516
    Trophy Points:
    181
    I don't think so,budget gaming laptop's will not feature santa rosa, and they come out quite early, for example : A8Js.
    ATI was more patient,nVidia tried to just as they said "Hit ATI while it's on the ground" and released the 8800 too early, now they are facing driver problems.Apart from the fact that ATI R600 is better designed and the leaked benchmark state this too.
     
  42. toxicGen

    toxicGen Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    [​IMG]

    Does this mean X2300 is DX10?

    I bought my A8Jr recently.
     
  43. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    I do not know. I do not think so.
     
  44. Daidojih

    Daidojih Notebook Consultant

    Reputations:
    24
    Messages:
    164
    Likes Received:
    0
    Trophy Points:
    30
    I think that just tells you the DX version... (I'm pretty sure) its not relevant to whether or not the x2300 is DX10 compliant. (Someone with more experience can come yell at me now :p )
     
  45. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    That just tells you the highest DX version that is installed on your system. I'm assuming you have vista, since vista comes with DX10 preinstalled. However, that does not mean your card supports it. In most cases, the card will use the DX9.0L (the legacy version) which also comes with vista.
     
  46. bombardior

    bombardior Notebook Consultant

    Reputations:
    6
    Messages:
    113
    Likes Received:
    0
    Trophy Points:
    30
    toxicgen, can you tell me where you bought it? i saw it on new egg but that one only has 1 gb of ram
     
  47. HT2

    HT2 Notebook Consultant

    Reputations:
    0
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    How to test does x2300 has unified shaders or not?
    Can toxicGen run simple DX10 game demo or Benchmark?
     
  48. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,090
    Trophy Points:
    931
    That just tells you your DirectX version as posted; my X700 shows the same thing (see attachment).
     

    Attached Files:

    • DX10.jpg
      DX10.jpg
      File size:
      61.7 KB
      Views:
      247
  49. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    You can look on ATI's spec page for it. It has pixel pipelines and vertex shaders like all the x1000 series cards.
     
  50. binnn4

    binnn4 Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    0
    Trophy Points:
    5
    ATI just release the x2300 mobile in the asus A8Jr.
    what do you think about the new gpu?
     
 Next page →