Ok, so I've been reading a lot about the x2300, available in the a8jr, and it has been toted by forum contributors as being the "first mobile dx10 video card". Unfortunately, according to ATI's own site, the X2300 is only DirectX 9.0c capable.
This leads me to wonder if ATI released the x2300 for the sole purpose of confusing hopefuls like us into buying this card for it's "reputational DirectX 10 compatibility". Just because it's part of the x2000 line, doesn't mean it's DX10 compliant.
I just hope nVidia or ATI release something soon, and stop trying to brush off their "new" technology as being state of the art.
-
Mr._Kubelwagen More machine now than man
-
This kinda sucks, while I didn't expect much from the x2300, I at least expected it to be a low end dx10 card but now it just seems some rather useless remake of the x1300. They should just call it the x1350 but I guess the x2300 just sounds "more advanced".
-
usapatriot Notebook Nobel Laureate
What a load of crap then.
Crap. That sucks.
http://www.techzonept.com/showthread.php?t=130446
http://www.google.com/search?hl=en&q=x2300+not+dx10&btnG=Google+Search -
wow, sucks... so wat is it? something better than x1300 and x1400?
-
too bad, i was hoping to see a mobile DX10 card review
-
-
Gah, that's ridiculous... At least NVidia never pulls that trick. When they launch a new card, it's always easy to see which generation it belongs to, and which featureset it has. 6xxx? SM3.0 support, all of them. 8xxx? Every single one is DX10.
Only ATI mixes the numbers around "Ok, X2xxx will be DX10, *except* for the X2300"... Was the same with the 9200, as I remember. Or was it 9000? And probably a few others... (Unless of course, the entire X2xxx series will be DX9 only, which would be.... disappointing) -
Well, there was a Geforce4 MX440. Bought it when I was a noobie, thought it was a good graphic card, but the rest is history.
-
...
Don't worry both companies use the dirtiest tricks availible when it comes to (saving their necks)/(making loads of money).Few will forget what nVidia did with the GeForce FX series.Though ATI's profile has been much cleaner, they sometimes do this kind of stuff especially when people are starving for DX10 laptops... -
So basically its an X1300 1.5....although one slightly positive thing is that it'll increase your battery life, although hard to sing such praise after all the misleads. -
Meaker@Sager Company Representative
-
True about the GF4MX. That was the same dirty trick. But at least that was ages ago. Since GF5, they haven't done that. But apparently the trend is still alive and kicking with ATI...
-
I think it sucks, and this could maybe damage ATIs image. Allot of people will not be pleased by this since they where expecting it to be DX10 especially since we are close to the intro of DX10...
-
Dunno, a lot of people won't care. And to those who do care, well, they already know it's a DX9 chip, so they won't buy it in the first place, so where's the loser?
Other than us poor sods who try to keep up with the GPU market, that is...
But I guess it's no worse than Intel confusing the market by naming their CPU's Core Duo when everyone are talking about dual-core..... -
Charles P. Jefferies Lead Moderator Super Moderator
What a letdown this information is, thanks for posting it though. At least we find this out early.
I had the mobile version of the GeForce 4 MX 440 in my old Athlon 64-based HP laptop (zv5000z), that was garbage. I have terrible memories of that whole experience. The fact that HP touted the zv5000z as a gaming notebook and only used a DX7 graphics card in it bothered me. Of course I found that out a month or so after I bought it. I spent $1400 on that machine and couldn't even play the newer DX8/DX9 (games (at the time, I was ignorant about graphics cards in general).
Enough of my rant . . . -
I played Half-Life2 fine on my Geforce 4 MX desktop that's now almost 4 years old
It's a shame to see the X2300 is only an X1350, or an X1375. Still can't wait for the Geforce Go 8600 and 8400 to come out. The desktop 8600 and 8300 are coming out in March. My guess is DX10 compatible notebook graphics cards will be available when Santa Rosa is launched. -
Well, atleast they didn't advertise it as a DirectX 10 card, then we would have been really disappointed. Could be worse, but a shame nonetheless.
-
It looks like ATi did this just for publicity. Lets hope nVidia produce what millions of customers are looking for... -
Mr._Kubelwagen More machine now than man
Yeah, I can't wait for the Go 8600 either. I'm starting a semester at university at the fall and would love to have an 86/8700 in my laptop. Assuming it's DX10, of course.
This situation almost reminds me of how a bunch of people are thinking of sueing nVidia for not releasing proper drivers for the 8800 for Vista. "The $650 coaster". -
My laptop also had a g4mx 420 and its the part that became obsolete. I think the cpu which was a amd 1800+ would still have served me today 4 1/2 years later.
When it came out ati had the 9000 mobile gpu and I should have spent more for that.
Benchmarks even showed me the g4mx was faster as I recall
This is going to be bad marketing in the long run, because when the x2600 comes out I think consumers are going to look twice at it because it might be bs. -
(1) The 8600GT (~$150), which is supposed to be a replacement for the 7600GT. It'll have 256 MB of GDDR3 and a core clock of 350 Mhz.
(2) The 8600 Ultra (~$180), which is supposed to be a replacement for the 7900GS. It'll have 512 MB of GDDR3 and a core clock of 500 Mhz.
I'm guessing we'll see something of the same thing for the laptop cards. Maybe nVidia will ship the Go 8600GT as the Go 8400. Only time will tell -
This sux...every Company/game devs i loved started to use all these dirty tricks...Valve,ATI,UBI soft,Bungie,and a whole bunch more..
-
-
ltcommander_data Notebook Deity
Admittedly though ATI is a lot worse at this. They started with the Radeon 7000 (Radeon VE) which lacked a hardware T&L engine. Then they had the Radeon 9000/9100/9200/9250 which were not DX9 but DX8.1. Then came the X300/X550/X600 which were just rebranded Radeon 9600 and so only supported DX9.0 and not DX9.0b.
A unique case on the mobile front was the Mobility Radeon 9700 which was actually just an overclocked Mobility Radeon 9600 instead of being related to the desktop Radeon 9700. Although I suppose they could be forgiven, since the follow on Mobility Radeon 9800 was actually based on the desktop X800 so it's a DX9.0b card pretending to be a DX9.0 card. Probably the only time a company tries to disadvantage their product unnecessarily. -
When word of the A8Jr and its x2300 came out it was exciting news considering it seemed to mean that DX10 notebooks were coming out sooner than expected. A few weeks back I began hearing that the x2300 wasn't DX10--and now this confirmation that it is indeed a DX9 GPU is a bit disappointing for those who are waiting for DX10 notebooks to show up.
Thanks for sharing that bit of information. -
-
To those of you's who think the Mobility Radeon X2300 is DX10, well it's not, so stick that into your heads.
The ATI X2300 HD might be DX10, but I can't confirm that.
And for those of you's who can't tell between DX10 and DX9 graphics cards, all you have to do is see wheather it has a unified shader(DX10) or have pixel and vertex shaders(DX9). Well, that's how I tell wheather it's DX10 or 9.
Good luck with buying your notebooks!:
-
I believe this sums up my disappointment and anger at this news.
Sadness, and fury. -
Notebook Solutions Company Representative NBR Reviewer
Thanks for the post, I already read about it but I was never sure. It really sucks that the X2300 is not DX 10!
-
Yeah, this sucks but it probably hadn't been powerful enough to play dx10 games anyways.
-
Then again.. i never get gpu that is Xx300 or even Xx400... mayb the X2400 is DX10..
-
I don't think I ever seriously expected it to be, I mean how in the world would ATI already release a mobile DX10 card before a desktop one?
-
Hasn't there already been a few long threads on this from last week?
-
-
What exactly is the difference between the x2300 and the x1450? They appear to have the same specs based on what I can see from ATI's website.
-
I just posted this because I wanted to tell people how to tell wheather a graphic card is DX10 or DX9. YOu just have to see wheather it has unified shaders(DX10) or maybe it might have the original vertex and pixel shaders (DX9). Also, ATI's DX10 X2300 might be called X2300 HD or something.
-Juz_FOllow_ATI -
I'm wondering why AMD wouldn't just call it the x1350 or x1375. It's just gonna be gonna be confusing since the x2300 is dx9 but the x2600 is dx10.
-
Because they wouldn't be able to sell a new x1000 series card this late in the game, they need to give it a new name to confuse people into buying it.
-
So I think its going to be even more confusing if they release a x2300 with dx 10, but that must be what they are going to do. The desktop one is called the r610 -
Hah, judging from the system specs of the people that's posted so far, you guys should be happy about this news. It means the gfx card industry---both ATI and nVidia---is having trouble implementing their DX10 strategies in notebooks, and your current hardware will be king of the hill for a little while longer. My prediction is we won't see DX10 in notebooks until the Santa Rosa platform debuts.
So unless you were planning on buying a notebook anytime soon, this is nothing to be concerned about. -
I don't think so,budget gaming laptop's will not feature santa rosa, and they come out quite early, for example : A8Js.
ATI was more patient,nVidia tried to just as they said "Hit ATI while it's on the ground" and released the 8800 too early, now they are facing driver problems.Apart from the fact that ATI R600 is better designed and the leaked benchmark state this too. -
Does this mean X2300 is DX10?
I bought my A8Jr recently. -
usapatriot Notebook Nobel Laureate
I do not know. I do not think so.
-
I think that just tells you the DX version... (I'm pretty sure) its not relevant to whether or not the x2300 is DX10 compliant. (Someone with more experience can come yell at me now
)
-
That just tells you the highest DX version that is installed on your system. I'm assuming you have vista, since vista comes with DX10 preinstalled. However, that does not mean your card supports it. In most cases, the card will use the DX9.0L (the legacy version) which also comes with vista.
-
toxicgen, can you tell me where you bought it? i saw it on new egg but that one only has 1 gb of ram
-
Can toxicGen run simple DX10 game demo or Benchmark? -
Charles P. Jefferies Lead Moderator Super Moderator
That just tells you your DirectX version as posted; my X700 shows the same thing (see attachment).
Attached Files:
-
-
-
ATI just release the x2300 mobile in the asus A8Jr.
what do you think about the new gpu?
x2300 is not DirectX 10 compliant
Discussion in 'Gaming (Software and Graphics Cards)' started by Mr._Kubelwagen, Feb 7, 2007.