I was looking at dell pc's and i came across the intel graphics card (GMA) 3,000. I know this is nowere near ATI or Nvuda but how does this one do?
I saw intels demo of it and they showed gameplay of some game and it looked good...
Or will this card lag when you play minesweeper?
this is the cp that had this card
http://www.dell.com/content/products/features.aspx/cto_xpsdt_210?c=us&cs=19&l=en&s=dhs
-
i will never trust intel when it comes to GPUs...i've been burned by them twice already
my guess is that it is a modified 950 that will support Vista and not use any software emulation of features...regardless I doubt it will be stronger than an ATI Integrated 200m -
Dustin Sklavos Notebook Deity NBR Reviewer
From what I've read, Intel is having trouble assembling a good driver for it and as a result, its current performance is actually Sub-GMA950.
Yuck.
Supposedly it will be much improved when they do release a good driver for it, and I remain hopeful it will at least be a solid IGP. After all, Intel finally learned how to do hardware T&L! -
I'm not sure but I don't think that gma 950 is worse then xpress 200m
but just equal, probably.
3D mark '03 '05 '06
Gma 950 1300 450 170
Xpress 200M 1100 450 140
As for gma 3000
There are two models(chipsets). gma X3000 and the budget version gma 3000
G965 Express - GMA X3000
DirectX 9c, DirectX 10 and OpenGL 1.5
Hardware vertex shader model 3.0
Hardware pixel shader model 3.0
32-bit and 16-bit full precision floating point operations
Up to 8 multiple render targets
Occlusion Query
128-bit floating point texture format
Bilinear, trilinear and anisotropic mipmap filtering
Shadow maps and double sided stencils
Q965 Express - GMA 3000
DirectX 9c and OpenGL 1.4 plus
Software vertex shader model 2.0/3.0
Hardware pixel shader model 2.0
32-bit and 16-bit fixed point operations
Up to 8 multiple render targets
Occlusion query
128-bit floating point texture format
Bilinear, trilinear and anisotropic mipmap filtering
Shadow maps and double sided stencils
X3000 seems nice and it has Hardware T&L
It's time for Intel to start making good GPUs... -
...intel cannot make good GPUs. they merely make ones that will display the desktop and play the occassional chess game.
i'm sorry, but i'll never forgive intel for making the GMA900...i'm glad i'm getting rid of that notebook -
To be fair, Intel did just hire a raft of engineers from 3DLabs very recently. Basically their whole chip design department, trying to keep the group together. One of them is a friend of a friend
So they're serious about making graphics. Whether they manage to not piss off the engineers with their corporate crap remains to be seen.
-
Charles P. Jefferies Lead Moderator Super Moderator
I don't have high hopes for the GMA X3000. Anything with shared-only memory isn't suitable for 3D. -
-
Charles P. Jefferies Lead Moderator Super Moderator
If there's a potential that it could be done well on PC, cool, but it doesn't exist yet. -
the 3000 is a small steping stone above the 950. It finally includes hardware T&L, but so did the first geforce. It also brings DX9 capabilities, but no where near the performance needed to run those games.
-
Hmm... Integrated card have a long way to go but there is future in them...
X200m is better then GMA 950 as a gaming performance but the difference is not significant
Chazzz
As for GMA X3000 Intel just makes little steps Extreme<<<900<950<<<<3000 but the direction is right. and also they are not guilty that the people buy GMA 900 and want to play the newest games and then ask how to upgrade it.
I don't have any hopes for GMA that's way I'll be happy to see anything better then 950.
So, If you don't have Great Expectations you can't be disappointed
Actually I'll be happy to see also high-high-end dedicated card with max battery life and no heat at all but...
By the way
How is Xpress 1250, do you now something about. -
-
XBox 360 doesn't use shared memory in the sense that PC's do. Rather, on the 360, the GPU doubles as memory controller. So it'd be more accurate to say that the GPU has dedicated memory, and the CPU uses shared memory, leeching off the GPU.
-
Dustin Sklavos Notebook Deity NBR Reviewer
The difference is night and day. The GMA 950 can't deliver a stable framerate to save its life. The X200M can. I've seen the X200 run beautifully, but I've never seen the GMA 950 game very well.
Half of it's compatibility, and half of it's just superior hardware. But the difference is VERY significant. -
As for the GMA 950 it does well in 3Dmark but it gets beaten pretty soundly by the xpress 200M and Geforce 6100 when it comes to actual gaming framerates. -
OK Agree!
I haven't tried X200M or gma950, actually I woudn't use integrated GPUs for games but..
I said that difference between X200m and GMA 950 is not significant because I've been asked so I've read several reviews and the gaming performance of x200m was slightly better then gma 950 (0-10% depends on the game). 3Dmark is not so reliable I've just mentioned the results. However, I agree. -
There is a much bigger difference of 10% between GMA950 and Radeon 200. And bear in mind this is the shared version of the Radeon 200, the dedicated memory version will be a lot faster again.
Link to review -
Dreamer and brain_stew,
NEVER argue with the chaz
then you just get owned
anyway, you're both incorrect sorry.
x200m is infinetly > GMA950
and even if the benchmarks are only 10% difference, gaming performace is much greater since benchmarks are by no means accurate gaming performance tools.
also, x200m has several variations, some have no dedicated memory, some has 32 mb dedicated memory and rest shared and some has 128 dedicated memory, each being more powerful the the former -
Ok I said I admit x200 is better, so ...
but!
Here is a review, a bit different from what I read months ago but who cares. Both cards are not for gaming just no way. And if you think that the difference between two so bad results can be so dramatical OK I agree.
http://www.anandtech.com/video/showdoc.aspx?i=2427
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2269&p=19
So to sum up
some games on Xpress 200 are barely playable
the same on Gma 950 are not so playable
so remarkable!
enjoy playing games on X200M. -
Also the xbox doesn't run a huge OS like XP. Its a huge hog on resources. If you remember the days of windows 3.1/dos a older system can run then at good speeds. I'd imagine if M$ still supported windows 3.1 running with current hardware everything would fly. It would take about 2 secs to load 3.1...
I also remember playing older games such as Championship Manager 97 booted to DOS mode and was about 2x faster than when it was played in windows 95. -
Don't get me wrong, I'm not underestimating the difference between the setup in a PC and a console but all I want to do is make clear that shared memory can offer good performance and does have potential. AMD / ATIs merger and focus on interating a GPU into the CPU die or even creating socketed GPUs without their own memory shows that this is an area that is going to become ever important and succesful. -
ltcommander_data Notebook Deity
The original Dell link contains the GMA 3000 which is used in the 946GZ/Q965/Q963 chipsets. The GMA 3000 is basically identical to the GMA 950, the only difference is that it is manufactured on a 90nm process instead of 130nm which means that it is clocked at 667MHz rather than 400MHz. It still doesn't have hardware T&L or hardware VS. The GMA 3000 is a small step up from the GMA 950.
The GMA X3000 is completely different though, which its why their similar naming is confusing. It's only similarity is that it is clocked at 667MHz like the GMA 3000. However, the GMA X3000 uses unified shaders (likely 8 with 4 TMUs), which will make Intel beat ATI (the R600 looks delayed to next year) in introducing that feature to PCs. The GMA X3000 also has complete hardware support for T&L, PS3.0, and VS3.0. The architecture is also supposed to be capable of HDR with AA like ATI. In theory, with 8 unified shaders clocked at 667MHz it's performance should easily surpass any previous IGP and as fast if not faster than ATI's Xpress 1250 (the new RS600 with the X700 based core, no word on clock speed but it looks like a 4PS/2VS configuration). Performance to discrete graphics solutions is probably X600/X1300HM level.
In terms of shared memory, Intel's tile-based architecture with it's larger internal buffers means it needs less bandwidth than other architectures like the Xpress 1250, which were originally designed to have the benefits of discrete memory. The GMA X3000 also looks to have some type of multithreading to work around stalled threads which is probably similar to ATI's "Ultra-threading Dispatch Processor". The Fast Memory Access feature in the G965 chipset should also help in optimizing bandwidth.
The problem as others have mentioned is that, as always, Intel is slow on the driver uptake. The tests that show the GMA X3000 being slower than the GMA 950 were using drivers that had no hardware T&L, VS or PS support. With the GMA 950 having hardware PS and well developed software emulation for T&L and VS it obviously had an advantage over the GMA X3000.
The driver schedule is as follows:
http://www.hkepc.com/bbs/itnews.php?tid=627088&starttime=0&endtime=0
14.21: Advanced Pixel De-interlacing, Proc Amp, HDMI, DVMT 256MB support (early benchmarks)
14.24: General performance and code optimization (current driver)
14.25: Hardware Pixel Shader 3.0 support
14.26: Hardware Vertex Shader 3.0 and Hardware T&L support (release driver)
The GMA X3000 definitely has great potential, but a lot of it is up to how much effort Intel decides to put into successive driver revisions/optimizations. Hopefully, the added 3DLabs personnel and pressure from AMD/ATI will force them to put more effort into their IGPs. Now for desktops X600/X1300HM level performance isn't remarkable, although it is an IGP, the real potential for the GMA X3000 is in it's Santa Rosa mobile version. If it can offer MR X600/MR X1300 level performance with the low power levels of IGPs it could really be a big hit. And since the GMA X3000 is the lowest denominator for the Centrino platform, it could potentially mean that every labtop has at least MR X1300 level performance, which is not bad at all. Especially if Intel follows through on their hints that DX10 support will be activated on their unified shaders once Vista is launched. -
The specs of the x3000 do look promising (for an integrated device). As stated, if Intel can get some decent drivers together for it it could be a respectable integrated option.
-
Dreamer, are you blind?
LOOK AT THE BENCHMARKS FROM YOUR LINK!!!
sure, in some games, the GMA and x200 are about the same level, but in some (ie farcry) the difference is huge. the difference between playiong with a GMA at 10fps and a x200 at 200 fps is what night and day is
the x200 almost doubles the GMA's fps in halflife. 20 fps from GMA is a choppy game, but at the x200's 38fps is VERY playable
the x200 beats out the GMA on that benchmark on every game except doom 3 and probably a lot of other games not in the benchmark
and though some may consider a 6fps difference small, it can make or break how enjoyable a game is
EDIT: oh yea, there is a thread around here about how some people are ENJOYING their gaming experience with the x200. Too bad they didn't put FEAR as a test, i'm sure the GMA would've choked on that one. Please don't even put the GMA on the same level as the x200. -
So, read again carefully, think again and you could find an explanation
otherwise ask me if I'm blind again.
I'm not gonna talk about that any more. Think whatever you want.
Find in the dictionary the definition of word "playable"
Actually you could find other useful things there too.
Hint: When you double a very low score the result is again low score.
Someone says something about benchmarks...
After searching in dictionary you can (try to) play Far cry on Xpress 200M if don't have I'll send you one for a present.
games are usually made not to be only playable
this converetion is pointless if you don't see -
Charles P. Jefferies Lead Moderator Super Moderator
This is getting a bit too heated for my tastes, so if it could be toned down/reverted to a normal conversation, that would be great.
Everyone has their own definition of 'playable'. -
so what does 1 typo and
i'm guessing by your definition that a game is unplayable unless you play it at 1600x1200 with hdr and 16x aa and af.
while there are people like you, who thinks that graphics is what a game is all about, equally many will find that it is a game's plot or gameplay that makes it a great game
anyway, i think the GMA will always be bottom barrel gpu and even though the x3000 will be dx10 compliant, it will amount to nothing as it probably won't be able to play any games that do require dx 10. i do think that once a fleshed out driver is out for the card, it will do much better then the current GMA 950 and the x200 in dx 9 games. -
I'm not talking about grammar at all. Generally speaking in the dictionaries there are more interesting things you should try.
-
OK, let's calm down a bit. No need to bicker over this.
I have used both a GMA 900 and an fully-shared ATi x200M. The difference is indeed night-and-day.
First, a brief explanation of the differences between The 900 and the 950: The GMA 950 is fundamentally very similar to the 900, with only a few slight core optimizations and faster clock. The performance difference you see in benchmarks comes partially from the fact that the GMA 900 is benchmarked on Pentium Ms running DDR2-533 RAM, while the GMA 950 is benched on Core Duos running DDR2-667 RAM. Well no kidding integrated graphics perform noticeably better on noticeably faster systems.
The 3000 is a more significant step above the 950, with much better compatibility and a redesigned core, but the X3000 is the real winner, because like the x200M, Go6150, x1250M, etc., it has Hardware T&L, a vital feature for getting acceptable performance from modern games.
The x200M (the fully-shared one, not the dedicated one, which would work even better), as I mentioned, has Hardware T&L already, as well as SM2.0 support. It can run F.E.A.R. acceptably well - at 640x480 w/minimum detail and DX8 Shaders on, but it still looks pretty incredible. The GMA 900 chokes even with pixel doubling turned on. Halo: 1280x768, SM1.4, medium-high graphics detail., fully playable on the x200M; on the GMA 900, the resolution has to go down to 640x480 at the same settings, or the details have to drop substantially. UT2004? 1280x768, medium settings on the x200M, but it has corruption issues even at 640x480 on the GMA 900. These comparisons are a little unfair: The GMA 900 machine had a Celeron M 1.6GHz, whereas the x200M had a Turion64 2.0GHz, but they were otherwise equal.
The fact is that the x200M has better compatibility and better performance than the GMA 950, but to answer the original question: The GMA 3000 will not be noticeably better than the 950, though the X3000 should offer decent performance in a wider range of games due to it's hardware T&L. -
ltcommander_data Notebook Deity
Now the GMA 3000 on the Q965/Q963 is wierd. The fact that it's also dubbed the GMA 3000 makes it seem related to the 946GZ and 945G, but that doesn't make much sense. Now the Q965/Q963 share nearly all other features with the G965 such as the new memory controller and Fast Memory Access. If the Q965/Q963 actually uses the same IGP as the 946GZ/945G then that means that Intel developed 2 chipsets separately: the Q965/963 and the G965 with separate IGPs. That seems like an inefficient use of resources and adds complexity. The other explanation is that the GMA 3000 in the Q965/Q963 is actually the same IGP as the GMA X3000 in the G965 only that they locked the unified shaders in PS mode only. That also doesn't make sense, since if the hardware is there already, why not use it? Granted it could be to use defect parts, but then you might as well make the Q963 the defect chipset and give the Q965 the full GMA X3000 of the G965. It's all very wierd.
-
-
ltcommander_data Notebook Deity
-
Its a cutdown X3000, no hardware T&L and only up to Shader Model 2. But it does have HD support.
Info here. -
ltcommander_data Notebook Deity
That article doesn't really say that the GMA 3000 is a cut down GMA X3000 though. It just said that the Q963 is a cut down Q965, but they both already use the GMA 3000. It just seems that too many features would have to be cut from the GMA X3000 to make the GMA 3000. Things like iDCT support for MPEG2 decode and VC.1 HW decode are not related to games and are useful to corporate platforms too for presentations yet they are not present in the GMA 3000. Similarly, OpenGL support is also useful outside of games yet that was cut back too. I can't wait for someone to do an in-depth review and architectural analysis. -
thanks for the info lowlymarine I think I know enough about GPUs but thanks
as for x200m I admited that several post ago but may strong position is that even the best trash is again trash. Grading like trash,better trash, best trash is a bit stupid for me.
here is my position about GPUs http://forum.notebookreview.com/showthread.php?t=78144&page=2&pp=10 -
Dustin Sklavos Notebook Deity NBR Reviewer
Make no mistake, I think there's a good argument for eye candy as entertainment unto itself, otherwise I wouldn't be running 7600s in both my machines. But those 7600s are just making already excellent games (Far Cry, Quake 4) that much more enjoyable. But Quake 4 was still a heck of a lot of fun at 640x480 on my old X600.
What you have to understand is that these parts aren't designed to be hardcore gaming parts. They're designed to be inexpensive, battery efficient graphics adaptors capable of light gaming. The X200M and its more robust nVidia cousin the Go 6100 are impressive bits of engineering given their intended purpose. These are incredible inroads in this market; when's the last time an IGP existed that had this kind of power compared to the games on the market?
Any improvement in the IGP market is only good for everyone, because by raising the lowest common denominator, you raise the average level of hardware that developers have to shoot for. No, they aren't for the hardcore gamer, but you need to stop thinking of yourself as the sole demographic. Some people may just want to play a quick game of World of Warcraft on the road. And God forbid, Unreal Tournament 1999 is still a great game.
And to be entirely frank, I don't think someone as inarticulate and aggressive as you've been has any business passing judgment on anyone else's intelligence. I'll be less diplomatic than the mods here: grow up or get lost. These forums are as strong as they are because the people that post here are intelligent, informed, and most of all, mature. If you're not going to contribute something useful or ask a useful question, at least politely remain silent. -
Charles P. Jefferies Lead Moderator Super Moderator
dreamer, hmmmmm, I don't know how I got involved in your arguement, but I do not pick favorites and I'm generally a forgiving persion. If you have a problem with another forum member, then private message me; we do not need that sort of stuff on the forums.
If this thread goes off on another tangent like that again, I'll lock it. -
It's amazing I entered notebookreview.com barely understanding RAM, and processor speeds and other such jargon. I have learned so much just from reading conversations had on the forum and you guys have helped me so much with figuring out what brand of notebook I want to buy (Asus), but you guys still have debates sometimes on another level that I just can't understand.
*but I'm working on it!
intel done it right this time?
Discussion in 'Gaming (Software and Graphics Cards)' started by m4rc, Sep 13, 2006.