The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    intel done it right this time?

    Discussion in 'Gaming (Software and Graphics Cards)' started by m4rc, Sep 13, 2006.

  1. m4rc

    m4rc Notebook Evangelist

    Reputations:
    109
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
  2. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    i will never trust intel when it comes to GPUs...i've been burned by them twice already

    my guess is that it is a modified 950 that will support Vista and not use any software emulation of features...regardless I doubt it will be stronger than an ATI Integrated 200m
     
  3. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    The problem is that actual benchmarks of it are rarefied at best.

    From what I've read, Intel is having trouble assembling a good driver for it and as a result, its current performance is actually Sub-GMA950.

    Yuck.

    Supposedly it will be much improved when they do release a good driver for it, and I remain hopeful it will at least be a solid IGP. After all, Intel finally learned how to do hardware T&L!
     
  4. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    I'm not sure but I don't think that gma 950 is worse then xpress 200m
    but just equal, probably.

    3D mark '03 '05 '06
    Gma 950 1300 450 170
    Xpress 200M 1100 450 140


    As for gma 3000

    There are two models(chipsets). gma X3000 and the budget version gma 3000

    G965 Express - GMA X3000

    DirectX 9c, DirectX 10 and OpenGL 1.5
    Hardware vertex shader model 3.0
    Hardware pixel shader model 3.0
    32-bit and 16-bit full precision floating point operations
    Up to 8 multiple render targets
    Occlusion Query
    128-bit floating point texture format
    Bilinear, trilinear and anisotropic mipmap filtering
    Shadow maps and double sided stencils

    Q965 Express - GMA 3000

    DirectX 9c and OpenGL 1.4 plus
    Software vertex shader model 2.0/3.0
    Hardware pixel shader model 2.0
    32-bit and 16-bit fixed point operations
    Up to 8 multiple render targets
    Occlusion query
    128-bit floating point texture format
    Bilinear, trilinear and anisotropic mipmap filtering
    Shadow maps and double sided stencils

    X3000 seems nice and it has Hardware T&L

    It's time for Intel to start making good GPUs...
     
  5. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    ...intel cannot make good GPUs. they merely make ones that will display the desktop and play the occassional chess game.

    i'm sorry, but i'll never forgive intel for making the GMA900...i'm glad i'm getting rid of that notebook
     
  6. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    To be fair, Intel did just hire a raft of engineers from 3DLabs very recently. Basically their whole chip design department, trying to keep the group together. One of them is a friend of a friend ;) So they're serious about making graphics. Whether they manage to not piss off the engineers with their corporate crap remains to be seen.
     
  7. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    The GMA950 can hardly be called a proper GPU. It does not even have hardware T&L. It is far behind the X200M - the X200M is actually a cut down X300 core.

    I don't have high hopes for the GMA X3000. Anything with shared-only memory isn't suitable for 3D.
     
  8. brain_stew

    brain_stew Notebook Consultant

    Reputations:
    5
    Messages:
    275
    Likes Received:
    0
    Trophy Points:
    30
    I wouldn't go that far, if done properly a graphics chip using shared memory can work well, if it couldn't then the Xbox 360 wouldn't be the beast it is. Sure the situation is much difference but it does illustrate the fact that memory shared between the CPU and GPU can produce cutting edge graphics if done properly.
     
  9. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    We're talking PCs here, though. If you can name any shared memory chip that games well, I will be interested to hear about it. Simple fact is that there isn't one.
    If there's a potential that it could be done well on PC, cool, but it doesn't exist yet.
     
  10. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    the 3000 is a small steping stone above the 950. It finally includes hardware T&L, but so did the first geforce. It also brings DX9 capabilities, but no where near the performance needed to run those games.
     
  11. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    Hmm... Integrated card have a long way to go but there is future in them...

    X200m is better then GMA 950 as a gaming performance but the difference is not significant

    Chazzz
    As for GMA X3000 Intel just makes little steps Extreme<<<900<950<<<<3000 but the direction is right. and also they are not guilty that the people buy GMA 900 and want to play the newest games and then ask how to upgrade it.

    I don't have any hopes for GMA that's way I'll be happy to see anything better then 950.
    So, If you don't have Great Expectations you can't be disappointed

    Actually I'll be happy to see also high-high-end dedicated card with max battery life and no heat at all but...

    By the way

    How is Xpress 1250, do you now something about.
     
  12. nix

    nix Notebook Consultant

    Reputations:
    88
    Messages:
    298
    Likes Received:
    0
    Trophy Points:
    30
    It is different with PCs becuase games don't directly communicate with the graphics card as in the Xbox. The GMA 3000 is just Intel's attempt to create a vista-capable IGP graphics card. I wouldn't recommend using Intel graphics either; they are just too far behind nVidia and ATI. Even my older nVidia GeForce 6100 IGP will handily outperform the new Intel GMA 3000.
     
  13. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    XBox 360 doesn't use shared memory in the sense that PC's do. Rather, on the 360, the GPU doubles as memory controller. So it'd be more accurate to say that the GPU has dedicated memory, and the CPU uses shared memory, leeching off the GPU.
     
  14. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    Oooohhh yes it is. I've used the X200M. And I've used the GMA 950.

    The difference is night and day. The GMA 950 can't deliver a stable framerate to save its life. The X200M can. I've seen the X200 run beautifully, but I've never seen the GMA 950 game very well.

    Half of it's compatibility, and half of it's just superior hardware. But the difference is VERY significant.
     
  15. particleman

    particleman Notebook Enthusiast

    Reputations:
    0
    Messages:
    27
    Likes Received:
    0
    Trophy Points:
    5
    What makes the Xbox 360 the beast it is, is embedded DRAM. The Xbox 360 has ram right on the die, this EDRAM is faster than anything even on highend discrete graphics cards. This is how the 360 is able to get FSAA at zero performance hit. The drawback of embedded DRAM is that it costs a ton to make, and it only works well for fixed circumstances. For example the xbox 360 only has 10MB of embedded DRAM this is just enough for the 720p framebuffer with FSAA. On PCs where higher resolutions are a must eDRAM isn't practical yet as it just costs too much if you wanted it to be useful for higher resolutions.

    As for the GMA 950 it does well in 3Dmark but it gets beaten pretty soundly by the xpress 200M and Geforce 6100 when it comes to actual gaming framerates.
     
  16. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    OK Agree!

    I haven't tried X200M or gma950, actually I woudn't use integrated GPUs for games but..

    I said that difference between X200m and GMA 950 is not significant because I've been asked so I've read several reviews and the gaming performance of x200m was slightly better then gma 950 (0-10% depends on the game). 3Dmark is not so reliable I've just mentioned the results. However, I agree.
     
  17. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    There is a much bigger difference of 10% between GMA950 and Radeon 200. And bear in mind this is the shared version of the Radeon 200, the dedicated memory version will be a lot faster again.


    Link to review
     
  18. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    Dreamer and brain_stew,

    NEVER argue with the chaz

    then you just get owned

    anyway, you're both incorrect sorry.

    x200m is infinetly > GMA950

    and even if the benchmarks are only 10% difference, gaming performace is much greater since benchmarks are by no means accurate gaming performance tools.

    also, x200m has several variations, some have no dedicated memory, some has 32 mb dedicated memory and rest shared and some has 128 dedicated memory, each being more powerful the the former
     
  19. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    Ok I said I admit x200 is better, so ... :cool:

    but!

    Sometimes night and day can be equally bad.(approximately) :confused:

    yes I do know that and I know that there are dedicated cards too. Compare them to GMA !? :)

    Ooo GOD forbid! ... I'm really sorry. :rolleyes:

    Here is a review, a bit different from what I read months ago but who cares. Both cards are not for gaming just no way. And if you think that the difference between two so bad results can be so dramatical OK I agree.

    http://www.anandtech.com/video/showdoc.aspx?i=2427

    http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2269&p=19

    So to sum up

    some games on Xpress 200 are barely playable
    the same on Gma 950 are not so playable

    so remarkable!

    enjoy playing games on X200M.
     
  20. ez2remember

    ez2remember Notebook Evangelist

    Reputations:
    28
    Messages:
    494
    Likes Received:
    0
    Trophy Points:
    30
    Also the xbox doesn't run a huge OS like XP. Its a huge hog on resources. If you remember the days of windows 3.1/dos a older system can run then at good speeds. I'd imagine if M$ still supported windows 3.1 running with current hardware everything would fly. It would take about 2 secs to load 3.1...

    I also remember playing older games such as Championship Manager 97 booted to DOS mode and was about 2x faster than when it was played in windows 95.
     
  21. brain_stew

    brain_stew Notebook Consultant

    Reputations:
    5
    Messages:
    275
    Likes Received:
    0
    Trophy Points:
    30
    The original Xbox didn't have eDRAM yet it too had a shared memory architecture and offered great performance.

    Don't get me wrong, I'm not underestimating the difference between the setup in a PC and a console but all I want to do is make clear that shared memory can offer good performance and does have potential. AMD / ATIs merger and focus on interating a GPU into the CPU die or even creating socketed GPUs without their own memory shows that this is an area that is going to become ever important and succesful.
     
  22. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    The original Dell link contains the GMA 3000 which is used in the 946GZ/Q965/Q963 chipsets. The GMA 3000 is basically identical to the GMA 950, the only difference is that it is manufactured on a 90nm process instead of 130nm which means that it is clocked at 667MHz rather than 400MHz. It still doesn't have hardware T&L or hardware VS. The GMA 3000 is a small step up from the GMA 950.

    The GMA X3000 is completely different though, which its why their similar naming is confusing. It's only similarity is that it is clocked at 667MHz like the GMA 3000. However, the GMA X3000 uses unified shaders (likely 8 with 4 TMUs), which will make Intel beat ATI (the R600 looks delayed to next year) in introducing that feature to PCs. The GMA X3000 also has complete hardware support for T&L, PS3.0, and VS3.0. The architecture is also supposed to be capable of HDR with AA like ATI. In theory, with 8 unified shaders clocked at 667MHz it's performance should easily surpass any previous IGP and as fast if not faster than ATI's Xpress 1250 (the new RS600 with the X700 based core, no word on clock speed but it looks like a 4PS/2VS configuration). Performance to discrete graphics solutions is probably X600/X1300HM level.

    In terms of shared memory, Intel's tile-based architecture with it's larger internal buffers means it needs less bandwidth than other architectures like the Xpress 1250, which were originally designed to have the benefits of discrete memory. The GMA X3000 also looks to have some type of multithreading to work around stalled threads which is probably similar to ATI's "Ultra-threading Dispatch Processor". The Fast Memory Access feature in the G965 chipset should also help in optimizing bandwidth.

    The problem as others have mentioned is that, as always, Intel is slow on the driver uptake. The tests that show the GMA X3000 being slower than the GMA 950 were using drivers that had no hardware T&L, VS or PS support. With the GMA 950 having hardware PS and well developed software emulation for T&L and VS it obviously had an advantage over the GMA X3000.

    The driver schedule is as follows:

    http://www.hkepc.com/bbs/itnews.php?tid=627088&starttime=0&endtime=0

    14.21: Advanced Pixel De-interlacing, Proc Amp, HDMI, DVMT 256MB support (early benchmarks)
    14.24: General performance and code optimization (current driver)
    14.25: Hardware Pixel Shader 3.0 support
    14.26: Hardware Vertex Shader 3.0 and Hardware T&L support (release driver)

    The GMA X3000 definitely has great potential, but a lot of it is up to how much effort Intel decides to put into successive driver revisions/optimizations. Hopefully, the added 3DLabs personnel and pressure from AMD/ATI will force them to put more effort into their IGPs. Now for desktops X600/X1300HM level performance isn't remarkable, although it is an IGP, the real potential for the GMA X3000 is in it's Santa Rosa mobile version. If it can offer MR X600/MR X1300 level performance with the low power levels of IGPs it could really be a big hit. And since the GMA X3000 is the lowest denominator for the Centrino platform, it could potentially mean that every labtop has at least MR X1300 level performance, which is not bad at all. Especially if Intel follows through on their hints that DX10 support will be activated on their unified shaders once Vista is launched.
     
  23. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    The specs of the x3000 do look promising (for an integrated device). As stated, if Intel can get some decent drivers together for it it could be a respectable integrated option.
     
  24. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    Dreamer, are you blind?

    LOOK AT THE BENCHMARKS FROM YOUR LINK!!!

    sure, in some games, the GMA and x200 are about the same level, but in some (ie farcry) the difference is huge. the difference between playiong with a GMA at 10fps and a x200 at 200 fps is what night and day is

    the x200 almost doubles the GMA's fps in halflife. 20 fps from GMA is a choppy game, but at the x200's 38fps is VERY playable

    the x200 beats out the GMA on that benchmark on every game except doom 3 and probably a lot of other games not in the benchmark

    and though some may consider a 6fps difference small, it can make or break how enjoyable a game is

    EDIT: oh yea, there is a thread around here about how some people are ENJOYING their gaming experience with the x200. Too bad they didn't put FEAR as a test, i'm sure the GMA would've choked on that one. Please don't even put the GMA on the same level as the x200.
     
  25. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    playiong - that's exactly what I'm talking about

    So, read again carefully, think again and you could find an explanation
    otherwise ask me if I'm blind again.

    I'm not gonna talk about that any more. Think whatever you want.

    Find in the dictionary the definition of word "playable"
    Actually you could find other useful things there too.

    Hint: When you double a very low score the result is again low score.

    Someone says something about benchmarks...

    After searching in dictionary you can (try to) play Far cry on Xpress 200M if don't have I'll send you one for a present.

    games are usually made not to be only playable

    this converetion is pointless if you don't see
     
  26. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    This is getting a bit too heated for my tastes, so if it could be toned down/reverted to a normal conversation, that would be great.

    Everyone has their own definition of 'playable'.
     
  27. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    wow, because of two typos i should go and read a dictionary and that my point is invalidated even though everyone understands what i meant

    so what does 1 typo and
    1+ grammar mistake amount to?

    i'm guessing by your definition that a game is unplayable unless you play it at 1600x1200 with hdr and 16x aa and af.

    while there are people like you, who thinks that graphics is what a game is all about, equally many will find that it is a game's plot or gameplay that makes it a great game

    anyway, i think the GMA will always be bottom barrel gpu and even though the x3000 will be dx10 compliant, it will amount to nothing as it probably won't be able to play any games that do require dx 10. i do think that once a fleshed out driver is out for the card, it will do much better then the current GMA 950 and the x200 in dx 9 games.
     
  28. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    I'm not talking about grammar at all. Generally speaking in the dictionaries there are more interesting things you should try.
     
  29. lowlymarine

    lowlymarine Notebook Deity

    Reputations:
    401
    Messages:
    1,422
    Likes Received:
    1
    Trophy Points:
    56
    OK, let's calm down a bit. No need to bicker over this.

    I have used both a GMA 900 and an fully-shared ATi x200M. The difference is indeed night-and-day.

    First, a brief explanation of the differences between The 900 and the 950: The GMA 950 is fundamentally very similar to the 900, with only a few slight core optimizations and faster clock. The performance difference you see in benchmarks comes partially from the fact that the GMA 900 is benchmarked on Pentium Ms running DDR2-533 RAM, while the GMA 950 is benched on Core Duos running DDR2-667 RAM. Well no kidding integrated graphics perform noticeably better on noticeably faster systems.

    The 3000 is a more significant step above the 950, with much better compatibility and a redesigned core, but the X3000 is the real winner, because like the x200M, Go6150, x1250M, etc., it has Hardware T&L, a vital feature for getting acceptable performance from modern games.

    The x200M (the fully-shared one, not the dedicated one, which would work even better), as I mentioned, has Hardware T&L already, as well as SM2.0 support. It can run F.E.A.R. acceptably well - at 640x480 w/minimum detail and DX8 Shaders on, but it still looks pretty incredible. The GMA 900 chokes even with pixel doubling turned on. Halo: 1280x768, SM1.4, medium-high graphics detail., fully playable on the x200M; on the GMA 900, the resolution has to go down to 640x480 at the same settings, or the details have to drop substantially. UT2004? 1280x768, medium settings on the x200M, but it has corruption issues even at 640x480 on the GMA 900. These comparisons are a little unfair: The GMA 900 machine had a Celeron M 1.6GHz, whereas the x200M had a Turion64 2.0GHz, but they were otherwise equal.

    The fact is that the x200M has better compatibility and better performance than the GMA 950, but to answer the original question: The GMA 3000 will not be noticeably better than the 950, though the X3000 should offer decent performance in a wider range of games due to it's hardware T&L.
     
  30. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Is the GMA 3000 actually redesigned from the GMA 950? That's what got me confused. The term GMA 3000 is used for both the 946GZ and the Q965/Q963. Since the 946GZ is just a 90nm shrink of the 945G I would assume it's GMA 3000 is unchanged, just clocked higher due to process allowance.

    Now the GMA 3000 on the Q965/Q963 is wierd. The fact that it's also dubbed the GMA 3000 makes it seem related to the 946GZ and 945G, but that doesn't make much sense. Now the Q965/Q963 share nearly all other features with the G965 such as the new memory controller and Fast Memory Access. If the Q965/Q963 actually uses the same IGP as the 946GZ/945G then that means that Intel developed 2 chipsets separately: the Q965/963 and the G965 with separate IGPs. That seems like an inefficient use of resources and adds complexity. The other explanation is that the GMA 3000 in the Q965/Q963 is actually the same IGP as the GMA X3000 in the G965 only that they locked the unified shaders in PS mode only. That also doesn't make sense, since if the hardware is there already, why not use it? Granted it could be to use defect parts, but then you might as well make the Q963 the defect chipset and give the Q965 the full GMA X3000 of the G965. It's all very wierd.

    The caveat is that the GMA 3000 will generally be coupled with Core 2 Duo chips. Since like the GMA 950, it uses the processor for VS and T&L, the added power of the Core 2 Duo over Netburst chips should give it a bit more encouragement. Still you'd be better off plopping a Core 2 Duo in a more capable IGP.
     
  31. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    You're right, the GMA 3000 on the 946qz is just a GMA 950 on a die shrink, with the speeds bumped up to 667mhz.
     
  32. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Any idea about the GMA 3000 on the Q965/Q963? Is it actually a overclocked GMA 950 stuck onto a P965 or is a a crippled GMA X3000 from the G965?
     
  33. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Its a cutdown X3000, no hardware T&L and only up to Shader Model 2. But it does have HD support.

    Info here.
     
  34. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Hmm. Since the GMA X3000 uses programmable shaders that's an interesting way to cut things down since it just means they are programmed for SM2.0 instead of SM3.0. The HW T&L is just a program too since I don't any current graphics chips actually have dedicated HW T&L units anymore. They are just emulated by the VS I believe. I wonder if that means someone can figure out how to flash the BIOS to reprogram and reactivate GMA 3000 functionality. It could also mean that the GMA 3000 on the Q965/Q963 is faster than the GMA 3000 on the 946GZ. Since the GMA X3000 looks to have 8 unified shaders, even if they are locked in SM2.0 PS mode that still gives the Q965 8 PS to the 946GZ's 4 PS.

    That article doesn't really say that the GMA 3000 is a cut down GMA X3000 though. It just said that the Q963 is a cut down Q965, but they both already use the GMA 3000. It just seems that too many features would have to be cut from the GMA X3000 to make the GMA 3000. Things like iDCT support for MPEG2 decode and VC.1 HW decode are not related to games and are useful to corporate platforms too for presentations yet they are not present in the GMA 3000. Similarly, OpenGL support is also useful outside of games yet that was cut back too. I can't wait for someone to do an in-depth review and architectural analysis.
     
  35. Dreamer

    Dreamer The Bad Boy

    Reputations:
    2,699
    Messages:
    5,621
    Likes Received:
    0
    Trophy Points:
    205
    thanks for the info lowlymarine I think I know enough about GPUs but thanks
    as for x200m I admited that several post ago but may strong position is that even the best trash is again trash. Grading like trash,better trash, best trash is a bit stupid for me.

    here is my position about GPUs http://forum.notebookreview.com/showthread.php?t=78144&page=2&pp=10
     
  36. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    One man's trash is another man's treasure...the more casual gamer simply doesn't NEED a powerful dedicated GPU and can get along just fine with an X200M. Yeah, my buddy was impressed at seeing World of Warcraft running on my Mobility X600 when he was just plugging along on a 200M, but it didn't make the game any less enjoyable. In my honest opinion, most games out there - with rare exception - that require a good GPU to be enjoyable aren't really worth the time anyhow.

    Make no mistake, I think there's a good argument for eye candy as entertainment unto itself, otherwise I wouldn't be running 7600s in both my machines. But those 7600s are just making already excellent games (Far Cry, Quake 4) that much more enjoyable. But Quake 4 was still a heck of a lot of fun at 640x480 on my old X600.

    What you have to understand is that these parts aren't designed to be hardcore gaming parts. They're designed to be inexpensive, battery efficient graphics adaptors capable of light gaming. The X200M and its more robust nVidia cousin the Go 6100 are impressive bits of engineering given their intended purpose. These are incredible inroads in this market; when's the last time an IGP existed that had this kind of power compared to the games on the market?

    Any improvement in the IGP market is only good for everyone, because by raising the lowest common denominator, you raise the average level of hardware that developers have to shoot for. No, they aren't for the hardcore gamer, but you need to stop thinking of yourself as the sole demographic. Some people may just want to play a quick game of World of Warcraft on the road. And God forbid, Unreal Tournament 1999 is still a great game.

    And to be entirely frank, I don't think someone as inarticulate and aggressive as you've been has any business passing judgment on anyone else's intelligence. I'll be less diplomatic than the mods here: grow up or get lost. These forums are as strong as they are because the people that post here are intelligent, informed, and most of all, mature. If you're not going to contribute something useful or ask a useful question, at least politely remain silent.
     
  37. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    dreamer, hmmmmm, I don't know how I got involved in your arguement, but I do not pick favorites and I'm generally a forgiving persion. If you have a problem with another forum member, then private message me; we do not need that sort of stuff on the forums.

    If this thread goes off on another tangent like that again, I'll lock it.
     
  38. jblock

    jblock Notebook Consultant

    Reputations:
    71
    Messages:
    106
    Likes Received:
    0
    Trophy Points:
    30
    It's amazing I entered notebookreview.com barely understanding RAM, and processor speeds and other such jargon. I have learned so much just from reading conversations had on the forum and you guys have helped me so much with figuring out what brand of notebook I want to buy (Asus), but you guys still have debates sometimes on another level that I just can't understand. :)


    *but I'm working on it!