The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Intel GMA X3100 Graphics Performance Review

    Discussion in 'Notebook News and Reviews' started by Dustin Sklavos, Dec 24, 2007.

  1. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56

    The performance of Intel's GMA X3100 is a question on a lot of peoples' minds, at least if our forums here are any indication. And why shouldn't people be curious? The GMA X3100 is ubiquitous in retail stores, where it's next to impossible to find a notebook with dedicated graphics for under a grand and a half. If you're buying Intel, odds are pretty good you're buying a GMA X3100.

    When I went to the Game Developers Conference in February of 2007, I was pleasantly surprised by what appeared to be a renewed interest in gaming from Intel. They were very excited by their GMA X3000 series, which the X3100 is a part of, and why wouldn't they be? Here's the first graphics core they've made that's actually designed to do something other than put an image on your screen or run Aero Glass. They demonstrated a real desire to make their integrated graphics competitive with the parts from ATI and nVidia, and to raise the bar for gaming developers so that the "lowest common denominator" they had to code for would be a good one. They even took me behind closed doors and demonstrated their IGP running Half-Life 2: Episode One at medium settings at 1024x768, very playably.

    So GDC has come and gone, about ten months have passed and several new drivers for the GMA X3100 have materialized. The question becomes: has Intel made good on their promise?


    TEST SETUP

    To test the Intel GMA X3100, I'm using the HP dv2615us outfitted with 2GB of RAM. Specifications are as follows:

    • Intel Core 2 Duo T5250 (1.5 GHz, 667 MHz FSB)
    • 2GB DDR2-667
    • Intel GMA X3100 with latest drivers, version 15.7
    • 160GB 5400rpm Hitachi SATA Hard Disk
    • DVD-RAM w/ Lightscribe
    • Intel Wireless 3945ABG w/ Bluetooth
    • Windows Vista Home Premium 32-bit

    The processor is one of the low men on Intel's totem pole right now, but is nearly as common as the GMA X3100 in retail right now. It's become a very popular baseline part, and that's understandable: even at 1.5GHz, systemwide performance is formidable as I demonstrated in my last article where I pit it against a Turion 64 X2 clocked at 1.9GHz.

    The GMA X3100 itself is theoretically a pretty formidable piece of technology, at least for an integrated graphics part. It is presently the only IGP to use the unified shaders specified by Microsoft for DirectX 10 support, featuring 8 scalar shading units that can process video, vertex, and texture operations. It's also the first part from Intel that supports texture and lighting in hardware. While it doesn't feature its own memory, it can address up to 384MB of system RAM for video memory, and it addresses the amount of RAM it uses dynamically to ensure maximum performance.

    Presently, Intel's most recent driver enables hardware vertex shader processing, though this can be switched to software using a really roundabout and frankly tedious and over-involved method. The drivers for this hardware have been under constant updating and improvement over the past ten months, so we're going to see what these improvements have yielded.

    As a sidenote, I'm handling this review a little bit differently than my last two IGP reviews. Here, I'm going to focus a bit more on the hard numbers I was able to glean out of this hardware, so feedback on whether you like this approach better or worse than my previous approach is appreciated. Part of the reason I've chosen to do this is because, frankly, there's a lot of conflicting information on the forums about the performance of the GMA X3100 and I wanted to be able to just deliver straight facts.

    DOOM 3 v1.3

    [​IMG]

    If you've been reading me for a while, you're probably aware Doom 3 is one of my favorite whipping boys for graphics hardware benchmarks. Years after the fact, the very scalable (no, really, I've run it very playably on a Radeon 8500DV) Doom 3 can still offer a decent amount of punishment for hardware that's not ready for it. At the same time, a lot of technology it uses has since become the norm (stencil shadows, heavy bump mapping, specular lighting).

    For Doom 3, switching between software and hardware vertex shader processing yielded no difference in performance, which I find particularly interesting since anecdotally, it's been my theory that the reason nVidia's IGPs always murder ATI's (even the recent X1200 series) in this game because ATI doesn't include a hardware vertex shader like nVidia does.

    All benchmarks were run with the game set to 640x480 Low Quality. As the game was completely unplayable with everything enabled, I toggled options to see how it would perform.

    w/o Shadows
    11.5
    w/o Specular Lighting
    9.4
    w/o Shadows and Specular Lighting
    11.9


    The game is unplayable. At certain points during the demo it ran smoothly, but the instant the environment got even the slightest bit complex, the framerate took a heinous nosedive.

    Given the reputed lousy DirectX 8 performance of the GMA X3100, it's unlikely that even switching to one of the other codepaths in the game can produce the performance required to make the game playable.

    UNREAL TOURNAMENT 2004 v3369

    [​IMG]

    My other favorite whipping boy, I employed uMark to run several benchmarks on this classic. If you can't play Unreal Tournament 3, this is always available for dirt cheap. If you can...well, you're probably still better off buying this.

    I used uMark to benchmark the game in the Icetomb and Inferno maps with 7 bots enabled. I've found these to be two of the most demanding maps in the game.

    Benchmarks were run with all settings at Normal, all checkboxes checked, and shadows set at "blob." Benchmarks labeled "HS!" were run with all settings at their highest level, to the point where the game's announcer actually says "holy s#!+" when you set the last one. (Note: this isn't a joke, as most of UT2k4 fans can attest.)

    Resolution
    Icetomb FPS
    Icetomb Min. FPS
    Inferno FPS
    Inferno Min. FPS
    640 x 480
    41 12 35 8
    640 x 480 HS!
    31
    11
    37
    8
    800 x 600
    40
    12
    34
    8
    800 x 600 HS!
    30
    11
    35
    6
    1024 x 768
    35
    12
    29
    9
    1024 x 768 HS!
    24
    6
    27
    5

    As you can see, the GMA X3100 actually acquits itself pretty favorably in UT2004, delivering solid performance all the way up to 1024x768 at medium settings. This is a massive improvement from the GMA 950, which couldn't scrape these numbers the last time I played with one. So bad was that chip's performance that the game actually ran better and smoother under the software renderer.

    Unfortunately, this isn't all bread and roses. While the GMA X3100 offers very playable performance in UT2004, the framerate is erratic. It's playable, but it's not smooth. I don't expect miracles out of integrated hardware, but after reviewing nVidia's 7150M and the Go 6150 before it, I've come to know what to expect from graphics hardware. The high highs and LOW lows of the X3100 can make gameplay a pretty bumpy ride and as you can see from the benchmarks, a 35 frame per second average doesn't do anyone any favors when the framerate can drop as low as an unplayable 6 frames per second.

    FAR CRY v1.4

    [​IMG]

    Far Cry is one of my favorites for benchmarking, but the flipside is that even though technically the game scales well, the visual difference is always painfully pronounced. At low settings, Far Cry can be pretty depressing to look at, and you'll sit there wondering where all the foliage went.

    For benching Far Cry, I used HardwareOC's Far Cry benchmark v1.7 running Ubisoft's stressful Volcano demo. The benchmark program has three settings for image quality, "Minimum" and "Maximum." While clearly there isn't a lot of granularity here, it should suffice.

    Resolution
    Result (FPS)
    640x480 Min
    51.71
    640x480 Max
    23.56
    800x600 Min
    50.67
    800x600 Max
    21.01
    1024x768 Min
    46.63
    1024x768 Max
    17.07


    Far Cry carves out another win for the GMA X3100, offering very playable performance with some minor image enhancements. All told this thing should run Far Cry at 1280x800 with medium/low details no sweat. This is probably the biggest win I've seen for an IGP, so at least here we're seeing the X3100 start to live up to its potential. Far Cry fans will be very happy with the X3100.

    F.E.A.R. v1.08

    [​IMG]

    F.E.A.R. is another game that scales surprisingly well and is easy to test.

    First, an amusing anecdote: running the game with the graphics card set at Minimum actually results in WORSE performance than running it with the graphics card set at Low. This can very likely be attributed to using DirectX 8 shaders instead of DirectX 9's, which the GMA X3100 handles much better.

    With the computer and graphics card set at "Low," I achieved the following numbers:

    Resolution
    Avg FPS
    Min FPS
    Max FPS
    640x480
    39
    20
    59
    800x600
    32
    20
    52
    1024x768 24
    13
    35
    1280x800
    20
    10
    27

    Below 1024x768, the X3100 acquitted itself VERY well and unlike in other games, actually produced fairly smooth framerates. I've watched this benchmark a thousand times, and on the X3100 it always slowed down and sped up precisely as I'd expect from any AMD or nVidia part. F.E.A.R. fans are in for a treat here: the X3100 runs it very well for an IGP as long as you don't enable Volumetric Lighting or Shadows, which seem to be historically the achilles heels of IGPs everywhere.

    HALF-LIFE 2: LOST COAST

    [​IMG]

    And here we come to what was hands down the most aggravating, major disappointment of the entire suite of benchmarks. Half-Life 2: Lost Coast was pretty stressful when it came out, but Valve's Source engine is in my opinion the most well-developed, scalable engine on the market. I'm not a huge fan of the games, but Source is an impressive piece of technology.

    It also bears mentioning that Half-Life 2 was Intel's bread and butter at GDC. Their big win was playable performance in Episode One, which inherited the technology pioneered in Lost Coast.

    So I set everything to its lowest setting and ran the video stress test. Here's what happened:

    • At 640x480, it produced an average of 20.35 frames per second.
    • At 800x600, it produced an average of 19.15 frames per second.
    • At 1024x768, it produced an average of 16.39 frames per second.

    This is horrendous. Lost Coast isn't THAT much more stressful than vanilla Half-Life 2 at these settings. I've read people getting solid performance in Half-Life 2 on the forums, but that clearly didn't reproduce here, where framerates were erratic at best and performance was dismal.

    IMAGE QUALITY

    I'm sure by now you've noticed an abundant lack of the screenshots which usually accompany my reviews. Simply put, image quality on the X3100 is, much like the framerates often are, erratic. I've played very little on it where there wasn't some kind of artifact that creeped up, be it the odd polygon here, or nasty texture flicker there. About the only game I ran on it that seemed to look right was Far Cry.

    And then there's the erratic performance to discuss. While a couple of games like Far Cry and F.E.A.R. delivered as well as any competing ATI or nVidia part, the majority of my experiences with the X3100 have been disappointing. In some games, framerate slowdown appears random. In others, it goes to hell in places an ATI or nVidia part wouldn't struggle with.

    CONCLUSION

    I've heard from multiple sources that Intel's graphics driver team is hopelessly overworked and understaffed. We'll just call Intel's graphics driver team "Jim." Jim, I know you're trying your best, but you really need some help, and you really need to prioritize.

    The forums here are quick to point out that the X3100 offers much better performance in XP. At first glance, that seems reasonable, but the more you noodle it, the more insane it sounds. Simply put, the X3100 was released AFTER Windows Vista. So logically, Vista-based notebooks with X3100s are going to radically outnumber the XP-based ones. Splitting resources between the two platforms just seems like suicide: focus should be on Vista at this point, because Vista's what's shipping. Does the X3100 put a picture on the screen in XP like the GMA 950 did? Fantastic! Let's move on.

    The real crime here is that, to me at least, the promise that Intel made in February hasn't been fulfilled. Performance has improved, sure, but if Intel seriously cares about competing with ATI and nVidia they're really going to have to kick it up a notch, they can't just build a GPU and "hope" it works.

    The forums are useless for figuring out performance of the X3100 because it varies so wildly, because it's so inconsistent and random. You can't even throw out a general performance estimate for it, games are constantly case-by-case. Half-Life 2, for example, should run absurdly well given the great performance Far Cry - a much more stressful, much less optimized game - puts out. Yet it doesn't.

    When I buy a game in the store, I can be reasonably certain that when I run it on my desktop at home on my GeForce, it'll work. It'll look like it's supposed to and it'll run consistently. If worse comes to worst, I can usually change the driver I'm running to one that WILL work. The support structure is there. The GMA X3100 is such a crapshoot right now that it's not even worth fighting with. I think I was actually insulted by how poor its performance in Lost Coast was.

    Of course, I do have a solution for all the casual gamers out there. If you must have Intel (and at this point, that's certainly reasonable), you're in luck. HP and Dell both have great mainstream notebooks on the market right now at reasonable prices that you can get a dedicated graphics card in. It doesn't have to be great, it just has to work fairly well. nVidia's GeForce 8400M GS is all over the place right now. Before using an academic discount, you can get a dv2500t from HP with the 8400M GS for $874. From Dell, you can get it in an Inspiron 1420 for $800. And if you're willing to go up to a 15.4" notebook, the price for it really just tanks.

    If you're going to game even casually on your laptop, unfortunately, I don't think the X3100 can be settled for. While technically the hardware is there, it's not worth gambling on it, hoping that eventually Intel will magically make it suck less. There just aren't any guarantees.

    Film students/filmmakers have an old joke: "fix it in post." It's funny because if your production starts living by this motto, by the time you get to post-production, your project is going to be borderline unsalvageable.

    Apparently Intel was planning to fix it in post. Good luck with that.

     
    Last edited by a moderator: May 12, 2015
  2. John Ratsey

    John Ratsey Moderately inquisitive Super Moderator

    Reputations:
    7,197
    Messages:
    28,841
    Likes Received:
    2,166
    Trophy Points:
    581
    Thanks for another of your analytical reviews.

    I find that the X3100 is fine for my gaming needs (which stop at Solitaire :biggrin: ).

    John
     
  3. Lysander

    Lysander AFK, raid time.

    Reputations:
    1,553
    Messages:
    2,722
    Likes Received:
    1
    Trophy Points:
    55
    Well, even at the current terrible state of drivers, I can play Defcon and DoW. Which is enough, for now, I suppose.

    Damn, I really do hope they bring out better drivers. Good thing is though, when the X4500 is released, it'll use the same driver set, which means that X3100 users will get the same optimisations the X4500 users get.

    Or, it means the X4500 get crap all too...
     
  4. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    I say this thread gets stickied in the Gaming forum in big red bold letters that state "X3100 QUESTIONS HERE!".

    Nice review, Pulp.

    Conclusion: If you are going to game, do yourself a favor and get something better than the X3100.
     
  5. wesrubix

    wesrubix Notebook Guru

    Reputations:
    0
    Messages:
    62
    Likes Received:
    0
    Trophy Points:
    15
    I find I get better performance out of my X61 with games (and in general) when I use Vista (Business)--I know shocking.

    Intel's X3100 / GMA965 is sufficient for gaming so long as you can tolerate lower detail settings.

    I'm actually not surprised I got better performance in Vista: Intel's drivers for Vista are newer than in XP. Doubly frustrating though, the x3100 is DirectX10 capable, but Intel hasn't released drivers to do it yet.

    If you have a beefy machine and plenty of ram (e.g. 3 gigs) GMA will do you good enough because you can increase the memory foot print that much more--it's proportional to your total ram.

    I think GMA is worth the gain in portability, battery life, and price difference for people who do "light" gaming--which is to say they don't care if they can count the hairs on an opponent's face.
     
  6. lokster

    lokster Notebook Deity

    Reputations:
    63
    Messages:
    1,046
    Likes Received:
    0
    Trophy Points:
    55
    awesome review. if anyone points out x3100 il refer them here,

    its gaming capabilities go as far as minesweeper. dismal imo. Intel isnt good for gaming and they should really stick to processors
     
  7. Airman

    Airman Band of Gypsys NBR Reviewer

    Reputations:
    703
    Messages:
    1,675
    Likes Received:
    1
    Trophy Points:
    55
    I second Usapatriot's idea to make this a sticky.

    Nice job Pulp I obviously have no interest in the X3100 but to see how well it can run F.E.A.R. was quite a surprise for me. It was nice to see you used it as a benchmark those are decent frame rates.
     
  8. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,080
    Trophy Points:
    931
    Nice article Pulp, a good read. Now we have something to fall back on for evidence that the X3100 is not suitable for people who want to game a lot. It is okay for those who only want to join in a game with their friends every once in a while I suppose. It CAN play games, that is for sure, but it has basically no headroom for future gaming and as noted, the driver performance is erratic.

    I made a thread over the summer on the X3100's performance; it did reasonably well with HL2, it was very playable at my settings:
    http://forum.notebookreview.com/showthread.php?t=148045
     
  9. stabile007

    stabile007 Notebook Guru

    Reputations:
    62
    Messages:
    53
    Likes Received:
    0
    Trophy Points:
    15
    Another excellent review pulp. It is like you answered all my questions I had in three concise articles. In the end my decision to get a Turion based with a Geforce 7150 laptop has been reinforced.
     
  10. justanormalguy

    justanormalguy Notebook Consultant

    Reputations:
    117
    Messages:
    232
    Likes Received:
    0
    Trophy Points:
    30
    Very good review pulp. Great read. Answered a few questions I had in the back of my mind!

    Thanks!
     
  11. XPS1330

    XPS1330 Notebook Deity

    Reputations:
    81
    Messages:
    967
    Likes Received:
    0
    Trophy Points:
    30
    the performance you're getting seems fishy...

    I'm using 15.7.1364 drivers, with

    T7100(1.8ghz)
    1gb of Ram :O
    Vista 32
    120gb 5400rpm


    and I get these frames:

    NBA LIVE 2003: 26-38 FPS, max, with AF 1024x768
    HL2 demo: 14-59 FPS, med-high, no AA/AF 1280x600(doesnt stay at 59fps for long)
    COD4: 9-58 FPS, low(with some in med), no AA/AF 800x600(doesnt stay at 58 fps for long)
    BattleField 1942 demo: 38-40FPS, no AA/AF 1024x768
    Transformers demo: 21-36FPS, bloom: on, speed blur :eek:n,etc. 1280x600
    UT2004 demo: 17-38 FPS, highest settings for demo, 1280x 600
    Tomb Raider anniversary 19-30 FPS, all special effects on, 1280x600

    I think the overall performance I'm getting is quite amazing for a IGP, and I dont think the x3100 is at all, a crapshoot, ;)

    P.S on wifi, I get 4 and a half hours of battery life b4 it dies out, so you know, nVidia isnt always the clear choice.



    Tomb Raider anniversary 19-30 FPS, all special effects on, 1280x600
     
  12. osso002

    osso002 Notebook Evangelist

    Reputations:
    33
    Messages:
    419
    Likes Received:
    0
    Trophy Points:
    30
    do you have the latest update of directx 9? It doesnt come with vista and it made a big difference on my 8400 g
     
  13. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Another great review Pulp.

    I just wanted to point out that it isn't unreasonable for Intel to be focusing on XP performance. While the GMA X3100 may have come out after Vista, the GMA X3000 came out before. In all likelihood, XP is the most common OS with the GMA X3000, and so when the driver team develops drivers for the GMA X3000 architecture, XP and the the GMA X3000 are the baseline for development, and then the results are adapted to Vista and the GMA X3100 if any changes are needed. And Intel's CEO has been on the record as saying he doesn't feel Vista will really pick up until SP1 is released, so it doesn't surprise me if XP remains the focus of development until Microsoft stops selling it mid-next year.
     
  14. jak3676

    jak3676 Notebook Consultant NBR Reviewer

    Reputations:
    13
    Messages:
    184
    Likes Received:
    0
    Trophy Points:
    30
    Nice review - well done as always. I think its sporadic performance has lead to a lot of the confusion for this IGP. Nice that you could clear some of that up.

    I get tired of everyone harassing all of us IGP users. Until someone makes a dedicated card that will improve my battery life and costs less than nothing, I'll always be stuck with an IGP. I know I'm not the only one that boat. I'm sure an 8400 is a nice dedicated card, but if my budget suddently increased by $100 (the cost of swapping out an IGP for an 8400 from either Dell or HP), I'd rather upgrade just about anything other than the GPU - maybe just buy a 2nd battery.

    Last time I checked the IGP market was still about an order of magnitude bigger than the dedicated card market and Intel still sells more graphics solutions than ATI and Nvidia combined. The situation has been improving over the last few generations to the point where there are even some modern games that are playable. In comparing IGP's Nvidia and ATI may be similar to New England while Intel is more like the Miami Dolphins, but at least they are finally playing in the same league. Last generation (GMA 950) it didn't even seem like Intel was trying to compete - it was more like a major league to minor league comparison. If you go back two generations (GMA 900) it was more like a pro-league to little little league. At least Intel is heading in the right direction.

    If you buy your laptop to play games, then no IGP will ever really work for you. But, for those us that mainly use our laptops for office work and email, an IGP does fine, and if you get a decent IGP you can even play the occasional game too. Intel finally has what I'd concider a decent IGP. I've always stuck to older last gen games and haven't had many problems. If Far Cry, FEAR and HL: episode one are playable, but Doom 3, UT 04 and HL: Lost Coast are not - that's fine by me. That's still a lot more options than Minesweeper or Solitare.
     
  15. Deify88

    Deify88 Notebook Consultant

    Reputations:
    9
    Messages:
    130
    Likes Received:
    0
    Trophy Points:
    30
    Great review. It makes me wish I would have picked up a HP with a 7150 as opposed to what I have now, but then again, I didn't buy this laptop for games (school first, games second :D).
     
  16. jak3676

    jak3676 Notebook Consultant NBR Reviewer

    Reputations:
    13
    Messages:
    184
    Likes Received:
    0
    Trophy Points:
    30
    Deify - darn you, you got the laptop that I was just trying to find. I ended up getting a T-1616 (14.1", AMD TL-58 @ 1.9 GHz, ATI X1270). I couldn't find any of the T-6816's in stock anywhere. Mind if I ask what you paid and how the battery life is?
     
  17. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    In a previous article, though, I pointed out that an AMD chip on an nVidia IGP will give crap battery life compared to an Intel on a GMA X3100.

    And on the bright side, you have a mobile part that's capable of producing decent performance in some games, and apparently still has a decent amount of headroom left in driver optimizations. Every new driver from Intel is gonna be like Christmas for you; that's gotta be pretty cool.
     
  18. applevalleyjoe

    applevalleyjoe Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Fine review...sticky material! Question, though I'm not sure that it can be answered here: if the x3100 is a chip, why can't it replace the existing 900 or 940 series GMA's already in place in some laptops? I've been told that it can't be done but not why it can't be done.
     
  19. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    X3100 isn't just a chip, it's integrated into the modern northbridge of Intel's 965 motherboard chipset. It can't be done because it's a part of something else, something much bigger and more complex.
     
  20. jak3676

    jak3676 Notebook Consultant NBR Reviewer

    Reputations:
    13
    Messages:
    184
    Likes Received:
    0
    Trophy Points:
    30
    AMD CPU with an ATI IGP is generally equally poor battery life for what its worth. It has little to do with the IGP's however. That comes more from the CPU.
     
  21. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Pulp, are you up for comparing the mobile GMA X3100 against the desktop GMA X3000?? I want to see what's the real reason for the difference in performance between the two. From user posts and my tests, the difference is too great.

    I had some theories:

    1. I was initially thinking that the combination between the CPU and the platform difference combined with a weaker IGP on the X3100 is causing the difference. That makes sense in software T&L/VS because the graphics relies more on the CPU(hence a greater difference between CPUs), but in hardware, it doesn't make sense. Unless Intel made a specific optimization on the certain CPUs to make them much faster than the CPUs that are not when its running on the Intel IGPs.

    I concluded that originally because when I went from a Celeron D 2.53GHz to a Core 2 Duo E6600, my frames in WoW increased by 3x. 3x is about the differences in performance in CPU-intensive single-threaded apps between the two CPUs.

    Well, I guess I was wrong. I got bunch of hints and its much simpler than I thought.

    2. The differences in performance between the X3000 and the X3100 is actually because of the differences in platform, aka memory bandwidth, and the core clock speed.

    Noxxle's(T7300, 2 gigs of PC2-5300 ram 5-5-5-12/1.8v) 3dmark05 results with XP and 14.32 drivers

    Software T&L: 868
    Hardware T&L: 690

    My(E6600, 2GB PC2-6400 5-5-5-15) result with same drivers and OS
    Software T&L: 932
    Hardware T&L: 998

    Differences in the software mode is only 7.4%, and it also surprised me as the differences between the CPU/memory should have made a bigger difference, or at least I thought initially.

    But look at the hardware mode, I get 44.6% higher scores. It's theoretically also the mode where the CPU is less determinant on the graphics performance because more of the graphics processing is done on the GPU. It must mean that the difference is because of core clock+memory bandwidth almost entirely(Remember what being GPU bound means??)

    From Quake 3 Arena tests, I got 10-11% higher fps using DDR2-800 instead of DDR2-667.The rest of the differences in the total of 44.6% is the differences between the X3100 and the X3000. X3000 clocks 33% higher.

    Makes sense??


    Another game, Need for Speed: Underground 1

    Me:
    1280x1024 All settings High
    20-25 fps(never went below 20, never went above 30)
    800x600 low
    51 fps with FRAPS, done with 1 lap race

    kyblik(another notebookreview user)
    hi
    i have just bought my new laptop - FSC Esprimo M V5505 (T2330,2GB DDR2,x3100) and i have som problems with playability of all games. i use Win XP(just installed) and i have installed version 14.31.1 drivers for x3100... but when i start for example, need for speed: underground (800x600,lowest details) a have only 10-25 fps! thats even worse than on my old pc (celeron 1,2ghz,384mb,TNT2 32mb !!! )


    New conclusion is that the memory bandwidth+core clock speed differences can attribute for 30-50% performance difference between the X3000 and the X3100. So when you see the benchmarks of the X3000, that's not what you'll get with the mobile X3100.

    Oh, and the default processing for NFS:U is hardware.

    As far as I remember, the IGP tested at GDC by Intel is the G965, which is the X3000.
     
  22. k3l0

    k3l0 Notebook Consultant

    Reputations:
    40
    Messages:
    281
    Likes Received:
    0
    Trophy Points:
    30
    Intel's GPU naming scheme is rather confusing.
    http://en.wikipedia.org/wiki/Intel_GMA#Table_of_GMA_graphics_cores_and_chipsets

    The GMA X3000 (G965 desktop chip) is more powerful than the GMA 3100 (G31/G33/Q33/ Q35) desktop chip. But neither are as powerful as the recently-released GMA X3500 ( G35) desktop chip. And the mobile GMA X3100 has the newer Shader Model 4 capabilites like the desktop GMA X3500.
     
  23. duffyanneal

    duffyanneal Notebook Deity

    Reputations:
    539
    Messages:
    981
    Likes Received:
    0
    Trophy Points:
    30
    While I agree that the X3100 is not a gamer's graphics solution it has more potential than what the OP demonstrates. The test machine is about the worst case you could find. Moving to unmatched RAM would take it to the bottom of the barrel. As most know the integrated graphics shares machine RAM and so it also shares memory bandwidth via the memory bus. The T5250 CPU used in this test is severely neutered. It has half the L2 cache as a standard C2D and the memory bus is 667 MHZ as compared to the now standard 800 MHz. The slower memory bus speed and reduced cache size acts as a bottleneck for the integrated graphics. The wider/faster the pipe the better the graphics output from the X3100. In addition the reduced cache size means that there are going to be fewer cache "hits" for the CPU so it is going to have to access RAM more often which ties up the already congested pipe. The last nail in the coffin is the CPU speed which is definitely bottom shelf.
     
  24. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Although I don't disagree with you duffyanneal, the hardware on the X3100 is indeed much weaker than the X3000, the desktop version. What I am saying is not how the game runs as much as how much X3100 is different from the X3000. Hardware mode on the X3100 seems to be slower on lots of the games even on XP. I know with the X3000, usually the difference is much less or the hardware mode runs better.

    On my post, you can see 3dmark05 results compared with mine and a poster named Noxxle. He has a T7300 CPU, which is a 800MHz FSB part with 4MB L2 cache. You can see that the software mode only results in 7% difference by moving to a E6600 CPU and 1066MHz FSB with DDR2-800.

    DDR2-533 dual channel is enough to saturate the 1066MHz FSB, yet the Santa Rosa platform with 800MHz FSB is using DDR2-667. It's because the IGP gains performance from it. The gains from the FSB is really purely only for the CPU as long as the total memory bandwidth equals or exceeds the FSB.

    Yet using hardware mode increases the difference to a surprisingly large 45%. It's because more of the hardware on the IGP is being used, so its more sensitive to the core clock and the memory bandwidth increases.

    You should know from the video card tests that low end IGPs running demanding games bottleneck the CPU. Faster CPUs in that case don't really benefit the performance much. Hence, the reviewers run the games at 640x480 low or something like that. Yet even on software mode which than the transform & lighting, and vertex shading is done by the CPU rather than the hardware, the difference is only 7%, despite the massive difference. The core clock+memory bandwidth+CPU differences should be more than 7% right?? However, even in the older discrete graphics that didn't support hardware T&L like the TNT2, it would have situations where it becomes video card bottlenecked.

    Now we are talking about hardware mode, where the CPU is even more irrelevant than the GPU itself.

    I can see couple of games that I can run that the X3100 users can't. The difference cannot be the CPU difference alone. The IGP performance spectrum is on such a low end that CPU doesn't matter after a certain point. You'd never recommend using a E6600 over a E6300 when using a low end graphics like X1300 and playing a demanding game like Bioshock, right?? The difference isn't big enough because the CPU is bottlenecked by the GPU.

    It's a little different for Intel IGPs because they can run both in software and hardware mode. The driver sometimes even do that and you can even modify it! However, you can't run some games in software mode, or switching from hardware mode to the software mode isn't easy for the average user, which in that case it'll be running on hardware mode.

    It seems that we are all forgetting that mobile variants of GPUs are significantly slower than the desktop parts. It's true with the X3x00 too.
     
  25. vbrookie

    vbrookie Notebook Consultant

    Reputations:
    2
    Messages:
    255
    Likes Received:
    0
    Trophy Points:
    30
    Great article.
    I am wondering about battery life. People who consider buying X3100 notebook don't look in to games. They most likily think about battery life. It would've been super if you have compared similar speced system and measured the battery life too. I too considered X3100 on vostro before I pulled a trigger on M1530 because of battery life for the most part.
    Anyways thanks for the gread read.
     
  26. gilo

    gilo Notebook Deity NBR Reviewer

    Reputations:
    166
    Messages:
    707
    Likes Received:
    0
    Trophy Points:
    30
    Hmmm , but can the x3100 run King quest I in full 16 colors ?
     
  27. stabile007

    stabile007 Notebook Guru

    Reputations:
    62
    Messages:
    53
    Likes Received:
    0
    Trophy Points:
    15
    Wha---you silly fool of course it can't! 16 colors!? What next? Cyrsis 2 on a X3100? Please...you need a 70 jiggahertz processor at least for that and at least 2TB of RAM....and a GeForce 12K Video Processor Unit with a 12kw PSU. And Windows Skynet....which of course is DirectX 20 only.
     
  28. stabile007

    stabile007 Notebook Guru

    Reputations:
    62
    Messages:
    53
    Likes Received:
    0
    Trophy Points:
    15
    But then most people with a X3100 either a) Don't care about video speed and will get a super fast CPU in a more expensive laptop designed to fit their needs or b) do not have a large laptop budget and are trying to get the cheapest unit they can get away with. In which case the most likely scenario is that they would have a T5250 processor or slower. Which isn't even that bad.

    If you want good performance and have money to spend on a faster processor and want to play games then you would shift resources to a dedicated card and balance it with a decent video card. But if you are stuck with a $700 budget then your choices really are geForce 7150M, X1270, or a X3100. so i think this review is perfect because it is going after the primary target market - people with super tight budgets who want to know what they can get for their funds. if they don't care about playing games then they should get an Intel based machine as it offer better battery life and performance at the expense of gaming performance. If they want to game then they should get an AMD based machine with a geForce 7150M or X1270 as that will give them the best gaming performance. if they want to game and have an intel then they will need to increase the budget and start looking at getting a machine with a 8400M or better.
     
  29. Wooky

    Wooky Notebook Evangelist

    Reputations:
    60
    Messages:
    692
    Likes Received:
    0
    Trophy Points:
    30
    I have a silly question. Doom3 is OpenGL, not Direct3D. So what does the poor DirectX 8 performance of the GMA3100 has got to do with it?
    Nice review. Unfortunately, mainstream notebook graphics cards are still a long way from where I expected them to be now, and this review proves it.
     
  30. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Nice point. I didn't notice that. Doom 3 is OpenGL. It seems that the OpenGL portion of the Intel driver is much worse than the DirectX part. Remember how ATI always lagged behind Nvidia in OpenGL performance?? I guess its same with Intel.

    Actually X3100 isn't a mainstream, its an integrated GPU, which is even lower/cheaper :p.
     
  31. lokster

    lokster Notebook Deity

    Reputations:
    63
    Messages:
    1,046
    Likes Received:
    0
    Trophy Points:
    55
    t-1616 isnt so bad ^^ gaming wise its way better than the x3100 ^^
     
  32. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    It's not a silly question, actually, and I'm happy to explain my remark.

    While Doom 3 is OpenGL, it has separate rendering code paths for different generations of DX hardware, most of these being DX8 or lesser parts. The ideal codepath of Doom 3 is taking advantage of DX9 class hardware - DX9 is about the only place the X3100 can excel.

    So if the X3100's DX8 performance is largely in the toilet (as word on the street says it is), switching the codepath to something optimized for DX8 hardware would probably hinder more than help. :)
     
  33. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    I'm sorry, I'm not interested in doing that comparison. I don't have access to an X3100 anymore, and honestly, it doesn't feel that relevant to me.

    The X3000 is clocked something like 216MHz higher (or roughly 50%) and has access to substantially more powerful hardware.

    All we're doing is proving that Intel should probably focus on getting more running on the IGP and less on the processor.
     
  34. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    Excuse me? Would you mind showing some respect?

    I do get it. That's why the comparison is worthless; the X3000 has access to substantially faster hardware - that is, the subsystem surrounding it - so the faster processor and RAM can pick up the slack. Desktop procs and RAM are ridiculously faster than their portable counterparts.
     
  35. lokster

    lokster Notebook Deity

    Reputations:
    63
    Messages:
    1,046
    Likes Received:
    0
    Trophy Points:
    55
    fanBoy ^^ as different as you say they are Basically speaking Intel Cards suck for gaming. if you want to or hope for gaming DONT get intel.

    again Pulp thanks for this nice comprehensive review which should answer all a newbies question. :cool:
     
  36. jman888

    jman888 Notebook Consultant

    Reputations:
    0
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    My experience x3100 on my laptop is awesome Serious sam 2 &1 play good along with HL2.
    [​IMG]
    <- HL2 screen
    Max settings
     
  37. WhiteLady

    WhiteLady Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    0
    Trophy Points:
    5
    Hi,

    First of all, sorry if this is not the right place to ask this question, please have a bit of patience as I'm a newbie!

    I'm not an expert on computers an dwould like to know if a sony-bx 51 VN equipped with an Intel x3100 GPU will be enough for me... I basically don't play any games, although I do some basic photo and video editing (in video, really basic)

    In case other specs help:
    Intel® Core™ 2 Duo T7250 2 GHz
    2048 MB DDR2 667 MHz. (2x 1024 MB)

    Thanks a lot for your help!
     
  38. MadHater

    MadHater Notebook Deity

    Reputations:
    229
    Messages:
    1,743
    Likes Received:
    34
    Trophy Points:
    66
    You'll be OK with x3100, it is strong enough for tasks you are going to do on your laptop.

    In your stuff, processor and RAM memory are much more important than graphics is.
     
  39. DanJB

    DanJB Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Hey.

    Just read your review (good review by the way) and thought I would comment.

    I don't know if any one here has done any kind of development on these cards, but I have had major problems when rendering large polygons, where they shear, and large parts of them often become invisible (if any one would like screen shots I can supply them).

    I contacted ACER (I have an aspire 7720 running Vista home premium) who tried telling me that the application I was running was not Vista compatible so I tested on another vista machine running an ATI card and also one running an NVidia card. These cards ran the application fine.

    I then contacted Intel who say that this effect has not been reproduced in testing and that it is an issue with how the chipset has been integrated into system.

    Bottom line i am stuck in the middle or ACER and Intel denying responsibility. I would avoid the Aspire 7720 for any thing other than basic office tasks.
     
  40. pundit

    pundit Notebook Consultant

    Reputations:
    11
    Messages:
    124
    Likes Received:
    0
    Trophy Points:
    30
    A quick question regarding the X3100 ...

    Would it be sufficiently powerful for Photoshop image manipulation - more to the point using photoshop to manipulate/enhance digital photos, or would a dedicated card such as an nVidia nvs 140 be better? (or perhaps even an FX 570m).

    I'm not into much gaming (the odd bout with civilization2 or simcitiy to keep me occupied on a plane or sitting around waiting for the next flight out - but nothing extreme - I have a desktop for that kind of "work").

    I'm looking at the Thinkpad X61s with LV7700 (1.8GHz), 3GB RAM and a 7200rpm hdd with the X3100 ...

    Cheers!
     
  41. yggdrasil

    yggdrasil Notebook Geek

    Reputations:
    2
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    For 2D work (photoshop, etc) the X3100 is excellent. It's only the 3D gaming performance leaves something to be desired.
     
  42. mario666

    mario666 Notebook Consultant

    Reputations:
    41
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    Hi,

    I'm going to buy a Dell Vostro, and I was wondering if there was any reason to upgrade the X3100 GMA to the GeForce 8400M GS (128MB) graphics card, considering that I don't intend to play games on it. Are there any other advantages to having a graphics card (e.g. better with a high resolution display?), or will it just turn power into heat. I'm assuming that in normal use (i.e. not playing games), it will consume more energy than an integrated GMA?

    Thanks in advance.
     
  43. yggdrasil

    yggdrasil Notebook Geek

    Reputations:
    2
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    If you don't plan on playing 3D games or using other 3D applications (CAD, etc) the GMA X3100 is the best choice. It will consume less power, generate less heat and the difference in 2D performance is rarely noticeable.
     
  44. mario666

    mario666 Notebook Consultant

    Reputations:
    41
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    I'll save my pennies and stick with the X3100 then.

    Thanks for the quick reply, yggdrasil!
     
  45. ash11

    ash11 Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    Is it possible to upgrade integrated graphics card?? I doubt!!
     
  46. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    You cannot upgrade an integrated graphics card, once you have the laptop. When ordering, you basically choose one or two motherboards...one with integrated, one with dedicated.
     
  47. Wes87

    Wes87 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Hello everyone

    I just bought a Dell XPS with a Intel GMA X3100 Graphic Accelerator and as yu may know, this particular Dell comes with Vista Home Premium and Vista ONLY drivers. Since i don;t like Vista, i wanted to install XP professional but i had a problem, I can;t find a driver for the Graphics Card.

    Can anyone help me out, please? :confused:
     
  48. neo-cortex

    neo-cortex Notebook Geek

    Reputations:
    1
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    go to intel's website to find the driver.

    http://downloadcenter.intel.com/Det...&OSFullName=Windows* XP Professional&lang=eng

    this should be what you're looking for
     
  49. Clutch

    Clutch cute and cuddly boys

    Reputations:
    1,053
    Messages:
    2,468
    Likes Received:
    28
    Trophy Points:
    66
    What I am planning on doing is getting a sub $1000 laptop (with warrenty)
    If I am to get the Intel graphics will it matter if I get it in diffrent laptop?

    As in will the fps on a HP dv2700 the same as on a Sony Vaio CR (assumming the same Ram and CPU)?
     
  50. corujo712

    corujo712 Notebook Evangelist

    Reputations:
    26
    Messages:
    348
    Likes Received:
    0
    Trophy Points:
    30
    Will the x3100 be able to run Guitar Hero III on a mac? The specs says that the minimum is an ATI x1600. I am going to upgrade my ram from 2gb to 4gb if thats makes a difference in vram available.

    Is the x3100 better than the x1600?
     
 Next page →