The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    X3100 screenshots

    Discussion in 'Gaming (Software and Graphics Cards)' started by StarScream4Ever, Jul 11, 2007.

  1. StarScream4Ever

    StarScream4Ever Notebook Consultant

    Reputations:
    3
    Messages:
    210
    Likes Received:
    0
    Trophy Points:
    30
    Has anyone tries to play games on Intel newest mobile graphic card? If so, hows the performance? Screenshots w. frame rates would be nice.
     
  2. CodeMonkeyX

    CodeMonkeyX Notebook Deity

    Reputations:
    118
    Messages:
    1,168
    Likes Received:
    0
    Trophy Points:
    55
    It's not looking great. I have read somethings about image quality problems, and average performance in games. But at the same time Intel is lagging behind with new driver releases, so new drivers might help performance in the future.

    I guess it depends on what games and how important it is to have high quality graphics etc. But personally if I were seriously planning on gaming I would look into a dedicated ATI or Nvidia card.
     
  3. Onimun

    Onimun Notebook Consultant

    Reputations:
    67
    Messages:
    247
    Likes Received:
    0
    Trophy Points:
    30
  4. StarScream4Ever

    StarScream4Ever Notebook Consultant

    Reputations:
    3
    Messages:
    210
    Likes Received:
    0
    Trophy Points:
    30
    thats from the beta-driver right? I heard it was about 30fps.
     
  5. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    My testing with beta driver showed at the settings that its playable at, there is no performance difference with most games. One game, CoD2 showed a difference, but the performance fluctuation is so vast that I can't really judge what's the problem. Rendering problems occur when its running REALLY slow for prolonged periods at under 3 fps, but if its running normally, there is no problems.

    Of course the level of "playability" I am talking about may not be for other users. Prey got 18 fps for the first act, and Quake 4 got 22 fps. FEAR MP demo, got 57 average fps at 640x480 medium with DX8 shaders off. Turn on DX8 shaders in the options and crank it up to 800x600, and it dropped to average 13 fps.

    CoD2 is fine with 640x480 everything low and Force DX7 on. It gets around 45-50 fps. It gets decent fps(25-35 fps) at medium and 800x600. Disable Force DX7, and it plummets to 15-20 fps at 640x480 and everything low. Crank up the resolution and settings and it'll be unplayable even if the settings are put back to minimum until the game is restarted.

    They(Intel) needs to improve the performance vastly before I think it becomes a competitive IGP. It's still slower than GMA 950. It has better compatibility yes, but there is no point with the low performance.
     
  6. HavoK

    HavoK Registered User

    Reputations:
    706
    Messages:
    1,719
    Likes Received:
    0
    Trophy Points:
    55
    I would definitely count that as superior to the GMA950!

    FEAR - At least it runs smoothly on low-med settings. On the GMA950, at absolute bare settings, I could barely hit 25fps idle, averaging about 18. And Dx8 shaders should help your framerate not impede it, so I'm not quite sure what's going on there....

    COD2 won't even start on the 950....

    Hopefully the final drivers will offer much better performance...
     
  7. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    It seems that I misunderstood the description for the DX8 shader part for the FEAR settings. It actually says: "Force DX8 shaders even if the hardware supports DX9. Since DX9 shading is more graphically intensive than the DX8 shading, I got very confused when it ran faster with DX9 shading. Maybe the unified shaders are better than running on individual shaders, at least for FEAR. It's not true with CoD2 though.

    I have a Core 2 Duo E6600 btw. I should have quite a bit of advantage over your mobile system that only runs on Dual Channel DDR2-667, and a slower CPU. I have DC DDR2-800. Before you say that CPU won't matter on graphic intensive situations, it does matter a lot for the G965.