The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    intel x3100 IGP tweaking thread

    Discussion in 'Gaming (Software and Graphics Cards)' started by nav-jack, Jul 17, 2008.

  1. nav-jack

    nav-jack Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    this is my intro thread and also a normal post all in one. i have an acer laptop with these specs:

    intel celeron 1.8ghz
    1 gig ram
    and of corse, a 3100.

    i dont have the option to upgrade my laptop. so i really would like to use my tweaking skills i had with my old built computer that had a athlon 64 and a 9800 with 512mb ram... i was with the big BIG 3dmark 2001 scene where just getting 10 more 3dmark score was amazing. and i am let down that i cannot, for the life of me, find one tweaking program for this IGP. even my old compaq with a ati 200m is oddly faster in some areas, but has heat issues. but at least it allowed me to overclock and underclock and use the normal ati tweaks. i just ran 3dmark 2001 with the new drivers intel made.
    here is my report in the attachments.

    EDIT: ive noticed more and more, the whole issue with any igp for laptops. is the fillrate. anytime u get alot of transparencies or polygons u get almost a brickwall. i know u cannot tweak that outta the math of the whole thing. but anything beyond what u see in the intel graphics panel would be interesting. or even a 3rd party driver.
     

    Attached Files:

  2. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    why r u running a new card on a 7 year old bench mark? it wont show its "powers" on usch an old test. The GMA 950 for example gets better while on 3dmark06 X3100 owns GMA 950
     
  3. nav-jack

    nav-jack Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    if i cant really run anything with directx 9 in it (team fortress 2). why would something newer do worse in old tech. im hating computing more and more. bring back the geforce 3 ti 200. i want some old fashion stability and predictibility. not cons and marketing.
     
  4. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    u need a lesson in technology. If u hate computers get a console.

    Because newer tech is not meant for running old stuff like that its optimized for the new stuff.
     
  5. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Unfortunately I see the Celeron as a bigger problem than your x3100...
     
  6. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    Yes I think I can give you a tweak:

    X3100 has 8 unified shaders. If it does Hardware Texture & Lightning (HW T&L) then it has to use some of these shaders for vertex shading. Let's assume 2. So there are just 6 left for pixel shading.

    You can switch to Software T&L and have the CPU do the vertex shading. In that case you have 8 shaders for pixel shading, which is often faster for older games. Vertex shading is not computationally intensive.

    An intel document that describes this in detail is found here:
    http://www.intel.com/assets/pdf/whitepaper/318888.pdf

    You may also have a look at this posting to see how to switch between them:
    http://softwarecommunity.intel.com/isn/Community/en-US/forums/permalink/30229561/30255386/ShowThread.aspx#30255386

    The default is HW T&L and for those apps SW T&L has been selected:

    HKR,, ~3DMark03.exe, %REG_DWORD%, 1
    HKR,, ~3DMark05.exe, %REG_DWORD%, 1
    HKR,, ~3DMark06.exe, %REG_DWORD%, 1

    Thus I assume you would also get higher 3DMark01.exe scores by enabling SW T&L. In case you don't want to mess with the registry just rename your 3DMark01.exe -> 3DMark03.exe and try again.

    I do even have benchmarks for this. Look at this Benchmark list for Vista x32:
    http://www.computerbase.de/forum/showthread.php?t=309411
    You will notice that 3DMark01 score drops as soon as HW T&L got introduced. But 3DMark06 scores roughly stay the same: That's because 3DMark06 is included in the list of those titles that run SW T&L by default (see intel thread above).
     
  7. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    Sorry to say this but u r wrong. The Celerons r a lot better than they used to be and X3100 is the bottleneck here. He would however get a little better performance with a Dual core CPU especially in 3dmark but for RL gaming i doubt he would notice it at all.
     
  8. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Agreed, for gaming it's definitely not in consideration. Didn't read the thread thoroughly, my fault.
     
  9. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    A tweak for you: Install rivatuner to get a little application called D3Doverrider. This will allow you to enable Tripple buffering for all your D3D games, it's a must if you use vsync.
     
  10. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    In theory you are right.

    However, running on 533MHz FSB CPUs limit BOTH the core clock speed of the IGP and the memory clock for the GM965/GL960.
     
  11. nav-jack

    nav-jack Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    ive done tons of research in old games and source engine games... there seems to be a lack of hidden surface removal. or anything having to do with geometry optimization for the perceptual details.

    EDIT: if the camera viewpoint. sees a building with trees and buildings behind it with a road... and a door to a building is blocking the view. well, the x3100 will still be rendering everything behind that door and building, as long as the game engine loaded the geometry already and is not a streaming engine. the source engine SDK stress test shows this the best.
     
  12. nav-jack

    nav-jack Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    double post, blah blah blah... im going to install the v15.4.4 final 7.14.10.1295 for vista... and post my results. so far the x3100 is faked to be vista compatible and not much better then a pre T&L 3d card. but yet its "new" and integrated. i dont care if people back it up as being a cheap alternative for laptops. but its a step back many many years to become a new product for laptops. this is why most games that let me, run in FULL SOFTWARE rasterization mode.
     
  13. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
  14. nav-jack

    nav-jack Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    where is the point in pointing out the obvious to me? the old drivers so far are faster then when i tried a forcing of software t&l in half-life 1 based games/mods. it almost seems like nobody here really knows about anything here. im going to totally shut out all possibility for this laptop to run ANYTHING direct X 9+ based. source engine, doom III, FEAR (lithtech). because gaming online for me, dosent work with 5fps or less. i want to make this card, now, perform for the old games, without a hic-up.

    1. in half life 1. during the into voiceover thing. any dynamic light that 'turns on' in the level ends up giving my system a good lag hit for a sec.
    2. i still have an odd geometry processing issue, it has no perseptual order for rendering geometry in any game or sofware (cinema 4d or GLDSRC engine games)
    3. intel has cheated its way to make this IGP vista friendly, i want some help with that one. this is NOT vista compatible in any way.
    4. i want people to wake up and stop being slow... i mean c'mon.
     
  15. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    HL2 runs with 30 pluss FPS on my laptop with all highest settings, Halflfie 1 run worse though
     
  16. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    X3100 has hidden surface removal. It uses the latest technology which is named Early Z.


    X3100

    1. Integrated GPUs are made to save cost and power. You can't have everything, performance/power/low cost. It achieves the two by sacrificing one, in this case, performance.
    2. It is fully Vista capable. Vista capability is the most overblown thing ever. All you need to run Vista smoothly is to have hardware DX9 capable shaders, dual channel memory, and good CPU.
    3. X3100 is FULLY DX10 capable. There are no such thing as partially DX10 capable GPU, its either you fully meet or you don't.
    4. Intel's driver development isn't very good. I notice it from just trying to setup their motherboards. The hardware is awesome, the driver is.... nvm.
    5. Most don't fully know what T&L is. T&L stands for Transform & Lighting, its an official term for a DX7 product. Nowadays its done by Vertex Shaders, which are more flexible and programmable than T&L. T&L is a non-programmable, fixed function version of the modern Vertex Shader.
    6. X3100 is optimized for modern shaders. But because its an IGP, they couldn't put lots of transistors, which sacrifices the performance. The reason Low and High settings on some games run similar on the X3100 is because X3100 doesn't do low setting/old shader instructions faster than it does on the new shader instructions.
    7. Combination of under-funded driver development, and the fact its an IGP, combined with the ultra fast CPU of theirs enable the CPU-run software mode to sometimes run faster on games that doesn't use modern shaders.