this is my intro thread and also a normal post all in one. i have an acer laptop with these specs:
intel celeron 1.8ghz
1 gig ram
and of corse, a 3100.
i dont have the option to upgrade my laptop. so i really would like to use my tweaking skills i had with my old built computer that had a athlon 64 and a 9800 with 512mb ram... i was with the big BIG 3dmark 2001 scene where just getting 10 more 3dmark score was amazing. and i am let down that i cannot, for the life of me, find one tweaking program for this IGP. even my old compaq with a ati 200m is oddly faster in some areas, but has heat issues. but at least it allowed me to overclock and underclock and use the normal ati tweaks. i just ran 3dmark 2001 with the new drivers intel made.
here is my report in the attachments.
EDIT: ive noticed more and more, the whole issue with any igp for laptops. is the fillrate. anytime u get alot of transparencies or polygons u get almost a brickwall. i know u cannot tweak that outta the math of the whole thing. but anything beyond what u see in the intel graphics panel would be interesting. or even a 3rd party driver.
-
Attached Files:
-
-
why r u running a new card on a 7 year old bench mark? it wont show its "powers" on usch an old test. The GMA 950 for example gets better while on 3dmark06 X3100 owns GMA 950
-
-
Because newer tech is not meant for running old stuff like that its optimized for the new stuff. -
Unfortunately I see the Celeron as a bigger problem than your x3100...
-
Yes I think I can give you a tweak:
X3100 has 8 unified shaders. If it does Hardware Texture & Lightning (HW T&L) then it has to use some of these shaders for vertex shading. Let's assume 2. So there are just 6 left for pixel shading.
You can switch to Software T&L and have the CPU do the vertex shading. In that case you have 8 shaders for pixel shading, which is often faster for older games. Vertex shading is not computationally intensive.
An intel document that describes this in detail is found here:
http://www.intel.com/assets/pdf/whitepaper/318888.pdf
You may also have a look at this posting to see how to switch between them:
http://softwarecommunity.intel.com/isn/Community/en-US/forums/permalink/30229561/30255386/ShowThread.aspx#30255386
The default is HW T&L and for those apps SW T&L has been selected:
HKR,, ~3DMark03.exe, %REG_DWORD%, 1
HKR,, ~3DMark05.exe, %REG_DWORD%, 1
HKR,, ~3DMark06.exe, %REG_DWORD%, 1
Thus I assume you would also get higher 3DMark01.exe scores by enabling SW T&L. In case you don't want to mess with the registry just rename your 3DMark01.exe -> 3DMark03.exe and try again.
I do even have benchmarks for this. Look at this Benchmark list for Vista x32:
http://www.computerbase.de/forum/showthread.php?t=309411
You will notice that 3DMark01 score drops as soon as HW T&L got introduced. But 3DMark06 scores roughly stay the same: That's because 3DMark06 is included in the list of those titles that run SW T&L by default (see intel thread above). -
-
-
A tweak for you: Install rivatuner to get a little application called D3Doverrider. This will allow you to enable Tripple buffering for all your D3D games, it's a must if you use vsync.
-
However, running on 533MHz FSB CPUs limit BOTH the core clock speed of the IGP and the memory clock for the GM965/GL960. -
ive done tons of research in old games and source engine games... there seems to be a lack of hidden surface removal. or anything having to do with geometry optimization for the perceptual details.
EDIT: if the camera viewpoint. sees a building with trees and buildings behind it with a road... and a door to a building is blocking the view. well, the x3100 will still be rendering everything behind that door and building, as long as the game engine loaded the geometry already and is not a streaming engine. the source engine SDK stress test shows this the best. -
double post, blah blah blah... im going to install the v15.4.4 final 7.14.10.1295 for vista... and post my results. so far the x3100 is faked to be vista compatible and not much better then a pre T&L 3d card. but yet its "new" and integrated. i dont care if people back it up as being a cheap alternative for laptops. but its a step back many many years to become a new product for laptops. this is why most games that let me, run in FULL SOFTWARE rasterization mode.
-
Once you know how to switch between Software T&L and Hardware T&L as described just a few postings before:
http://forum.notebookreview.com/showpost.php?p=3634860&postcount=6
you will get any performance of 15.4.4 drivers:
http://www.computerbase.de/forum/showthread.php?t=309411 -
where is the point in pointing out the obvious to me? the old drivers so far are faster then when i tried a forcing of software t&l in half-life 1 based games/mods. it almost seems like nobody here really knows about anything here. im going to totally shut out all possibility for this laptop to run ANYTHING direct X 9+ based. source engine, doom III, FEAR (lithtech). because gaming online for me, dosent work with 5fps or less. i want to make this card, now, perform for the old games, without a hic-up.
1. in half life 1. during the into voiceover thing. any dynamic light that 'turns on' in the level ends up giving my system a good lag hit for a sec.
2. i still have an odd geometry processing issue, it has no perseptual order for rendering geometry in any game or sofware (cinema 4d or GLDSRC engine games)
3. intel has cheated its way to make this IGP vista friendly, i want some help with that one. this is NOT vista compatible in any way.
4. i want people to wake up and stop being slow... i mean c'mon. -
HL2 runs with 30 pluss FPS on my laptop with all highest settings, Halflfie 1 run worse though
-
1. Integrated GPUs are made to save cost and power. You can't have everything, performance/power/low cost. It achieves the two by sacrificing one, in this case, performance.
2. It is fully Vista capable. Vista capability is the most overblown thing ever. All you need to run Vista smoothly is to have hardware DX9 capable shaders, dual channel memory, and good CPU.
3. X3100 is FULLY DX10 capable. There are no such thing as partially DX10 capable GPU, its either you fully meet or you don't.
4. Intel's driver development isn't very good. I notice it from just trying to setup their motherboards. The hardware is awesome, the driver is.... nvm.
5. Most don't fully know what T&L is. T&L stands for Transform & Lighting, its an official term for a DX7 product. Nowadays its done by Vertex Shaders, which are more flexible and programmable than T&L. T&L is a non-programmable, fixed function version of the modern Vertex Shader.
6. X3100 is optimized for modern shaders. But because its an IGP, they couldn't put lots of transistors, which sacrifices the performance. The reason Low and High settings on some games run similar on the X3100 is because X3100 doesn't do low setting/old shader instructions faster than it does on the new shader instructions.
7. Combination of under-funded driver development, and the fact its an IGP, combined with the ultra fast CPU of theirs enable the CPU-run software mode to sometimes run faster on games that doesn't use modern shaders.
intel x3100 IGP tweaking thread
Discussion in 'Gaming (Software and Graphics Cards)' started by nav-jack, Jul 17, 2008.