Yes if you paid squat in the first place.
We know the X3100 is a piece of crap, but why do people need to always point it out? Most of us do have other gaming means, but every time a new set of drivers release we get something extra, its at least something
Regardless, when the chip first came out, it was at around 11 FPS with HDR, im guessing 15-25 is at least a bit better
-
Who was pointing out that the card is a "piece of crap". Those are your words buddy, not mine. I was simply pointing out the fact that to me those framerates are not "excellent". Now, I understand that they are "excellent" in comparison, but you just stated they were "excellent".
I don't think the card is a "piece of crap". I think people misunderstand the point of the card. -
as i said in another thread
it's not a bad chip and there are some good ideas behind. -
StarScream4Ever Notebook Consultant
True, this card isn't that all bad, unless ur a gamer and know that you should not play games on integrated card but you do and complain about it. Intel is just taken a step at bringing casual gaming possible to those who are not able to afford a laptop powered with GeForce 8600m GT or higher, but just wanting to play simple games at reasonable settings. I do most of my gaming on PC but every time a new driver is out for the X3100 its a good thing to test it out to see if the card is going up on performance. -
We all know that this card cant be compare to Nvidia for example, but its a good card and it can be improved with better drivers, me for example i only use it to play Halo and Age of Empires III, because i know that it wont manage pretty well games that are slightly new, so i just want x3100 to play those games with the best performance possible
.
-
Fair enough, but this is a chip that struggles on all the worst games yet pulls a decent and very playable framerate on a game with HDR. Real or fake, that impressive, especially considering when this chip first came out it couldnt even keep around 11 FPS.
And importantly, it is indeed excellent for such a cheap chip, for me, it cost £430, but anything with a nVidia or ATi solution wouldve cost me at least £550 +
Considering my brother was paying for it because my desktop was screwing up on me at all the rights times, i didnt feel i needed to rip him off. People have been saiyng drivers wont do anything, truth is they have done a LOT. And the proof is there, Far Cry is a very stressful game for example, but it plays quite well, much better than Doom 3 which is a lot more optimised yet is trash on X3100. In other words, it will remain crap, but something is better than nothing -
i did a couple of tests in 3DMark05
with _3dmark05=1 ~3dmark05=1
3DMark Score
799 3DMarks
CPU Score
5277 CPUMarks
with _3dmark05=0 ~3dmark05=1
3DMark Score
782 3DMarks
CPU Score
4825 CPUMarks
Which is a good proof of IntelUser's words about engaging HW T&L
Now, about prefix ~. DO NOT CHANGE IT! When changed to 0, it brings a complete texture mess to 3DMark. You can try it your yourself for a laugh))
It's just another proof that Intel slowly but manages to improve HW T&L driver support. -
Use CyberLink PowerDVD 8 Deluxe. Click on Configuration/Video and check 'Enable hardware acceleration (Intel Video...)'.
-
The latest 15.9 (7.14.10.1472) driver for Vista has a noticeable performance gain over previous drivers.
Half-Life 2 Demo is playable on max settings at 1280x800 with AA and AF on.
CounterStrike:Source is playable at 1280x800 with default settings. -
Njoy, I have a request. Can you run 3DMark2001/2003 fillrate tests for me?? Just the fillrate.
(You can try 3DMark2005 fillrate tests but they have to be registered, while 01 and 03 doesn't).
Yep. The X3x00(X3000/X3100/X3500) doesn't excel in older games that are "pixel bound". Lots of the MMORPGs fall into that category, and software mode runs them faster. Hardware will be faster when the game/app is shader intensive, and games like Far Cry/Half Life 2 have lots of that. Age of Empires also perform much better on particular battles on hardware mode(ship battles get 3-5 fps in software regardless of settings, and hardware gets 2-3x better than that). -
last time i tried to do 3DMark03 it crashed. Gonna install it again now and do just a fillrate test.
-
okay
1024x768
Single-Texturing - avg 7-8fps
Multi-Texturing - avg 17-18fps
1440x900
Single-texturing - avg 4fps
Multi-Texturing -avg 10fps -
also did a full test. 1292 3dmarks. poor, very poor. And i've never seen such bad framerates in SM 2.0 tests
-
i cant believe this thread is still alive
i personally gave up on the x3100 when i found out it couldnt play BF1942 smoothly at 640x480 all lowest settings on the city maps
8600GT here i come baby -
Yet the X3100 on Vista can play BF2 very well on Medium.
-
x3100 seems to have a real problem playing old games; my CS:S runs faster than CS1.6
i read somewhere its because of the way its pixel pipelines work -
StarScream4Ever Notebook Consultant
the X3100 strangely is able to perform well on recently release games '04-present but any thing from '03 to below seems to lag a bit.
-
Starscream: (Rep points for IntelUser)
-
Thanks, but I want the actual results. It will show the results in MTexels/s.
-
aw, sorry... here they are:
1024x768
ST 483.8MTexels
MT 929.6MTexels -
I'm having problem with x3100 drivers in hp dv6600t , on xp it shows that max memory is 128 but on vista 256 (hate vista) , my room mate have hp compaq with x3100 shows that max memory 384 , i tried all xp drivers out there nothing works , plz hlp
Could this happen because of any other driver i installed ???, which one ?
*note: my ram is 1G as my friend too -
This is the maya PLE view problem :
http://www.laptopvideo2go.com/forum/index.php?showtopic=18054
edit: scratch that, it seems to be fixed in 15.9 -
It's not going to matter. Are you just doing it to show off?
-
NJoy: Thanks a lot. From the fillrate scores here's what I think.
I benchmarked couple of games and 3DMarks with different memory/CPU/drivers earlier this year. The differences from the lowest setting to highest is quite significant. From there, I noticed that the Core 2 Duo results had almost no variation for multi-texturing benchmark across 3 different drivers and dual and single channel memory.
So it must show the results without being affected by outside influences, which makes it a good candidate for figuring out what versions of IGP people might have.
Results: The average score for G965 on multi-texturing benchmark for 3DMark2005 is 1600MTexels/s. G965 has 667MHz core clock, the GM965 has 500MHz and GL960 has 400MHz clock.
The core clock of your IGP should be=[(your MT fillrate)/(my MT fillrate)]x667MHz. Which comes out to be 930MT/1600MT*667MHz=388MHz core
According to that you have GL960, or is performing like one. Might be possible that Intel de-clocks the GM965 to GL960 level when its running on a non-Core 2 Duo CPU, but I haven't seen anywhere where it says that except their claims its "optimized for Core 2 Duo"
.
The drivers I tested were 14.29, 14.31 pre-beta, 14.31 beta, and dual and single channel DDR2-800. I currently have 14.32.3 drivers and the results are still in the 1600MTexels/s range, not deviating more than 2%.
(Yours is technically a Core 2 Duo based one, but marketing wise its not) -
thanx for your efforts, IntelUser!
The situeation gets more interesting))) I just did 3dMark05 fillrate test, shows
ST - 525.5
MT - 957.8
I also ran PC Wizard to get exact info about the chipset and video. It reports GM965, X3100 running at 500Mhz. hummm
Trying to figure out what is the problem... my CPU is actually t5200 with half less cache, 533Mhz FSB. Therefore, my 667 RAM is operated at 533 too. Driver - recent official 15.9 (7.15.10.1472)
I suggest memory bandwidth is the bottleneck.... -
What are the current 3DMark 05 scores?
-
i listed results of my system on the previous page, with different registry values
-
Makes sense, but only for ST. I have told you that in MT memory bandwidth doesn't impact the scores much. Here's the scores I have recorded before.
Core 2 Duo E6600, Single Channel DDR2-800
14.29-1590
14.31pb-1586
14.31b-1587
Dual Channel DDR2-800
14.29-1620
14.31pb-1621
14.31b-1620
As you can see, memory bandwidth impacts are negligible. However, when I tested with Celeron D 2.53GHz...
Single Channel DDR2-800(runs on DDR2-533 mode)
14.29-1384
14.31pb-Not tested
14.31b-Not tested
Dual Channel DDR2-800(runs on DDR2-533 mode)
14.29-1473
14.31pb-Not tested
14.31b-Not tested
It is VERY possible that the IGP also runs on reduced clock speed with 533FSB processors. I know your CPU is based off Merom, but the official name is Pentium Dual Core. Marketing-wise it is different, which points to another possibility of purposeful downclocking. -
probably. I have two options to think about at the moment. Originally, i was thinking about getting a new CPU for this machine, but some time later I will need to work with Alias 3D software, where x3100 will suck big time. If i get a chance, i'll get rid of this lappy and get something with dedicated graphics.
-
I did some tests
Software VS-GMA X3000/E6600/14.32.3/DC DDR2-800
3DMark2005 score:1064
Game Tests
GT1-5.7
GT2-2.8
GT3-4.9
CPU Tests
CPU Score-7138
CPU Test 1-3.9
CPU Test 2-5.8
Feature Tests
ST fillrate-859.1
MT fillrate-1682.1
Hardware VS
3DMark2005 score: 865
Game Tests
GT1-4.9
GT2-2.2
GT3-3.8
CPU Tests
CPU Score-4518
CPU Test 1-2.7
CPU Test 2-3.4
Feature Tests
ST fillrate-857.7
MT fillrate-1682.1
Notes: Hardware VS has significant rendering problems. The 14.31.1 driver had similar score for hardware and software VS, but also the performance drops in 14.32.3 for hardware. I don't use 14.33 or 14.33.1 because it crashes in WoW, and hoping 14.34 will bring the best of both worlds. -
StarScream4Ever Notebook Consultant
real gaming performance plz...
-
You know how it performs. Just take what you get for your X3100 and multiply it by 30-50% since I have a desktop G965. I only have couple of games that can be benchmarked without using unreliable methods like FRAPS.
.
-
The sucky thing is that the X3100 doesnt do anything nicely. Older games that would work on GeForce 2 MX, work worse, yet newer games are only barely playable at a reasonable setting.
The drivers have improved drastically, the fact that this play HL2 with full HDR nicely is a good sign of what its capable of -
well don't know what to say to that.
on my old computer w/ geforce4 Mx440, it can't get over 15 fps in CS:Source
but with my x3100, I can get usually 30+
ofcourse, cpu and ram might be a bigger bottleneck on the mx440 -
Neo-Cortex: That won't be true. Geforce 4 MX 440 has dedicated RAM while X3100 uses shared memory. That alone will make X3100 more sensitive. At 15 fps, the impact on CPU/memory on performance will be almost nothing. AKA GPU-bound.
-
mx440 is a weak card itself, it's basically a rebranded TNT2 Ultra with no support even for SM 2.0
-
Makes me wonder if we've hit close to peak performance with the new 15.9 drivers. I'm pretty surprised how much faster CS:S runs now. With modifications to my autoexec.cfg file, I'm consistently getting 100+ FPS on low-texture maps at 1280x800 resolution. It really makes the game a joy to play.
-
StarScream4Ever Notebook Consultant
If Intel has already started to shift gear to work on their next-gen integrated card like the X4xxx series then yeup this might be the beginning of the last drivers for the X3100.
-
Regardless, games like KOTOR2 ran at a nice framerate on low res with almost all settings high with my GeForce 4 420 GO (worse than a 4 MX BTW) yet on X3100 is almost died. I think we still have a long way to go before we hit peak, the Vista drivers are still a whole lot behind the XP drivers
Oh and what are your CS:S settings, it didnt run very nicely for me, also is it worth feeding it more RAM?P -
may be he meant 100fps while staring at a wall on respawn point xD
-
Haha, if that were true, I wouldn't have written it. Here's my autoexec.cfg file. I had to rename it to autoexec.txt for attachment reasons.
By the way, the map I played on with 100+ FPS was Iceworld, with 8+ players. On cs_office, I consistently get 40 FPS and higher with 52 players.
I also use these launch options: -dxlevel 70 -width 1280 -height 800
Oemenia, can you clarify what you mean by "feeding more memory"?
I set the game to preload certain textures to increase game performance, and I have 2GB DDR2 667 Mhz ram in my Thinkpad.Attached Files:
-
-
unfortunately, i cannot check it right now... coz have completely forgotten details of my steam account... lol
-
Oh ok, what are your CS:S settings? What i mean by feeding more memory is giving it more system RAM
-
I've never explicitly allocated memory for CS:S through my autoexec settings, but I did notice a big performance boost when I moved from 1GB to 2GB of memory on Vista.
As for my settings, they're all set and detailed in my autoexec.cfg file. I run the game at 1280x800 resolution (which is native for my Thinkpad). -
^^ Wow awesome, i had tried everything and the framerate was still lame, so a high-res and nice settings seems nice
Just one question, where do i place your .txt file? I still think your misunderstanding me, i mean video RAM fed from your system memory. The x3100 supports up to something like 350mb? -
Oh, I understand. I set driver memory footprint to high, which allows up to 384mb of memory for graphics.
The x3100's memory usage is dynamic, and set by the GPU, it doesn't take up 384mb at all times.
You have to rename the .txt file to .cfg and place it within your CS config folder in steam. -
I dont seem to have a file called autoexec.cfg as it is, just making sure youre talking about CS:S?
C:\Program Files\Steam\steamapps\[email protected]\counter-strike source\cstrike\cfg
Would that be the place to to put it in?
As for the mem, i understand now, thanks again -
Yes, that's the folder.
You're right that autoexec.cfg is not in there by default, and this is because the autoexec.cfg file is user created. The game searches for it before starting, and if it's there, loads the settings. -
Thanks, i tried it, i played it on 1280x800, with around 10 characters on screen the framerate dips, but on normal occasions it runs smoothly.
Also, i played around with the settings before using your .cdg, will this be affecting me? -
No, that wont affect you, the autoexec.cfg file will override them.
As for the framerate dipping, make sure you set -dxlevel 70 in your launch options in steam. That will help dramatically.
In my opinion the game still looks very good with those tweaks in place.
Intel X3100 users rejoice! It's finally here! New Pre-Beta Drivers
Discussion in 'Gaming (Software and Graphics Cards)' started by epictrance4life, Jun 7, 2007.