Intel just released a new graphics driver. 15.8 for Vista and 14.33 for XP. Here are my short impressions on the games I tested*.
Positives:
3DMark05: 1069, the highest ever score for my system. I can't seem to set hardware/software though any longer. Can't check the results using system details in 3DMark. The score is supposed to be with hardware mode, if its activated properly.
Call of Duty 4 Demo-No longer need to run on Safe Mode to run
Crysis SP Demo-In previous drivers it had a bug where it wouldn't show the movement of the chars properly. The whole body would move sometimes out of sync with the head and you wouldn't see arms and legs moving, rather the whole body moves. It no longer has that problem
Doom 3 Demo-Significant performance improvement in higher quality settings. I am pretty sure I haven't seen it run as fast before. Estimating 50%+.
Need for Speed: Underground-26 fps at 1024x768 High, which seems to be around 30% faster than before
Prey Demo-20-30% faster
Quake 4 Demo-20-30% faster
World in Conflict Demo-It used to crash the whole system before, now it loads the game(yay!). The performance benchmark on 1024x768 Low using the in game benchmark is:
Avg: 12, Min: 6, Max: 38
Negatives:
Crysis-If we ignore the char bug being fixed and performance being improved, there is a bug which I find very annoying. The water looks like its interlaced, gets worse with higher water quality settings. Everything else is ok.
Need for Speed: ProStreet-Enabling smoke still crashes the system/game w/e
Unreal Tournament 2007-Has significant texture corruption, its not playable because of it. White/grey, flashing textures all over, guns are interlaced just like the water on Crysis
Overall loading time seems to have increased.
Link for Vista 15.8: http://downloadcenter.intel.com/fil...32-bit+version&lang=eng&strOSs=156&submit=Go!
Link for XP 14.33: http://downloadcenter.intel.com/fil...XP+Professional&lang=eng&strOSs=44&submit=Go!
(Not a DX10 driver yet)
*Performance shown may or may not indicate the performance of your system.*
-
Please do not flame this thread, and I'd prefer guys like Unreal to stop posting false or inaccurate information. I'd like it open.
-
StarScream4Ever Notebook Consultant
Nice, I'm going to test on it further.
-
Upon further inspection, Doom 3 just seemed to got the highest fps ceiling raised higher. It still runs better than it used to. Hopefully this is start of a trend not the other way around
.
Crysis also performs better. There was part of the cutscene in the very beginning where it would go below 1 fps. Now it gets 4-5 fps. -
Wow... maximum of 50% performance increase and could be more just because of driver change?
-
Just tested 3dmark 05 on vista & even though not meaning much my vista expeeince score went down from 3.5 to 3.4 and my 3dmark 05 stayed about the same even though I did notice better fps on the firefly scene. Score: (878)
-
Does doom 3 have a benchmark utility?
-
Pro:
None
Cons:
Doom 3 is around 50% slower than the August version of the driver. Part with light are reaching a new low. (If you want to play doom3, try the august winxp version of the driver set seta r_skipPostProcess "1" into the autoexec.cfg, I swear to you, you'll have really good fps)
Unreal Tournament 3 doesn't work anymore (which I don't play anyway since it too was slow to be run)
Other games are the same than before (NFS:U 2) -
I am having serious problems with 14.33. I got 878 in 3dmark05 ( I was getting 915 before).
Also, in hardware mode I got 571 points! I was getting around 700 previously. To test hardware mode, simply rename the 3dmark05.exe to something else and then run it.
Perhaps the driver's didnt install properly. I noticed a ton of graphical glitches when running in hardware mode. In fact, it was so bad you could barely tell what was going on.
*edit* after reinstalling, hardware mode in 3dmark05 still looks very very strange. Can someone else confirm this please? -
I'll switch back to 14.31.1 release as soon as I have time.
I'll be curious to see benchmark from the 14.31.1 vs 14.33. If anyone have time it would be interesting to see. -
StarScream4Ever Notebook Consultant
Same here, I'm gettin graphic glitches when running COH:OF and CivIV. (sigh) so much for gettin my hopes up. =/
-
That's weird, I will post screenshots.
EDIT: Maybe its because I am running demo versions?? I don't know. -
-
I thought the performance should have been indicative of both the IGPs because they are basically the same architecture.
EDIT: I guess I forgot how to change software/hardware manually. The 1069 score was with software mode. Hardware mode does show image corruption, same with 3DMark01.
Doom 3 only got the max fps significantly increased, the average fps didn't seem to change much, maybe 5-10%. It used to be that I got only 15-20 fps as max fps, now I get 40-50. Whether that changes anything is another question. -
Some screenies:
CoD4(just for the heck of it, doesn't crash in default settings anymore)
Crysis-No more char glitches, but water is interlaced. It used to have no hands on the gun and it just had hand in top middle of the screen. Also, the fellow chars didn't look/move right. There used to be situations where it literally became a slideshow. Although the average fps didn't improve, the min fps did(at least on the cutscene in the beginning, everytime on the cutscene on the aircraft, showing the white guy on the left would always lag the game like hell, but now it doesn't)
1:
2:
Doom 3 1024x768 Low
Doom 3 1024x768 High
Upon further testing, average fps did not seem to change too much for Doom 3, but the highest fps is higher.
World in Conflict-IT RUNS IN XP!!(ignore the FRAPS fps for this one)
-
StarScream4Ever Notebook Consultant
Nice, yeah I see it now on Crysis, the water glitches.
-
-
The new drivers still didn't fix the issue with scrolling in CoH:OF. Ah well.
-
I don't get it, other than the specific ones, the performance does not seem to be worse. For all other games not mentioned, they seem to run just as well as they have before for me. I noticed that the forum with desktop X3000 users are much more positive. Perhaps the X3100 is too weak.
It's not amazing, but its not TERRIBLE either. I for one, is glad that things are starting to change. Maybe it wasn't a good idea to compare the two after all. -
Anyway yea didn't notice any real difference in performance at all with the new drivers, And im still getting the Direct X error in Cod 4 as well, but o well. -
What setting did u set it at? -
I crashed on the Demo on the 14.32.3 driver. I went back to older drivers to check it out. With everything low its ok, but put it to "Optimal System Settings" and it'll just crash. I get no DX message.
-
-
I will try it out on a few days.
-
Intel's presentation about Eaglelake/Cantiga says that the Vertex Processing performance will be 2x the performance of the Gen 4(GMA X3x00).
Never buy Intel's first gen graphics.
-
i didnt notice much difference with the drivers. So far i only done 3Dmark01 and 06 and teste Perfect World.
Hey Unreal what is ur 3dmark score by the way in 06 or 01? -
Woah, the X3100 can run Crysis?
-
StarScream4Ever Notebook Consultant
heh, yea Crysis in slideshow
-
StarScream4Ever Notebook Consultant
we all do have another notebook with dedicated cards, we are testing the X3100 performances after every new driver release.
-
Ah hell, now I will provide a discretion with every test I do.
This is my system:
Core 2 Duo E6600
2x1GB DDR2-800 5-5-5-15 Transcend
WD360GD Raptor+ Seagate 160GB SATA 2 8MB
Intel DG965WH
Intel GMA X3000
X3000: 667MHz core clock
X3100: 500MHz core clock(GM965), 400MHz(GL960)
Memory bandwidth and core clock speeds are critical to performance. In the Intel IGP, it is very constrained by memory bandwidth, more than any other IGP before it. It is possible that the X3100 is limited by pixel throughput and memory bandwidth before the limitations of the vertex shader kicks in. That way, no matter how fast the Execution Units perform, you won't get much performance benefit. I heard that the software mode runs really good in comparison to the hardware mode for the X3100, which isn't as big on with the bigger brother, the X3000. Using Vertex Shader on the CPU frees up the CPU for another tasks like doing more pixel processing. -
why would anyone buy a desktop with the X3000 when a dedicated graphic card is just as cheap and 10 times better?
-
Actually, every discrete card costs something to add to the system. Realistically, I'd have to shell out at least $50 for extra graphics, which user may care(in this case, me). I also like having no small high speed fans spinning and less power consumption.
To give an example, I played World of Warcraft at 800x600 at Low-Med settings with Celeron D 2.53GHz, 1xDDR2-533, GMA950, until one day I noticed I could stand to go a little lower on performance. So I made it 800x600 High-Med settings(notice I put High before Med). The frames would dip as low as 5-6 fps, average standing around 13-15 fps, but I did play for fare bit of time. I also finished demo version of Prey(okay I admit that was too far...).
Spending some money to go from playable fps to a even higher one?? I couldn't care less. So, its justified.
Back to the topic. -
"I also finished demo version of Prey(okay I admit that was too far...)."
hahaha that wa a good one
I also played Wow well on my old laptop GMA 945 Celeron 1.6 Ghz and 1 GB ddr2 -
The Celerons on the laptops never used Netburst(P4) variants. The Celeron D for desktop on the other hand...
I also used single channel memory, I wouldn't be surprised if my previous system actually had lower performance than yours XD. -
I love these new drivers, I can finally play World in Conflict!
-
StarScream4Ever Notebook Consultant
Well I tested Crysis (full version) out with the new driver. Yeup, like IntelUser said, there was a huge glitch that causes the water to have a gradient effect. And cutscenes is still painfully slower than the gameplay! Average FPS was at a dreadful 5-10 FPS! I manage to record the first few scene from the game when you first landed on the island. I'll try to record more video and post it up soon, unless I figure that its not worth to further damaging the quality of Crysis it self.
Here's the link: http://www.youtube.com/watch?v=p6X3K2Rkh1E -
Crap, I had to roll back to 14.32.3 driver. Though the performance in general didn't seem to be worse, I was crashing like mad in WoW. It would freeze up for me every 5 mins or so in certain zones.
So I'd guess here's how my recommendation goes. In the driver, say 14.31 vs. 14.31.1, the first number 14 means its XP, the second one
means its a new version, and the last one means its a bug fix to the new version.
14.31 had hardware T&L switching problem, 14.31.1 fixed that. 14.32 had default resolution problem, 14.32.3 fixed that. 14.33 has crashing problem, I'd guess I'd have to wait for 14.33.1, ARGH! -
Of course a dedicated card is faster, yes it doesn´t cost much (35-75) but for people that don´t play much it makes all the difference if a game runs acceptable at 800x600 or at slideshow.
IGP should be able to handle up to 800x600 / 1024x768 with standard settings at 30fps and PCIe should handle higher quality/res/speed so it´s pretty normal we want decent graphics/performance.
Nevertheless I think IGP should be only for laptops (to save TONS of battery) or for work/Media Center PCs (good video needed) and, as so, I understand we 950/X3100 users should ask for more from Intel. X3000 owners: don´t wait, buy a 35 card please
My laptop with X3100 + C2D T7100 at 1.8GHz + 2Gb 667MHz is slower (on games) then my old Centrino 1.6 with ATI Radeon 9700 64Mb. I expected my X3100 with such CPU/mem was *at least* as fast as my friend´s laptop (Centrino 2.0 + ATI X700).
I´m waiting for that driver that enables full hardware performance! I hope it doesn´t take long intel! -
rmcrys: Couple of games I heard are much slower on the X3100 than the X3000. Combine CPU+GPU differences, and some of the games that are playable on the X3000 makes it not playable on the X3100. Need for Speed: Underground is one example. I heard that X3100 based laptop can get something like 20 fps in 640x480 low. I can get higher than that with 1024x768 High.
Substantial amount of games that are playable on the X3000 isn't on the X3100. So its either, higher battery life+slow, little bit cheaper+acceptable. Which IMO doesn't make one more desirable than the other. -
-
Has anyone tried farcry with the new driver? are their still textures and draw distance problems?
just putting it out there -
@Mikebob
I play Farcry since the june or july driver release. And it runs @ 30fps!
Patch the games to the latest version. -
cronos77 said: ↑@Mikebob
I play Farcry since the june or july driver release. And it runs @ 30fps!
Patch the games to the latest version.Click to expand...
New Intel drivers, XP version tested
Discussion in 'Gaming (Software and Graphics Cards)' started by IntelUser, Mar 4, 2008.