I believe that the G35/GM965 won't get most of the new features that will feature with the 15.9 driver. AVC support is not on the hardware at all, and except for possibly one, most of the power savings features are not on the GM965 either.
MPEG hardware acceleration and partial VC1 acceleration is on G35/GM965, and it looks like the 15.9 driver will enable those. From what I understand about DX10, again, G35/GM965's hardware support for DX10 looks either poorly done or done minimally, and some will be software. It's G45/GM45 that will be true DX10 architecture with new power consumption features and full HD video acceleration(AVC/MPEG/VC1).
Basically from rumors(now I find these off the net, not that I am close to Intel or anything...), Intel basically abandoned the G35 development which is why almost nobody talks about that chipset and there is only one motherboard for it. They have moved most of the guys to G45 chipset, and so far it looks good. At least currently Gigabyte seems to be happy about it.
-
I think I have figured out reasons why the X3000 architecture is slow. Not just in modern games but even in older games like Civ 4.
An old Anandtech review about the R300(Radeon 9700) gave me some hints: http://www.anandtech.com/showdoc.aspx?i=1656&p=6
This is why:
"The R300 features four programmable vertex shader pipelines, much like the Matrox Parhelia. The difference between the R300 and the Parhelia is that the R300 has a much improved triangle setup engine, so while the Parhelia can boast incredible vertex throughput rates, the GPU is still limited by its triangle set-up engine. This reason is why, in games with very low triangle counts, that the Parhelia was not able to outperform even the GeForce4 with fewer vertex shader pipelines; with a triangle set-up engine no more powerful than the GeForce4s, and running at a lower clock speed, the Parhelia did not have the triangle throughput power to backup its vertex shader pipelines."
Older integrated Intel graphics did not feature hardware vertex shaders and transform & lighting for obvious reasons- die size, heat and, cost
Now people were happy when the G965/GM965 came with talks that it'll have hardware VS/T&L. Over a year later its intro, people can't be more unhappy about it.
It's simple. X3000-based IGPs have poor VS/T&L implementation. Just like Intel tried to save die size by not implementing hardware VS/T&L on the Extreme Graphics/older GMA graphics, Intel saved die size on the X3x00 by putting a low performance hardware T&L/VS. The ONLY advantage of hardware T&L/VS on the X3x00 is compatibility.
Let me show you how the VS/T&L on the X3000 performs:
What Intel created with the X3000/X3100 was this:
1. Poor polygon throughput and poor vertex shader performance meant that in older games(and the newer games in low resolutions), the performance is low
2. Low fillrate(1.06GTexels/s single, 2.13GTexels/s multi-textured pixels for the 667MHz X3000), and low number of unified Execution Units guarantees that the performance is low for older games in high resolutions and newer games with new graphics features.
3. Conclusion is that you have a GPU which performs lower in older games than GMA 950 but faster in newer games(which doesn't matter because the performance improvement is still not enough).
4. If you cannot tolerate below 40 fps, the ONLY advantage in 3D that X3000/X3100 gives over GMA 950 is compatibility. -
Did intel screw up the design of the X3100, or does it simply represent the most powerful chip possible given very tight wattage constraints?
I will probably never buy a notebook with discrete video (too hot & power-hungry) so I just have to hope Intel gets it right in the next generation. -
It has to do with die size/power consumption/transistor count constraints that I have mentioned on the previous post. The G35 chipset has 45 million transistors for the entire GMCH. That includes the PCI-Express controller, GPU, and memory controller. For the G965 chipset I think it was somewhere around 35 million, with 70% of the die taken up by the GPU core. So we are looking at 25 million transistor GPU(the even more amazing thing is that the transistor count of the G965 is rumored to be 3x the 945G chipset with GMA 950). The AMD 780G's HD3200 IGP has 205 million transistors!!
Intel's presentation for their 5th generation IGP core(1st: Extreme Graphics, 2nd: Extreme Graphics 2, 3rd: GMA 900/950, 4th: GMA X3000/X3100/X3500, 5th: GMA X4500) says that the performance will be 2x the current IGP in 3DMark06. I just hope that the performance in the parts where its performing really bad will be better than that. They also say texture filtering performance will be better and the VS performance will be 2x per clock(and its 65nm, GM965 is 90nm), so here's hoping something good comes out of it! -
StarScream4Ever Notebook Consultant
so i guess we can close the chapter on the X3100 performance?
-
BTW, did Vista SP1 did anything for anyone??
(14.33.1 is out, 15.8.1 is not, hopefully soon) -
Do the drivers of the motherboard influence the quality of the video displayed by X3100?
-
ToxicBanana Notebook Consultant NBR Reviewer
-
-
You are correct about not letting them off the "hook". The X4500 will use basically the same drivers. -
I am having some serious weird problems with the X3100 on a Sony VAIO CR21S.
Thing is, everything works fine but video playback. XViD & DivX.
I tested on both Vista ultimate with all the updates and the latest drivers and XP Sp2 with everything.
The image apears blocky. I tested both with video codecs and without. With BSPlayer & VLC. Some videos look worse than others.
Is there any way to work with the video settings in BIOS like on anormal PC?
I don`t know what else to do. Any help would be highly appreciated. I am getting desperate.
Thank you! -
-
To avoid sifting through all 52 pages...which drivers are the best performing for Windows XP? I have a MacBook which I have XP installed on to play some games...the x3100's performance under Leopard is probably half that of windows. It's pathetic.
-
-
Updated hardware on the other hand(like the X4500) for example, gives features no drivers will be able to give. X4500 has more Execution Units, higher clock speed, more bandwidth(when equipped with faster memory), and better architecture(like faster Vertex Shaders!!). Now that's something that can't be "unlocked". You can't unlock something when there is nothing to unlock in the first place. -
Yes but that isnt to say the X3100 isnt capable. The reason why it sucks is simply down to drivers. Heck, stuff like KOTOR 2 runs like crap whereas Far Cry and FEAR run quite well on reasonable settings. What im saying is that the drivers will make it a LOT better, doesnt mean anything on par with X4500, which as you said is mostly just a faster version, and as we know, lowering settings hardly helps the X3100
-
How bad does KOTOR 2 run?? Is it worse frame-wise than Far Cry??
-
StarScream4Ever Notebook Consultant
-
That IS a driver issue unfortunately
. I was thinking you get something like 30 fps at least.
I also believe TF2 and Portal can run better, and maybe Doom 3 and other Doom 3 based games. -
StarScream4Ever Notebook Consultant
-
Should I upgrade to the latest drivers? I have the one which was released late December.
-
I managed to get Half Life 2 Episode 2 to work on my T61 14.1 inch.
-
What's the story with GMA3100 and X3100? What are the differences, and how much faster is the latter? Have a machine here but it's the older GMA3100, might see what I can get out of it....they did release drivers enabling hardware features for the GMA3100 I think...
-
I have the Mobile Intel(R) 965 Express Chipset Family connected to a 1440x900 external display but I have like a "refresh problem". Is like having transparent orizontal lines on all the screen. Anyone got this problem?
-
A major problem i've experienced with the GM965 Vista drivers (past and present) is the inability to sync properly to an external monitor at its native resolution.
I have a Samsung lnt4071f lcd tv which has a native resolution of 1920x1080. I cannot for the life of me get my notebook to extend its desktop onto the external display at the correct resolution. The highest resolution choice appears to be 1600x1200.
I've tried manually setting the horizontal and vertical refresh rate via a 3rd party software but it doesnt resolve the problem. I've contacted Intel directly about the problem but they say they are unable to reproduce the problem in the lab. -
Yes it seem like a refresh problem but the vga support my display perfect. A 12" notebook without the possibility to use an external display is useless.
-
Desktop
G965=GMA X3000
Q965/Q963/946GZ=GMA 3000
G35=GMA X3500
G33=GMA 3100
Laptop
GM965=GMA X3100
GL960=GMA X3100(lower clock)
The difference between the versions with X and no X: The versions with X(X3000/X3100/X3500) has support for full DX9 SM3.0 support with hardware vertex shaders, while the versions with no X has no hardware support for them.
-
Thanks for that. I was getting confused - I thought X3100 was a rebranding for the mobile version. At least now I know not to bother wasting my time trying out games before I get a new card for said machine.
Why does the G965 feature X3000 and the G33 GMA3100? Or is the desktop X3000 not actually part of the family the X3100 is in, and just happens to share a similar name? Or is the G33 just cheapo... -
-
I just found this on the Intel IGP forum (scroll down to bottom of the topic) looks to be a beta version of the 15.9 driver I haven't installed it yet read!
http://softwarecommunity.intel.com/isn/Community/en-US/forums/9/30229561/ShowThread.aspx#30229561
http://www.computerbase.de/downloads/treiber/grafikkarten/intel_grafiktreiber/ -
Use the Google translater to translate from german to english
http://www.google.com/translate_t
http://www.computerbase.de/news/tre...l/intel_erster_directx-10-treiber_gma-grafik/ -
Just tested in 3dmark05 got the highest score ive ever got (900 3dmarks)
-
-
http://www.computerbase.de/downloads/treiber/grafikkarten/intel_grafiktreiber/
I usually got on average 750 but my highest score prior was 830. Of course 3dmark is just a benchmark to really see if there is a performance increase ill have to test a few games. So far Ive tried Wow which ran fine before but it seems that I may have gain 2-3 fps nothing significant. -
According to the links there are some image quality issues with the beta 15.9 driver so they are not recommending it to everybody.
Apparently the fastest driver for most games is 15.4.4 which is the last drivers to use Software T&L. In some cases the speedup with 15.4.4 is dramatic compared with later drivers. -
On average I get about 20-30fps but fps occasionally in higher detail areas dips as low as 16fps but that doesn't happen to often and doesnt last long fps goes as high as 30+fps especially inside buildings Ive notice that the view distance Dramatically effects fps. If you put the view distance on high you may get as low as 10fps again thats only in certain higher detail areas while of you set the view distance to low you get an Constant framerate of about 25-35+ fps I have my view distance set to about medium.So I get on average 20-30fps with a framerate going as high as 30+.
Settings: Terrain distance(View distance):Medium, Terrain detail:Low,Spell detail:High, Environment detail:Far (high) Ground clutter & Radius is set to Low
Texture Resolution:High, Texture Filtering: Low Specular lighting is enabled and Resolution is set to 1280x800
Also again this is just the beta driver so don't be surprise if any problems arise Time to test more games... -
From that link about 15.9 beta driver:
The games that run better are the ones that already run pretty well-Far Cry, CS: Source, Half Life 2
Those games got 25-40% gain according to the site.
It would be more accurate to test the gain using hardware mode for 3DMark as most games run on hardware mode for that, and its more meaningful.
Don't expect to gain better 3DMark scores because 3DMark scores represent the best case scenario and is likely not gonna get better. -
Tried out the new drivers. Company of Heroes finally works properly! No more scroll lag! Gonna mess around with CoD4 and Portal, will post results. -
Just tested 3dmark05 in hardware mode there's still artifacts got a score of 548
-
I'm still on the 15.8 Intel drivers, and thought I'd let people know Adobe flash has had an upgrade to 9.0.124.0 and with this version, with hardware acceleration enabled in flash settings, it no longer gives a grey screen.
So maybe I was wrong
Or maybe it just downgrades gracefully if it doesn't find supported HW accel now -
So whats the general consensus, any improvements so far?
-
In addition to setting a relatively low view distance, I've found that it's not a good idea to set texture filtering higher than the 2nd-lowest notch. A higher texture filter setting will not lower the frame rate much, but it induces a very noticeable delay in mouse movement especially in areas that are high above the ground.
Setting Terrain Detail to high only seems to cost about 1 fps on my system.
Specular Lighting thankfully only costs about 1 fps also. Not sure I could live without this effect.
I tried installing the 15.4.4 drivers, the last ones with software-only T&L, but I experienced a 15% drop in FPS compared with the 15.9 drivers.
The only annoying thing is that Vista keeps asking if I'm sure if I want to run the driver executables (igfxsrvc.exe, etc) each time I boot up. -
The problem with setting low view distance in games like MMORPGs is that it actually sacrifices gameplay a bit. It's actually useful to have far view distances.
-
-
-
StarScream4Ever Notebook Consultant
so when the final 15.9 driver is release, are we expecting it to be the driver that we all been waiting for since the darn thing was launch?
-
This further proves that one day we can play games from 2007, 2006 and 2005 on Low settings at a playable framerate. Heck, Shader heavy games like Republic run at a playable framerate whereas KOTOR 2 is crippled. Again, once the drivers get good, we can do SOME gaming with the chip -
StarScream4Ever Notebook Consultant
-
Unfortunately, KOTOR is still trash, though considering the game it is, its still playable. Max Payne 2 also maintains high framerates but constantly dips making it look like its going into slo-mo on its own. -
That's a good sign. Maybe we'll actually see some improvements that should have been there from the beginning. The drivers following 15.6/14.31 drivers didn't seem to change performance enough for the most part.
Intel X3100 users rejoice! It's finally here! New Pre-Beta Drivers
Discussion in 'Gaming (Software and Graphics Cards)' started by epictrance4life, Jun 7, 2007.