The only game that I play right now is CS:S on my desktop PC which has an ATi X300 graphics card. Would the X3100 perform better than my X300? I'm talking about performance hardware-wise. Like many others, I'm waiting for Intel to release some good drivers for the X3100.
The Specs on my Desktop are:
Pentium 4 HT @ 3.6 gHz
1 GB DDR-400
ATi X300
Latop (FZ145E)
Core 2 Duo T7100 @ 1.8 gHz
2 GB DDR2
Intel X3100
-
They are both very slow cards but as Intel updates X3100 drivers, the x3100 will probably outperform the x300. You still won't be playing any new games with decent settings though. The x3100 should be fine for games that are 2 years and older.
-
I don't really play new games anymore. All I play now is CS:S and I'm able to run that on my desktop @ 1280x1024 with mid/high settings and get playable fps.
The only game that I plan on getting in the future is maybe GTA4. I've always been a fan of the GTA series. I'll be going off to college in 2 years and don't want gaming to become a big part of my life. -
CS:S will run well on the X3100 but I doubt GTA4 will be playable on decent settings, it might run at low resolutions + details.
-
Charles P. Jefferies Lead Moderator Super Moderator
I don't think we should jump to any conclusions . . . I have yet to test my X3100. Somehow I doubt it is going to beat an X300.
-
ltcommander_data Notebook Deity
http://deadmoo.com/articles/2006/09/28/intels-new-onboard-video-benchmarked
Well on Linux the X3100 manages to beat the X550 (which is an overclocked X300) in UT2004 benchmarks. Either that is some true dedication by linux Intel users or a very disappointing effort by ATI Linux users. And this was from 10 months ago too, far before Intel decided to release hardware VS and T&L for Windows.
The above shows that the X3000/X3100 definitely have potential. However, this is completely dependent on Intel releasing decent drivers to unlock this performance, which they probably will, but the question is how patient are you? Intel supposedly won't officially activate all the hardware features of their IGPs until the fall and then there is still plenty of optimization to do. For an Intel IGP the X3100 is amazing, but if you really want to hit the ground running you'd best look for something from ATI or nVidia. -
Well, I really hope they do release the drivers soon. I'm not much of a gamer anymore and the only game I play right now is CS:S and I don't even play that as often as I used to. Because I don't play the newest games or many games for that matter, I found it kind of pointless to shell out the extra cash for a dedicated gpu.
Also, I mainly chose this FZ model because it was on sale on labor day for $1200 from $1400. The specs seemed pretty good and I loved the way it looked, so I bought it. -
From a benchmark standpoint, the X3100 seems on-par with the X300. It may perform better than the X300 sometimes, it may not; it depends on the drivers that Intel puts out.
If you're only going to be playing CSS, then the X3100 should be plenty powerful regardless. -
ltcommander_data Notebook Deity
http://downloadcenter.intel.com/Det...&OSFullName=Windows* XP Professional&lang=eng
This link is for XP but there is a Vista version available now. It is a pre-beta driver (unsupported) but demos hardware T&L and VS2.0 support. Intel only verifies acceleration on a few applications right now for trial purposes and CS:S is not among them, but you can always try. You might still have to stick to a lower codepath though like DX8.1 instead of DX9.0. -
I'm a little bit weary about installing these pre-beta drivers. I've been keeping up with the X3100 Pre-beta drivers thread. I don't really mind waiting for the final version to come out. I heard they were coming out in August or September.
-
ltcommander_data Notebook Deity
If only Intel would hurry up.
-
-
www.notebookcheck.com says the X3100 is around the X300.
-
ltcommander_data Notebook Deity
(The upcoming GMA X3500 for the desktop G35 also looks to support OpenGL 2.0 versus the current OpenGL 1.5 in addition to DX10. It will also have universal VC-1 hardware acceleration (since currently it only does it on Microsoft's WMV9 implementation). And most importantly feature 10 unified shaders. In the mobile version on the Cantinga chipset for Penryn in Q2 2008 it will actually be clocked slower at 475MHz versus 500MHz now so hopefully a refined architecture and those 2 extra shaders can make up the difference.) -
I doubt the X3100 is faster than the baseline X300 which features 128-bit memory bus. AMD 690G's Radeon X1250 barely manages to beat X300 performance, yet the performance for the X3100 is still slower than the 690G. -
ltcommander_data Notebook Deity
Cantiga is known to have 10 unified shaders at 475MHz for it's implementation and since desktop and mobile versions are generally architecturally identical except for clock speeds and other power saving features, the X3500 on the G35 most likely has 10 unified shaders as well. The reasoning for a strange number like 10, probably stems from the G35 being directly based on the G965, which is kind of given away by the fact they have the exact same packaging as well as only DDR2 support and the old ICM8 southbridge instead of DDR3, unlike the lower-end G33 which has new chip packaging, has DDR3 plus uses the ICM9 southbridge. Most likely the G965 and G35 both have 12 shaders, but the G965 only has 8 activated for yield reasons, and the G965 was slow to appear because of the transition to 90nm, while yields have finally improved enough for the G35 to activate 10 with 2 extra for leeway. -
"Mit Bearlake-G+ wird Intel die Anzahl der Unified-Shader-Einheiten von derzeit 8 auf 16 verdoppeln."
It basically says Bearlake G+ will have 16 unified shaders, up from 8.
Intel says on the G965 architecture document that its unified shaders are made to be easily scalable. Intel used to make low-end CPUs essentially crippled versions of the top-end CPUs, but now with Core 2 Duo, the latest stepping 2MB cache models actually have a smaller die size and transistor count, meaning its a legit 2MB chip. There are also talks that X38 is a different chip from P35. I don't see why the shader unit count could be not different.
About the support for only DDR2 memory and same packaging as G965, HKEPC explained sometime ago why. It says that the motherboard manufacturers think the pace of changes for chipsets/packaging and everything related is so fast, that they wanted the G35 to remain compatible pin-wise with the G965. The result is that G35 gets downgraded to support DDR2 memory and similar pairings to the G965. However, they said the G35 chip itself will remain the same architecture-wise. -
ltcommander_data Notebook Deity
X3100 (laptop) vs X300 (desktop)
Discussion in 'Gaming (Software and Graphics Cards)' started by r!vaL, Jul 11, 2007.