The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    X3100 (laptop) vs X300 (desktop)

    Discussion in 'Gaming (Software and Graphics Cards)' started by r!vaL, Jul 11, 2007.

  1. r!vaL

    r!vaL Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    The only game that I play right now is CS:S on my desktop PC which has an ATi X300 graphics card. Would the X3100 perform better than my X300? I'm talking about performance hardware-wise. Like many others, I'm waiting for Intel to release some good drivers for the X3100.

    The Specs on my Desktop are:
    Pentium 4 HT @ 3.6 gHz
    1 GB DDR-400
    ATi X300

    Latop (FZ145E)
    Core 2 Duo T7100 @ 1.8 gHz
    2 GB DDR2
    Intel X3100
     
  2. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    They are both very slow cards but as Intel updates X3100 drivers, the x3100 will probably outperform the x300. You still won't be playing any new games with decent settings though. The x3100 should be fine for games that are 2 years and older.
     
  3. r!vaL

    r!vaL Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    I don't really play new games anymore. All I play now is CS:S and I'm able to run that on my desktop @ 1280x1024 with mid/high settings and get playable fps.

    The only game that I plan on getting in the future is maybe GTA4. I've always been a fan of the GTA series. I'll be going off to college in 2 years and don't want gaming to become a big part of my life.
     
  4. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    CS:S will run well on the X3100 but I doubt GTA4 will be playable on decent settings, it might run at low resolutions + details.
     
  5. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,088
    Trophy Points:
    931
    I don't think we should jump to any conclusions . . . I have yet to test my X3100. Somehow I doubt it is going to beat an X300.
     
  6. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    http://deadmoo.com/articles/2006/09/28/intels-new-onboard-video-benchmarked

    Well on Linux the X3100 manages to beat the X550 (which is an overclocked X300) in UT2004 benchmarks. Either that is some true dedication by linux Intel users or a very disappointing effort by ATI Linux users. And this was from 10 months ago too, far before Intel decided to release hardware VS and T&L for Windows.

    The above shows that the X3000/X3100 definitely have potential. However, this is completely dependent on Intel releasing decent drivers to unlock this performance, which they probably will, but the question is how patient are you? Intel supposedly won't officially activate all the hardware features of their IGPs until the fall and then there is still plenty of optimization to do. For an Intel IGP the X3100 is amazing, but if you really want to hit the ground running you'd best look for something from ATI or nVidia.
     
  7. r!vaL

    r!vaL Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    Well, I really hope they do release the drivers soon. I'm not much of a gamer anymore and the only game I play right now is CS:S and I don't even play that as often as I used to. Because I don't play the newest games or many games for that matter, I found it kind of pointless to shell out the extra cash for a dedicated gpu.

    Also, I mainly chose this FZ model because it was on sale on labor day for $1200 from $1400. The specs seemed pretty good and I loved the way it looked, so I bought it.
     
  8. FREN

    FREN Hi, I'm a PC. NBR Reviewer

    Reputations:
    679
    Messages:
    1,952
    Likes Received:
    0
    Trophy Points:
    55
    From a benchmark standpoint, the X3100 seems on-par with the X300. It may perform better than the X300 sometimes, it may not; it depends on the drivers that Intel puts out.

    If you're only going to be playing CSS, then the X3100 should be plenty powerful regardless.
     
  9. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Oh so you already have it. Then you should definitely try these drivers:

    http://downloadcenter.intel.com/Det...&OSFullName=Windows* XP Professional&lang=eng

    This link is for XP but there is a Vista version available now. It is a pre-beta driver (unsupported) but demos hardware T&L and VS2.0 support. Intel only verifies acceleration on a few applications right now for trial purposes and CS:S is not among them, but you can always try. You might still have to stick to a lower codepath though like DX8.1 instead of DX9.0.
     
  10. r!vaL

    r!vaL Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    I'm a little bit weary about installing these pre-beta drivers. I've been keeping up with the X3100 Pre-beta drivers thread. I don't really mind waiting for the final version to come out. I heard they were coming out in August or September.

     
  11. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    I have heard the drivers seem a bit finicky. I guess the only defence is that 3DMark is not an approved application for the pre-beta driver. ;) If only Intel would hurry up.
     
  12. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    On a side-note, those benchmarks compare the desktop version of the x3100 which is probably slightly faster than the laptop version.
     
  13. FREN

    FREN Hi, I'm a PC. NBR Reviewer

    Reputations:
    679
    Messages:
    1,952
    Likes Received:
    0
    Trophy Points:
    55
  14. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Well it's probably a wash. The desktop X3000 is clocked at 667MHz while the mobile X3100 is clocked at 500MHz. That said, the X3100 looks to be based on a revised design (bug-fixed rather than redesigned though) since Intel now only supports (as in future support) DX10 on the X3100 while the desktop X3000 is resigned to DX9.0c. This may help boost the X3100 a bit more than the lower clockspeed makes it appear. The X3100 also supports improved TV output with a driver utility that is also available for the GMA 3000 on the new G33 and the GMA X3500 on the upcoming G35, but not for the X3000.

    (The upcoming GMA X3500 for the desktop G35 also looks to support OpenGL 2.0 versus the current OpenGL 1.5 in addition to DX10. It will also have universal VC-1 hardware acceleration (since currently it only does it on Microsoft's WMV9 implementation). And most importantly feature 10 unified shaders. In the mobile version on the Cantinga chipset for Penryn in Q2 2008 it will actually be clocked slower at 475MHz versus 500MHz now so hopefully a refined architecture and those 2 extra shaders can make up the difference.)
     
  15. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    There will be 16 unified shaders on the desktop G35/GMA X3500 with similar clock speeds to the G965.

    I doubt the X3100 is faster than the baseline X300 which features 128-bit memory bus. AMD 690G's Radeon X1250 barely manages to beat X300 performance, yet the performance for the X3100 is still slower than the 690G.
     
  16. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    http://www.vr-zone.com/?i=4139

    Cantiga is known to have 10 unified shaders at 475MHz for it's implementation and since desktop and mobile versions are generally architecturally identical except for clock speeds and other power saving features, the X3500 on the G35 most likely has 10 unified shaders as well. The reasoning for a strange number like 10, probably stems from the G35 being directly based on the G965, which is kind of given away by the fact they have the exact same packaging as well as only DDR2 support and the old ICM8 southbridge instead of DDR3, unlike the lower-end G33 which has new chip packaging, has DDR3 plus uses the ICM9 southbridge. Most likely the G965 and G35 both have 12 shaders, but the G965 only has 8 activated for yield reasons, and the G965 was slow to appear because of the transition to 90nm, while yields have finally improved enough for the G35 to activate 10 with 2 extra for leeway.
     
  17. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    If the G35 only supports 10 shaders, it would spell almost doom performance-wise. It would still not be enough of an advancement over the two generation old GMA 950. Here's the site that says about the G35: http://www.computerbase.de/news/har.../oktober/idf_gma_x3x00_treiber_leistung_dx10/

    "Mit Bearlake-G+ wird Intel die Anzahl der Unified-Shader-Einheiten von derzeit 8 auf 16 verdoppeln."

    It basically says Bearlake G+ will have 16 unified shaders, up from 8.

    Intel says on the G965 architecture document that its unified shaders are made to be easily scalable. Intel used to make low-end CPUs essentially crippled versions of the top-end CPUs, but now with Core 2 Duo, the latest stepping 2MB cache models actually have a smaller die size and transistor count, meaning its a legit 2MB chip. There are also talks that X38 is a different chip from P35. I don't see why the shader unit count could be not different.

    About the support for only DDR2 memory and same packaging as G965, HKEPC explained sometime ago why. It says that the motherboard manufacturers think the pace of changes for chipsets/packaging and everything related is so fast, that they wanted the G35 to remain compatible pin-wise with the G965. The result is that G35 gets downgraded to support DDR2 memory and similar pairings to the G965. However, they said the G35 chip itself will remain the same architecture-wise.
     
  18. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Well I guess it's hard to tell. But, the G35 has not remained the same hardwarewise. It was originally roadmapped and presented as having full integrated secure HDMI support, but that was cut when they changed to G965 packaging. And 10 shaders isn't a big hardware impedement, since Intel still hasn't figured out how to fully activate the GMA X3000s 8 shaders yet. Once everything's done it hardware, there will definitely be dramatic improvement over the GMA 950.