The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Intel GMA X3100 shader support?

    Discussion in 'Gaming (Software and Graphics Cards)' started by AmazingGracePlayer, Jul 16, 2007.

  1. AmazingGracePlayer

    AmazingGracePlayer Notebook Deity

    Reputations:
    236
    Messages:
    1,737
    Likes Received:
    12
    Trophy Points:
    56
    What's the shader support on Intel Graphics Media Accelerator X3100? Does it offer up to 3.0?
    How does the performance (when it comes to games) compare to GeForce 7400?
     
  2. msiner

    msiner Notebook Consultant

    Reputations:
    5
    Messages:
    127
    Likes Received:
    0
    Trophy Points:
    30
    The 965 documentation lists "Hardware Pixel Shader 3.0" as a feature.
     
  3. AmazingGracePlayer

    AmazingGracePlayer Notebook Deity

    Reputations:
    236
    Messages:
    1,737
    Likes Received:
    12
    Trophy Points:
    56
    So it IS possible to play certain games that require 3.0 unit/pixel shader?
     
  4. msiner

    msiner Notebook Consultant

    Reputations:
    5
    Messages:
    127
    Likes Received:
    0
    Trophy Points:
    30
    That just means that it will support that feature. Whether or not the X3100 offers the complete performance needed for any specific game is another question. It can allocate something like 256 MB of memory, so that is not much of a bottleneck either. You just have to remember that the chipset has other duties to perform and it is sharing the memory with the CPU. So overall performance is where the integrated graphics can fall short. But it does seem like X3100 has at least the ability to support many features to some extent.
     
  5. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    It will support shader 3.0. It should be approximately 60-65% as powerful as the Geforce 7400, depending upon the application.
     
  6. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Huh?? On what basis?? So far its slower than AMD 690G. Plus, it doesn't support Shader 3.0 because the drivers aren't ready yet, and even the 2.0 support is in the beta stage.
     
  7. msiner

    msiner Notebook Consultant

    Reputations:
    5
    Messages:
    127
    Likes Received:
    0
    Trophy Points:
    30
    First of all, you do know that he was saying that the Intel is 35%-40% SLOWER than the GeForce, right? Second, the hardware spec does say that it supports Shader 3.0. I have cited the white-papers in my initial post, please cite a source for your claim about the Shader support not being available.
     
  8. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Hmm yes, thanks for point that out, but I must say it seems like you have a tone that's used when attacking. Anyway I thought he said 60-65% faster.

    You DO really understand what I said here right??: Plus, it doesn't support Shader 3.0 because the drivers aren't ready yet.

    White paper doesn't mean anything when rest of the infrastructure doesn't allow it to support it. Anyway, none really matters when the 7300GS is already couple of times faster(4-5x): http://www.hkepc.com/hwdb/gf7300gs-3.htm
     
  9. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    That page doesn't have anything to do with anything. I believe it has a desktop ATI x1300, which is a completely seperate card from the laptop Intel GMA x3100. And to be clear, I said the x3100 will be between 60-65% as fast as the Go 7400, i.e. if the 7400 had 100fps, the x3100 would have between 60-65fps.

    And while it's true the driver support for it is not great, it does support shader 3.0. I have run many games and apps on a D830 w/ x3100 that require shader 3.0, and it's fully compatible.
     
  10. msiner

    msiner Notebook Consultant

    Reputations:
    5
    Messages:
    127
    Likes Received:
    0
    Trophy Points:
    30
    @IntelUser
    I am not attacking anything. I emphasized the word slower to make sure there was no confusion because there was obviously confusion with the initial way it was stated. I will trust the technical documentation unless I see another, at least semi-official, technical source stating otherwise. That is why I was simply asking that you cite a source for your claim that the driver is not ready. My wife has a D630 with X3100, so I know that a driver exists, but have not seen anything to suggest that it does not support the full capabilities of the hardware. If you do not cite a source, then you could easily be spreading rumours that have no validity. So once again I will politely ask that you provide a source for your claims about the X3100 drivers. The link you provided is not a source for your claims and adds nothing to this discussion about shader support in the X3100.
     
  11. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    If you say so:
    http://www.intel.com/support/graphics/sb/CS-011910.htm

    1 Although the chipset hardware itself supports T&L, future graphics drivers will be required to utilize this feature.

    Solution ID: CS-011910
    Date Created: 25-May-2004
    Last Modified: 09-Jul-2007


    What does the above mean?? It means at the time it came out, it did not support hardware T&L on the driver. Did you never hear of lack of drivers for the X3000/X3100?? It looks like you are way behind the times buddy, at least in IGP terms.

    It still does not, at least not officially since the http://www.intel.com/support/graphics/intelg965/

    page shows that the drivers are in the beta stages!! Could you at least check Intel's driver page to see that its obvious that its not supported yet?? The production drivers are supposed to be coming out in August.

    On this VR-Zone forums page, which has hundreds of replies regarding the driver support,

    And that was the pre-beta back in July 2007, almost a year after the chipset's release.

    So?? I have G965/GMA X3000 with E6600 CPU.

    You could have seen what is the reality by doing a simple thing as checking Intel's driver page. Ok, so maybe you are calling the beta drivers as a proof of supporting hardware T&L/vertex shaders. Except for most people, they can't really care with the bugs and problems that's appearing with the driver, in addition to the "performance" part not being there yet for lots of games. We are all hoping that the final driver will bring the promised improvements and the compatibility.

    Previously, before the pre-beta driver even came out, games like BF2/2142, CoD2, refused to run at all, while with the pre-beta and the beta driver it runs. 3dmark01 also showed that it only had software T&L support, while with the beta drivers, it shows hardware T&L. If you still want to say the drivers are ready, well then I think I'll stop posting.
     
  12. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Well, you're right in that it is a beta driver, but that doesn't change the fact that it does support shader 3.0. Also, Hardware T&L support is a completely different animal from shader 3.0 support. And regardless of the matter of the drivers, the answer to the OP's question is yes, since the x3100 hardware has full support of shader 3.0, whether or not the drivers have matured yet.
     
  13. jessi3k3

    jessi3k3 Notebook Evangelist

    Reputations:
    211
    Messages:
    520
    Likes Received:
    0
    Trophy Points:
    30
    The games that are "supposively" running shader 3.0 most likely support shader 2.0 or 1.4 and below and are resorting to that method of rendering. Would you list the names of the games or programs you are runnin?

    For example: The game Elder Scrolls IV Oblivion supports shader 3.0. If it does detects that your card does not support it it falls back to shader 2.0. If you dont have that i simply wont run at all. Now there are mods out there that will give Oblivion shader 1.4 and 1.1 but that's another story.
     
  14. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    My opinion is that with being on a beta driver stage, the support is pretty much null. And "officially" speaking, it does not support T&L/vertex shader(yet). It's a beta. Usually Intel doesn't post beta drivers, but because people wanted it so much they did it this time. The performance doesn't support that it has T&L/vertex shader support in hardware either. I didn't really get performance increase in games I tested. And considering Geforce Go 7400 doesn't have that problem of having a crappy ass driver, its magnitudes better.

    Go to VR-Zone about the driver. The driver has enough bugs to be not worthy yet. It's almost like Geforce 5200 being DX9, but since its so slow, some games recognize it as an older part.

    I tested:
    -Warcraft III
    -World of Warcraft
    -Prey
    -Call of Duty 2
    -Quake 4
    -Quake 3
    -3dmark 01 to 06

    Besides the fact that Call of Duty 2 stays laggy once it starts lagging, rest of the games yielded absolutely no improvement in performance, and 3dmarks and Quake 3 actually went down in performance.

    -Beta driver+not much improvement. I don't call that support, I'll recognize it when they release a production driver with all sorts of bugs sorted out.
     
  15. msiner

    msiner Notebook Consultant

    Reputations:
    5
    Messages:
    127
    Likes Received:
    0
    Trophy Points:
    30
    @IntelUser
    Thank you for supplying a source for your claims. You should not get so defensive about supplying such a source. Language like
    is not helpful nor polite. Please adjust your attitude when posting to this forum.

    I believe the final answer in this thread is:
    While the Intel X3100 supports Hardware Shader 3.0, current "stable" drivers do not utilize this support. The driver support is in a separate beta driver, but support should eventually come to a stable driver.
     
  16. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Sorry, I'll take note.

    Yep, basically. Actually even the beta drivers don't support Shader 3.0. The GMA X3100 has support for full SM 3.0 support in hardware, including HDR, but even the beta drivers remain at 2.0.
     
  17. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    According to Intel and my own tests, it fully supports Shader Model 3.0 and HW T&L in the beta drivers.
     
  18. jessi3k3

    jessi3k3 Notebook Evangelist

    Reputations:
    211
    Messages:
    520
    Likes Received:
    0
    Trophy Points:
    30
    It's as if you completely disreguarded my post...
     
  19. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    I'm sorry, I actually didn't notice your post. I was quite tired when I checked this thread last. To answer your question, most of the software I run actually can revert to 2.0, so to test it out I just loaded my 3D World Studio on, and it works fine with Shaders. Since that feature of the Leadwerks engine pretty much requires 3.0, I'm assuming Intel is telling the truth about the support.
     
  20. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Doesn't the GMA X3100 emulate shaders in software? I thought that that's how Intel claimed 3.0 compliance even before HW T&L was activated.
     
  21. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Possibly, but I'm fairly certain that Leadwerks engine wouldn't run in Shader mode on it unless the Shader 3.0 support was hardware based. Also, from monitoring my system at the time it doesn't appear that the CPU is performing any extra emulation.