The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Intel X3100 users rejoice! It's finally here! New Pre-Beta Drivers

    Discussion in 'Gaming (Software and Graphics Cards)' started by epictrance4life, Jun 7, 2007.

  1. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    x3100 is epic, i love mine :D
     
  2. EdP0

    EdP0 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    hi

    i just installed area 51 on my dell inspiron 1525 and it runs horribly slow it cant be played. i have search for pc requirements for this games and i found that it is compatible with intel integraded graphics so i dont know why i get slow gameplay this game was release in 2005 so i was expecting that my laptop could run it well, i think maybe the problem is windows vista.

    i have:

    dell inspiron 1525
    windows vista home premium 32 bits
    2.0 gb ram
    intel pentium dual core 1.73 Ghz
    Mobile Intel(R) 965 Express Chipset Family
    gma x3100

    help please !!
     
  3. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    The problem isn't vista, it's the x3100. It doesn't have the pixel throughput to play all but the simplest games at a smooth framerate. This limits especially old games, where as new games are slowly by virtue of just being more complex.
     
  4. Sadie48

    Sadie48 Notebook Guru

    Reputations:
    12
    Messages:
    56
    Likes Received:
    0
    Trophy Points:
    15
    Ther're grrrreat!
     
  5. EdP0

    EdP0 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5


    but if you search for area 51 pc requirements you found that the intel
    gma 950+ integrated graphics is in the recommended video cards for playing this game, that is what makes me think is vista or something else
     
  6. Oemenia

    Oemenia Notebook Evangelist

    Reputations:
    0
    Messages:
    331
    Likes Received:
    5
    Trophy Points:
    31
    So yeah, will we X3100 ever get new drivers?

    I think the chip still has plenty of juice left in it, ive had some decent gaming done with it. The peak being HL2EP1 with HDR on at 1024x768
     
  7. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    Right. The x3100 is actually weaker than the GMA 950 in the only sense that matters for old games, pixel throughput. In many games (such as half-life), it's weaker than the i940 even. Half-life will run fantastically on a i940/950 without slowdown at high resolutions. The x3100 can't handle Half-Life 1. It's even weaker than my 7 year old 16 mb ATI radeon mobility (166 core 144 memory), probably around half the performance. Nope, definitely not vista.
     
  8. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    i get 5-10 FPS on Anarchy Online which is a ancients game and same on Diablo 2 -_-
     
  9. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    I don't think its JUST Intel's fault for slow performance. Game developers also need to work with Intel to improve performance. Remember, the developers only worked on ATI/Nvidia graphics. It's also a radical departure from the GMA 950 based architecture.

    Take a look at some of the X4500 optimized games:

    http://software.intel.com/en-us/vid...el-on-intel-centrino2r-processor-technology-1

    http://herald.warhammeronline.com/warherald/NewsArticle.war?id=168

    Look at the first video, it runs rather well. Remember, X4500 has lots of things common with X3100, meaning it should also run decently on the X3100. The ATI/Nvidia IGPs have an advantage that the architecture is derived off of their common dedicated GPUs, which means optimization isn't as necessary.

    The good news to IGP market for Larrabbee isn't that a higher performing core coming to IGPs, but rather than in the future, Larrabbee-derivatives will scale down to IGPs eventually and have a common platform for game developers to work on for both Larrabbee-IGP and Larrabbee-Discrete.
     
  10. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    Yes there's indeed room for performance improvement on the game developer side. For example check section "Performance Tips when working with GMA X3000" in this document
    http://software.intel.com/en-us/articles/intel-gma-3000-and-x3000-developers-guide

    E.g. "Favor advanced single-pass shaders over simple multi-pass shaders" just to mention one of them. And you may also browse this presentation from GDC Feb2008. Even as a non-Developer you should be able to tell by looking at slide 20:
    http://software.intel.com/file/1477

    Even given that - there's not too much to expect. One reason is pure shader and clock advantage ATI/nVidia has. See section "Competitive Integrated Graphics?" here:
    http://www.anandtech.com/mb/showdoc.aspx?i=3417&p=4
    We had similar discussions in this thread just a few pages before. Also keep in mind that intel produces series 4 chipsets in 65nm production process, while AMD/ATI is already at 55nm.

    From a game developer site: Ego shooter sales most likely won't raise if I optimize for X3100/X4500, since gameplay won't reach a level where it's real fun playing (and competitive if you intend to play online). Even though HL2E2 fps look quite promising (depicted in Anandtech review link above). But some family games similar to The Sims, Second life, MMOG, Spore, ... those indeed might raise.
     
  11. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    Well, if Second Life would at least start on x3100...
     
  12. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    Oh, by the way, changin' the CPU from T2330 to T7250 gave me quite a noticeable boost. Finally I can enjoy some CS:Source action.
     
  13. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    I've read that before.

    Since Second Life is not contained in intel's compatibility list für X3100:
    http://www.intel.com/support/graphics/intelgm965/sb/CS-026200.htm
    I assume it's either a temporary feature, that it doesn't run. Or you have to switch from HW T&L -> SW T&L:
    http://forum.notebookreview.com/showthread.php?p=3634860#post3634860

    I did launch Second Life on a 855GM, which has software T&L and it worked.
     
  14. ATG

    ATG 2x4 Super Moderator

    Reputations:
    3,306
    Messages:
    4,461
    Likes Received:
    344
    Trophy Points:
    151
    What do you mean, I just try it out and i does start and it seem playable at 800X600(I played for like 2min though)..
     
  15. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    i play CSS with 41 FPS average on my laptop in my sig. works fine
     
  16. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    the last time i tried it crashed a minute later with a bunch of error messages.. maybe the updated drivers have changed the situation. Gotta give another go some time. Thanx for info

    mmm, it was sorta fine on t2330... until the moment someone throws a smoke grenade or two...
     
  17. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    The Anandtech article misses some things. The X3100 architecture has 8 Execution Units(EUs) sure, but each EUs have 2 cores, and each cores have 2 threads. X4500 has 10 EUs/20 cores/50 threads.

    So according to what they are saying(AT), it should be like a 2-wide 10 EU version, not the single wide one that Nvidia has, and from simple numbers alone, Intel should have 67% advantage over Nvidia's IGP.

    Look at the high end. Geforce 8800GTX has 128 SPs. ATI Radeon 3870 has 320. What's faster?? 8800GTX is faster. They are simplifying it way too far.

    But I think its clear that the implementations aren't as simple as we think and there's more.
     
  18. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Alright, I was really sure I could find 16 core for the X3100 claim, but I no longer can. However, single instruction per EU is not true.

    General purpose processing enhancement with GPU, using the X3000
    http://www.cis.udel.edu/~cavazos/cisc879/papers/exochi.pdf

    8 EU, 32 thread

    Another one with GMA X4500: http://www.ece.ubc.ca/~aamodt/papers/pact08_pangaea.pdf

    10 EU, 50 thread, 8-wide SIMD per thread.


    What I think

    1. The individual EUs are not comparable with Intel IGPs and Nvidia IGPs just like they are not comparable with ATI's IGPs.
    2. Intel definitely has a greater plan with the "X3000" derivative cores. One of the presentations say that the architecture can support up to 128 EUs, whatever that means.
    3. Perhaps maybe its this focus with "general purpose processing" that the graphics part suffers with their IGP.
     
  19. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    Yes, and that's why I just took a deeper look into the Gen4 architecture:

    Yes, the X3100 has 8 Execution units (EU), but I'm pretty sure it doesn't have 2 cores unless you can come up with a reference. Let's just stick for a moment to the 8 EU and I come back to the cores later. The threads in every shader whether it is of nVidia, ATI or intel are just there to avoid stalling:
    page 34:
    http://intellinuxgraphics.org/VOL_4_subsystem_core.pdf

    While those threads help to keep the EUs busy, they don't actually raise throughput. It doesn't even make sense to compare nVidia's # of threads per shader with Intel's. The reason is that the topology of those two is completely different: For example the intel EUs all have to share a single sampler unit which perform texture filtering. nVidias shaders can that perform on their own. There are even more functions external to EUs, which nVidias shader perform natively. That means: There's more latency involved in certain operations when running on intel EUs and thus very likely more threads are required to keep the EUs busy.

    To summarize: The # of threads don't tell much about how fast an EU processes work.

    A surprising aspect is that each single intel EUs can only perform a subpixel operation (e.g. shade the green pixel of RGBA) with every single clock, while nVidias shaders perform an operation on the entire pixel with all 4 RBGA values for example. You may read about there summerized here below the table:
    http://techreport.com/articles.x/12195/1
    That reduces the intel EU performance by 1/4th on a per clock basis.

    The above information is verified by source code snipplets of the mesa 3D linux driver (the link to it is given in the techreport article). Furthermore this principle is explained on page 281 of the already mentioned intel document. One easily misunderstands how the intel EU works, when reading about the SIMD EU instruction set and their 256 byte wide registers. Although intel doesn't mention explicit clock cycles in the specification those SIMD instructions are really split in 8 and 16 bit data packets and processes one after the other (e.g. see p.41 of the document, but there are more places where this becomes obvious).

    To summarize: The performance of the intel Gen4 architecture can hardly be predicted by looking at the raw technical topology. There are way to many indirections (I haven't talked about 3D rendering pipeline to EU mapping yet) in the entire sofware stack. That being said: The proof is in the benchmark. And not in a synthetic one (3DMark xy), but in games, applications and such.

    --

    Now let's get back to the 2 cores per EU. I have multiple explanations of why one could think that. And neither of them raises throughput:

    a) Some facts first:
    Gen4 has 8 EUs and 4 Threads/EU
    page 36:
    http://intellinuxgraphics.org/VOL_1_graphics_core.pdf

    each EU has 256 registers
    each register is 256 bit wide
    a thread can allocate at most 128 registers
    page 284
    http://intellinuxgraphics.org/VOL_4_subsystem_core.pdf

    One could think: Each thread may at most (128 registers) occupy half of the EU registers (256 registers). So the EU is partitioned at least in two pieces. The partitioning is correct but one wouldn't call those partitions cores. The reason is because the partitioning is of dynamic nature and the second reason is because it's really just there to perform instant switching between threads (without register saving/loading) if a thread stalls due to calling an external function. Those partitions can't be performed at once. It's like running multiple applications on a single core CPU. It does interleaving of tasks.

    b) The 3D rendering pipeline also has a certain topology. For example pixel shading only works on 16 pixels at once (and also p 277 of VOL_4):
    http://forum.beyond3d.com/showthread.php?t=32643
    Divided by 8 EUs one might believe there are two cores per EU, but that's wrong. The mapping of pixel shader commands is more complicated and doesn't have anything to do with the EUs. It's mentioned a couple of times in the document that "the number of EU cores in the GEN4 subsystem is almost entirely transparent to the programming model" (p36 VOL_1).
     
  20. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Question to 7oby:

    On this site: http://www.hkepc.com/?id=1510&page=5&fs=idn#view

    G45 on the Gigabyte motherboard with similar config as Anandtech's(seriously, Q9300 and Q6600 isn't that big of a difference), Crysis 1280x1024 Medium settings get 5.59 fps.

    Anandtech on the other hand gets 9.3 fps with 1024x768 Low quality http://www.anandtech.com/mb/showdoc.aspx?i=3432&p=4

    780G on the same Chinese site, do not gain such a massive advantage over the G45.

    Few sites I have seen shows drastic decreases going from Low to Medium settings on the 780G, while G45 shows not such a big decrease, making it artifically more competitive at higher settings.

    While you do make a point in saying that Nvidia/ATI's solutions have better performing architecture, higher settings do not show great advantage of Nvidia/ATI over the G45 as it would do on the lower settings. What I believe is something other than the architecture alone is making such a huge difference. Same problem plagues the GMA X3000/X3100.

    Now why would that be??
     
  21. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    The site hkepc.com is currently down - at least the link doesn't work for me.

    But your point is that going from low to medium detail certainly has a negative impact on fps. But it doesn't hurt the the G35/G45 platform as much as it does on the ATI/nVidia platform if you compare framerates on a relative basis. I'm not entirely sure whether this always applies - maybe there's this tendency.

    All I see is a recommendation going higher DX levels instead of lower ones in the intel slides I already mentioned:

    [​IMG]

    together with the note on the same slide:
    From that one might conclude: If going DX8 -> DX10 helps intel platforms maybe the same applies to going from low detail -> high detail. However doing this step differs from game to game: In one game in higher detail settings just bigger textures and more mip levels are used. In another game additional shadows, bump mapping and various shading effects are applied. It's hard to make a general rule from that.

    In practice it doesn't make a difference anyway: Since you can hardly play a game in high detail settings with intel graphics, it doesn't matter that you "loose less ground" compared to ATI/nVidida in higher settings.

    Let's compare some numbers from Anandtech's benchmarking:

    Crysis 1024 x 768, Low Quality
    AMD 780G: 26,2 fps
    G45: 9,3 fps
    G35: 7,9 fps

    Spore 1024 x 768, High Quality
    AMD 780G: 12,8 fps
    G45: 11 fps
    G35: 9,7 fps

    So the gap between ATI and intel fps in Spore is much smaller than in crysis. But I can't tell whether
    • this is due to High Quality Settings in Spore compared to Low Quality Settings in Crysis?
    • or due to the fact, that intel is kind of involved in Spore game development? For example there has been a beta driver version released by intel just to fix issues in the Spore game here. Something I haven't seen much from intel before.
    • or is this just coincidence?
     
  22. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Lots of the modern games aren't playable on the ATI/Nvidia integrated parts either. Just pointing out Crysis since its the most commonly benchmarked game. And at Medium settings, its not far away from the competition as its at low.

    Rather than looking at the G45 and saying "it scales better" with higher settings, I think its more of "the performance at low settings suck/not optimized". It can achieve over 2x the advantage in medium settings yet less than 20% at low. The question is why??

    Certain games do perform well on the G45, as I have shown on the few G45-specific games I have put couple posts ago. My conclusion is that the game developers also need to work with Intel more and vice versa, and the performance isn't purely tied to Intel's fault alone.
     
  23. xo...

    xo... Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Hi guyz just wonder does x3100 support 1920x1080? I got the Dell s2409w and it doesnt seems like giving me 1080 resolution.
    Thx
     
  24. mooler

    mooler Notebook Consultant

    Reputations:
    9
    Messages:
    196
    Likes Received:
    0
    Trophy Points:
    30
    how old? what versions?
     
  25. othersteve

    othersteve Notebook Evangelist

    Reputations:
    57
    Messages:
    361
    Likes Received:
    16
    Trophy Points:
    31
    Hey guys,

    I have an interesting question for you. I am currently running the brand-new version of Photoshop (CS4) and I would love to have GPU acceleration. Supposedly all you need is OpenGL support. But the option is grayed out in my Preferences in spite of the upgraded drivers I've tried for my X3100! Does it indeed support OpenGL or am I missing a crucial step somewhere?

    Thanks!

    Steve
     
  26. cronos77

    cronos77 Notebook Consultant

    Reputations:
    2
    Messages:
    134
    Likes Received:
    0
    Trophy Points:
    30
    1st: Why do you need GPU acceleration for Photoshop?

    2nd: Try using an older drivers version, something like:6.14.10.4831

    I had problems with Chronicles of Riddick (WinXP) with the new drivers versions, It said that I didn't have OpenGL 1.3 (while I should be supposed to have 2.0)
    Only an older drivers worked!

    Good luck
     
  27. othersteve

    othersteve Notebook Evangelist

    Reputations:
    57
    Messages:
    361
    Likes Received:
    16
    Trophy Points:
    31
    CS4 supports GPU acceleration for some nifty features such as smooth zooming and rotating. It's very nice.

    Anyhow, I will try your suggestion. Thank you!

    Steve
     
  28. rmcrys

    rmcrys Notebook Guru

    Reputations:
    1
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    Let´s hope x3100 is faster than your C2D doing that. As on some 3D functions, perhaps 3D hardware acceleration does not compensate over the software one.

    Let us know if it´s faster (can you compare with CS3 too?) with the GPU but it would be nice if it accelerates filters too
     
  29. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    That's indeed odd. For those, who don't know the s2409w - it's a real 16:9 screen (and not 16:10).

    If you attach a FullHD LCD TV to the X3100 chipset computer it will run 1920 x 1080 just fine. That being said there are multiple options on how a screen may report it's resolution capabilities to the graphics card. All use
    http://en.wikipedia.org/wiki/EDID
    but there are different ways to encode the native resolution: In Standard Timing Descriptor format, in Detailed Timing Descriptors and via the use of the Short Video Descriptor.

    Could you post the output of
    http://entechtaiwan.net/util/moninfo.shtm
    here? This might help to tell what's going on.

    In any way: There's a way to enforce custom resolutions in the intel driver. Should all fail, you could do that:
    http://software.intel.com/en-us/articles/custom-resolutions-on-intel-graphics
    http://www.avsforum.com/avs-vb/showthread.php?t=947830
     
  30. othersteve

    othersteve Notebook Evangelist

    Reputations:
    57
    Messages:
    361
    Likes Received:
    16
    Trophy Points:
    31
    Well, regrettably, these functions don't even work without GPU acceleration... hence the source of my pain and longing for a working relationship between the X3100 and CS4.

    I tried downgrading the driver to no avail... now I'm thinking it's just CS4 that won't cooperate with the X3100. I wonder if there is some way to hack the settings file to make it give it a shot...

    But that's a subject for a different forum I suppose.

    Steve
     
  31. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    its so sad that X3100 cant play Diablo 2 in direct 3d with over 10 FPS and Anarchy Online which requires a 8mb GPU, 32 mb ram and 250 mhz.

    my super epic old desktop from 1998 run those games flawlessy...

    Which means x3100 is MUCH worse than 10 year old hardware. Heck i bet even a vodoo 1 would play those games better than the x3100

    Awesome!
     
  32. Yakumo

    Yakumo Notebook Consultant

    Reputations:
    5
    Messages:
    130
    Likes Received:
    0
    Trophy Points:
    30
    I've complained here before about the OpenGL support in After Effects not working with the x3100, and demos of other apps failing in some parts too, I'm very sad but not entirely surprised to hear CS4 won't support the x3100 either.

    It's not beyond imagining that the implementation they've used is fairly ATI/Nvidia specific, but that does make a mockery of the idea of openGL 'standard' so I would really like to see x3100 (& g45 but I don't have one of those) supported properly.
     
  33. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    i got a GMA 945 with a celeron 1,6 and VS the laptop in my sig it plays Anarchy online for example with average 40 FPS VS X3100 which gives me 1-10 FPS

    Same goes for Diablo 2
     
  34. xo...

    xo... Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    I have attached the log.
    Already try to "hack" the driver , after "hacking" it, its giving me option of 1920x1080@30Hz or 25Hz, If I choose the option the monitor will 'flash', seems like problem with the Hz. Use powerstrip, and many different modeline, does help at all. Email to intel and they told me that i shouldnt have any problem cuz x3100 should support 1080... which is not in my case.

    Please have a look my log and see any thing I can do to get it work. Thanks
     

    Attached Files:

  35. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    EnTech's Powerstrip-- doesn't work properly with Intel graphics says "archibael" and that's still true. It just doesn't work even though the Powerstrip changelog says different. Otherwise the tool "DTD Calculator" would have never been invented.

    Anyway: You Moninfo.txt look perfectly fine. I really don't understand why you chipset doesn't offer you 1920 x 1080. I guess you have the latest driver installed.

    But if you check on the "DTD Calculator" tool, you can really enforce the intel driver to offer a particular resolution. Use this modeline in DTD Calculator. Paste the Raw EDID into DTD Calculator extract the 1920 x 1080 @60Hz resolution and click on "Write DTDs to Registry". Doing that will offer you 1920 x 1080.

    Uninstall any Powerstrip hacks first. I have Powerstrip installed as well, but I didn't allow Powerstrip to write anything to the registry - it's really not compatible.
     
  36. xo...

    xo... Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    I have installed the latest driver already, use DTD Calcutor and paste the raw edid to it, after restart the pc, the 1920x1080 wont appear more. (DTD 02 3A 80 18 71 38 2D 40 58 2C 45 00 13 2A 21 00 00 1E).

    1920x1080 only appear if I use DTD
    01 1D 80 18 71 1C 16 20 58 2C 25 00 13 2A 21 00 00 9E or
    FA 1C 80 18 71 1C 16 20 58 2C 25 00 13 2A 21 00 00 9E
    but it will appear at the intel video as 30hz or 25 hz, and the monitor keep 'flashing' if i use those DTD.
     
  37. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    That one was actually good (EIA/CEA-861B norm). Don't know why it doesn't work for you. Maybe try a slightly different one with 59,94Hz (double NTSC rate) instead of 60Hz:
    F3 39 80 18 71 38 2D 40 58 2C 45 00 13 2A 21 00 00 1E
    Also worth a shot 50Hz:
    02 3A 80 D0 72 38 2D 40 10 2C 45 80 13 2A 21 00 00 1E

    The 50Hz Version might introduce tearing when moving windows on the desktop.

    Not suprising, these are interlaced ones. You should stick to progressive ones.

    I saw your posting in the intel forum. Let's see ...
     
  38. xo...

    xo... Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    I can get my monitor working perfectly fine..... on Linux Ubuntu, without any hesitate setting/install driver. trying to work out is it possible to use linux driver in window. last hope....
     
  39. Yakumo

    Yakumo Notebook Consultant

    Reputations:
    5
    Messages:
    130
    Likes Received:
    0
    Trophy Points:
    30
    Got the CS4 trial.
    Help -> GPU brings up : http://kb.adobe.com/selfservice/viewContent.do?externalId=kb404898

    x3100 does not comply atm, though technically it could if the driver team pull their finger out. (please, cherries on top for all? :))
     
  40. Axelo

    Axelo Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    Does Anyone has played Fifa 09? And whatzup with the performance?

    I played with the drivers 6.14.10.4859 and sometimes i got a BSOD
     
  41. Happyreaper

    Happyreaper Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5

    Yeah really annoying, I've got the latest drivers ....4990 and can run the game at high settings at about 20 fps, but it always crashes/BSOD after about 10 mins of play anyone with solutions to this as I think its only driver optimisation as was for spore and not that the x3100 cant handle it.
     
  42. dukka

    dukka Notebook Consultant

    Reputations:
    11
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
    I think the driver of Intel need a lot of optimisation since I always get the BSOD in about 10 mins for many games such as Mech warriors 4, tribes vengeance (old games), Dynasty Warriors 6... while I can play Devil May cry 4 at highest setting (just for fun :D with 2 - 4 fps) for 30 mins without BSOD.
    By the way, I think Konami, Capcom games favors the x3100 since I've never encountered BSOD in games like Silent Hill series, Devil may cry series and they are pretty playable.
     
  43. Axelo

    Axelo Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    how can i know the fps on game? i think i got more of 20 fps in fifa 09 with the driver 4859 on xp a
     
  44. Axelo

    Axelo Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    i just download fraps...and with the config on 1024x768 QUALITY on MEDIUM, shadders on level 2, 3D grass oFF, Referee OFF, Lineman on

    I got from 15 fps to 20 fps and sometimes(rarely) i got a BSOD

    thats with the the drivers 6.14.10.4859

    anyother better config for fifa 09?
     
  45. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    My god, how can 2-4 fps be considered a fun experience? Once you play a game smoothly at 60 fps, it's just distracting and takes away from the fun, like watching a 1920's silent-film version of your favourite FPS.
     
  46. dukka

    dukka Notebook Consultant

    Reputations:
    11
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
    Don't be so harsh on me :) I just want to see how it looks at the highest setting and I benchmarked it, it tooks me nearly 30 mins with such low fps but I was completed attracted at how it looks and surprised at how it could last that long without BSOD
     
  47. anarky321

    anarky321 Notebook Deity

    Reputations:
    65
    Messages:
    1,190
    Likes Received:
    49
    Trophy Points:
    66
    nevar again, intel...nevar again
     
  48. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
  49. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    x3100 is doomed and will probably never get any better with driver optimization. Ina years time there might be driver that can run DMC 4 on 6 fps and not 4, but does that really matter? -_-

    The card simply sux for all games that r older than from year 2001 and all games newer than 2005
     
  50. dukka

    dukka Notebook Consultant

    Reputations:
    11
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
    I fear so, but 90% I will have to stick my life with the x3100 for 1 more years :((
     
← Previous pageNext page →