The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Intel X3100 users rejoice! It's finally here! New Pre-Beta Drivers

    Discussion in 'Gaming (Software and Graphics Cards)' started by epictrance4life, Jun 7, 2007.

  1. diegorborges

    diegorborges Notebook Enthusiast

    Reputations:
    1
    Messages:
    46
    Likes Received:
    0
    Trophy Points:
    15
    Thats one interesting question too... I use the latest drivers for vista.

    BTW, it should be interesting when Intel release its own drivers for Windows 7 64bit.
     
  2. UnReaL596

    UnReaL596 Notebook Consultant

    Reputations:
    6
    Messages:
    197
    Likes Received:
    0
    Trophy Points:
    30
    own drivers? we haven't seen new xp drivers since. idk 1/25 i think...
     
  3. stanny1

    stanny1 Notebook Consultant

    Reputations:
    2
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30
    Are there any new drivers for the x4500? I have been using the latest one (1688) which was available from April and have been crashing constantly since then. If not, where can I find the previous drivers until this one has been fixed?
     
  4. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Thank you for this info, which gave me incentive to look into this.

    x3100 configuration registers for faster clock rates

    [​IMG] [​IMG] [​IMG] [​IMG] [​IMG]
    Far left: 4 bits to modify to increase/decrease Core Display clock speed

    Tested configuration: HP 2510P, PM965 chipset, DDR667 RAM, FSB=533, single-channel configuration

    I used baredit to try the faster and slower clock below, and used WIN7's 'winsat dwm' and 'winsat d3d' to measure performance and battstat to measure power consumption to see if makes a difference. There was no noticeable increase in performance. I'm definitely switching the right bits as when I switch the 'gating' bit 14 to 0, the LCD completely blanks out. HWDirect also can do these register modifications too.

    Out of curiousity, I observed these registers too before/after using the Intel Graphics Drive systemtray panel set to "Maximum Powersaving" and "no powersaving" mode to see if it switches to faster/slower clock rates, thinking it would downclock the gpu to save power. Would make sense given that Intel themselves give a lower TDP rate for the slower clock. Nope, doesn't set the slower core render clock.

    Ie: Device 0/2/0, offset F0-F1h

    Bits 12:8 Graphics Core Display Clock select:
    00010 - 320/333Mhz <--- default on my system

    Bits 3:0 Core Render clock select
    0010 = 250/267 <---- this for better battery life (9.5W TDP)
    0011 = 320/333
    0100 = 400/444 <--- default on my system (12W TDP)
    0101 = 500/533 <--- this for better performance (13.5W TDP)

    GPU-Z versus Everest reporting of Core Render Clock Speed
    Ran GPU-Z and it *always* shows 500Mhz clock speed. Everest shows the correct speed of 400Mhz as shown below. Changing the above bit for Core Render to 500Mhz has Everest then report 500Mhz, so Everest appears to read the same bits for it's reporting. However, benchmarking at the supposed 500Mhz still gives me the same performance as at 400Mhz, so I don't believe changing the bits does anything.

    [​IMG]

    XP versus Win7 drivers

    More interesting is the XP driver gives me 1750MB/s video memory transfer, whereas the WIN7 (or the Vista one), tops out at 1550MB/s. (My system has only a single-channel configuration). This is as measured with 'winsat dwm'. winsat for XP comes with the Vista Upgrade Advisor download, or extracted winsat binary here. I even compared the X3100 registers between WIN7 and XP installs by running baredit and looking at the *complete* register listing as above on both and found they're the same for WIN7 and XP. Certainly XP's interface is snappier than Win7's Aero 3D interface on my 2510P, so maybe that 13% difference in video memory transfer rate accounts for part of it? Disabling Aero makes XP and Win7 desktop on par.

    Improving X3100 performance
    According to this, single-channel operates at 65% the level of dual-channel setup. So definitely worth adding another RAM module if you have an empty slot. As does a faster FSB with overclock.

    gmabooster author is looking at the X3100 vbios for clues on how to switch the Core Render clock speed successfully, though questionable as to what real benefit you'd get. None at all if already running a 500Mhz Core Render clock. If going say from 320/400 to 500Mhz, it's questionable what performance difference GMABooster would make considering this tells us for a GMA950 EEE netbook going from 166Mhz to 400Mhz Core Render clock speed ( GMABooster):

    It increases Windows 7 WEI by one point 0.1 for Desktop/Aero and three points 0.3 for 3D graphics.
    Best overall improvement would be with a expressport/mini pci-e DIY ViDock running a HD4670 to external LCD. Let's see how that progresses..
     
    Last edited by a moderator: Feb 6, 2015
  5. hitman72

    hitman72 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    [​IMG] [​IMG]


    is it in this way for 500/533 Graphics Core Display Clock??? If I press write then, the 101 returns 100. Please try to be clearer and simpler. sorry for my poor english, ciao
     
  6. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    The bit you've highlighted in red is correct. Mine starts off being 100, then I make it 101 and do a write and it accepts and updates the value.

    If yours doesn't work, worth noting the previously posted Support Configuration image with frequencies supported dependent on the chipset, RAM and FSB you have. I have a PM965 chipset, DDR-667 RAM and FSB=533.

    Could look at the value of the Capability Identifier to see (summarized below) if you might be limited in your choice, or trying a lower frequency (320 as bits 011) to see if it accepts that?

    18.1.37 - Capability Identifier

    0/0/0/PCI
    Address Offset E0-E9h
    Default value 000000000000010A0009h
    Access: R0
    Size: 80 bits

    Bit 30 / 0 : supports all memory frequencies
    ...........1 : supports up to DDR2-533

    Bit 29-28 = 01 : capable of up to FSB 800Mhz
    Bit 29-28 = 10-11 : capable of up to FSB 667Mhz
     
  7. hitman72

    hitman72 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    I have seen now on your edited posted that now the bits to change is on Device 0/2/0, offset F0-F1h and not on 0/2/ 1, offset F0-F1h. 101 is not accepted aniway but 011 is accepted, I think 100 is max value on my cheaper gl960. Thank you very match :)
     
  8. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    You have a GM965 chipset:

    GM965 = PM965 + X3100

    Which is also stated in the datasheet of your HP 2510p Notebook. Unfortunately your notebook indeed only seems to support single channel operation due to being enquipped with just one memory slot. And as I previously have shown single -> dual channel operation does make a big difference performance wise.

    Switching clock speed this way doesn't work:

    p.343:
    http://www.intel.com/Assets/PDF/datasheet/316273.pdf

    Most likely you can only change these settings at boot strapping time in a safe manner. Or - if you run linux - maybe from a remote shell while having X down.

    P.S.: My Bus.0/Dev.2/Fct.0 x0F0 is set to ... 101 already. I suppose at least reading those registers reflect the actual core and ratio.

    P.P.S.: I just downloaded the current version of GPU-Z which is v0.3.4 by now. And it displayed 500MHz, which seems to be correct. Maybe the author of GPU-Z found his or a more reliable way of displaying core/renderspeed by now.
     
  9. hitman72

    hitman72 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    gpuz in all x3100 pc displays 500Mhz (tested on a lot of notebooks). ciao
     
  10. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    [​IMG] [​IMG]

    OK.. proceeded to enable the 0/1/0 device used for Link control.

    0/0/0/PCI 54-57h
    bit 1: 0: disable/hide 0/1/0
    1: enable and make visible 0/1/0

    < did a scan in Device Manager and the device it finds a new device >

    0/1/0/PCI
    bit 4: 0: Normal operation
    1: Link is disabled

    Changed my core rendering clock to faster 500Mhz mode, Then writing 1 to disable the link immediately changes it to 0. Hmm.. probably doesn't like that XP is running with it's graphical mode? No change in performance from 'winsat wdm' either, so can say that it probably did nothing.

    The idea being to put the link down, make changes to increase core render clock speed, then bring link back up. Anyone else have any ideas?
     
  11. raduque

    raduque Notebook Evangelist

    Reputations:
    95
    Messages:
    458
    Likes Received:
    39
    Trophy Points:
    41
    nando4, what parameters are you passing winsat to keep it's window from closing? I type "winsat dwm" in start|run and it just closes when it's done.
     
  12. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Run 'cmd', then at the commandline type "winsat dwm".
     
  13. raduque

    raduque Notebook Evangelist

    Reputations:
    95
    Messages:
    458
    Likes Received:
    39
    Trophy Points:
    41
    I did that, and it just closes. I gave up trying to figure it out.
     
  14. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Ensure you have had the latest Win7 updates installed otherwise winsat runs in it's own window and exits upon completion. In case you can't get around it, redirect the output to a file. I tried this and it works:

    Code:
    winsat dwm > c:\winsat-dwm.txt
    Then open c:\winsat-dwm.txt to see your video ram timings
     
  15. raduque

    raduque Notebook Evangelist

    Reputations:
    95
    Messages:
    458
    Likes Received:
    39
    Trophy Points:
    41
    OK, I figured it out. Gotta run the command line as admin. Just for comparison my desktop got 22,679 MB/s. My laptop got 5507 MB/s.

    I can't do XP tests, because I haven't bothered with that OS in about 3 years.
     
  16. raduque

    raduque Notebook Evangelist

    Reputations:
    95
    Messages:
    458
    Likes Received:
    39
    Trophy Points:
    41
    No, but my moms laptop has an X3100, and she likes to play shooters, so I have an interest in whatever you guys come up with. I was just throwing my numbers out there for the heck of it.
     
  17. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Win7RC1 X3100 performance is 11-15% slower than XP on a single-channel configured HP 2510P

    Benchmarking a single-channel RAM configured X3100 on a HP2510P shows XP has 11%-15% better Aero Desktop performance than Win7RC1. That's if it ran Aero Desktop.This is regardless if Aero is turned off or on the Win7RC1. A GMA950 owner, who has contact with a Microsoft tester, suggests the final Win7 release will have full performance. A drawcard to invest in the upgrade it seems.

    'winsat dwm' results on XP, Vista and Win7RC1 @1280x800
    For a HP 2510P [email protected] 2510P (FSB=533, pci-e=100)
    Code:
    OS.................FPS | Video Mem (MB/S)
    XP...............26.04 | 1378.02
    Vista 64-bit^1...24.31 | 1286.48
    WinRC1 32-bit....22.54 | 1192.44 
    ^1 below provided by 'pocketgeek' NBR member, as I don't run Vista. All other values the average of three runs. Appears the Aero Desktop score is the Graphics Performance (F/s) divided by 10. Even though Win7RC1 provides it's own winsat, I used the Vista Upgrade Advisor which provides a winsat binary (Dec 11, 2007, version 1.0) loaded in c:\program files\Vista Upgrade Advisor to ensure it was a consistent benchmark across all operating systems, standalone binary available here. Can confirm these findings on your own system if have dual-boot XP/Win7. 'winsat dwm -v' to give more info on what it's doing. The Win7RC1 readings were identifical using the WIN7RC1 or Vista X3100 32-bit drivers.

    Overclocked HP 2510P [email protected] (FSB=667, pci-e=109) @1280x800
    Code:
    OS.................FPS | Video Mem (MB/S)
    XP...............32.62 | 1726.04         
    Win7RC1 32-bit^1 29.30 | 1550.35
    Win7RC1 32-bit^2 33.99 | 1798.60
    Win7RC1 32-bit^3 34.20 | 1809.77
    
    ^1 using wdm1.1 driver as supplied with Win7. Gaming graphics WEI=4.9.
    ^2 using Vista 32-bit series 4 drivers as posted here. Gaming graphics WEI=3.1(??)
    ^3 using Win7 32-bit series 4 drivers as posted here. [ NOT STABLE - Aero fails to work! ]

    The series 4 driver pushes up my video bandwidth up to XP's levels with Win7 WEI Aero Performance score up from 2.9 to a more bearable 3.1. Firefox now with a fraction smooth scrolling when using Aero. Still prefer to disable Aero.

    An increase of Core Render clock from 400 to 500Mhz could increase that yet again. Hard to say since the memory timings appear to be the handicap. My single-channel configuration likely giving these lower figures too. Video ram bandwidth shown to making a significant difference in overall graphics performance here. The possibility of Win7RC1 optimized drivers and GMAbooster having X3100 500Mhz core render clock support (author is working on it) means the X3100 may have some more fuel in the tank yet. yet.

    I've heard a rumor that the Win7RC1 drivers/DirectX might be performance handicapped pending final release. Aim being a good incentive to buy the official release.
     
  18. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    I'm having some concerns about what you measure and what conclusions you are drawing:

    What exactly do the dwm numbers mean? For example the MB/s? Is it the bandwidth CPU <-> GPU memory? Or GPU Core <-> GPU memory? I have serious doubts it is related with end user performance in applications as it is just one or two numbers derived from one of the industry most meaningless benchmarks: WEI. Anyway my numbers on DDR2-667 DualChannel FSB800 T7500@2,2GHz CPU are: 40.48fps/2142.23 MB/s@1280x800 and 38.47fps/2035.49 MB/s@1920x1200. So the numbers are already sensitive to the resolution you test it on - since obviously higher resolutions require some of the bandwith you want to test.

    Furthermore: XP (XPDM), Vista (WDDM 1.0) and W7 (WDDM 1.1) have completely different driver stacks and even DirectX stacks. That might very well result in a diffferent performance. It's not a hard rule, but sometimes introducing flexibility and features results in lower overall performance. Consider the following:

    . you have an application that caculates the hypotenuse of a right triangle given the two other sides. Basically it calculates sqrt(a^2 + b^2).

    . Now you extend the application (= introduce new feature) and say it is able to calculate the third side of any triangle. It now calculates sqrt(a^2 + b^2 - 2*a*b* cos phi)

    If you feed the second application to calculate right triangles it is slower than the first application. There's a cost associated with the introduction of new features. Now you say: Well you can branch depending on the angle to provide a short path for 90°:

    If (phi = 90°)
    return sqrt(a^2 + b^2)
    else
    return sqrt(a^2 + b^2 - 2*a*b*cos phi)

    Sometimes optimizations like these are feasable - somtimes not. And if you are precise: The "If (phi = 90°)" does steal a CPU cycle, which means even the optimized version is slower.

    Anyway - the point is: It is valid to compare the graphics performance of different systems (XP, Vista, W7) or Subsystems (DirectX9,10, ...). You may observe different and even very different performance numbers. However you can't derive from those numbers anything about how optimized the implementations are. The previous example demonstrated the effect.

    New DirectX versions and driver models in general introduce new flexibility and features, which trade to some degree performance for it. You can also think of security as a feature: Any security gate may steal performance. Programming the GPU got highly virtualized in the latest Windows OS while previously in WinXP you pretty much banged directly on the hardware once you blocked it for use in other processes. That is for example one of the reasons why MS was able to introduce features such as the animated window preview in the taskbar and continously playing videos in Windows + Tab.

    I remember Aero performance being perceivable slower and with judder in effect animations while running on single channel memory. Currently I don't have the time to repeat the test and provide videos. However it may mean: Regardless how much you lift CPU power, FSB speed, GPU render clock - your Aero performance might be still not good since the bottleneck is somewhere else.

    P.S.: I'm away for a week - don't expect any further answer to these issues soon. Though I'm subscribed to this thread. If there's something interesting I'll post later.
     
  19. nicocarbone

    nicocarbone Notebook Enthusiast

    Reputations:
    1
    Messages:
    45
    Likes Received:
    0
    Trophy Points:
    15
    Something weird happened today. Windows 7 beta upgraded my X3100 drivers without asking (I had Vista drivers instaled). Now I have 8.15.10.1749 WDDM 1.1 dated 06/05/2009. Beside the fact that Windows should ask me before such an update is done, with this new driver the performance is better. Around 3 more FPS in COD2 DirectX9 and something similar in COD4.
    There are some drawbacks though. For example, there is no Intel Control Panel and the driver is configured to fix aspect ratio, so playing COD at 640x480 generates black bars on the sides.

    For comparison, I did a "winsat dwm" and my result was 2186.67 Mb/s with dual-channel 667Mhz memory and a Intel Core 2 Duo 1.83 GHz.
     
  20. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    nicocarbone - would you be willing to run 'winsat dwm' with one ram module removed? This would give an idea of how much Aero Desktop performance is lost in Win7 in the single-channel configuration.
     
  21. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    kind of off the current topic, but has anyone posted a benchmark of windows 7 vs xp vs vista which actually accurately represents the game performance difference?
    all the ones I have seen so far have actually shown windows 7 to be worse than both, but from all the times I have used it, it has been obviously much faster.
     
  22. diegorborges

    diegorborges Notebook Enthusiast

    Reputations:
    1
    Messages:
    46
    Likes Received:
    0
    Trophy Points:
    15
    I think until now what you can trust is what you feel. But for me, yeah, 7 is faster.
     
  23. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    well, for me, every game has been faster, some by very noticeable amounts (cod 4, xp:15-20fps, 7:25-30 on higher settings) but source games like HL2 EP2 seem to have no difference, which is wierd, as I have always thought of the source engine as being incredibly well optimised.

    Also, does FRAPS decrease performance? after a lot of googling this, I have got very mixed answers.
     
  24. nicocarbone

    nicocarbone Notebook Enthusiast

    Reputations:
    1
    Messages:
    45
    Likes Received:
    0
    Trophy Points:
    15
    I did what you asked. I run "winsat dwm" in single channel mode and the result was 1422.39 Mb/s.
     
  25. nicocarbone

    nicocarbone Notebook Enthusiast

    Reputations:
    1
    Messages:
    45
    Likes Received:
    0
    Trophy Points:
    15
    How do you configure COD4 to have this performance? Are you using any modified config file? In which map did you do the test?

    By the way, you can use the in-game fps meter in COD4 and compare it to fraps results to see if there is a difference. The command is cg_drawFPS 1
     
  26. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    XP: cod4, 640x400, everthing on low, 15-20 FPS (if lucky- the view distance in bloc reduced it to about 5!)

    Windows 7: 800x600, all texture settings medium or high, bilinear, glow on, model detail medium, water detail medium. This gives me about 25 fps on shipment, 20-25 on crash, 25 on backlot, 20 on bloc, other maps I can't remember but I have only rarely seen the framerate go below 15, and that was when I was using the PEZBOTS mod, which creates multiplayer bots, which have to run off the CPU, and that lowers freamerates dramatically.

    Single player: 15-30 FPS, and only went below 15 once, in the mission when you get nuked.
     
  27. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    And no, I have never used a modified config file: all the variables seemed to have no impact on performance, but then again I'm not too experienced in using config files... (although my one great pride was getting Fallout 3 to work playably on XP......However, none of the characters had heads, and I essentially stole all the ideas from people on a site where they were trying to do the same thing...)
     
  28. nicocarbone

    nicocarbone Notebook Enthusiast

    Reputations:
    1
    Messages:
    45
    Likes Received:
    0
    Trophy Points:
    15
    I tested your conf, and I can't get your performance results. (What driver are you using?) In bloc for example, it oscillates between 12 and 18fps, and the performance in single player is usually lower too. Maybe Its related to my lower CPU spec (1.83 vs 2.2 GHz), or maybe there is somwthing wrong with my config. I am upgrading to W7 rc this weekend, and test again.

    Have you tested your performance in the Killhouse map? This is the worst for me. It has an average of 20fps in one side of the map, but it drops to around 9fps in the other side.

    Does any one know how to configure the Windows 7 drivers to use all the screen when using a 800x600 resolution?
     
  29. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    So combining this with your previous post we have:

    Win7RC1 Intel Core 2 Duo 1.83 GHz, DDR2-667
    Code:
    Config          | Video Transfer Rate
    single channel  | 1422.39 MB/s
    dual channel    | 2186.67 Mb/s
    
    So single channel performance is about 65% of a dual-channel config. Quite a significant difference in performance. Wish I had another RAM slot.
     
  30. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    nicocarbone, I'm using the latest vista drivers on windows 7 RC.

    The processor speed may be to blame, as COD4 can be quite processor intensive. And killhouse is wierd: there is no great view distance, or trees, or other complicated scenery, and yet the performance is almost the worst of all the multiplayer maps, but only when going in one direction! Walking from one end of the map to the other gives 20-30fps, while the other direction gives 13-15!
     
  31. nicocarbone

    nicocarbone Notebook Enthusiast

    Reputations:
    1
    Messages:
    45
    Likes Received:
    0
    Trophy Points:
    15
    Well, I am using W7 drivers, maybe I should try Vista ones. About Killhouse, that's exactly what happens to me, is very weird. However, I tried it on my desktop PC (with a nvidia 9800gt) and this inconsistency is still there, with around 85 fps in one direction and 65 in the other. It is of a lower degree than in the X3100, but still present. I don't know the reason, but i might not be only related to the X3100.
     
  32. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    yeah, I switched to the vista drivers because ALL my games seemed to give black vertical bars on the left and right of the screen with the new drivers, but i'm afraid I never tested the performance on them, so I don't know whether they are any better or worse than the latest vista drivers.

    My only theory about killhouse is possibly that the dust has an effect on performance, as there seems to be quite a bit more visible dust in the air on the end which gives the lower FPS, but that could be just a coincidence.
     
  33. mrkitty5

    mrkitty5 Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    has anyone here tried bioshock?
     
  34. dukka

    dukka Notebook Consultant

    Reputations:
    11
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
  35. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    The desktop section of intel mainboards contains updated Windows 7 drivers:

    Win7 32-Bit : v15.15.0.1808 dated 06/09/2009 ( Download)
    Win7 64-Bit : v15.15.64.1808 dated 06/09/2009 ( Download)

    compatible with

    . Intel(R) B43 Express Chipset
    . Intel(R) G41 Express Chipset
    . Intel(R) G43 Express Chipset
    . Intel(R) G45 Express Chipset
    . Intel(R) Q43 Express Chipset
    . Intel(R) Q45 Express Chipset
    . Mobile Intel(R) GL40 Express Chipset
    . Mobile Intel(R) GM45 Express Chipset
    . Mobile Intel(R) GS45 Express Chipset
    . Mobile Intel(R) GS40 Express Chipset
     
  36. moral hazard

    moral hazard Notebook Nobel Laureate

    Reputations:
    2,779
    Messages:
    7,957
    Likes Received:
    87
    Trophy Points:
    216
    hi,

    Is there any way to overclock the X3100, either software or hardware overclock? my GPU is at 500mhz fyi.

    Also can anyone tell me what controls the GPU frequency. like the equivilent of the PLL controling the CPU frequency.

    And has anyone overclocked the X3100 past 500mhz?
     
  37. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Intel have their Core Render Clock select at a max of 500Mhz as shown here and Graphics Core Display Clock select at 320/333. Though there does appear to be further reserved or undocumented bits that may extend that?? I tried with baredit but so far have not been able to successfully downclock the link to confirm a change is possible with a live running system. The author of GMABooster figured out a way on GMA950 systems and is looking at GMA965 too (X3100). If you figure out a way, please post :)

    Since the X3100 performance ties in heavily with RAM throughput, FSB overclocking will increase performance. Dual-channel setup gives a noticable increase in performance as shown here. By induction then, RAM with lower CAS rating would help too.

    This would suggest to me these drivers are not for X3100, otherwise it would have listed GL960/GMA965. Happy to be corrected though.
     
  38. diegorborges

    diegorborges Notebook Enthusiast

    Reputations:
    1
    Messages:
    46
    Likes Received:
    0
    Trophy Points:
    15
    Why Intel is taking so much time to release new drivers for Win7???

    I say because with default win7 drivers or the updated one, it says 64MB system video memory, and the vista drivers (the ones with control panel and stuff) says 128. And they are better...
     
  39. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    That's correct. v15.15.x doesn't work with GM965. v15.13.x and v15.9.x are for Gen 5.0 (X4500, ...) only as well.

    Maybe because they made such a good job that hardly any immediate bugfixes were necessary. That fact that they released v15.15. 0 suggests this is a new-features-release rather than a bugfix release.

    Intel graphics shared memory allocation changed from XP (XPDM) -> Vista (WDDM 1.0) -> Win 7 (WDDM 1.1).
    The change XP -> Vista is documented here in section "Dynamic Video Memory"
    http://software.intel.com/en-us/articles/intel-gma-3000-and-x3000-developers-guide/
    and in more detail here:
    http://www.microsoft.com/whdc/device/display/graphicsmemory.mspx

    It boils down to added flexibility and more demand driven memory allocation. While previous Windows generations had a high unflexible chunk allocated, the more recent versions allow you to only allocate as much as is required for your current tasks. This makes perfectly sense if you want to be able to run Win 7 on some small notebook or netbook, which might only have 1GB of memory. In this case it's very helpful to be able to allocate less than 128MB.

    Vista already will allocate as much memory as it requires. For the X3100 this can be as much as 358MB. In my initial current session it used 139MB. After launching Google Earth, it's 144MB. After extending my display to a second LCD, it's 212MB. Closing Google Earth and switching to single display the GPU shared memory usage got reduced to 130MB.

    Basically Vista and Win7 do a meaningful memory allocation for shared graphics and beyond the initial 64MB or 128MB.
     
  40. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    You might want to get in touch with the Authors of GMABooster:
    http://www.gmabooster.com/contacts.htm

    Since those enabled rendering clock selection for the 945GM chipset. The registers are basically the same (GCFC see 8.1.35):
    http://www.intel.com/Assets/PDF/datasheet/309219.pdf

    Maybe they have a better understanding about how things work together and what else needs to be done to change renderspeed on the fly.

    Your posting here
    http://www.phoronix.com/forums/showthread.php?t=17589
    will not be heard by anybody involved in intel linux driver development. You could file a bug in their bugzilla though to get attention. However it's outside the scope of what their open source team does.
     
  41. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    GMABooster author has already been contacted. Discussion summarized as follows:
    • refers to Intel public datasheets as "datash***ts" because they are incomplete or simply wrong in many important places.
    • succeeded with GMABooster for GMA950 only after moving from documented to Undocumented things.
    • suggests that without BSEL mod (without FSB 166) it's might not be possible to set 500/533 MHz at all, then goes to say: "However I prefer not to believe in this fact, despite it is clearly indicated in datasheet".
    • knows how to do this for GMA950. For X3100 it doesn't work, otherwise a GMABooster for X3100 would already arrive in April or May
      Says my proposed method here does not work on X3100 and the method he used with GMA950 does not work either.
    • says GMA950 is not tied to FSB at all, while X3100 is, so thinks X3100 needs a completely different method
    • is working on GMA X3100 support right now, but hasn't succeeded yet.
     
  42. ondris

    ondris Notebook Enthusiast

    Reputations:
    3
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    So, we will get (maybe) good increasing for x3100? New Windows 7 drivers + GMABooster :)
     
  43. netc0rd

    netc0rd Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    Good day.

    Every time I see the X3100 in a GPU-Z screenshot it reports as having a bandwith of 5.3 GPixel & GTexel, with a GPU clock of 667mhz, whereas my GPU-Z report reads a 4.0 GPixel & GTexel bandwith, and with a GPU clock of 500mhz.

    This is very much FSB related if I'm not mistaken? - I'm on an aspire 4315, with a Celeron M530 (1.73Ghz, 133 bus), currently running Win XP with the old 6.14.10.4935 drivers - Could this have anything to do with it the low bandwith report?

    If I upgraded the CPU to, say, a T7300 c2d with a FSB of 800Mhz, would that also boost the performance of the X3100?

    /Net
     
  44. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    GPU-Z isn't accurate. Use Everest's output to report Core Render Clock. My system with FSB=533 runs Core Render Clock of 400Mhz, matching the datasheet info here. 800Mhz FSB would run a 500Mhz clock. So Dual-channel RAM and 800Mhz FSB CPU would improve performance.

    Can you pinmod you're existing Celeron M 530 CPU to run 200Mhz FSB like shown here?
     
  45. stanny1

    stanny1 Notebook Consultant

    Reputations:
    2
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30
    Any new drivers for x4500? I am using the ones from April and it keeps crashing.
     
  46. netc0rd

    netc0rd Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    Thanks for the tip! - I'm now a proud owner of a Celeron M530 @ 2.16Ghz :)

    Hopefully the temperature won't be too much of a bother.

    EDIT: Still no change in GPU-Z, and I cannot find any "correct" bandwith/clock report in Everest either. Just my luck I guess.
     
  47. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Congratulations. A BIG change in performance going from 133->200Mhz FSB. Perhaps add some Artic Silver on the CPU to have faster heat dissipation. Manufacturer supplied heat pads tend to be only just adequate.

    'winsat dwm' in Vista/Win7 can show you the difference in video ram throughput b4/after the overclock. If using XP can use winsat.exe as extracted from the Vistra Upgrade Advisor. With dual-channel RAM you'd have a significant X3100 performance upgrade there.

    Have you used the screenshot show here to check the Core Render Clock speed? That is MORE accurate that Everest/GPU-Z. I can confirm that Everest reads the *same* bits to report the Core Render Clock speed, since when I alter them, it reports the different speed, matching the Intel Datasheet extract as linked.
     
  48. netc0rd

    netc0rd Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    Interesting. My video Ram transfer in dual channel is now 4361.39 Mb/s in Win XP (at 200 FSB). If I recall correctly my Video ram transfer were aprox. 3900 Mb/s before the FSB change (133 original), so it looks all well and dandy :)

    I'd love to squeeze whatever performance I can out of the X3100, so I was wondering about your baredit experimentation:
    Apparently I'm on a GL960 chipset - Could I still change the values to a higher clock (0101 = 500/533) without the risk of plain bricking my laptop?
    ... Baredit reminds me a lot of the SPD Tool for memory timings, and to be frank I fear the use of those programs :p

    EDIT: Silly me. I just pulled a "/Winsat" to get the results above. With the "/Winsat dwm" command it reports as having a bandwith of 1844.18 Mb/s

    Well, that was quite some difference :)
     
  49. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    The experiment is only useful to read the Core Render clock speed. Sure, you might be able to write the value but it appears to not make any change unless the link is reset.
     
  50. 7oby

    7oby Notebook Evangelist

    Reputations:
    151
    Messages:
    457
    Likes Received:
    0
    Trophy Points:
    30
    Which drivers and OS do you use? v15.13.1 in Vista:
    http://forum.notebookreview.com/showthread.php?p=4738893#post4738893

    And what kind of crashing do you mean. Do you mean this one:
    "screen occassionally blanks then recovers"
    "The graphics Driver has stopped responding, but has restarted"
    http://software.intel.com/en-us/forums/user-community-for-visual-computing/topic/65256/

    There is a
    Vista 32 Bit v15.13.3 dated 5/28/2009
    Vista 64 Bit v15.13.3 dated 5/28/2009
    but I'm afraid it won't fix the above problem.

    You might want to try v15.15.0:
    http://forum.notebookreview.com/showthread.php?p=4968557#post4968557
    or go back to v15.12.4 which you can get in the download section for the GM965 chipset:
    http://downloadcenter.intel.com/Default.aspx?lang=eng
     
← Previous pageNext page →