The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    DIY eGPU experiences

    Discussion in 'e-GPU (External Graphics) Discussion' started by master blaster, Sep 18, 2009.

  1. avlan

    avlan Notebook Guru

    Reputations:
    94
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    Which type of connection did you used? 2.0 x1 via mPCIe or 1.0 x1 via ExpressCard?
     
  2. AgentYura

    AgentYura Notebook Guru

    Reputations:
    37
    Messages:
    60
    Likes Received:
    7
    Trophy Points:
    16
    I'm using x1 via ExpressCard.
     
  3. Faruk

    Faruk Notebook Evangelist

    Reputations:
    36
    Messages:
    416
    Likes Received:
    15
    Trophy Points:
    31
    Wow those are impressive results! Glad to see it working well on the X201 :D I will probably wait until the GTS cards come out in September and then give this a shot... Unless of course, if a really good deal pops up for the GTX 460.
     
  4. laolaoxu

    laolaoxu Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    That means T410s (Nvidia NVS 3100M/Intel Hybrid Graphics) can make SLI with GTX4XX via ExpressCard?
     
  5. jamesbond007

    jamesbond007 Notebook Consultant

    Reputations:
    19
    Messages:
    108
    Likes Received:
    0
    Trophy Points:
    30
    @AgentYura is it possible to post some benchmarks of games
     
  6. Max399

    Max399 Notebook Enthusiast

    Reputations:
    0
    Messages:
    18
    Likes Received:
    0
    Trophy Points:
    5
    I have such a question, if i got Nvidia graphics in my laptop, and i have and Nvidia ViDock, is it possible to run both of the graphics in SLI mode?
     
  7. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    No offense, thou, these are great results, but I would expect such... AgentYura is running an intel i7 CPU, which drastically improves overall performance of the ViDock interface (bandwidth allocation time, latency, etc.).. Many of us, up to now, have been utilitizng and benchmarking our setups with laptops which feature the core2duo processor, which is a giant leap BACK from the i7.. You CPU is helping you out A LOT with your overall scores, which is great!! wish I could upgrade my CPU to see results like that.. but it would be nice to see some real-world benches to see exactly where your setup stands...

    Looking forward to see some data!!! =) good luck
     
  8. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    No, two completely different I/O components on the motherboard... unfortunately, wouldnt be possible...

    I know what you're saying thou, to utilize both onboard and external card performance capabilities, kinda like a DIY ViDock SLI or something... unfortunately, not possible. =(

    Actually, if some codemaker could design some sort of hack-type application whereby the computer DID recognize both GPU's as viable that would indeed be awesome, but alas I doubt this can be achieved.. =(

    forgot to mention, I have same setup as you, have HP HDX 16t with onboard nvidia GT130M w/ 1GB RAM, and using Vidock with GTX 470 via PE4H.. Expresscard is really an extreme bottleneck and it would be nice if there was a way to "combine" so to speak, both GPU's at once... Closest we've gotten to optimiziation is Optimus drivers which, unfortunately unless your lappy features intel HD chipset this is not possible.. only other alternative is to try 2x method, which in my case isnt possible either... so im stuck with 1x w/ expresscard 1.0 and beefy GTX470 for now =(
     
  9. Z4gRoS

    Z4gRoS Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    WoooooooWWWWW!!!! I have this core i7 in my laptop, but i have 3 gb RAM ddr3 1066, Does this influence very much the performance of the VIY ViDock? I like to buy, but don't have a credit card with International purchase :-(I have to wait until December ¬¬...thank you for your contribution!!!
     
  10. AgentYura

    AgentYura Notebook Guru

    Reputations:
    37
    Messages:
    60
    Likes Received:
    7
    Trophy Points:
    16
    OK I did some real game benchmarking. Post is updated
    What surprised me that turning motion blur on doesn't slow fps at all (At least in Just Cause 2). Anyway I'm playing without it as far as I don't like that blur.

    What is your GPU? Can you do optimus set up? You don't need credit card. You don't need credit card to buy PE4H or PE4L you can use debit card or your bank account trough paypal.
     
  11. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Can you run the variable RE5 benchmarks? That's whats been used on the first post.

    Please ensure you've disabled the GTX460 HDMI audio in control panel so sounds are routed via the notebook soundcard. That liberates some bandwidth as well to get the best possible FPS results.
     
  12. AgentYura

    AgentYura Notebook Guru

    Reputations:
    37
    Messages:
    60
    Likes Received:
    7
    Trophy Points:
    16
    OK now HDMI sound is disabled. Here are the results:
    Resident evil 5 (full game), variable mode,default:
    DX9 - 100.6
    DX10 - 90.9

    Always glad to help you, nando4, thanks to your tread now I have that awesome thing.


    Update: I have just tried just cause 2 with HDMI sound disabled. Just a small improvement from 39.74 to 40.94 on dark tower.
     
  13. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    You shoudlve maxed out the settings in RE5,ie 1920 x 1200, motion blur on, alll effects set to high, then see the results. I think my results were around 47fps variable.... but see that i7 is helping you out tremendously
     
  14. AgentYura

    AgentYura Notebook Guru

    Reputations:
    37
    Messages:
    60
    Likes Received:
    7
    Trophy Points:
    16
    1600x900 is the highest resolution my monitor can handle. You should try default settings and tell us the result.
     
  15. Z4gRoS

    Z4gRoS Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    I have geforce 310M, that comes with a reformed core the from 210 to one called 218, basically is the same thing. I have understood that if it is suitable for optimus. I do not have any card with international purchase and the banks of my country do not have agreement with paypal, the most nearby thing is "to "contract" someone that of another country and to say to him that he should send me, but they receive 8 % or more, then better I wait ... though the fingers eat me for proving the configuration xD
     
  16. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30

    Already did, results I posted.

    Whats quite interesting qbout the ViDock interface, is that, at least in my scenario, I've noticed that scaling down performance settings in games has little to no effect on FPS. It seems, sometimes, the HIGHER i set the settings the BETTER the card performs, or it performs just as good if the setting werent applied... For instance, AA, AF, motion blur, change of resolution, etc etc... I did a bench for Just Kauze 2, all settings on high, with AA OFF, averaged around 25-27 FPS. Then I maxed out the AA to 32QsAA I believe and FPS dropped by 2!! Then I decided to drop the resolution to 1280 x 800, thinking it'd breeze thru the bench @ that res, but instead the results were about 5-6 FPS greater... So my configuration is quite fickle.. In most cases its better to simply max out everything because even if you don't you'll still get the same FPS or worse than your on-board GPU, which defeats the purpose of having a DIY VIdock in the first place.

    However, bottom line, point blank is, I could never in my wildest dreams run batman, Dirt 2, L4D2, Battlefield 2: Bad Company, Crysis Warhead, @ 1920 x 1200 with settings maxed and full AA with my onboard GT 130M. Now I can get a playable average 25-30 FPS, most of the time, which I am happy with. It's no gamer's delight but whatever. Disappointingly, my setup seems to have a difficult time with the Call of Duty series games, MW1 and MW2.. Runs quite choppy with speed bursts at times.. O well, cant have ur cake and eat it too =)
     
  17. Eggs Scrambled

    Eggs Scrambled Notebook Evangelist

    Reputations:
    56
    Messages:
    394
    Likes Received:
    4
    Trophy Points:
    31
    I'm sure you may have thought of this, but I just wanted to mention that those findings probably have to do with how quickly the card can send frames through the expresscard interface. Hence if you could double the interface bandwidth you would either notice more normal results, or everything could very well be doubled.
     
  18. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    Therein lies the solution, young Eggs Scrambled.

    But alas, by what means would such bandwidth be achieved by???

    Probably only by waiting for a standardized implementation of Expresskard 2.0, in which case we'd all have to get new laptops. =( 'Till then, it's FPS ranging from 10 to 80 in some games lol. Even Optimus with advanced drivers the same problem, I would imagine, still persists due to Expresscard bandwidth, like you said.

    To summarize, At least I can run games at 1920 x 1200 playable now. That's all I really wanted in life. :D
    Now I can die peacefully.
     
  19. luongkyl

    luongkyl Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Hey,

    I need a little help here. I have a Lenovo Y650 (P8600) and I'm trying to set up the DIY Vidock. I boot up my laptop and go into the DIY Vidock set up. I think I read somewhere that Lenovo whitelists their ports so I have to anti-whitelist them. I do that and then plug in the expresscard adapter and power on the vidock and press f5 to detect. However, the gfx card isn't detected. I also try the method of booting up normally and then putting my comp on standby and powering on the vidock but all I get is a blackscreen on my monitor and no signal on my external CRT.

    What I was wondering is that I might need a replacement mini HDMI?

    Any help for this noob is appreciated thanks!
     
  20. junusa

    junusa Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    So I got gtx460 pe4h and ultra mATX and did all the connection
    with
    lg P300
    core duo t8300 2.4Ghz
    4g ram
    win7 64bit

    tried the boot disk like million times with setup version 1.0e and 1.0e2
    but I would end up having windows 98 command prompt
    and I'm not that good with prompt so I tried with grub4dos but didn't understand it very well
    I tried without setup, but both my laptop and monitor would be blank
     
  21. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    hey, I have the same setup as u bascially with some minior differeneces with our laptop specs, but over all it's all the same (i have GTX470, but returned a GTX460 in order to get it, and before that I had a Radeon 5750 and returned that as well to get the 460 LOL!!) So I've had my share of experiences with all the cards , and they've all worked. We're also both running Win7x64 so u should have no problems firing this thing up.

    try troubleshooting:

    1) Download the DESKTOP Forceware drivers (258.xx) for GTX4xx.. Dont run setup.exe, instead extract it via 7-zip or winRAR.
    2) go to device manager, turn on ur PSU AND PE4h, so everything is ON, but the expresscard isnt in the slot yet.
    3) Stick it in the slot now... with comp just at Desktop screen, and see what happens.
    4) This is where you'll have to explain to me what it does at this point, but for me at least Initially windows regiznizes the device as a "VGA Controller", but it wont work at this point. You need to click on VGA Controller, then Update Driver. Then click Browse Computer for Driver Folder, and point it to the folder you extracted your forceware drivers to.. Windows will do it's thing for a bit, installing the drivers manually, then it may say reboot required.
    5) Un-plug Expresscard from slot, and reboot. During windows Start-up screen, (make sure everything is ON) plug in the expresscard. The monitors (desktop and laptop) may flash a few times but after 4-6 seconds the display will shift to the external one.

    I tried the boot up setup files but they didnt help as my problem wasnt related to port or memory allocation or re-allocation, rather a Device driver issue, which I soon discovered. (nando assumed this right off the bat BTW lol)

    This is what I always do whenever I buy a new card to test, and it always seems to work... one time I was stumped, kept giving me error 43, so randomly I decided to unplug the EC (expresscard adapter) then re-plug EC adapter immediately after then VOILA! the screen suddenly lit up beautifuly..

    To me, it sounds like a driver issue with you, which I had same problem with initially with the nvidia card (Radeon fired right up upon initial insertion and Catalysy drivers installed)

    If this doesn't work then please provide more feedback of the errors your are encountering so we all can better diagnose the problem and try to figure out the best solution. But #1 thing is u need the DESKTOP drivers. Doesnt matter if you have onboard GPU drivers installed, they get installed to different folders anyway. PM me if u need more help.
     
  22. AgentYura

    AgentYura Notebook Guru

    Reputations:
    37
    Messages:
    60
    Likes Received:
    7
    Trophy Points:
    16
    OMG so complicated. I follow this steps, maybe you should try it too:
    1. Make sure DIY Vidock is on.
    2. Make sure express card out of your laptop.
    3. Turn on the laptop.
    4. Wait for Windows ready till pluging in. (If express card is pluged in with enabled DIY Vidock I'm getting blue screen when windows is trying to load)
    5. Done. (In next 10 seconds it's ready to play)
     
  23. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    OMG SO COMPLICATED? If he doesn't install the correct dESKTOP forceware drivers no matter what he sticks in his expresscard slot it NOTHING is going to make it work. That's why I made sure to point out that he needs to download and MANUALLY install the desktop 258 drivers.

    Your method wont work at all if the drivers aren't installed, at best the computer will treat it as a "VGA Controller" and won't feed a signal to it, and if it does the resolution will most likely be 640x480, a default resolution, for instance what the resolution reverts to upon an initial windows installation or until the actual GPU drivers get loaded.
     
  24. AgentYura

    AgentYura Notebook Guru

    Reputations:
    37
    Messages:
    60
    Likes Received:
    7
    Trophy Points:
    16
    Well... it worked for me. No need to get angry ;)
     
  25. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Without the DIY ViDock attached:

    1. Load the Optimus drivers and ensure your 8600M works using them. The 8600M is in the INF so is supported. This step is to ensure the loaded NVidia driver supports both your 8600M and the GTX460 which Nautis' Optimus driver does.

    2. Reboot, do standby, attach GTX460, resume. Is the GTX460 now being detected? If not, disable the 8600M just in case it's conflicting, then repeat the standby/resume to see if the GTX460 registers in device manager. Do a 'scan new hardware' just in case.

    As to DIY ViDock Setup 1.x, revert to using the disk image install. The USB method works too as long as all the files are correctly copied. There must be a autoexec.bat and config.sys in the USB thumbdrive's root directory.

    The LG L300 is two-gens older tech plus has a NVidia 8600M onboard so may require more attention to get going. Recent notebooks, especially with an Intel IGP appear to be plug and play as seen by the Lenovo X2xx/T4xx examples.

    @AgentYura - could you do any Optimus versus non-Optimus benchmarks. We see that 3dmark06/vantage scores increase substantially with the Optimus-enhancements but how much more FPS do real-world games see? I saw a minor improvement in FFXIV but a pretty big improvement in dmcv4 here. I speculate any DX9 games would significant improvements given the DX9-centric 3dmark06 increases by 2-3 times.
     
  26. junusa

    junusa Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    I got till step 5 and installed all the driver, but it was blinking then the screen didn't shift.



    I have different onboard gpu; it's 8400m gs would the optimus drivers still work? I'm downloading it now
     
  27. junusa

    junusa Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    it's in korean, but i see that it recognizes gtx460, but same problem repeating everytime I reboot. It blinks, and the laptop screen goes off, but my monitor screen won't turn on. It still says that it has problem with the graphic card
    I tried the optimus driver, but I couldn't find the software and couldn't turn it off
     

    Attached Files:

  28. ithildin

    ithildin Notebook Geek

    Reputations:
    78
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    People here are having mixed experiences with enabling their PE4H with GTX 460. While AgentYura and others have managed start up their cards by hot plugging or suspending+resuming Windows, I haven't been able to get mine to start up properly in such way. It either fails with a similar error to yours or starts up briefly and then crashes (BSOD). I've noticed that even turning my DIY Vidock at BIOS POST can cause it to mis-initialize. The solution I found is to power up the external GPU and PE4H 1-2 seconds before I start up my laptop and never suspend or hybernate - then everything works fine.

    It's bizarre - the Radeon HD 5750 never had such issues. I could hot-plug it without a hitch. It's possible that I need to reconfigure my bootdisk to a more suitable memory allocation like Nando previously suggested.
     
  29. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    AgentYura's extra-ordinary results are quite easily explained - he has in i7 processor, and of course the DDR3 RAM doesn't hurt much, either. The majority of systems that've been tested and benched have been the (by todays standards) mediocre-performing Core 2 Duo, a slowly dying breed. The i5, i7, quad-core, hexa-core just bascially blow-away any C2Duo. Times they r a'changin.

    In fact, I'd be dissapointed if any lesser-benchmarks resulted in his case.

    Oh, on a side note, I'd like to know everyone with DIY Vidocks, do you guys turn your cards off when you put ur system on standby, or do you leave them on? Me personally, I always turn mine OFF (and power supply via SWEX) when I put my computer on sleep mode.
     
  30. Eggs Scrambled

    Eggs Scrambled Notebook Evangelist

    Reputations:
    56
    Messages:
    394
    Likes Received:
    4
    Trophy Points:
    31
    The core i7s may be an improvement over c2d, but you're giving them way too much credit for better fps and such. A 2.8ghz C2D for example is better than a 2.4ghz (2.9 turbo) i5 at straight up FP calculations, but a 2.5ghz (3ghz turbo) edges out the 2.8ghz C2D.

    Clock for clock, yes the i5/i7 will beat a C2D, but if we're talking about severely bottlenecked GPUs stuck in an expresscard slot, major differences in fps are likely not as much due to the CPU as they are to the efficiency of the expresscard slot.
     
  31. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    Yes I realize what you are saying in regards to the expresscard bottleneck, but regardless of this, if you have a CPU that processes data faster & more efficient this will result in obviously better benchmark results, as the data being taken in thru the expresscard port is processed faster. DIY vidock experience as a whole will improve if every aspect of the PC is better, whether it be RAM, CPU clock, etc.

    Anyways, if that's not the explaination, then what is? (for the impressive Agentyuri's results?)
     
  32. Eggs Scrambled

    Eggs Scrambled Notebook Evangelist

    Reputations:
    56
    Messages:
    394
    Likes Received:
    4
    Trophy Points:
    31
    You're misunderstanding what I'm saying. If you look at nando's chart, it looks as if just using a newer CPU is what improves the score so drastically. The real improvement is just using a better CPU, regardless of generation. The other cpu that hits 10k under optimus is like 2.2ghz and Agent Yura's cpu is the tippity toppity top of the line latest i7 dual core clocked at 2.66ghz and turbo goes as high as 3.33ghz o_O . Any upgrade that drastic will result in a huge performance differential.

    My point was if someone gets in here and uses a p8700 (2.8ghz) overclocked to like 3-3.2ghz (relatively easy assuming unlocked PLL or such), the difference wont be as drastic. So it's not about an architecture change, it's instead about a raw power change. I just don't want people coming in here, thinking they want a laptop that supports a vidock, and thinking they have to get a latest generation notebook just for the i5/i7 architecture as opposed to something last-gen but with a good C2D inside.
     
  33. ithildin

    ithildin Notebook Geek

    Reputations:
    78
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    That's why we favour benchmarks focusing on GPU rather than general system performance. If you look again at the Nando's performance data, his "creaky old" Core 2 Duo [email protected] is keeping up with AgentYura's i7 in both 3dmark Vantage GPU marks and DMC4 scene4 fps. My "ancient" [email protected] can keep up with JohnnyEM's Asus M60J with i7-720QM (a quad core!).

    Of course a machine equipped with a i7 will feel snappier and generally perform a lot better but in the context of a DIY Vidock the CPU is not the major performance determinant - expresscard/mPCIe bandwidth, GPU architecture and driver optimization seem to be.

    Optimus.
     
  34. pterodactilo

    pterodactilo Notebook Consultant

    Reputations:
    11
    Messages:
    225
    Likes Received:
    7
    Trophy Points:
    31
    Hello, I want to use vidock with a Precision M4500 notebook. Is it possible to display the signal on the internal LCD ?
     
  35. jamesbond007

    jamesbond007 Notebook Consultant

    Reputations:
    19
    Messages:
    108
    Likes Received:
    0
    Trophy Points:
    30
    @pterodactilo It is possible but with frame rate caps. There is a full explanation in the first page of the post under mini faq question number 3.
     
  36. antichamp

    antichamp Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Hi.
    I've been trying to read through all of this, but there's just hundreds of pages through these threads and I was wondering if someone would help me.

    I know from opening up my laptop (acer 5920g) that there is an unused pcie, and it also has an expresscard slot.
    When I check it only shows 1 unused pcie:

    [​IMG]

    [​IMG]


    Does anyone know how to figure out which port is which?
     
  37. ithildin

    ithildin Notebook Geek

    Reputations:
    78
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    It's hard to tell how the PCIe lanes are wired for a particular machine just by using software. The easiest way of finding out is by moving mPCIe cards around and connecting a device to the expresscard slot. Everest should be able to tell you in which port the device is now connected and you can then figure out how they're all connected.

    For instance, in your case try moving your Intel Wifi card to the unused mPCIe slot, boot up Windows and check which port it shows up in Everest. As for the expresscard, can you get your hands on any expresscard peripheral, like a wifi adapter or USB/firewire card? If you plug any of those to the expresscard slot, Everest should be able to recognize the device and tell you which port that slot is wired to. Have a got at it and let us know what you find.
     
  38. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    Is there any way (with adqueate knowledge of soldering and tech knowledge of course) would it be able to detach laptop onboard GPU chip and re-solder some sort of makeshift device that would route data from that old GPU "port" or socket and send it to the PE4H w/ external GFX card attached?

    Or, perhaps in other words, what is the absolute best method of achieve highest bandwidth xfer rates with DIY vidock currently?
    Assuming x2 w/ optimus drivers has been proven best interface as of yet, right?
     
  39. ithildin

    ithildin Notebook Geek

    Reputations:
    78
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    A few people here in NBR thought of hooking up a desktop GPU to the laptop GPU's MXM slot. The GPU would have to be supported by a more complex device than the PE4H but then it could make use of full x16 2.0 bandwidth and even use the laptop's LVDS link to display the output on the LCD panel. Unfortunately, this idea hit a roadblock - the detailed MXM specs required to design such a device are not open to the general public.
     
  40. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    I see, thanks for that info....

    On another note, I've been doing some stress level testing with FurMark using my PE4H and GTX470, due to some "blackouts" during gameplay. I'm using stock clock speeds so it couldn't be due to OCing, which leads me to think that it may have something to do with heat levels.

    After several stability tests with FurMark, with settings at 1920x1200 Fullscreen, 0xAA, each test got to about the 2:00 mark and "blackscreened" at 64C each time. I had fanspeed at 100% during the duration of the test and no background programs running. (Except firewall). So it seems as thou, for whatever reason, once 64C is hit it automatically shuts down. Im using Corsair VX450 as PSU so I doubt it's a power supply issue. In addition, what facts further support this theory is that of the other several tests that I ran with card overclocked (715/1720) performed without error, and temps never passed 61C during each test.

    Any ideas what can be done to workaround this issue? Perhaps it's the PE4H shutting down due to overloading? I just don't know, this card should be able to reach well into the 80C+ zones and perform fine.

    Doing a benchmark test, the GTX performed quite well, considering other results with comparable cards scored the same or lower with much more powerful rigs, overall. The results of a bench run at 1920x1200 fullscreen can be seen here

    EDIT: Just performed another quick benchmark test with FurMark with settings at 1920 x 1200, with 8xAA. 30000ms test (pretty quick), avgeraged about 50 FPS. Temps never exceeded 60C. Then I did another test, changing AA from 8 to 32xAA. Temps never exceeded 60C, and half way thru the test it "blackscreened" again, freezing up like it does when temps pass 64C. So now Im just stumped...

    Only thing I can think of would be that I have my PSU plugged into a surge protector with about 6 other power-hungry devices plugged into it. Does this have a wattage reduction effect, due to an overload??? I have no idea, maybe that's a stupid theory but to me it makes sense, in a way. I will try plugging Corsair PSU directly into it's exclusive wall 110V outlet and replicate test to see if results are the same.
     
  41. SmellyGeekBoy

    SmellyGeekBoy Newbie

    Reputations:
    20
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Hi all, I've been following this for a few weeks now and decided to take the plunge. My new graphics card arrived today and all is running perfectly! I can't believe how well this all works!

    I'm running an ASUS Radeon 4670 1GB DDR3 in a PE4H connected via the ExpressCard slot to my Acer Aspire 5720. I'm very lucky because the ExpressCard is actually on slot1, and slot2 is empty, so I've disabled slot2 and set 2x on slot1 for the X1E performance boost.

    I'm currently running Win7 64-bit chainloaded from a USB stick running DIY ViDOCK Setup v1.0e2.

    Benchmarks (Integrated Intel X3100 vs external Radeon 4670):

    Before:
    3DMark Score 665
    SM 2.0 Score 216
    SM 3.0 Score 257

    After:
    3DMark Score 4825 (625% improvement)
    SM 2.0 Score 2393 (872% improvement)
    SM 3.0 Score 2696 (949% improvement)

    Resident Evil 5 Benchmark Edition (All settings on HIGH):
    Before: 3.0 FPS
    After: 26.9 FPS (796% improvement)

    Windows Experience Index (Win7 64-bit):
    Before: 3.1
    After: 5.3
    With graphics going from lowest (3.1) to highest (6.7).

    All for under £150... I'd say that's a result ;)

    Thanks for all the help and advice everyone - I couldn't have done it without you. Apologies for the crappy pic, but I'm sure you all know what one of these looks like by now... ;)

    [​IMG]
     
  42. antichamp

    antichamp Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Hey, that acer looks a lot like mine.
    Would you mind posting a screenshot of your pcie config from everest?

    and also, what do you mean by improving performance by disabling another slot?
     
  43. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    yeah for some reason Radeons score amazingly well with all 3dmarks, along with Windows Index scores, etc... nVidia scores like crap... ironically, nvidia performs much better in real-world gameplay for some reason. I've experienced this personally..
     
  44. SmellyGeekBoy

    SmellyGeekBoy Newbie

    Reputations:
    20
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Have a look at Nando4's post here. It's one of the options on the DIY ViDOCK Setup bootdisk.

    Here are screenshots of my PCIE config before and after running the bootdisk... As you can see, slot 2 is missing and slot 1 is running in "x2" mode (although it's not really true x2, as explained in Nando4's post).

    Really? I'd heard the opposite, but I'm no expert and there certainly is a lot of conflicting advice out there. I think the nVidia cards would definitely perform better in games that support PhysX, although I read in this thread somewhere that the Radeons are less affected by the lower bandwidth available in a ViDock setup.

    I actually went with the 4670 because it's extremely cheap (mine was under £50) and because it's the model that Villagetronic offered in the original ViDock, so I guess they knew what they were doing. In the end my boss expressed an interest in the project and bought the card for me anyway so I can always upgrade later on if I'm not happy with it. :cool:

    I'm not really a serious gamer, I just wanted to play Portal at higher than the 10fps or so I got before!
     
  45. SmellyGeekBoy

    SmellyGeekBoy Newbie

    Reputations:
    20
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    That x16 port intrigues me... Is that the MXM slot? Some Googling tells me that your laptop has an 8600M GT. Perhaps the MXM slot uses all the resources that ports 1-3 would usually be using.

    The 5720 doesn't have a "gaming" GPU, in fact there is physical space in the casing and solder pads on the motherboard for an MXM slot, but the physical connector isn't actually there.

    Like I said in my previous post, I got very lucky with my port configuration. I guess you could beg / steal / borrow an Expresscard from somewhere to test it out and work out which port is which?
     
  46. PanzerHauptmann

    PanzerHauptmann Notebook Consultant

    Reputations:
    50
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    30
    Your setup is great compared to the performance u used to get for sure, but trust me, I have 1st hand experience. I've gone thru 3 cards already, all tested on the same system/interface.

    1st card was a Radeon 5750. Scored nearly 10000 on 3dmark06, compared to 5000 with onboard GT130M GPU. Scored a 7.1 on 3d gaming on windows experience index. All "tests" were outstanding, I was thrilled....

    Then I tried to play a few games with some moderate-to-high settings (nothing crazy, 1920 x 1200, no AA, some bells and whistles here and there) My FPS were worse than my ONBOARD GPU!!! Dirt2 for instance, was utterly un-playable with the 5750, getting around 10-15 FPS. Same went to Modern warfare 2, MW1, BF:BC2, Batman, CoD4, etc.. Teens FPS all around.

    Then I got a GTX460. It's renowned for it's insane overclock-ability, and it performs as such... I ran 3dmark06, scored 5000, less than what my internal GPU scored. Windows experience index? 6.1. LOL.

    I wanted to max things out so I decided to return the 460 and get the 470... All benchmark data was the same. crappy crappy crappy. Until I fired up Gears of War... Dirt 2, BFBC, L4D2, and Batman once again. It was quite pleasant to max out all settings and get a buttery smooth 30-40 FPS average with all of them.. Dirt 2 I can max out and get around 30 FPS constant.

    DX10 rendered games tend to really excel with the nVidia cards & Vidock setup.. DX11 even better.. Batman looks amazing at 32xAA, as does Mafia II with the GTX470. Ill admit it, newer, "sandbox" style 3D games (Mafia II, Just Cause 2, etc) will have troubles at time, with FPS dropping in the 15-20's, which is playable, but not very much enjoyable..

    Good luck thou and congrats on ur achievements...! ;)
     
  47. antichamp

    antichamp Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Yeah, the x16 is (at least i think) the MXM, it sits on the northbridge.

    My main concern was to why I have an open mpcie slot and an open expresscard slot while only one open slot shows up on everest.
     
  48. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    Let that be a warning to all who place emphasis on benchmarks alone
     
  49. antichamp

    antichamp Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Is it safe to assume that port 1 is my expresscard slot, and that I can do 2x with ports 1 & 2?

    I don't have any cards handy to plug in to test with.

    Original:
    [​IMG]

    After WiFi card switch:
    [​IMG]
     
  50. ithildin

    ithildin Notebook Geek

    Reputations:
    78
    Messages:
    88
    Likes Received:
    0
    Trophy Points:
    15
    No, sorry. I don't want to sound discouraging but PCIe port 1 seems to always be left enabled even in laptops where the pins on the Southbridge are not connected - i.e. my own laptop's port 1 is not wired even though it shows up on Everest... frustrating, I know. Other than plugging stuff to the expresscard slot, we can figure out to which port it's connected to get your hands on your mainboard schematics. Try looking for them on the web.- I managed to find one for my laptop.
     
← Previous pageNext page →