Which type of connection did you used? 2.0 x1 via mPCIe or 1.0 x1 via ExpressCard?
-
I'm using x1 via ExpressCard.
-
Wow those are impressive results! Glad to see it working well on the X201
I will probably wait until the GTS cards come out in September and then give this a shot... Unless of course, if a really good deal pops up for the GTX 460.
-
That means T410s (Nvidia NVS 3100M/Intel Hybrid Graphics) can make SLI with GTX4XX via ExpressCard?
-
@AgentYura is it possible to post some benchmarks of games
-
I have such a question, if i got Nvidia graphics in my laptop, and i have and Nvidia ViDock, is it possible to run both of the graphics in SLI mode?
-
PanzerHauptmann Notebook Consultant
Looking forward to see some data!!! =) good luck -
PanzerHauptmann Notebook Consultant
I know what you're saying thou, to utilize both onboard and external card performance capabilities, kinda like a DIY ViDock SLI or something... unfortunately, not possible. =(
Actually, if some codemaker could design some sort of hack-type application whereby the computer DID recognize both GPU's as viable that would indeed be awesome, but alas I doubt this can be achieved.. =(
forgot to mention, I have same setup as you, have HP HDX 16t with onboard nvidia GT130M w/ 1GB RAM, and using Vidock with GTX 470 via PE4H.. Expresscard is really an extreme bottleneck and it would be nice if there was a way to "combine" so to speak, both GPU's at once... Closest we've gotten to optimiziation is Optimus drivers which, unfortunately unless your lappy features intel HD chipset this is not possible.. only other alternative is to try 2x method, which in my case isnt possible either... so im stuck with 1x w/ expresscard 1.0 and beefy GTX470 for now =( -
-
OK I did some real game benchmarking. Post is updated
What surprised me that turning motion blur on doesn't slow fps at all (At least in Just Cause 2). Anyway I'm playing without it as far as I don't like that blur.
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
Please ensure you've disabled the GTX460 HDMI audio in control panel so sounds are routed via the notebook soundcard. That liberates some bandwidth as well to get the best possible FPS results. -
OK now HDMI sound is disabled. Here are the results:
Resident evil 5 (full game), variable mode,default:
DX9 - 100.6
DX10 - 90.9
Always glad to help you, nando4, thanks to your tread now I have that awesome thing.
Update: I have just tried just cause 2 with HDMI sound disabled. Just a small improvement from 39.74 to 40.94 on dark tower. -
PanzerHauptmann Notebook Consultant
-
1600x900 is the highest resolution my monitor can handle. You should try default settings and tell us the result.
-
-
PanzerHauptmann Notebook Consultant
Already did, results I posted.
Whats quite interesting qbout the ViDock interface, is that, at least in my scenario, I've noticed that scaling down performance settings in games has little to no effect on FPS. It seems, sometimes, the HIGHER i set the settings the BETTER the card performs, or it performs just as good if the setting werent applied... For instance, AA, AF, motion blur, change of resolution, etc etc... I did a bench for Just Kauze 2, all settings on high, with AA OFF, averaged around 25-27 FPS. Then I maxed out the AA to 32QsAA I believe and FPS dropped by 2!! Then I decided to drop the resolution to 1280 x 800, thinking it'd breeze thru the bench @ that res, but instead the results were about 5-6 FPS greater... So my configuration is quite fickle.. In most cases its better to simply max out everything because even if you don't you'll still get the same FPS or worse than your on-board GPU, which defeats the purpose of having a DIY VIdock in the first place.
However, bottom line, point blank is, I could never in my wildest dreams run batman, Dirt 2, L4D2, Battlefield 2: Bad Company, Crysis Warhead, @ 1920 x 1200 with settings maxed and full AA with my onboard GT 130M. Now I can get a playable average 25-30 FPS, most of the time, which I am happy with. It's no gamer's delight but whatever. Disappointingly, my setup seems to have a difficult time with the Call of Duty series games, MW1 and MW2.. Runs quite choppy with speed bursts at times.. O well, cant have ur cake and eat it too =) -
Eggs Scrambled Notebook Evangelist
I'm sure you may have thought of this, but I just wanted to mention that those findings probably have to do with how quickly the card can send frames through the expresscard interface. Hence if you could double the interface bandwidth you would either notice more normal results, or everything could very well be doubled.
-
PanzerHauptmann Notebook Consultant
But alas, by what means would such bandwidth be achieved by???
Probably only by waiting for a standardized implementation of Expresskard 2.0, in which case we'd all have to get new laptops. =( 'Till then, it's FPS ranging from 10 to 80 in some games lol. Even Optimus with advanced drivers the same problem, I would imagine, still persists due to Expresscard bandwidth, like you said.
To summarize, At least I can run games at 1920 x 1200 playable now. That's all I really wanted in life.
Now I can die peacefully. -
Hey,
I need a little help here. I have a Lenovo Y650 (P8600) and I'm trying to set up the DIY Vidock. I boot up my laptop and go into the DIY Vidock set up. I think I read somewhere that Lenovo whitelists their ports so I have to anti-whitelist them. I do that and then plug in the expresscard adapter and power on the vidock and press f5 to detect. However, the gfx card isn't detected. I also try the method of booting up normally and then putting my comp on standby and powering on the vidock but all I get is a blackscreen on my monitor and no signal on my external CRT.
What I was wondering is that I might need a replacement mini HDMI?
Any help for this noob is appreciated thanks! -
So I got gtx460 pe4h and ultra mATX and did all the connection
with
lg P300
core duo t8300 2.4Ghz
4g ram
win7 64bit
tried the boot disk like million times with setup version 1.0e and 1.0e2
but I would end up having windows 98 command prompt
and I'm not that good with prompt so I tried with grub4dos but didn't understand it very well
I tried without setup, but both my laptop and monitor would be blank -
PanzerHauptmann Notebook Consultant
try troubleshooting:
1) Download the DESKTOP Forceware drivers (258.xx) for GTX4xx.. Dont run setup.exe, instead extract it via 7-zip or winRAR.
2) go to device manager, turn on ur PSU AND PE4h, so everything is ON, but the expresscard isnt in the slot yet.
3) Stick it in the slot now... with comp just at Desktop screen, and see what happens.
4) This is where you'll have to explain to me what it does at this point, but for me at least Initially windows regiznizes the device as a "VGA Controller", but it wont work at this point. You need to click on VGA Controller, then Update Driver. Then click Browse Computer for Driver Folder, and point it to the folder you extracted your forceware drivers to.. Windows will do it's thing for a bit, installing the drivers manually, then it may say reboot required.
5) Un-plug Expresscard from slot, and reboot. During windows Start-up screen, (make sure everything is ON) plug in the expresscard. The monitors (desktop and laptop) may flash a few times but after 4-6 seconds the display will shift to the external one.
I tried the boot up setup files but they didnt help as my problem wasnt related to port or memory allocation or re-allocation, rather a Device driver issue, which I soon discovered. (nando assumed this right off the bat BTW lol)
This is what I always do whenever I buy a new card to test, and it always seems to work... one time I was stumped, kept giving me error 43, so randomly I decided to unplug the EC (expresscard adapter) then re-plug EC adapter immediately after then VOILA! the screen suddenly lit up beautifuly..
To me, it sounds like a driver issue with you, which I had same problem with initially with the nvidia card (Radeon fired right up upon initial insertion and Catalysy drivers installed)
If this doesn't work then please provide more feedback of the errors your are encountering so we all can better diagnose the problem and try to figure out the best solution. But #1 thing is u need the DESKTOP drivers. Doesnt matter if you have onboard GPU drivers installed, they get installed to different folders anyway. PM me if u need more help. -
OMG so complicated. I follow this steps, maybe you should try it too:
1. Make sure DIY Vidock is on.
2. Make sure express card out of your laptop.
3. Turn on the laptop.
4. Wait for Windows ready till pluging in. (If express card is pluged in with enabled DIY Vidock I'm getting blue screen when windows is trying to load)
5. Done. (In next 10 seconds it's ready to play) -
PanzerHauptmann Notebook Consultant
Your method wont work at all if the drivers aren't installed, at best the computer will treat it as a "VGA Controller" and won't feed a signal to it, and if it does the resolution will most likely be 640x480, a default resolution, for instance what the resolution reverts to upon an initial windows installation or until the actual GPU drivers get loaded. -
Well... it worked for me. No need to get angry
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
1. Load the Optimus drivers and ensure your 8600M works using them. The 8600M is in the INF so is supported. This step is to ensure the loaded NVidia driver supports both your 8600M and the GTX460 which Nautis' Optimus driver does.
2. Reboot, do standby, attach GTX460, resume. Is the GTX460 now being detected? If not, disable the 8600M just in case it's conflicting, then repeat the standby/resume to see if the GTX460 registers in device manager. Do a 'scan new hardware' just in case.
As to DIY ViDock Setup 1.x, revert to using the disk image install. The USB method works too as long as all the files are correctly copied. There must be a autoexec.bat and config.sys in the USB thumbdrive's root directory.
@AgentYura - could you do any Optimus versus non-Optimus benchmarks. We see that 3dmark06/vantage scores increase substantially with the Optimus-enhancements but how much more FPS do real-world games see? I saw a minor improvement in FFXIV but a pretty big improvement in dmcv4 here. I speculate any DX9 games would significant improvements given the DX9-centric 3dmark06 increases by 2-3 times. -
-
it's in korean, but i see that it recognizes gtx460, but same problem repeating everytime I reboot. It blinks, and the laptop screen goes off, but my monitor screen won't turn on. It still says that it has problem with the graphic card
I tried the optimus driver, but I couldn't find the software and couldn't turn it offAttached Files:
-
-
It's bizarre - the Radeon HD 5750 never had such issues. I could hot-plug it without a hitch. It's possible that I need to reconfigure my bootdisk to a more suitable memory allocation like Nando previously suggested. -
PanzerHauptmann Notebook Consultant
AgentYura's extra-ordinary results are quite easily explained - he has in i7 processor, and of course the DDR3 RAM doesn't hurt much, either. The majority of systems that've been tested and benched have been the (by todays standards) mediocre-performing Core 2 Duo, a slowly dying breed. The i5, i7, quad-core, hexa-core just bascially blow-away any C2Duo. Times they r a'changin.
In fact, I'd be dissapointed if any lesser-benchmarks resulted in his case.
Oh, on a side note, I'd like to know everyone with DIY Vidocks, do you guys turn your cards off when you put ur system on standby, or do you leave them on? Me personally, I always turn mine OFF (and power supply via SWEX) when I put my computer on sleep mode. -
Eggs Scrambled Notebook Evangelist
The core i7s may be an improvement over c2d, but you're giving them way too much credit for better fps and such. A 2.8ghz C2D for example is better than a 2.4ghz (2.9 turbo) i5 at straight up FP calculations, but a 2.5ghz (3ghz turbo) edges out the 2.8ghz C2D.
Clock for clock, yes the i5/i7 will beat a C2D, but if we're talking about severely bottlenecked GPUs stuck in an expresscard slot, major differences in fps are likely not as much due to the CPU as they are to the efficiency of the expresscard slot. -
PanzerHauptmann Notebook Consultant
Anyways, if that's not the explaination, then what is? (for the impressive Agentyuri's results?) -
Eggs Scrambled Notebook Evangelist
. Any upgrade that drastic will result in a huge performance differential.
My point was if someone gets in here and uses a p8700 (2.8ghz) overclocked to like 3-3.2ghz (relatively easy assuming unlocked PLL or such), the difference wont be as drastic. So it's not about an architecture change, it's instead about a raw power change. I just don't want people coming in here, thinking they want a laptop that supports a vidock, and thinking they have to get a latest generation notebook just for the i5/i7 architecture as opposed to something last-gen but with a good C2D inside. -
Of course a machine equipped with a i7 will feel snappier and generally perform a lot better but in the context of a DIY Vidock the CPU is not the major performance determinant - expresscard/mPCIe bandwidth, GPU architecture and driver optimization seem to be.
-
Hello, I want to use vidock with a Precision M4500 notebook. Is it possible to display the signal on the internal LCD ?
-
@pterodactilo It is possible but with frame rate caps. There is a full explanation in the first page of the post under mini faq question number 3.
-
Hi.
I've been trying to read through all of this, but there's just hundreds of pages through these threads and I was wondering if someone would help me.
I know from opening up my laptop (acer 5920g) that there is an unused pcie, and it also has an expresscard slot.
When I check it only shows 1 unused pcie:
Does anyone know how to figure out which port is which? -
It's hard to tell how the PCIe lanes are wired for a particular machine just by using software. The easiest way of finding out is by moving mPCIe cards around and connecting a device to the expresscard slot. Everest should be able to tell you in which port the device is now connected and you can then figure out how they're all connected.
For instance, in your case try moving your Intel Wifi card to the unused mPCIe slot, boot up Windows and check which port it shows up in Everest. As for the expresscard, can you get your hands on any expresscard peripheral, like a wifi adapter or USB/firewire card? If you plug any of those to the expresscard slot, Everest should be able to recognize the device and tell you which port that slot is wired to. Have a got at it and let us know what you find. -
PanzerHauptmann Notebook Consultant
Is there any way (with adqueate knowledge of soldering and tech knowledge of course) would it be able to detach laptop onboard GPU chip and re-solder some sort of makeshift device that would route data from that old GPU "port" or socket and send it to the PE4H w/ external GFX card attached?
Or, perhaps in other words, what is the absolute best method of achieve highest bandwidth xfer rates with DIY vidock currently?
Assuming x2 w/ optimus drivers has been proven best interface as of yet, right? -
-
PanzerHauptmann Notebook Consultant
On another note, I've been doing some stress level testing with FurMark using my PE4H and GTX470, due to some "blackouts" during gameplay. I'm using stock clock speeds so it couldn't be due to OCing, which leads me to think that it may have something to do with heat levels.
After several stability tests with FurMark, with settings at 1920x1200 Fullscreen, 0xAA, each test got to about the 2:00 mark and "blackscreened" at 64C each time. I had fanspeed at 100% during the duration of the test and no background programs running. (Except firewall). So it seems as thou, for whatever reason, once 64C is hit it automatically shuts down. Im using Corsair VX450 as PSU so I doubt it's a power supply issue. In addition, what facts further support this theory is that of the other several tests that I ran with card overclocked (715/1720) performed without error, and temps never passed 61C during each test.
Any ideas what can be done to workaround this issue? Perhaps it's the PE4H shutting down due to overloading? I just don't know, this card should be able to reach well into the 80C+ zones and perform fine.
Doing a benchmark test, the GTX performed quite well, considering other results with comparable cards scored the same or lower with much more powerful rigs, overall. The results of a bench run at 1920x1200 fullscreen can be seen here
EDIT: Just performed another quick benchmark test with FurMark with settings at 1920 x 1200, with 8xAA. 30000ms test (pretty quick), avgeraged about 50 FPS. Temps never exceeded 60C. Then I did another test, changing AA from 8 to 32xAA. Temps never exceeded 60C, and half way thru the test it "blackscreened" again, freezing up like it does when temps pass 64C. So now Im just stumped...
Only thing I can think of would be that I have my PSU plugged into a surge protector with about 6 other power-hungry devices plugged into it. Does this have a wattage reduction effect, due to an overload??? I have no idea, maybe that's a stupid theory but to me it makes sense, in a way. I will try plugging Corsair PSU directly into it's exclusive wall 110V outlet and replicate test to see if results are the same. -
Hi all, I've been following this for a few weeks now and decided to take the plunge. My new graphics card arrived today and all is running perfectly! I can't believe how well this all works!
I'm running an ASUS Radeon 4670 1GB DDR3 in a PE4H connected via the ExpressCard slot to my Acer Aspire 5720. I'm very lucky because the ExpressCard is actually on slot1, and slot2 is empty, so I've disabled slot2 and set 2x on slot1 for the X1E performance boost.
I'm currently running Win7 64-bit chainloaded from a USB stick running DIY ViDOCK Setup v1.0e2.
Benchmarks (Integrated Intel X3100 vs external Radeon 4670):
Before:
3DMark Score 665
SM 2.0 Score 216
SM 3.0 Score 257
After:
3DMark Score 4825 (625% improvement)
SM 2.0 Score 2393 (872% improvement)
SM 3.0 Score 2696 (949% improvement)
Resident Evil 5 Benchmark Edition (All settings on HIGH):
Before: 3.0 FPS
After: 26.9 FPS (796% improvement)
Windows Experience Index (Win7 64-bit):
Before: 3.1
After: 5.3
With graphics going from lowest (3.1) to highest (6.7).
All for under £150... I'd say that's a result
Thanks for all the help and advice everyone - I couldn't have done it without you. Apologies for the crappy pic, but I'm sure you all know what one of these looks like by now...
-
Hey, that acer looks a lot like mine.
Would you mind posting a screenshot of your pcie config from everest?
and also, what do you mean by improving performance by disabling another slot? -
PanzerHauptmann Notebook Consultant
-
Here are screenshots of my PCIE config before and after running the bootdisk... As you can see, slot 2 is missing and slot 1 is running in "x2" mode (although it's not really true x2, as explained in Nando4's post).
I actually went with the 4670 because it's extremely cheap (mine was under £50) and because it's the model that Villagetronic offered in the original ViDock, so I guess they knew what they were doing. In the end my boss expressed an interest in the project and bought the card for me anyway so I can always upgrade later on if I'm not happy with it.
I'm not really a serious gamer, I just wanted to play Portal at higher than the 10fps or so I got before! -
The 5720 doesn't have a "gaming" GPU, in fact there is physical space in the casing and solder pads on the motherboard for an MXM slot, but the physical connector isn't actually there.
Like I said in my previous post, I got very lucky with my port configuration. I guess you could beg / steal / borrow an Expresscard from somewhere to test it out and work out which port is which? -
PanzerHauptmann Notebook Consultant
1st card was a Radeon 5750. Scored nearly 10000 on 3dmark06, compared to 5000 with onboard GT130M GPU. Scored a 7.1 on 3d gaming on windows experience index. All "tests" were outstanding, I was thrilled....
Then I tried to play a few games with some moderate-to-high settings (nothing crazy, 1920 x 1200, no AA, some bells and whistles here and there) My FPS were worse than my ONBOARD GPU!!! Dirt2 for instance, was utterly un-playable with the 5750, getting around 10-15 FPS. Same went to Modern warfare 2, MW1, BF:BC2, Batman, CoD4, etc.. Teens FPS all around.
Then I got a GTX460. It's renowned for it's insane overclock-ability, and it performs as such... I ran 3dmark06, scored 5000, less than what my internal GPU scored. Windows experience index? 6.1. LOL.
I wanted to max things out so I decided to return the 460 and get the 470... All benchmark data was the same. crappy crappy crappy. Until I fired up Gears of War... Dirt 2, BFBC, L4D2, and Batman once again. It was quite pleasant to max out all settings and get a buttery smooth 30-40 FPS average with all of them.. Dirt 2 I can max out and get around 30 FPS constant.
DX10 rendered games tend to really excel with the nVidia cards & Vidock setup.. DX11 even better.. Batman looks amazing at 32xAA, as does Mafia II with the GTX470. Ill admit it, newer, "sandbox" style 3D games (Mafia II, Just Cause 2, etc) will have troubles at time, with FPS dropping in the 15-20's, which is playable, but not very much enjoyable..
Good luck thou and congrats on ur achievements...! -
Yeah, the x16 is (at least i think) the MXM, it sits on the northbridge.
My main concern was to why I have an open mpcie slot and an open expresscard slot while only one open slot shows up on everest. -
-
Is it safe to assume that port 1 is my expresscard slot, and that I can do 2x with ports 1 & 2?
I don't have any cards handy to plug in to test with.
Original:
After WiFi card switch:
-
DIY eGPU experiences
Discussion in 'e-GPU (External Graphics) Discussion' started by master blaster, Sep 18, 2009.