I'm going to buy PE4L/PE4H adapter.
Is there any difference between PE4H ver2.4 and PE4L ver1.4, if I'm going to use expresscard1.0 x1?
-
No, the only differences in this case is that the PE4H provides a better stand due to the longer board, has more activation timout options and can be powered with a standard laptop psu up to 19V. If you plan to use an ATX PSU, there is only the stand thing. For most users planning to do a x1 setup, PE4L provides all functionality needed.
-
ok, tnx for explanation. that... timeout options, is that something important (am I going to have problems without it)?
-
Just gave it another try. Same results, but this time I noticed a really quick BSOD.
-
Depends on your laptop. Its just a convenience feature it's nothing important. If you need to use the sleep and wakeup method, you don't even use this. If your BIOS takes not more than 7 seconds to POST (this is the max. timeout time on the PE4L, PE4H has an addtitional setting), you even have no use for the PE4H extra timeout
-
This isn't perhaps something to do with DIY ViDock Setup 1.0e8?
I'm chainloading with Windows 7 and using 36 bit compaction. -
IDK exactly which card you have, but this card - Newegg.com - EVGA 01G-P3-1430-LR GeForce GT 430 (Fermi) 1GB 128-bit DDR3 PCI Express 2.0 x16 HDCP Ready Video Card- on newegg says thisI know you said you bought a new PSU, but you apperlently bought the wrong one (just get the one Nando4 recommended as it works for him, me, and many others without fuss) this is the PSU Newegg.com - CORSAIR Builder Series CX430 CMPSU-430CX 430W ATX12V Active PFC Power Supply
Your PSU is almost certainly the issue (as it was for prikko here -only a few pages back)
-
Nando told me this PSU would be alright. He said he's running a 460 on a 300W at 16A. Well, mine's a 450w at 16A and only running a 430. It should be ample. But I'll give it a go. Hopefully that'll fix it.
-
Like I said, what is your exact video card? The card I quoted is a 430, but it might have different power requirements than yours... but if the card calls for 22A (like the one I quoted did) you will need 22A at minimum (16A isn't 'close enough' for it to work)
The 22A required (again, that is from the card I quoted, not necessarily yours) is needed on the +12 rail alone, since the +5 rails and others do not matter, you cannot just look at the overall Wattage and think it has enough oomph (as it includes all the rails not just the +12)
also, the PSU that I am using (and recommended, and that nando4 usually recommends) has 28A on the +12 rail, more than enough headroom to power your card running at full tlt -
Not all PSUs are created equal, and BS stickers or sneaky games with rail ratings are common. I also think you misinterpreted some of nando's posts.
"450W" with 16A on 12V is a cheap piece of junk, sorry.
Only 192W of that "450W" is available on 12V, which is downright insulting to other modern computer PSUs that don't play dirty tricks. -
Like I said a few pages ago, yes the PSU is important, but please don't overinflate the power ratings! 22A is the recommended for the entire system if you were running a full desktop computer on it. Only very high end GPUs can draw 20A alone.
"450W" with only 16A has got to win the prize for sketchiest PSU labeling I have ever seen though. If you have a unit like that, chances are it's built so terribly that the voltage regulation among other things is horrible, no surprise your card can't run on it even if the card's power draw is way below spec. -
Those were my exact thoughts. It isn't running an entire rig, just a card. The specifications Agent 9 is talking about are the same as mine, but in the small print on the box, it says that's what's needed for a full blown gaming rig. It says on the box that 300W and 22A minimum are needed on a system running a 3.2GHz i7. Well, I'm not even running that.
But yeah, I sent a message to the dude who sold me the PSU, and he's refunded the money for it, so I'll just buy another one. -
Alright, so I'm looking at this. It should be alright me thinks.
-
430 GT needs 60W...... ......
.....
............
...... -
If anyone is interested in picking up an adapter, I have one on sale on ebay. Search up "PE4L - PCIe adapter". I no longer need it as college is over for me.
PE4L - PCIe adapter | eBay -
will this work with my lenovo t500?
-
I bought a pe4h-pm3n for my Dell Latitude D630, and a HD5670, and a fresh install of Win7. It works, but it seems like I'm not getting the numbers I should be. On 3Dmark Vantage, it gets 2800 on the performance. The numbers suggest the card can do 5000-6000. That's about 55%.
The other thing is, I can't get the x1E to work. I have the pm3n in slot one, I removed the WAN card from slot two, I bridge compact, set x1E (also tried x2) at odd port 1, chainload into Win7, Starting Windows comes up, then a blue screen and STOP: 0x0000005 and some similar numbers. According to what I've read, I can run at x1E, so any help would be welcome.
Thanks. -
The setup utility will list if the port the card is on is set up as x2, and if the card is running x2. For Nvidia cards its possible for the port to be set to x2, but the card run at x1, I don't know about ATI cards. Also, some cards crash when you change the port to x2, and require the computer to be restarted. After setting the card to x2, look at how the utility says the card is configured.
-
Hello I am new to this forum and have been trying to get the DIY Vidock to work for the past 2 months and I am at a loss
I have gone through all 300 pages of information some it seems a bit complicated but have still tried and to no avail have had no luck on nado's solutions 1-6 for error 12
I have a Toshiba Satellite P505D-S8935
originally I used the ATI Gigabyte 5750 that was recommended but could not recognize it using setup or express port switch out method.
I have switched to a Palin Geforce 450 now in the setup 1.x it recognizes the card but freezes on startup. in the device manager I still get the same error message "this device isn't using any resources because there is a problem" upon startup, setup, and swtch out method. I don't know what else to do at this point. Can someone help?
Thank you -
Apparently there are issues with the GTX 400 series of nvidia graphic cards. Desktop computers will lockup or freeze when using one of these cards. Would that be an issue for DIY ViDock? I might decide to go with an AMD 6850 over a Nvidia GTX 460 just to avoid the lockup/freeze issue.
-
I also wondered about the VPC-Z WWAN PCI-e but AIDA/Everest doesn't show it up, don't know why. Is it because it's not electrically linked to port 1?
If only it was like port 4, so we could try a x2-system, with EC2C + PM3N ^^
Otherwise what about such a tool ?
(it's doesn't exist I've just made it with existing pic of a PM3N and XCEX)
I thought about it when I saw this from hwtools.net
I do not really understand why anyone would need that tools since it has to be inserted it in the SD port to insert an SD card in the inserted tool
Would be much better to get a HDMI adapter like this one ^^Attached Files:
-
-
WWAN (Gobi 2000) is connected to USB2.0 only. VPC-Z only has 3 PCIe. Advanced BIOS also only shows up 3. At least I don't think there are more than 3 wired on the mobo. x2 on this machine is only possible with port1+2 with unsoldering the RICOH cardreader chip /modifying the cardreader/HDMI daughterboard.
Second that this adapter doesn't make sense. BTW: SD card and PCIe are anything but electrically compatible
-
haha ok ^^
I give up doing anything else rather than my actual x1.opt ^^ -
This seems like a really cool way to overcome the problem that we all have with laptops, having to choose between power and portability!
Is this list (ty #301) still complete/optimal in regards in what you need to make this whole thing work? Was also a list in the original post that seemed a little bit more basic.
1. PE4L
2. GFX of choice (any limitations here that I should know about?)
3. 12V/120W AC adapter with 2.5mm/5.5mm (or 5mm) barrel end
4. Molex splitter/gender changer, with female ends
5. PCI-E to molex (male) adapter
And now to the things that I don't know..
What are the ways that you can connect a vidock-setup to your laptop?
How much is performance hurt by the fact that it's not integrated into the computer like it's "supposed" to be?
Reading the whole 350+ pages seems like quite the tasks, so I'll save that for later.
-
thank you for the reply.
originally I could not get the PE4L to recognize the 5750 and always gets whitelisted no matter what card I use even when I switch the express port card out. I've also tried every possible configuration trying to get the card out of "this device isn't using any resources because there is a problem error on" the resources tab of the video card. Do you have any other ideas. It was weird because the NVIDIA card was recognized but also gave me the same error. I am also going to try the new video card as well. if you have any other suggestion let me know.
Thank you -
User Retired 2 Notebook Nobel Laureate NBR Reviewer
Intel have locked down their Series-6 chipset. The x16 dedicated graphics port and x2/x4 width settings for the mPCie/expresscard pci-e ports can only be set via the flash descriptor which requires pretty advanced modding to change. Setup 1.x can no longer change these.wicked20 said: ↑Setup 1.x's (Video card->Hybrid Graphics.dedicated=off) is not working for my L502x.
Does any1 know when Setup 1.x might support this?
Could someone w/ Nvidia Hybrid share similar experience or problems and solutions?
I also tried 32bit W7 but same problem w/ "Nvidia Display settings are not available"Click to expand...
The experimental solution to get a x1.Opt setup on your IGP+NVidia hybrid graphics Series-6 system would be:- disable the NVidia component of your hybrid graphics in Win7's device manager.
- Install the Series-6 chipset aware Setup 1.0f-pre7 from here. Do a 'Video cards.Initialize' on the desktop GTX460 . If using 4GB of RAM then perform a 36-bit PCI compaction in Setup 1.x. Force your IGP and dedicated graphics to 32-bit. This is in effect hijaaking the 32-bit resources used by the onboard NVidia gpu. If using 2GB then just perform a 32-bit PCI compaction on all the gpus.
- Install the 270.61 Verde driver + nando4 nvam.inf from here.
There's a problem with using the PE4L 1.5 or PE4H 2.4 PCI Reset delay on drivers prior to 270.51. So set that to the 0sec position while testing AND use 'Video Cards->Initialize' with Setup 1.0f-pre7. OR load the latest Verde 270.61 + nvam.inf from here.theremin123 said: ↑I have switched to a Palin Geforce 450 now in the setup 1.x it recognizes the card but freezes on startup. in the device manager I still get the same error message "this device isn't using any resources because there is a problem" upon startup, setup, and swtch out method. I don't know what else to do at this point. Can someone help?
Thank youClick to expand...
Ensure that chainloading to Win7 presents no errors. Then enable one thing at a time (i) x1E port mode and (ii) 32-bit PCI compaction on your video cards. A WWAN card typically uses USB pins so does not need to be removed.radar816 said: ↑The other thing is, I can't get the x1E to work. I have the pm3n in slot one, I removed the WAN card from slot two, I bridge compact, set x1E (also tried x2) at odd port 1, chainload into Win7, Starting Windows comes up, then a blue screen and STOP: 0x0000005 and some similar numbers. According to what I've read, I can run at x1E, so any help would be welcome.Click to expand...Last edited by a moderator: May 7, 2015 - disable the NVidia component of your hybrid graphics in Win7's device manager.
-
A NVidia GTX 580 x1.Opt implementation - where are the limits of Optimus?
________________________________________________________________
We already know that Optimus brings a significant speed boost to our systems by compressing data streams over PCIe 1x. Now I went out and got a GTX 580 card (MSI N580GTX TwinFrozr II) and hooked it up to my DIY ViDock system. The GTX 580 is currently the strongest single-GPU card.
My intention was to figure out how far Optimus is able to scale performance with high end cards. In this post, you'll find my different benchmark results and some brief observations.
Hardware Setup- Laptop: Sony Vaio VPC-Z11
- i7-620M (2.66 GHz, during all benchmarks, it ran at constant 3.06-3.19 GHz Turboboost)
- HM57 Chipset
- 8 GB DDR3 (running @ 1066MHz)
- Intel HD Graphics
- Geforce GT330M deactivated in BIOS (static switching mode)
- PE4L-EC2C hooked up to a 650W Corsair PSU
- 1.2m mHDMI cable between PE4L and EC2C
- Windows 7 Pro x64
GPU-Z info about GTX580 in DIY ViDock:
Note that Series-5 chipset claims PCIe 2.0 connection but runs only at half 2.0 bandwidth (=PCIe 1.1 bandwidth, 2.5GT/s)
Driver Setup:
Intel: 2342 x64
NVidia: 270.61 x64 WHQL w/ nandos's inf, default driver settings, PhysX set to CPU
Connecting DIY ViDock:- Boot Win7
- Switch on 12V and 5V on PE4L board, as well as 12V via PCIe connectors
- GPU fans start spinning
- Plug in EC2C card, install drivers, reboot (No sleep-wakeup required)
- Remove EC2C when powering up the laptop (black screen lockup when connected on BIOS POST)
- Bootup and reconnect to the running machine, set your monitor connected to DIY ViDock as primary screen
- No DIY ViDock setup tool needed.
Benchmark Results
DX9
3DMark06 (default settings, 1280x1024):
RE5 DX9 (1280x800, variable, default settings)3dmark06 SM2.0 SM3.0 CPU Proxycon Firefly Canyon Deep Freeze 16270 6769 9095 3227 55.75FPS 57.06FPS 111.75FPS 70.15FPS
Area 1 Area 2 Area 3
Area 4
Average
200.4 FPS 95.5 FPS
101.7 FPS 99.9 FPS
117.5 FPS
DX10
3DMark Vantage (default settings, DX10, 1280x1024):
dmcv4 DX10 (default settings)Main
scoreGPU
scoreScene1 Scene2 P13847 17956 55.14 FPS 49.97 FPS
Scene 1 Scene 2 Scene 3
Scene 4
218.00 FPS 193.11 FPS
230.74 FPS 128.79 FPS
Power consumption and temperatures
(max. Temp/max. Power is under full load conditions)
max. Temp max. Power
max.
ClocksIdle
Clocks
Idle Power
GTX 580 56°F (132°F)
330W
800c/1024m/1600s
50c/67m/101s
45W i7-620M
84°C (183°F) -
3.19GHz
1.33GHz
-
Notes:- CPU load during all benchmarks and gaming never went over 70% on two cores (w/ and w/o HT enabled) - I think this could mean that compression of the data stream is saturated and a stronger CPU (i7-640M, SB CPUs running x1.Opt at PCIe 1.0 connection) may not improve performance any further
- Overclocking the card (up to 20%) does not gain noticeable performance boosts in DX9, in DX10 a slight but not that significant boost (5-8%) can be measured
- Deactivating Nvidia HD Audio Controllers gave a slight performance improvement, most likely due to freeing some PCIe bandwidth
- Most games and benchmarks do rarely use more than 500-600 MB VRAM, so it's kinda wasted bandwidth when new textures are needed resulting in fps drop-offs/stuttering
- I was not able to make Optimus output to the internal laptop screen, can anyone confirm?
- Plug-and-Play works great without sleeping the pc, but there seems to be a hardware reassignment after detection of the GPU, so network adapters get disconnected and reconnected (takes about 5-10 secs)
- Setup worked out of the box and is ultra stable so far. With this setup, I played through Crysis 2 the last days on FullHD resolution with extreme details, runs great.
Conclusion:
This GTX 580 is an awesome GPU and seems to get most out of x1.Opt link with a Core-i CPU. Anyhow, DX9 performance does not get a noticeable performance hit over a GTX460 (which is 3x cheaper and consumes only ~33% the power). In DX10, this Fermi chip is able to show its great potential and leaves weaker GTS/GTX 4x0 cards in the dust - a 30% DX10 performance gain over a GTX460 is realistic. If you have a Intel Series-5 chipset (Core-i CPU with Intel HD Graphics), a GTX460 still gives you best bang for the buck atm.
Acknoledgements:
Thanks and kudos for all the great work in this topic go to:
nando4, Nautis, MikjoA, all others who worked on x1.Opt stuff and posted their setups and not to forget NVidia.Last edited by a moderator: May 7, 2015 -
King of Interns Simply a laptop enthusiast
Is there any ATI equivalent of Optimus in the pipeline?
-
Nice writeup. It would be interesting to see how a 580 scaled with PCI-E overclocking. Feel free to send me your 580 to do x2 testing!pyr0 said: ↑A NVidia GTX 580 x1.Opt implementation - where are the limits of Optimus?Click to expand...
Also, I made a post on the Nvidia Optimus forums asking about getting around the OPtimus hardware requirements. Optimus with 2 Nvidia cards? (no intel IGP) - NVIDIA Forums -
guys, my current Acer Aspire 7720 is dying on me and i figured it is time to get a new notebook.
I want the focus to be on a SB system which has decent gpu capabilities of its own but will make the most optimal use of my ViDock when i want to use it for gaming etc.
My current interest go out to the Clevo W150HR/ W170HR
These systems have all the requirements i need: relatively low price (less than 1000 euros), decent midrange dGPU with optimus for improved portability/battery/life on iGPU and most important an optimus VD setup.
The problem i have with these systems is the number and accessibility of the mPCIe slots. The W150HR shows two slots; 1 for half-height WLAN card with PCIe and USB interface and 1 for a mini-card with a USB interface. The W170HR show only one slot; 1 for half-height WLAN card again with PCIe and USB interface.
Can anyone provide some advice on the usability of a VD with these systems; more specifically if the PE4H+PM3N will work with these systems. I am not familiar with half-height mini-cards but google shows these are half the size (duh) of normal mini-cards and i am wondering if it will fit.
If these systems are not suitable are there any other SB system which would meet my requirements? -
Hello everyone,
I am at a loss with no knowledge of hardware.
I am really eager to do a ViDock. But i have no express card slot. In the first page of this forum number 3 article says without having no express card slot its possible to do a ViDock by the m pcie slot.
But how do i get to know that i have that one or not??
How does it look like even??
Additional Informations i can give are :
I have no HDMI port.
From AIDA i got to know:
PCI-E 2.0 x1 port #1 Empty
PCI-E 2.0 x1 port #2 In Use @ x1 (Dell DW1501 Wireless-N WLAN Half-Mini Card Network Adapter)
PCI-E 2.0 x1 port #3 In Use @ x1 (Atheros AR8132 PCI-E Fast Ethernet Controller)
PCI-E 2.0 x1 port #5 Empty
I have SD card slot.
There is an ATI HD4330 512 VRAM in my laptop. (4 GB RAM)
No internal graphics support.
So how do i do the ViDock??
Please give me solution. -
pyr0, great job! Thank you for sharing your amazing results. I'm thinking about getting rid of GTX 460, as far as i have lag/shutter in some games such as GTA 4 or Prototype(!). Could you please try playing GTA 4 and share your experience?
-
@Pyro
I did a test on the internal display with a Asus GTX 580. I got a score of 10554 3DMarks06, so it is working. One strange thing I have noticed though is that I get about 1k higher score if I use Setup1x to configure the PCIe ports as x2. Maybe this is because the ricoh controller get removed from the bus...
Result -
I got it working as well, but having crappy performance with that. I'm using 270.61 drivers and I can get it only output to internal lcd when booting with the GPU connected (I plug it in right after POST, when RAID is checked).trylle said: ↑@Pyro
I did a test on the internal display with a Asus GTX 580. I got a score of 10554 3DMarks06, so it is working. One strange thing I have noticed though is that I get about 1k higher score if I use Setup1x to configure the PCIe ports as x2. Maybe this is because the ricoh controller get removed from the bus...Click to expand...
What I don't understand is, what has the Ricoh controller to do with x2 mode? It is on port2 and EC is on port3. If you would like to set up x2, you would need p1+2 or p3+4 - am I missing something? -
When I reconfigure the the ports to 2x mode the ricoh controller disapperars. Only port 1 and port 3 remain. I am only using a single cable to the 1x port on the dock, so I think it is still x1 even though the ports are configured as 2x.
Unless I am mistaken I would need a second cable to port 4 if I was to get 2x running, and I don't think this is possible for us. So I am also at a loss as to why I get better performance this way...
I also have to disconnect the dock when the bios boots. Same as you I can connect it after the bios is done, or I can connect it with windows suspended. -
#2 is could be a useable mpci-e port... as for the others you'd have to take a look at themGamer90 said: ↑PCI-E 2.0 x1 port #1 Empty
PCI-E 2.0 x1 port #2 In Use @ x1 (Dell DW1501 Wireless-N WLAN Half-Mini Card Network Adapter)
PCI-E 2.0 x1 port #3 In Use @ x1 (Atheros AR8132 PCI-E Fast Ethernet Controller)
PCI-E 2.0 x1 port #5 EmptyClick to expand...
it looks like this
http://tombeauchamp.com/misc/ssd and wlan slot.jpg
http://www.lckdanny.com/images/dell_1555/18.jpg -
wicked20 said: ↑Some progress, I got gtx460 running 3dmark w/ external hdtv but crashed half way. Strange, that I cannot get that working again but bsod at login or error 43 or the other previous problems.wicked20 said: ↑Hi,
I'm hoping more users can help test and improve since many newer laptops could be similar to mine.
History: Got HD5870 working w/ Setup 1.x 3DMark06 = 18024
Now I'm using only 2GB RAM,
Device Manager recognizes GTX460 fine but using Nvidia Property says:
"Nvidia Display settings are not available"
"You are not currently using a display attached to an NVIDIA GPU"
W7-64-Ult sometimes can output 1080p from GTX460 but cause BSOD.
W7-64-Home Prem (shipped w/ system) doesn't ever output 1080p from GTX460 and no BSOD. However, using new 270.61 from laptop2go Nvidia Property recognizes GTX460 only as PhysX. I tried all 4 previous versions of drivers notebook, desktop, and Nando modded inf. I can use MSI Afterburner to control GTX460 fanspeed.
We appreciate any ideas.
Much thanks again.
Could someone w/ Nvidia Hybrid share similar experience or problems and solutions?
I also tried 32bit W7 but same problem w/ "Nvidia Display settings are not available"
Much thanks.Click to expand...
It seems inconsistent, sometimes I get 256M free in Setup.1x and other times, 512M doing the same thing???
Could someone help modify inf:
525m hardware ids:
VEN_10DE&DEV_0DF5
I may have not done it correctly for desktop, notebook, laptop2go.com and 270.61 Verde driver + nando4 nvam.inf versions.
Much thanks to all.Click to expand... -
Since someone got this working with L502X does this mean I can use Vidock with the Thinkpad T420? Or should I go with something not Sandybridge to ensure I can use Vidock? I'm close to pulling the trigger on a T420 with Intel HD graphics but I'm not sure if it'll work with Vidock so I'm hesitating.
-
Got my Acer Extensa 5420 running a HD 3450 card, just using the card for testing purposes. Everything works great except I am getting some "flashing" from the screen anytime I move anything quickly on the screen it goes black for a moment, that includes new windows or even moving the mouse. I dont know if it is an issue with my card or with my setup, it didnt seem to do it before I loaded updated drivers from ATI. Hopefully when I test with a different card I wont have the issues.
-
I'm still getting a STOP: 0x0000005c error when I try to boot after enabling x1E/x2 on my Dell Latitude D630/HD5670.
I'm running Windows 7, and have the latest drivers, and the card works in x1 mode. I took my wireless card out of slot two and slot one has the pm3n
I boot into ViDock
Chainload...OK
Set slot one to x1E...OK
Chainload...OK
Compact (bridge mode)...OK
Chainload to Win7...Blue Screen.
My other question is if x1E performance is equal to x2 performance. I want to know if I should keep trying for x1E or just spend $20 to get another pm3n and hook up x2. -
User Retired 2 Notebook Nobel Laureate NBR Reviewer
Your D630 can do x1E, x2 and x2E. x2E is a x4 link using only the first 2 lanes and is 15-30% faster than x2, which is 15-30% faster than x1E which is 15-30% faster than x1. See JamesBond007's P8400+HD5770@x2E implementation results.radar816 said: ↑My other question is if x1E performance is equal to x2 performance. I want to know if I should keep trying for x1E or just spend $20 to get another pm3n and hook up x2.Click to expand...
Based on your last message it appears that the Bridge compaction is causing the BSOD. Suggest do a 32-bit compaction using only the onboard + external GPU, ensuring you a \config\devcon.txt present. Latest Setup-1.0f-prex from here has depreciated Bridge compaction in favor of a selective gpu compaction and is recommended for your purpose.
Added to Nando4's 270.61 64-bit nvam.inf here. If you need to further mod that file just search for "Nando4" to see the three sections where edits were made.wicked20 said: ↑Could someone help modify inf:
525m hardware ids:
VEN_10DE&DEV_0DF5Click to expand...
If those games are optimized for DX9 then you may find a HD5770 or better outperforms a GTX460. A GTX580 will not get rid of the lag and at best gives 30% improvement over a GTX460 at substantially more cost. Would suggest picking up a cheap HD57xx/HD68xx for such occasions instead.AgentYura said: ↑I'm thinking about getting rid of GTX 460, as far as i have lag/shutter in some games such as GTA 4 or Prototype(!). Could you please try playing GTA 4 and share your experience?Click to expand... -
Hi, I've seen this topic back over a year ago when this was a new idea, and just recently came back to see the progress. In the past, I've been hesitant about this solution because it would only allow the use of an external display, but I noticed there is an option for using the internal display with Optimus. My question is, with the system in my signature, can i use the Optimus method? If so, could someone explain what I would buy/need to do as, even after reading the explanation on this from the first post, I still have no idea the steps for making this work. Can I just disable the my ATI graphics card in BIOs or is the setup 1.x thing necessary?
-
Optimus only caters to laptops with Intel HD Integrated Graphics and external Nvidia GTX 4xx/5xx cards.XFlameWithin said: ↑Hi, I've seen this topic back over a year ago when this was a new idea, and just recently came back to see the progress. In the past, I've been hesitant about this solution because it would only allow the use of an external display, but I noticed there is an option for using the internal display with Optimus. My question is, with the system in my signature, can i use the Optimus method? If so, could someone explain what I would buy/need to do as, even after reading the explanation on this from the first post, I still have no idea the steps for making this work. Can I just disable the my ATI graphics card in BIOs or is the setup 1.x thing necessary?Click to expand...
-
Yes that's true, but the T500 has switchable graphics, and you can turn off the dedicated and use only the integrated 4500m. Would that allow for Optimus?ruhtraeel said: ↑Optimus only caters to laptops with Intel HD Integrated Graphics and external Nvidia GTX 4xx/5xx cards.Click to expand...
-
Much thanks.nando4 said: ↑Added to Nando4's 270.61 64-bit nvam.inf here. If you need to further mod that file just search for "Nando4" to see the three sections where edits were made.Click to expand...
Sadly, it's still not working: same results as before.
Do you know if your modded nvam.inf can be used with desktop drivers? What do we need to do to use desktop drivers, renaming nv_disp.inf to nvam.inf?
Why desktop drivers have nv_disp.inf instead of nvam.inf?
Much thanks again. -
Now this is confusing, you'r recommending ATI over Nvidia, i know, it's stuped that i still didn't buy the whole thing for a special reason. Thank you Nando4 for making GPU upgrade for my low performance high cost portable computernando4 said: ↑Your D630 can do x1E, x2 and x2E. x2E is a x4 link using only the first 2 lanes and is 15-30% faster than x2, which is 15-30% faster than x1E which is 15-30% faster than x1. See JamesBond007's P8400+HD5770@x2E implementation results.
Based on your last message it appears that the Bridge compaction is causing the BSOD. Suggest do a 32-bit compaction using only the onboard + external GPU, ensuring you a \config\devcon.txt present. Latest Setup-1.0f-prex from here has depreciated Bridge compaction in favor of a selective gpu compaction and is recommended for your purpose.
Added to Nando4's 270.61 64-bit nvam.inf here. If you need to further mod that file just search for "Nando4" to see the three sections where edits were made.
If those games are optimized for DX9 then you may find a HD5770 or better outperforms a GTX460. A GTX580 will not get rid of the lag and at best gives 30% improvement over a GTX460 at substantially more cost. Would suggest picking up a cheap HD57xx/HD68xx for such occasions instead.Click to expand...
.
-
I think it should work...XFlameWithin said: ↑Yes that's true, but the T500 has switchable graphics, and you can turn off the dedicated and use only the integrated 4500m. Would that allow for Optimus?Click to expand...
If it was me I would buy it, set it up, and try to get it working.. but that's me -
So I bought a PE4H and GTX460 in an attempt to build a ViDock for my new Thinkpad X220...
Everything seemed to be working fine after the initial setup. I installed the Verde 270.61 drivers and the graphics card showed up in device manager. Ran some tests, played around for a few hours, then rebooted and it's been consistent BSOD after every Windows login.
Blue screen says something about drivers mismanaging system PTEs. I guess I missed something in the set up, just not sure what.
On a possibly related note, I tried to follow nando's instructions on overwriting the nvam.inf file, but can't find any file by that name to overwrite. Searched and nothing. Still need to get the Optimus drivers working, if I can get past the BSOD monster.
Anyway, hope you guys can help. I've been searching the forum for answers but frankly, a lot of this stuff is over my head. I could use some direction troubleshooting. Thanks
DIY eGPU experiences
Discussion in 'e-GPU (External Graphics) Discussion' started by master blaster, Sep 18, 2009.

