The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Z1 + ViDock + GTX 570 > FTW (the Z2/PMD killer)

    Discussion in 'VAIO / Sony' started by ComputerCowboy, Jul 29, 2011.

  1. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Sadly my ViDock 4+ has now completely failed. I was using the workaround for the delay start, by plugging the express card in just after powering on the laptop, but the card was working OK, until last night when:

    1. The 2 x monitors (running from ViDock DVI) switched off on their own. These 2 monitors were the only monitors connected at the time, and was running in stamina.
    2. I heard crackling noises coming from the ViDock
    3. Smoke was rising from the ViDock

    I turned off all ViDock power quickly. Inside the ViDock there is evidence that one of the components has burned out! I have contacted vidock support, and am waiting for a response. Not happy :(
    [​IMG]
     
  2. Qaenos

    Qaenos Notebook Consultant

    Reputations:
    0
    Messages:
    134
    Likes Received:
    0
    Trophy Points:
    30
    This is really bad news. Anxiously awaiting to hear what the cause of this was. I'm following this thread and the potential Z2/LightPeak ViDock solution closely as I really want to pull the trigger on the Z2 if it can be used like how you guys have these Z1s with the ViDock 4+. However, if all this hacking is unsafe, i.e. causing component failures like this one, it might just not be worth it...
     
  3. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    ViDock 4+ is underpowered for a GTX570

    Not surpising if you were driving a GTX570 - they often require in excess of the 225W peak power as shown here. The ViDock 4+, being a 225W device, would be overdriven hence the reason for what appears to be a power fuse blowing.

    Good enough reason to seek a refund?

    For that reason as well as the ViDock 4+ being found here and here to not be pci-e 2.0 compliant contrary to the advertising, you may consider seeking a refund and implement a DIY eGPU instead. It will be cheaper, have no power issues, soon be pci-e 2.0 compliant upon release of a new cable and has a $20 a mPCIe connectivity option. The last useful to attach it to say a 900P 13" Asus UX31 ultrabook if it has an accessible mPCIe slot.

    See the DIY eGPU experiences thread if interested. There are several Z11-Z13 DIY eGPU implementations shown there.
     
  4. TofuTurkey

    TofuTurkey Married a Champagne Mango

    Reputations:
    431
    Messages:
    1,129
    Likes Received:
    2
    Trophy Points:
    56
    This thread no longer makes me want to spend my monies :|

    I'm sorry for your loss :( I hope the monitors and GPU card are ok.
     
  5. Qaenos

    Qaenos Notebook Consultant

    Reputations:
    0
    Messages:
    134
    Likes Received:
    0
    Trophy Points:
    30
    I hope I am not hijacking the thread, but is the DIY eGPU solution for the Z2 in the same state as the ViDock solution, i.e. waiting for Light Peak specs / cables / adapter?
     
  6. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    bplus have expressed interest in making a Thunderbolt eGPU solution which as you know is difficult to get the transmitter/receiver chips and specifications for. LP is in the same category. bplus haven't detailed any plans of making a Sony LP solution.

    Since bplus are very close to getting pci-e 2.0 transmission going with an active cable then the quickest way to getting a solution would be lifting the HD6650M dGPU on the PMD, attaching an interposer chip in its place with 4 mHDMI connectors, then thread 4 pci-e 2.0 cables for each lane out of it to connect to a PE4H. You'd also want to thread the DVI/HDMI from the video card to attach to the DVI lines if wanting to drive the internal LCD. This I detailed here.

    The hardest part would then be designing the interposer chip and getting the IR BGA gear to swap the interposer in place of the HD6650M. Such a solution bypasses the whole LP layer and works on the final pci-e and DP signals that are sent/received by the eGPU.
     
  7. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Quite down in the dumps about all this, after all the effort and expense and after being so close to getting this working.

    The only positive I can draw from this is that I was lucky I was sitting there at the time. I have a tendency to go out or go to sleep leaving my electrical devices running, so who knows what could have happened. I was not running video at the time, I was installing SQL server and the installer window was the only window visible on the desktop, so was hardly pushing the video card to its limits.

    Got to start again on another path. Now that I've experienced 2 x WQXGA, I think that ultimately now I want to go for 3 U3011s. From what I understand, the only single card capable of driving 3 x WQXGA is GTX590 and nobody has even got one of those working with a DIY ViDock, due to PCI memory allocation issues that I can’t get my head round. So basically the options are:

    1) DIY ViDock (using my GTX570 assuming it still works), and then later try driving a 3rd Dell U3011 via the laptop's HDMI at 35Hz as per ComputerCowboy's workaround. Previous instability issues mentioned in this thread could well have been due the ViDdock 4+ struggling to cope with the GTX570
    I'm certainly no electrical engineer, and get the feeling that the DIY route might be another minefield, and prefer my electronics to be neat and tidy, with a relatively low risk of them burning me to death.

    2) Give up and get a desktop. Fit it with either a) the existing GTX 570 + another GTX 570, or b) a GTX590 and remote desktop to the Vaio Z1 over the LAN using the new multi monitor features in remote desktop that I know work in W7 Ultimate/Enterprise.

    Any thoughts/suggestions?
     
  8. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I suspect you got a bad dock from the start, you were having issues before it burnt out.

    Village told me that the GTX570 was supported, if they got it wrong it is their fault.

    Remote desktop over Gigabit ethernet should provide an OK experience until you want to play video.

    If you want more power for your GTX570 you don't need a DIY dock, just get an old ATX PSU and run the card off of that...

    In the mean time you have the monitors and the Z1, have you tried running just one with my U3011 35Hz driver? Note that you will need an HDMI>DVI cable, or the Z1 dock and a DVI cable.

    I've been running the U3011 without the ViDock since I figured this out, it runs great... but 24P video was a little choppy... so I made a U3011 24P driver. The only rub on the 24P driver is that the mouse movement on the desktop is not as smooth.
     
  9. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Easily done. Just buy a modular power supply and attach the cables neatly. Eg: Newegg.com - SILVERSTONE Strider Plus ST50F-P 500W ATX 12V v2.3 & EPS 12V 80 PLUS BRONZE Certified Modular Active PFC Power Supply . You'd have 12V/34W*0.8=326W. Plenty to drive your GTX570. It has an optional $20 short cable kit if the supplied ones are too long.

    Could also get a GTX570 backplate if your concerned it might touch something when running out in the open on a DIY eGPU's PE4H.
     
  10. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    @nando4... where is the 411 on how to set up a delayed start on the DIY eGPU. If I was to build one I'd want a delayed start, and I'd like it to start/stop the ATX PSU.

    Also where is the proof that a GTX 590 won't run? Has anyone tried it?
     
  11. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    The ViDock4+ has the advantage of powering down the ATX PSU when you power the system down. The DIY eGPU version needs you to do that manually by flicking the switch on the SWEX.

    There is a 0, 7 or 15 second PCI reset delay slider switch on the latest PE4H 2.4 (see left hand side). I found the 15 second selection worked great during a cold boot on the 2530P which would otherwise hang hang if it saw the eGPU attached. The delay however didn't activate on the 2530P if I did a warm reboot.

    A GTX590 is in essence two GTX580 on a single bus internally SLIed. I did have user NewNET try to get one going in the DIY eGPU thread but it would consistently start up with an error 12 which once resolved become an error 43. We tried quite a few things to fix but never resolved it. The same user had a GTX580 working without issue.

    If the aim of the installation is simply to drive multiple monitors, then there are ATI/AMD options. Would need to check which can run the high-res dual-link DVI arrangement. Keep in mind that unlike the Optimus setup these ATI/AMD cards have no pci-e compression so fullHD video would stutter. There simply isn't enough bandwidth to transmit it over the x1 link.

    -any HD54xx or better card can drive 3 LCDs. The third requiring a special adapter.
    -ATI 5870 Eyefinity edition providing up to 6 LCDs off a single card.
    -ATI FirePro 2450, 2450x1, and 2460 to drive up to 4 monitors from a low power card
     
  12. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    FHD Video is a must
    ATI/AMD is a no go for me anyway, I don't like them

    You bring up an interesting point, the way I have my 570 set up I don't think it does any Optimus stuff, but I can play HD video scaled up to WQXGA with no stutter. (the video in question is either MKV rips, no transcoding, or straight off the disc with WinDVD BD) Is it doing some sort of compression even though I didn't specifically install any Optimus components?
     
  13. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    It sounds to me like you have the Optimus pci-e compression engaged. It's not something that NVidia ever advertised rather it was something that I discovered while doing some Optimus-related tests for Nautis. If you've installed the Verde driver with a custom nvam.inf containing your GTX570 then you'd get the x1.Opt pci-e compression.

    You can check if the pci-e compression is engaged by by running 3dmark06. If you get ~5k then pci-e compression is disabled. If you get > 12k then it's enabled. It's quite a big performance difference between the two.

    I believe NVidia underquote the power requirements for the GTX470/GTX570 or greater cards. I'd only feel comfortable running GTX560TI on a VD4+'s rated at max 225W. That's coming from someone that blew up 2 'budget' ATX PSUs driving a GTX470 while stress testing it using FurMark. The VD would need two of it's 150W adapters to be able to drive a GTX470/570 at full 3D load without any issues.
     
  14. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Yep, I would really want a delayed start too.

    From
    www.HWtools.net ??????????



    Also check out NewNet's experiences here:

    http://forum.notebookreview.com/gaming-software-graphics-cards/418851-diy-egpu-experiences-457.html


     
  15. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I get ~15K on 3DMark06, I guess that means compression is engaged
     
  16. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Still waiting for a response from Village. They are not quick in responding.

    Is this assuming the village replace the vidock 4+ card? i.e. run it within the vidock 4+ case with the vidock card, with a ATX PSU powering the GTX570?

    Will give it a try, but quite keen to get back to 2 x U3011s (and at some point adding 1 more)
     
  17. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I can forward you their response telling me that the GTX570 would work if it helps.

    [​IMG]

    They didn't tell you directly, but they told me, I ran it posted here and you followed. I seems reasonable that they should send you a new unit. You didn't abuse the product in any way.



    I know you want it all, I am the same way, I am just saying, in the mean time you may as well run one of them. I've been running without the ViDock for a couple days and I haven't had any problems yet.

    Frankly I am interested in what Village has to say. Are they going to retract support for said card? I can't have a fire on my desk. I'm not in the habit of shutting down my computer when I leave my office.
     
  18. TofuTurkey

    TofuTurkey Married a Champagne Mango

    Reputations:
    431
    Messages:
    1,129
    Likes Received:
    2
    Trophy Points:
    56
    It's Labor Day today, a holiday for many people.
     
  19. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    What is a 2530P? A warm reboot delay would be nice, we know it can be done because the ViDock does it. Is anyone working on this?

    I would think it would also be possible to wire the ATX PSU to the P4H board and have it turn ON/OFF?

    These are the types of things that make the DIY route less favorable to me. If I had figured out that I can run WQXGA with an EDID hack I probably wouldn't have even gotten the ViDock. I bought the U3011 first, couldn't get it running, so I ordered the ViDock with overnight shipping.


    I'm not sure that matters, I think support is from China.
    I heard a rumor somewhere that the guy was having some health problems or something. As you can see from my post above they used to be pretty snappy with their response, something changed.
     
  20. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    It's a HP laptop
     
  21. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Well.. given you don't really need the VD4+, it's insufficiently powered to drive a GTX570 and it's not pci-e 2.0 compliant as advertised then perhaps consider a refund and call it a day? Can put the bickies towards a Z2 and TB solution if and when it is available.. or acquire more Cognac :) Your credit card might have some buyer protection type insurance so could compensate you if VT present obstacles to refunding you.

    You could prove to yourself if the VD4+ is underpowered. Just time how long Furmark runs on the GTX570 before the VD4+'s fuse blows.

    As for the DIY eGPU, to have it auto power on/off with the system would require patching a signal that goes LOW (GND) when the system turns on to the ATX PSU in the same way as the paperclip trick. You can see the PS_ON (black) is connected to GND (green) to switch it on. Two candidate signals that could do that are:

    * the low hotplug detect (DETECT) signal on the PE4H
    * add a logic gate to invert the 3.3V (HIGH) signal received by the PE4H (more complex)

    I haven't test either but I did suggest to bplus several weeks ago to consider incorporate such a feature in their next hardware revision. Got neither a yay nor nay response from them.
     
  22. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    all they would really have to do is to put an ATX plug on the PE4H, or make some sort of daughter board that had an ATX plug which could be wired to the PE4H.

    You keep talking about PCIe 2.0 and you have an obvious dislike for Village, but does the Z1 even support PCIe 2.0 over ExpressCard?
     
  23. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
  24. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
  25. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I don't think that thing requires any additional power! You could run that from a basic ViDock, or a DIY ViDock with ease.

    I am extremely tempted to buy the four port one and give it a go.
     
  26. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    The Series-5 chipset equipped Z1 doesn't support PCI-e 2.0 5GT/s speed. You'd need a Series-6 chipset for that, eg: Lenovo X220, Toshiba R830, Sony Z2. I make a point of it because you've been delivered tech that is underspecced. I've made a similar point about Lenovo T61 (SATA-II capped) and Toshiba R830/R840 (SATA-III capped).

    Yeah.. those don't have pci-e power connectors. Means the 75W provided to the slot is sufficient to power them. Though if they don't do some link compression your videos FHD+ videos will stutter.

    I know you were interested in the Z2 but it has limited external monitor support. Another option may be to wait and see what the Displaylink DL-3000 USB 3.0 devices offer. They are scheduled for release in Q3-2011. There's one such product on ebay already USB3.0 to HDMI converter, UP to 1080p Solution 10048 | eBay. Not sure if they'll go beyond 1080P though.
     
  27. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I am aware of the upcoming DisplayLink USB 3.0 solutions. It says on the DisplayLink site that they will offer WQXGA in the future. HD Video would suck on DisplayLink also.

    I was thinking of trying the Matrox card, anyway.

    The thing about multi-monitor and HD video is I don't need them both at the same time. I only have one monitor running when I watch a movie, I usually use my projector to watch movies and there is no ViDock involved in that at all. So basically I can try the Matrox card and if I like it, I can either ditch the GTX570... or have one dock for GTX570 and one for the Matrox card. I am more likely to just get a second dock.

    I know I can get at least one WQXGA display going on the Z2 with the EDID/DTD hacking technique. In theory I could get two, one from the PMD and one from the laptop, we shall have to wait and see how the AMD card likes the hack.
     
  28. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    It does on USB 2.0 solutions. USB 3.0 [on a Series-6 chipset] has 10 times the bandwidth so will fair bit better. If you got a USB 3.0 expresscard for your system it would be 5 times more bandwidth. Displaylink also had some link level compression to extend that bandwidth further.
     
  29. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    That is not my impression of DisplayLink. I have DisplayLink on my VAIO VGP-UPR1... but I've never used it.

    I used to have a MIMO monitor and I found the performance to be pretty crappy. This was a couple years ago, maybe DisplayLink is better now.
     
  30. beaups

    beaups New Jack Hustler

    Reputations:
    476
    Messages:
    2,376
    Likes Received:
    4
    Trophy Points:
    56
    ^You might be missing the point. Currently displaylink is limited to usb 2.0 speeds....3.0 could bring bandwidth to make performance reasonable. Could...
     
  31. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I understand that USB3 DisplayLink is better than USB2 DisplayLink.
    When the high resolution adapters finally arrive I will almost certainly try one, what do I have to loose? A hundred bucks? Seems like a fine gamble to me. Right now DisplayLink @ WQXGA is vaporware.
     
  32. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Thanks for the info nando4!

    If I wanted to keep things as silent as possible with a DIY Vidock setup would:
    Silverstone Nightjar 500W Fanless ST50NF or a Silverstone Nightjar 400W Fanless SF40NF suffice?

    nvidia state that Minimum Recommended System Power is 550W for a GTX570. Is this because they are normally used to power the CPU as well?
     
  33. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    Yes. NVidia's power estimates are based on total desktop system power. Techpowerup measured a GTX570 finding it required 298W peak power.

    So from your PSUs, the 400W one says 82-86% efficient. It's 12V/27A.. so max would be 265W, not enough . You'd need the pricier 12V/38A one. However...

    A fanless AU$179 400W Seasonic X400 is 12V/33A@80%=316W. That's enough to drive a GTX570.
    A fanless AU$199 460W Seasonic X460 is 12V/38A@80%=364W. That the one to go for if you'd be overclocking the GTX570.

    Those Seasonic PSUs can be had cheaper for $20 less from IT Estate. They also have a modular cabling system.

    Actually, the Seasonic X400 review measured it at 90% efficiency so the 400W version would be 12V/33A@90%=356W. So there's even overclocking headroom. I'd still pay the extra $20 for the 460W version if looking to overclock to the card's max capability.
     
  34. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Nice one! Just before I do this, say for instance I wanted to later change the card to ATI HD 5870 Eyefinity 6 (so as to drive 3 x WQXGA) would that same 460W power supply still cut the mustard?
     
  35. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    A HD5870 requires 212W peak so the Seasonic 400W exceeds the requirements. A HD5450 requires 13W peak (!!). The latter could be drive by an $20-delivered 19V/90W AC adapter (PE4H) or $13-delivered 12V/60W adapter (PE4L). A passive cooled HD54xx would be an inexpensive way of quietly extending your displays. Your existing VD4+ could drive either without a problem.
     
  36. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Great info nando4! Appreciated. So the VD4+ could have driven either of those cards without blowing up. Darn.

    I've still heard nothing back from village support after 3 days. I'm starting to lose hope of getting anything from them, so might have to contact PayPal if nothing happens in a couple of days.
     
  37. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    The thing that impressed me about Village from the start was that they responded within hours, both pre-sale answers and post-sale support. Something has changed, no support response for days? Another reason to go DIY.
     
  38. jedip

    jedip Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    Anyone can tell me If I use vaioz116gh spec coreI5 can it run this concept
    (vidock+ gtx570) with no problem? (lag, hang flicker screen)
    Escpecially use vidock run internal(1600*900) nb screen.

    Thank you
     
  39. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    See a few pages back, I discovered (despite ViDock telling ComputerCowboy that a gtx570 is supported in a vidock 4+) the card in the vidock will blow as the vidock4+ doesn't have enough power for a gtx570. Mine blew after 6-7 days. of use and there was lots of nasty smoke. I'm planning on building a DIY dock now for the gtx570. The ViDock4+ is an expensive paperweight. Village have not responded to my support request or emails after 6 days . Should I get a response, I will certainly let everyone know on this thread. As it stands though now I certainly wouldn't recommend buying anything from them.
     
  40. pyr0

    pyr0 100% laptop dynamite

    Reputations:
    829
    Messages:
    1,272
    Likes Received:
    36
    Trophy Points:
    56
    Why has it to be a GTX570? For the last time: There are many cards that cost half, have less power requirements and give the same performance when run with the Z1. The Z1 is capped at PCIe x1 @ 2.5Gb/s which bottleneck everything above a GTX x60. I have tested a GTX580 and it is NOT substantially faster than a 560/460 let alone it does not give extra functionality like more video outputs. 2x WQXGA is supported even with passively cooled (=silent) low end Fermis which can be run with an old 90W laptop PSU. For watching HD media and doing office work, this is perfectly enough and does not heat up your workplace so much. If you want to game, you won't benefit from using a 570/580 over a 560 that maxes at about 200W so can be run by the VD4+ without any issues. VT's answer on CC's request if a 570 could be run is not accurate - I tell you why: They looked up the specification of the 570 on the NVidia page (they provided a link). These specifications are related to NVidia's reference design. Since many OEMs do factory overclocks and do change the designs, the final product has different specifications (in case of overclocked cards higher wattage etc.), so the already tight power reserve of the VD4+ with the reference design gets exceeded by most 570's. This is the case for cyclone5, I am pretty sure he got a EVGA superclocked or whatnot card. If you ask VT if a 570 overclocked xy card works with the VD4+ and they say yes - they lie. One could modify the 570's VBIOS and do some underclocking/volting to push down the power requirements but then, there is no point in actually buying a 570.

    Folks, go with a 560/460 and you will have no issues.

    Unfortunately, nobody hears me in these topics where I try to share the experience I made with 3 DIY eGPU builds and 5 years of electrical engineering. I am out.
     
  41. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,860
    Likes Received:
    10
    Trophy Points:
    0
    I hear you and agree. Anymore than a reference-clocked GTX560Ti using a 225W VD4+ is a timebomb waiting to go off. A GTX560Ti is more than enough for the x1 1.0 Optimus link. I'd go so far as to suggest going for a GTX460. They can often be overclocked to GTX560Ti levels and can be had for bargain prices as gamers upgrade.

    Now the DIY eGPU community will soon see x1 2.0 Optimus performance on Series-6 systems (eg: Lenovo X220, Toshiba R830/R840) which may give a GTX570 more breathing space (bandwidth) to show it's superior performance.
     
  42. jedip

    jedip Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    Thanks for your answer (cyclone5 and pyr0).
    My question may not clear.
    The point is corei5 and screen res 1600*900
    Is powerful enough to drive Vidock(with any model of Pci-ex for pc) without problem?
    Exspecially to drive internal screeen.
    Sorry for silly question
    First I will buy Z2 but after read this thread (thanks for ComputerCowboy).
    I tihnk I will buy Z1 But in my country I can't find z1 with corei7.
     
  43. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    @jedip
    Your computer will run a ViDock or DIY ViDock

    @pyr0
    This GTX570 may have been started by me. I got the 570 because it was the best card the ViDock4+ was supposed to be able to run. It wasn't about need. I just like to max everything out so there is no possible upgrade in the future.
    FWIW: I listen a lot to what you say. You were the one who told me what the WWAN PCIe slot can and can't do, so I didn't have to find out the hard way.
     
  44. TofuTurkey

    TofuTurkey Married a Champagne Mango

    Reputations:
    431
    Messages:
    1,129
    Likes Received:
    2
    Trophy Points:
    56
    Awwwww....

    [​IMG]
     
  45. beaups

    beaups New Jack Hustler

    Reputations:
    476
    Messages:
    2,376
    Likes Received:
    4
    Trophy Points:
    56
  46. jedip

    jedip Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    Computercowboy
    I wanna run vidock. So corei5 power is enough?
     
  47. pyr0

    pyr0 100% laptop dynamite

    Reputations:
    829
    Messages:
    1,272
    Likes Received:
    36
    Trophy Points:
    56
    Yes it is. With everything properly configured, you will get around 13-14k 3DMarks06 with a GTX460 or 560 and up card. This is around 2.5 times faster than the stock 330M in the Z.
     
  48. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
  49. jedip

    jedip Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    Thanks to pyr0.
    I think I will get z1 with vidock in stead of z2
     
  50. cyclone5

    cyclone5 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    but what if you want to expand and turn your home study into an international airport departure hall?
     
← Previous pageNext page →