The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Z1 + ViDock + GTX 570 > FTW (the Z2/PMD killer)

    Discussion in 'VAIO / Sony' started by ComputerCowboy, Jul 29, 2011.

  1. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    sure link?
     
  2. maven1975

    maven1975 Notebook Evangelist

    Reputations:
    51
    Messages:
    609
    Likes Received:
    6
    Trophy Points:
    31
    Welcome to Steam

    Insall the Steam application and then you will find it in the demo section. :)
     
  3. Brianho1337

    Brianho1337 Notebook Evangelist

    Reputations:
    18
    Messages:
    427
    Likes Received:
    0
    Trophy Points:
    30
    The 7.9 WEI score is very impressive no doubt. If my Z can still handle the upcoming Modern Warfare 3 then I'd be more than happy to stick with my current setup. Dragonfable is another game I play which I'm pretty sure my Z can handle :D

    I sure hope to see some pictures of your mad VAIO science lab, professor computercowboy :p
     
  4. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    ^ I added the pictures to the first post.
     
  5. kvnchg

    kvnchg Notebook Guru

    Reputations:
    3
    Messages:
    67
    Likes Received:
    0
    Trophy Points:
    15
    ComputerCowboy, I am very interested in following in your footsteps, but I don't need all the ports on the GTX 570. I am only interested in playing some of the newer games.

    As you have said, the bottleneck is on the ExpressCard, so could there be a cheaper graphics card that would perform just as well? Any information on how much the ExpressCard is slowing down the graphics card would be greatly appreciated.
     
  6. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    No idea, although pyr0 suggests that you can do pretty well with a 460...

    see his post on page 4 of this thread
     
  7. kvnchg

    kvnchg Notebook Guru

    Reputations:
    3
    Messages:
    67
    Likes Received:
    0
    Trophy Points:
    15
    Thank you so much for the quick reply. After reading pyr0's post I went over to the eGPU experience thread and looked at the various Sony VPC Z1 benchmarks they had:

    System
    13" Sony_VPC-Z13 i7-640M 2.80 8.0 [email protected] 15867
    13" Sony_VPC-Z11 i7-620M 2.66 8.0 [email protected] 15623
    13" Sony_VPC-Z11 i7-620M 2.66 8.0 [email protected] 16270
    13" Sony_VPC-Z12 i7-620M 2.66 8.0 [email protected] 15600


    For some reason they are all higher than your 14949. In that thread it also states that the ViDock and the eGPU solutions should get the same performance, so I am kind of confused as to what to make of all this. Anyways, they all scored around the same, so I think I will take pyro and your advice and get the GTX460.

    Lastly, have you tried overclocking your GTX 570? I would greatly appreciated it if you post your results.
     
  8. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    Well, as I have said before, I don't really care about the game performance. I don't play games. I opted not to buy the "Superclocked" 570HD which was the same price as the one that I have because I didn't want more heat, noise and less life of the graphics card. The only reason I got this setup is to run high resolution displays. There may be something I can do to get my 3DMark scores up higher, but I don't really care. The ViDock runs cool and quite, and it runs my U3011 at native resolution.

    I downloaded Steam and Homefront... but I couldn't get the demo to find any servers. Too bad because I was going to try it at 2560x1600. Anyone have any ideas why the Homefront demo won't find any games?
     
  9. Louche

    Louche Purveyor of Utopias

    Reputations:
    92
    Messages:
    894
    Likes Received:
    0
    Trophy Points:
    30
    That's pretty cool. The Z1/ViDock/GTX 570 wouldn't work for me because the i7-620M doesn't have enough power for at least one of my tasks and I can't imagine needing anywhere need the graphics power or port capabilities of the Killer system. Nonetheless, it's still really cool and I very much appreciate ComputerCowboy taking the time to post all of the technical details. I can see where there are any number of people would be better off with the Z1/Vi-Dock system.
     
  10. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    You are saying a Z2 is better for you? The Second Generation i7 isn't that much faster than the first gen is it? I'll probably end up getting a Z2 at some point, but I am in no particular hurry.

    What daily task needs the power of the 2nd Gen i7? How are you accomplishing this task right now?
     
  11. fhsieh

    fhsieh Notebook Consultant

    Reputations:
    69
    Messages:
    134
    Likes Received:
    1
    Trophy Points:
    31
    Second gen i7 has revised turbo boost, faster memory support (1333mhz vs 1066mhz), and the new GMA HD 3000 graphics with QuickSync. For general purpose computing you probably won't get a big performance boost with the extra mhz, but for certain tasks the memory and QuickSync could potentially speed things up quite a bit.
     
  12. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I know that SanyBridge and faster ram are better, and I know the second gen i7 is faster... what I am wondering is what the Z2 will run that the Z1 won't run?
     
  13. Louche

    Louche Purveyor of Utopias

    Reputations:
    92
    Messages:
    894
    Likes Received:
    0
    Trophy Points:
    30
    The Z2 would work definitely work better for me. As I mentioned, for me the high powered graphics and super-hi res monitors don't matter. I need the GTX 570's power even less than you need a new watch (at least you can make some use of that). Either the Z1's graphics card or the SB graphics should be plenty for what I do.

    On the other hand, I'll be looking to do some fairly sophisticated real-time audio processing on a substantial number of high resolution audio streams. The maximum number of simultaneous streams is dependent on my processing power. Pre-SB, what I'm doing took quad core power. Sure it's possible to simulate the bounce technique used with old four-track cassette decks and a few other tricks to get by with a lower power processor, but that takes time and creates several other pains that can be worked through but are not desirable.

    Since I don't need a ViDock and would only occasionally use the PMD, the real choice for me is between a maxed-out Z1 or a similarly configured Z2. Given that, it's a pretty simple choice.

    That said, I would still be tempted take a Z1 over the SA no matter what the processing power.
     
  14. maven1975

    maven1975 Notebook Evangelist

    Reputations:
    51
    Messages:
    609
    Likes Received:
    6
    Trophy Points:
    31
    It will run a whole new set of proprietary drivers that the community here will have to sort through and straighten out. Thats about all I can see.
     
  15. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    The SA is ok, I played with the maxed out signature version at the SonyStyle store... but the Z1 and the Z2 just have that special top end feel. The Z1 and Z2 are much lighter and have the 1080P screen. I still want a Z2, and I am sure I'll get one eventually, right now my money is much better spent getting stuff for the Z1. I want to get two Intel x18 to replace the stock RAID and I want to get a Intel 600GB 2.5" drive to replace the Blu-ray burner. I also want a second DELL u3011. The cost of all of that is about as much as a new Z2 but I am getting a lot more workstation than I would ever hope to get out of the Z2.

    Yes and all for what? A crap ATI card? Meh. When/if I get a Z2 I probably won't get the PMD, if I do, I'll probably sell it.
     
  16. Louche

    Louche Purveyor of Utopias

    Reputations:
    92
    Messages:
    894
    Likes Received:
    0
    Trophy Points:
    30
    I played with an SA at a store but they had a Z1 next to it and there was just no comparison -- as you know. If there wasn't a Z right there, I would probably have a better impression of the SA though I see no reason to compromise for 1600*900 display.

    You're going to end up with truely cool machine. My ideal current technology machine would probably be a Z1 with an SB chip and a slice battery option. I would be glad to ditch the graphics card as a compromise.
     
  17. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I don't care that much about internal graphics... but it is nice that I can plug just my Z into my 3D projector and watch 3D Blu-ray without anything else. That said, the amount of time I actually spend watching 3D vs just 2D content is very small. If I had to plug in some external unit to make the 3D work the few times I use it, that wouldn't be that big of a deal.

    The Intel 3000 is supposed to be able to drive 3D displays I think, although I am pretty invested in nVidia 3DVision so I like the fact that the Z1 has the 330M built in.

    What I am really hoping for is that a company like Village will come along and make something that plugs into the Z2 and allows for a full PCIe Graphics card. That would be truly awesome.
     
  18. Louche

    Louche Purveyor of Utopias

    Reputations:
    92
    Messages:
    894
    Likes Received:
    0
    Trophy Points:
    30
    Movies for me are just an occasional diversion and 3D isn't something I bother with even at theaters other than remind me that I really don't care. On the other hand, although the BD storage capacity is overkill for me, the burner will be nice to have.
     
  19. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I spent $500 on the BD burner for the Z1 and I use it just about never. My external Polariod BD drive rips movies twice as fast as the internal drive. I am ready to ditch the BDR and put a mega SSD in it's place.
    As far as 3D goes, I have some waterproof 3D rigs, it is really cool to watch the footage from them on the projector, but I never get any time to take them out.
     
  20. Louche

    Louche Purveyor of Utopias

    Reputations:
    92
    Messages:
    894
    Likes Received:
    0
    Trophy Points:
    30
    I could easily get away with an external burner, one reason why the PMD doesn't bother me much. The Z2's burner is only about $200 over a DVD drive so it's not a bad price. If the Z2 came without the PMD other than through Conics (which charges is bizarrely outrageous amount for a 512GB SSD) I would do so and just buy a burner.
     
  21. corrado85

    corrado85 Notebook Consultant

    Reputations:
    4
    Messages:
    216
    Likes Received:
    0
    Trophy Points:
    30
    nice job cowboy.
     
  22. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    Thanks corrado... I didn't really do much, just got the money and slapped it all together. There was a little bit of tweaking to get it "just right", nothing serious. I am quite pleased with the result. Can't wait to expand to dual WQXGA monitors, or even three. I am not sure if it can run three WQXGA, or if I could fit three on a desk. The viewing angles of IPS would make for the possibility of wrapping three around me. Or maybe I could mount them all in portrait mode.
     
  23. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    nVidia doesn't support 3 from one GPU, need dual GPU card for that. AMD supports 3-6, depending on the exact GPU. New AMD muxless drivers give some hope to this + AMD :)
     
  24. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    One GTX 590 does three in 3D surround... but it looks like you are correct about the 570. According to this nVidia Datasheet it has two display pipelines (page 2). The 590 Datasheet mentions quad display support and specifically three DVI-D WQXGA.

    I wonder if Village will make a ViDock 5? One that supports the Dual 8Pin connectors that the 590 takes. I suppose I could also just get another power supply and rig something up. I'll probably wait until I have at least maxed out what the 570 can do.

    One thing I noticed is that the DisplayPort on the 570 is a little finicky. It doesn't remember the display settings, DVI-D works perfectly every time. I guess that is not that big of a deal since it looks like this thing maxes out at two displays and it has two DVI-D ports.

    All in all I am a little disappointed that the card can only drive two displays. I saw all of those ports and got excited, thinking I could just plug something in to each one of them.
     
  25. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    The GTX590 has two GF110 GPU dies, therin, it's a dualGPU card :D I've noticed the same thing with DP on my U2711 with several AMD GPU. Maybe just the nature of high resolution DP connections. (older types, that is)
     
  26. pyr0

    pyr0 100% laptop dynamite

    Reputations:
    829
    Messages:
    1,272
    Likes Received:
    36
    Trophy Points:
    56
    I think if you dont game, a GTX570 is already overkill. A cheaper lower end, more compact and cooler GPU with lower wattage requirements (Vidock 4+ can deliver up to 225W) would suffice for multi-monitor office/multimedia stuff. A GTX570 easily can go beyond that under full gaming load. A 570 on a x1 ExpressCard link has almost no advantages over a 460 that costs half, runs cooler, fits in a Vidock4 (smaller).

    I know what I am talking here because I already hooked up a custom setup with GTX580 to my Z and mark the top end of pre-SB setups as you can see in the dx9 table here: http://forum.notebookreview.com/gaming-software-graphics-cards/418851-egpu-experiences.html

    My GTX460 setup costs 1/3, needs 1/3 of the power of a 580 and is just 800 3DMarks06 behind. A 590 will be problematic since dual gpus need double the pcie memory window that a Z will most probably not be able to accomodate (did not test this so far due to lack of 590).
     
  27. corrado85

    corrado85 Notebook Consultant

    Reputations:
    4
    Messages:
    216
    Likes Received:
    0
    Trophy Points:
    30
    i like to do the same... now if i only have time to mess with drivers :cool: + do a fresh install of win7 for kicks.
     
  28. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I'm not saying I want to put a 590 in there tomorrow. I am happy with what I have for now. I would only want something different once I had two WQXGA monitors, then I might want a third one and I would need a different card.


    Getting the hardware is the easiest part, getting everything configured correctly is trial and error, except that I've already done the Vanilla drivers way and the Optimus way has been done also. So there isn't anything to figure out.
     
  29. pokerart

    pokerart Notebook Enthusiast

    Reputations:
    0
    Messages:
    23
    Likes Received:
    0
    Trophy Points:
    5
  30. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    that looks like it would work fine with the ViDock4Plus
    According to Village it is all about the power requirement. The page you linked me to says it takes two 6pin connectors, it should work great.

    EDIT: What is a little concerning is the text on the page
    That would imply that when you drive 3 displays that two of them need to be HDMI. Are you sure the tech said that the two DVI ports and the DisplayPort would all work at the same time?

    EDIT 2: look at this link http://www.zotac.com/pdbrochures/vga/master_zt-50706-10m_gtx-560-multiview_v1.pdf
    scroll down and look to the left under "connectors"

     
  31. pokerart

    pokerart Notebook Enthusiast

    Reputations:
    0
    Messages:
    23
    Likes Received:
    0
    Trophy Points:
    5
    ah, that would be bad , you should check out the zotac gtx 460 3dp if you were planning on doing 1 x 2560 + 2x 1920x1200 . Only displayport and no adapter if u want to drive 3 monitor .I talk to the live chat and he said that was possible.
     
  32. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I am not quite sure what you are getting at here. I am not questioning whether or not it can run three monitors, I am saying that it can not run three WQXGA monitors. I know I can run two WQXGA monitors with what I have now... and maybe even another 1080P monitor off of the dock.
     
  33. chfshifter

    chfshifter Notebook Geek

    Reputations:
    0
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    Exactly how quiet is the ViDock? (idle/load) My VPCZ (top spec) fan gets very noisy at times with some activity going on. I'm using a Dell U2709 thru HDMI and another monitor thru a Toshiba dock.

    My concern is how quiet/noisy is the ViDock compared to the Z? Would appreciate if you can describe more (loaded or when idle). thanksss ;)
     
  34. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    The noise level of the ViDock is largely dependent on the graphics card you select and what you do with it. I have a GTX 570 which is pretty damn near top of the line, but I mostly just run 2D stuff, desktop applications. The only time the ViDock/GTX570 made much noise at all was when I ran 3DMark06. In my setup the Z is much louder, at least two or three times louder. The ViDock does NOT have any fans. The noise it makes is all about the card you put in it.

    All in all at its noisiest it was way less noisy than the Z1. Keep in mind that when the GPU is doing something intense the CPU is usually doing something big also. The Z1 "covers up" the sound of the dock almost all the time.
     
  35. chfshifter

    chfshifter Notebook Geek

    Reputations:
    0
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    That sounds nice! I thought the GTX570's fan will be pretty noisy. I've been following your posts for a long time on other Z threads (coz I'd love to run a high res (Dell 30'') monitor as well!) Thanks for sharing ur success with ViDock! ;)
     
  36. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    Yea no problem. I am quite happy with the result. It is nice to finally be running WQXGA, I've been working on it/thinking about it for a while. In the end it was kind of a leap of faith to drop $2K for the setup hoping that it would work out.
     
  37. SPEEDwithJJ

    SPEEDwithJJ NBR Super Idiot

    Reputations:
    865
    Messages:
    3,499
    Likes Received:
    1
    Trophy Points:
    106
  38. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I wish someone would make something like that for the Z2, or that a Lightpeak to Thunderbolt adapter comes along.
     
  39. chfshifter

    chfshifter Notebook Geek

    Reputations:
    0
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    What do you mean when you say you can hot plug/unplug? Does it mean you don't have any issues when trying to unplug the express card even with graphics-intensive apps opened? And that you can plug it back without any restarts/app shutdown?
     
  40. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    ^ you have to eject it using the system tray, like you would a USB device
    other than that... yes you can unplug it and plug it back in without a reboot
     
  41. chfshifter

    chfshifter Notebook Geek

    Reputations:
    0
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    ^does it take ages for it to be ejected using the system tray? USB drives cannot be ejected when in use so do I have to close apps before ejecting? If not, will, say, a YouTube video continue to play probably after ejecting? (as it won't play probably switching from stamina to speed in the current Z1) thx!!
     
  42. Profy_X

    Profy_X Notebook Consultant

    Reputations:
    17
    Messages:
    212
    Likes Received:
    0
    Trophy Points:
    30
    Just added to my Z a ViDock 4+ and I'm so happy to have it.. :D .It was quite a pain in the to install the graphic card inside and not only,it really disappointed me that they didn't put a how to install manual but after some hours i managed to install it.For the moment I use my boss's graphic card from the office desktop and hope to buy one my self in the next couple of months because for now i spend quite some money on the upgrades :p
    By the way thanx ComputerCowboy for mentioning the trick with booting up with the dock :D
     
  43. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    I've updated the orginal post to show the Z2+PMD WEI scores, and a screen cap of eject.

    It ejects really fast, less than five seconds. Youtube video turns green, but the audio will play. When you plug the card back in it takes less than 10 seconds to initialize it and put the screen on.

    [​IMG]

    Nice, I'm glad you like it. What are you doing drivers-wise? Do you have the BIOS hack?
     
  44. chfshifter

    chfshifter Notebook Geek

    Reputations:
    0
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    ^That's nice! Exactly which GTX570 are you using though?
    Thing is I want to use at least 3 external monitors so I am wondering if I should get a Radeon card instead...Or should I use a USB-DVI adaptor for the third screen if I stick with GTX570...
     
  45. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    What resolution are the monitors? Earlier in this thread there was a Zotac GTX560 card mentioned that can do 2x 1080P and 1x WQXGA

    I don't like ATI, personal preference. If you went GTX570+USB/DisplayLink you could plug the USB/DisplayLink adapter into one of the ViDock USB ports, that way you'd undock all the monitors at once.
     
  46. ComputerCowboy

    ComputerCowboy Sony Fanboy

    Reputations:
    502
    Messages:
    1,503
    Likes Received:
    27
    Trophy Points:
    66
    @Profy... what card are you using? How is it configured?
     
  47. chfshifter

    chfshifter Notebook Geek

    Reputations:
    0
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    A bit off topic, but I was wondering if putting acoustic absorption mats underneath the ViDock and the Z will do anything to improve the noise at all...=P
     
  48. exastify

    exastify Notebook Enthusiast

    Reputations:
    8
    Messages:
    37
    Likes Received:
    0
    Trophy Points:
    15
    Is there a significant difference between a vidock and a cheaper, no-enclosure or power supply included EGPU solution?
     
  49. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    that card uses a matrox chip, lol. One output (DL DVI) can go all the way, the others are limited to what the chip can do, upto 2560x1600 (to be reconfigured in any way possible).
     
  50. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    Since it cured near impossible to edit:

    2560x1600 pixels, that is. So ~ 2x 1600x1200 is as high as it can get, in the splitting process, at 60Hz.
     
← Previous pageNext page →