sure link?
-
-
Welcome to Steam
Insall the Steam application and then you will find it in the demo section.
-
The 7.9 WEI score is very impressive no doubt. If my Z can still handle the upcoming Modern Warfare 3 then I'd be more than happy to stick with my current setup. Dragonfable is another game I play which I'm pretty sure my Z can handle
I sure hope to see some pictures of your mad VAIO science lab, professor computercowboy
-
^ I added the pictures to the first post.
-
ComputerCowboy, I am very interested in following in your footsteps, but I don't need all the ports on the GTX 570. I am only interested in playing some of the newer games.
As you have said, the bottleneck is on the ExpressCard, so could there be a cheaper graphics card that would perform just as well? Any information on how much the ExpressCard is slowing down the graphics card would be greatly appreciated. -
No idea, although pyr0 suggests that you can do pretty well with a 460...
see his post on page 4 of this thread -
Thank you so much for the quick reply. After reading pyr0's post I went over to the eGPU experience thread and looked at the various Sony VPC Z1 benchmarks they had:
System
13" Sony_VPC-Z13 i7-640M 2.80 8.0 [email protected] 15867
13" Sony_VPC-Z11 i7-620M 2.66 8.0 [email protected] 15623
13" Sony_VPC-Z11 i7-620M 2.66 8.0 [email protected] 16270
13" Sony_VPC-Z12 i7-620M 2.66 8.0 [email protected] 15600
For some reason they are all higher than your 14949. In that thread it also states that the ViDock and the eGPU solutions should get the same performance, so I am kind of confused as to what to make of all this. Anyways, they all scored around the same, so I think I will take pyro and your advice and get the GTX460.
Lastly, have you tried overclocking your GTX 570? I would greatly appreciated it if you post your results. -
Well, as I have said before, I don't really care about the game performance. I don't play games. I opted not to buy the "Superclocked" 570HD which was the same price as the one that I have because I didn't want more heat, noise and less life of the graphics card. The only reason I got this setup is to run high resolution displays. There may be something I can do to get my 3DMark scores up higher, but I don't really care. The ViDock runs cool and quite, and it runs my U3011 at native resolution.
I downloaded Steam and Homefront... but I couldn't get the demo to find any servers. Too bad because I was going to try it at 2560x1600. Anyone have any ideas why the Homefront demo won't find any games? -
That's pretty cool. The Z1/ViDock/GTX 570 wouldn't work for me because the i7-620M doesn't have enough power for at least one of my tasks and I can't imagine needing anywhere need the graphics power or port capabilities of the Killer system. Nonetheless, it's still really cool and I very much appreciate ComputerCowboy taking the time to post all of the technical details. I can see where there are any number of people would be better off with the Z1/Vi-Dock system.
-
You are saying a Z2 is better for you? The Second Generation i7 isn't that much faster than the first gen is it? I'll probably end up getting a Z2 at some point, but I am in no particular hurry.
What daily task needs the power of the 2nd Gen i7? How are you accomplishing this task right now? -
Second gen i7 has revised turbo boost, faster memory support (1333mhz vs 1066mhz), and the new GMA HD 3000 graphics with QuickSync. For general purpose computing you probably won't get a big performance boost with the extra mhz, but for certain tasks the memory and QuickSync could potentially speed things up quite a bit.
-
I know that SanyBridge and faster ram are better, and I know the second gen i7 is faster... what I am wondering is what the Z2 will run that the Z1 won't run?
-
The Z2 would work definitely work better for me. As I mentioned, for me the high powered graphics and super-hi res monitors don't matter. I need the GTX 570's power even less than you need a new watch (at least you can make some use of that). Either the Z1's graphics card or the SB graphics should be plenty for what I do.
On the other hand, I'll be looking to do some fairly sophisticated real-time audio processing on a substantial number of high resolution audio streams. The maximum number of simultaneous streams is dependent on my processing power. Pre-SB, what I'm doing took quad core power. Sure it's possible to simulate the bounce technique used with old four-track cassette decks and a few other tricks to get by with a lower power processor, but that takes time and creates several other pains that can be worked through but are not desirable.
Since I don't need a ViDock and would only occasionally use the PMD, the real choice for me is between a maxed-out Z1 or a similarly configured Z2. Given that, it's a pretty simple choice.
That said, I would still be tempted take a Z1 over the SA no matter what the processing power. -
It will run a whole new set of proprietary drivers that the community here will have to sort through and straighten out. Thats about all I can see.
-
The SA is ok, I played with the maxed out signature version at the SonyStyle store... but the Z1 and the Z2 just have that special top end feel. The Z1 and Z2 are much lighter and have the 1080P screen. I still want a Z2, and I am sure I'll get one eventually, right now my money is much better spent getting stuff for the Z1. I want to get two Intel x18 to replace the stock RAID and I want to get a Intel 600GB 2.5" drive to replace the Blu-ray burner. I also want a second DELL u3011. The cost of all of that is about as much as a new Z2 but I am getting a lot more workstation than I would ever hope to get out of the Z2.
Yes and all for what? A crap ATI card? Meh. When/if I get a Z2 I probably won't get the PMD, if I do, I'll probably sell it. -
I played with an SA at a store but they had a Z1 next to it and there was just no comparison -- as you know. If there wasn't a Z right there, I would probably have a better impression of the SA though I see no reason to compromise for 1600*900 display.
You're going to end up with truely cool machine. My ideal current technology machine would probably be a Z1 with an SB chip and a slice battery option. I would be glad to ditch the graphics card as a compromise. -
I don't care that much about internal graphics... but it is nice that I can plug just my Z into my 3D projector and watch 3D Blu-ray without anything else. That said, the amount of time I actually spend watching 3D vs just 2D content is very small. If I had to plug in some external unit to make the 3D work the few times I use it, that wouldn't be that big of a deal.
The Intel 3000 is supposed to be able to drive 3D displays I think, although I am pretty invested in nVidia 3DVision so I like the fact that the Z1 has the 330M built in.
What I am really hoping for is that a company like Village will come along and make something that plugs into the Z2 and allows for a full PCIe Graphics card. That would be truly awesome. -
Movies for me are just an occasional diversion and 3D isn't something I bother with even at theaters other than remind me that I really don't care. On the other hand, although the BD storage capacity is overkill for me, the burner will be nice to have.
-
I spent $500 on the BD burner for the Z1 and I use it just about never. My external Polariod BD drive rips movies twice as fast as the internal drive. I am ready to ditch the BDR and put a mega SSD in it's place.
As far as 3D goes, I have some waterproof 3D rigs, it is really cool to watch the footage from them on the projector, but I never get any time to take them out. -
I could easily get away with an external burner, one reason why the PMD doesn't bother me much. The Z2's burner is only about $200 over a DVD drive so it's not a bad price. If the Z2 came without the PMD other than through Conics (which charges is bizarrely outrageous amount for a 512GB SSD) I would do so and just buy a burner.
-
nice job cowboy.
-
Thanks corrado... I didn't really do much, just got the money and slapped it all together. There was a little bit of tweaking to get it "just right", nothing serious. I am quite pleased with the result. Can't wait to expand to dual WQXGA monitors, or even three. I am not sure if it can run three WQXGA, or if I could fit three on a desk. The viewing angles of IPS would make for the possibility of wrapping three around me. Or maybe I could mount them all in portrait mode.
-
nVidia doesn't support 3 from one GPU, need dual GPU card for that. AMD supports 3-6, depending on the exact GPU. New AMD muxless drivers give some hope to this + AMD
-
One GTX 590 does three in 3D surround... but it looks like you are correct about the 570. According to this nVidia Datasheet it has two display pipelines (page 2). The 590 Datasheet mentions quad display support and specifically three DVI-D WQXGA.
I wonder if Village will make a ViDock 5? One that supports the Dual 8Pin connectors that the 590 takes. I suppose I could also just get another power supply and rig something up. I'll probably wait until I have at least maxed out what the 570 can do.
One thing I noticed is that the DisplayPort on the 570 is a little finicky. It doesn't remember the display settings, DVI-D works perfectly every time. I guess that is not that big of a deal since it looks like this thing maxes out at two displays and it has two DVI-D ports.
All in all I am a little disappointed that the card can only drive two displays. I saw all of those ports and got excited, thinking I could just plug something in to each one of them. -
The GTX590 has two GF110 GPU dies, therin, it's a dualGPU card
I've noticed the same thing with DP on my U2711 with several AMD GPU. Maybe just the nature of high resolution DP connections. (older types, that is)
-
I think if you dont game, a GTX570 is already overkill. A cheaper lower end, more compact and cooler GPU with lower wattage requirements (Vidock 4+ can deliver up to 225W) would suffice for multi-monitor office/multimedia stuff. A GTX570 easily can go beyond that under full gaming load. A 570 on a x1 ExpressCard link has almost no advantages over a 460 that costs half, runs cooler, fits in a Vidock4 (smaller).
I know what I am talking here because I already hooked up a custom setup with GTX580 to my Z and mark the top end of pre-SB setups as you can see in the dx9 table here: http://forum.notebookreview.com/gaming-software-graphics-cards/418851-egpu-experiences.html
My GTX460 setup costs 1/3, needs 1/3 of the power of a 580 and is just 800 3DMarks06 behind. A 590 will be problematic since dual gpus need double the pcie memory window that a Z will most probably not be able to accomodate (did not test this so far due to lack of 590). -
i like to do the same... now if i only have time to mess with drivers
+ do a fresh install of win7 for kicks.
-
I'm not saying I want to put a 590 in there tomorrow. I am happy with what I have for now. I would only want something different once I had two WQXGA monitors, then I might want a third one and I would need a different card.
Getting the hardware is the easiest part, getting everything configured correctly is trial and error, except that I've already done the Vanilla drivers way and the Optimus way has been done also. So there isn't anything to figure out. -
cowboy, i just found out that the zotac 560gtx multiview is the only (lowest end since u dont game much) single nvidia card that drives 3x 2560x1600 after i speak with with the live chat guy. http://www.zotacusa.com/zotac-geforce-gtx-560-multiview-zt-50706-10m.html Does this card works with the vidock?
-
that looks like it would work fine with the ViDock4Plus
According to Village it is all about the power requirement. The page you linked me to says it takes two 6pin connectors, it should work great.
EDIT: What is a little concerning is the text on the page
That would imply that when you drive 3 displays that two of them need to be HDMI. Are you sure the tech said that the two DVI ports and the DisplayPort would all work at the same time?
EDIT 2: look at this link http://www.zotac.com/pdbrochures/vga/master_zt-50706-10m_gtx-560-multiview_v1.pdf
scroll down and look to the left under "connectors"
-
ah, that would be bad , you should check out the zotac gtx 460 3dp if you were planning on doing 1 x 2560 + 2x 1920x1200 . Only displayport and no adapter if u want to drive 3 monitor .I talk to the live chat and he said that was possible.
-
I am not quite sure what you are getting at here. I am not questioning whether or not it can run three monitors, I am saying that it can not run three WQXGA monitors. I know I can run two WQXGA monitors with what I have now... and maybe even another 1080P monitor off of the dock.
-
Exactly how quiet is the ViDock? (idle/load) My VPCZ (top spec) fan gets very noisy at times with some activity going on. I'm using a Dell U2709 thru HDMI and another monitor thru a Toshiba dock.
My concern is how quiet/noisy is the ViDock compared to the Z? Would appreciate if you can describe more (loaded or when idle). thanksss
-
The noise level of the ViDock is largely dependent on the graphics card you select and what you do with it. I have a GTX 570 which is pretty damn near top of the line, but I mostly just run 2D stuff, desktop applications. The only time the ViDock/GTX570 made much noise at all was when I ran 3DMark06. In my setup the Z is much louder, at least two or three times louder. The ViDock does NOT have any fans. The noise it makes is all about the card you put in it.
All in all at its noisiest it was way less noisy than the Z1. Keep in mind that when the GPU is doing something intense the CPU is usually doing something big also. The Z1 "covers up" the sound of the dock almost all the time. -
That sounds nice! I thought the GTX570's fan will be pretty noisy. I've been following your posts for a long time on other Z threads (coz I'd love to run a high res (Dell 30'') monitor as well!) Thanks for sharing ur success with ViDock!
-
Yea no problem. I am quite happy with the result. It is nice to finally be running WQXGA, I've been working on it/thinking about it for a while. In the end it was kind of a leap of faith to drop $2K for the setup hoping that it would work out.
-
Slightly off-topic.
I think those VillageTronic guys are just awesome with their GPU solutions. I came across the following link yesterday:
External Thunderbolt PCI Expansion Chassis and Hub in Development - Mac Rumors
After reading that, I can't help but wonder how much the above GPU solution (when it is finally released) is going to "cannibalize" the VAIO Z2 notebook sales.
-
I wish someone would make something like that for the Z2, or that a Lightpeak to Thunderbolt adapter comes along.
-
What do you mean when you say you can hot plug/unplug? Does it mean you don't have any issues when trying to unplug the express card even with graphics-intensive apps opened? And that you can plug it back without any restarts/app shutdown?
-
^ you have to eject it using the system tray, like you would a USB device
other than that... yes you can unplug it and plug it back in without a reboot -
^does it take ages for it to be ejected using the system tray? USB drives cannot be ejected when in use so do I have to close apps before ejecting? If not, will, say, a YouTube video continue to play probably after ejecting? (as it won't play probably switching from stamina to speed in the current Z1) thx!!
-
Just added to my Z a ViDock 4+ and I'm so happy to have it..
.It was quite a pain in the to install the graphic card inside and not only,it really disappointed me that they didn't put a how to install manual but after some hours i managed to install it.For the moment I use my boss's graphic card from the office desktop and hope to buy one my self in the next couple of months because for now i spend quite some money on the upgrades
By the way thanx ComputerCowboy for mentioning the trick with booting up with the dock
-
I've updated the orginal post to show the Z2+PMD WEI scores, and a screen cap of eject.
It ejects really fast, less than five seconds. Youtube video turns green, but the audio will play. When you plug the card back in it takes less than 10 seconds to initialize it and put the screen on.
Nice, I'm glad you like it. What are you doing drivers-wise? Do you have the BIOS hack? -
^That's nice! Exactly which GTX570 are you using though?
Thing is I want to use at least 3 external monitors so I am wondering if I should get a Radeon card instead...Or should I use a USB-DVI adaptor for the third screen if I stick with GTX570... -
What resolution are the monitors? Earlier in this thread there was a Zotac GTX560 card mentioned that can do 2x 1080P and 1x WQXGA
I don't like ATI, personal preference. If you went GTX570+USB/DisplayLink you could plug the USB/DisplayLink adapter into one of the ViDock USB ports, that way you'd undock all the monitors at once. -
@Profy... what card are you using? How is it configured?
-
A bit off topic, but I was wondering if putting acoustic absorption mats underneath the ViDock and the Z will do anything to improve the noise at all...=P
-
Is there a significant difference between a vidock and a cheaper, no-enclosure or power supply included EGPU solution?
-
that card uses a matrox chip, lol. One output (DL DVI) can go all the way, the others are limited to what the chip can do, upto 2560x1600 (to be reconfigured in any way possible).
-
Since it cured near impossible to edit:
2560x1600 pixels, that is. So ~ 2x 1600x1200 is as high as it can get, in the splitting process, at 60Hz.
Z1 + ViDock + GTX 570 > FTW (the Z2/PMD killer)
Discussion in 'VAIO / Sony' started by ComputerCowboy, Jul 29, 2011.