^ done.
By the way, for Village Tronic we reached 35 people at the moment.. but they're already moving. And I'm moving for them also.
Sony support gave me a couple of email adress where to write ... I'll give a try.
-
-
-
Karamazovmm Overthinking? Always!
-
1) 225 W = 2 pcs 6 pin
2) What Notebook do you own: z21 and a 2009 MBP
3) Purpose: Gaming
4) Yes
5) Product
Thanks again! -
Why not choosing 2 pcs 8 pins = 375 W ? it would make no differences until you don't have a bigger graphic card and gives us a larger graphic card range for choice, for now and then, as this is supposed to be an evolving GPU dock.
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
DIY eGPU implementations use ATX PSUs as you know. There can match your PSU to your card. 375W (12V/31.25A). That can be supplied by a high grade PSU like a 12V/46A $50 Newegg.com - Diablotek PHD Series - 2nd Generation PHD650 650W ATX12V v2.2 SLI Ready CrossFire Ready Power Supply
If need 225W or less then can get by with ATX PSUs that cost $25-$35.
@Z2 owners - we still don't have confirmation of a muxed or muxless setup. Can also disable the HD6650M's HDMI audio and use the notebook's soundcard instead for better FPS since they share the x4 link. -
When on the ATI card the intel graphics utility can still make adjustments to the internal lcd only (color, etc). So I think it's safe to say the intel card is doing the display rendering.
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
It's probably muxless then.
Could you do a full swag of comprehensive benchmarks when running internal versus external LCD like I've done for my DIY eGPU here. Ensure you disable the HD6650M's HDMI audio device so the x4 link carries only video traffic and the external LCD must be connected off the PMD.
I would expect the external LCD performance will be faster since there is no additional loading on the x4 pci-e link to carry the internal LCD traffic in a muxless setup. You may well find then that you exceed a Sony SA HD6630M's performance. -
-
-
> Dubya wasn't really there, originally, if I am not mistaken that is Louis XIII Cognac. The Louis XIII is aged 50 years. It is pretty good. I had a bottle once, cost me $2,500!
Understood. Only "pretty good" for $2.5K - Hmm... What would the Sun King have said after that... You, I, and Louche must assemble for dinner and fine wine one evening (perhaps not in Montpelier?).
Seriously, much enjoy the repartee on these very valuable contributions.
Sundial -
That would be awesome!
The thing about Hine is it is single cask Cognac, similar to Single Malt Scotch. So a $700 bottle of Hine is much more interesting than a $2,500 bottle of blended Remy. I love Cognac but not like I love Scotch. A good bottle of Macallen or Lagavulin is much better bang for the buck. Pretty much any Islay malt 18 years or more will do the job. That smoky peaty flavor, plus some Fillet Mingon and Lobster Bisque.
I think we could have a rather nice dinner some day. -
I was reading better this y}´êPÌrL^XîñÇz CTCh VAIO Z obviously translated... it maybe an error of translation but it doesn't seem so..
"LightPeak à surmonter les nombreux défis de cette manière, la production de masse de fibre optique pour être abordée, d'autant que je pouvais être un grand atout pour le développement futur VAIO pense. LightPeak idéal originel, parce que la pensée de venir les limitations de vitesses de transmission de fil de cuivre, mais qui était censé atteindre des vitesses plus élevées en changeant de transmission par fibres optiques. En raison de l'évolution de la technologie de fil de cuivre solide, qui peuvent atteindre la même fibre optique à 10 Gbps montré que Apple et Intel Thunderbolt. Toutefois, si vous essayez les vitesses de transmission plus élevés prochaine, il devrait venir le temps de passer complètement à renoncer à la fibre optique, quelque part dans le calendrier de cuivre."
"LightPeak at overcoming the many challenges in this way, the mass production of optical fiber to be addressed, especially as I could be a great asset for future development VAIO think. LightPeak original ideal, because the thought of coming the limitations of copper wire transmission speeds, but that was supposed to achieve higher speeds by changing optical fiber transmission. Because of the evolving technology of solid copper wire, which can achieve the same optical fiber to 10Gbps showed that Apple and Intel in Thunderbolt. However, if you try the next higher transmission speeds, it should come time to move completely to give up the optical fiber somewhere in the timing of copper."
So they states that optical fiber has to be given up since it wouldn't match (in their opinion) future copper wires... what the hell are they saying??? There's already the technology to achieve 1 Tb/s with one single optical fiber... and it's the same technology! Instead of using 4 lasers, it's possible to use for example 20 lasers with different wavelength and empower laser throughput from 12.5 to 50 Gb/s (as already designed from Intel) and that's it. Obviously it would cost a lot.. but we're talking about 1000 Gb/s XD
In addition, in this article it's stated that connection between PMD and laptop it's 10 Gb/s bidirectional. But:
1) it's not something that the present Sony people in that article said, it's a guess from those who wrote the article
2) write that copper wire will kill optical fiber is fool since we all know that next generation of thunderbol will use optical fiber, like Sony already does
3) HD6650M on PMD it's to fast for using only PCI-E 2X 2.0 /4X 1.0
4) transmitter/receiver connector it's identical to those pics we saw in other articles from SOny.. and the speed was of 12.5 Gb/s per each laser minimum, not less. In addition it's clear visible that there are at list 4 "route" going to the demuxer: this means 4 lasers and 50 Gb/s in that direction.
This to say.. this article is good in everyhing but Light Peak implementation. -
User Retired 2 Notebook Nobel Laureate NBR Reviewer
I've read that article where they give gpu-z screenshots of the HD6630M showing x4 2.0 link but the author corrects it to be x4 1.0. If that is really the max link speed over the copper-LP link, then those evaluating the current Z2 you might want to skip this iteration of it and buy some other runaround like a 12.5" Lenovo X220 that can do a x1 2.0 Optimus expresscard DIY eGPU setup. Or even an upcoming ultrabook with an accessible mPCIe slot.
A x1 2.0 link using an expresscard/mPCIe slot is 5GT/s, but with NVidia's pci-e compression approximatelyy doubles it to be 10GT/s. NVidia's driver only engages on a *x1* link. It may well be it checks if it's a pci-e 2.0 link and doesn't engage there as well. We're not sure because no vendor has provided gear that can maintain a x1 2.0 link over an expresscard or mPCIe slot as yet. -
(e.g. PCIe Speed Test from AMD APP Power Toys | AMD Developer Central )
This should clearly show if it's PCI-E 1.0 or 2.0. -
^ I tride it on my desktop, but I was able to reach only 4.5 GB/s downstream and 1 GB/s upstream ... Theoretically I'd have to reach 16 GB/s as overall... I don't think it's useful in this case.. but we could give a try anyway.
EDIT: uhm.. maybe it can be useful to determine if it's 4X 2.0 or 2X 2.0. -
User Retired 2 Notebook Nobel Laureate NBR Reviewer
PCIe link GT/s -figure = PCIe link freq (GHz).
PCIe 1.n x16: 2.5GT/s, and as such the bandwidth is:
16bit/s × 2.5GHz × (8b/10b) = 4GB/s (unidirectional)
PCIe 2.0 x16: 5GT/s, and as such the bandwidth is:
16bit/s × 5GHz × (8b/10b) = 8GB/s (unidirectional)
Sony Z2 x4 link bandwidth
PCIe 1.n x4: 2.5GT/s, and as such the bandwidth is:
4bit/s × 2.5GHz × (8b/10b) = 1GB/s (unidirectional)
PCIe 2.0 x4: 5GT/s, and as such the bandwidth is:
4bit/s × 5GHz × (8b/10b) = 2GB/s (unidirectional) -
So we need this:
@ Someone who wants to do some benchmark with PMD in two conditions: with internal screen active and external switched off / with internal screen switched off and external active
@ Someone who wants to do run this AMD APP Power Toys | AMD Developer Central with PMD and external screen -
^I can do it Monday, I leave the dock at the office, sorry.
-
Thank you Beaups!
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
-
CPU->GPU= 4.914 GB/sec, GPU->CPU= 375.162 MB/sec
(GPU-Z reports PCI-E 2.0 x16 @ x16 2.0) -
If you're really interested in seeing what a cognac can be, forget the watch and laptop you don't need and try a bottle of pre-phylloxera cognac.
BTW, the glass in my avatar does not contain cognac. -
You and Louche are clearly beyond me on Cognacs: I had some very nice (to me) Hine Antique XO once, but never such as Tres Vielle or Family Reserve. I'll fish out my bottle of Macallan and go for the filet mignon.
But are we not leaving something out? Someone pick the cigar. -
As far as cigars go I am a huge fan of the Acid Collection by Drew Estate. They are not anything like any other cigars... they are flavor infused with oils, wines and herbs.
If you want me to pick a more classic cigar, I'd have to go with Pepe & Manuel, I don't know where to find them anymore, I don't even know if they still exist. They are Cuban seed Dominican made cigars, as far as I am concerned they are about the closest non-Cuban I've had to a real Cuban. When I was younger I'd go to Montreal and buy real Cubans, I never tried to bring them home though. -
Old Lagavulin or Laphroaig are wonderful. Laphroaig 30 is my favorite. The selection of Hine or other cognacs in a liquor store, even in a very good one, is limited. That's why there are rare spirits dealers. For those with inclination and resources, it's possible to purchase Hine sold in the '30s and other antique (and sometimes banned) spirits.
I've found Cuban cigars to be grossly over-priced, often of suspect quality and generally not worth it. Instead, try a Fuente Fuente OpusX. -
So what is in your glass Louche?
-
There's OT and then there's what this thread has turned into...
-
In our defense, we're waiting with bated breath while Nando, Crystal, you, and others do the heavy lifting on this one - waiting for outcomes from those tests and passing the time with some small talk... -
At the moment, no news from Sony and Village Tronic...
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
- who would still want a Sony Z2-specific eGPU implementation option if the best performance was achieved running an external LCD?
- what price would they pay for such an eGPU option?
So will have to wait and see what beaups' results are.
Can someone take a photo of the Z2's mPCIe slot?
The current DIY eGPU hardware can be made to run at x1 1.0, which with compression becomes approx x1 2.0. If you have an accessible mPCIe slot (for wifi) then that is a way you could have an eGPU implementation today.
bplus are working at getting cabling to run at x1 2.0 speed. In which case with the pci-e compression you'd be getting ~x2 2.0 or x4 1.0 performance. A significantly cheaper option. Maybe good enough to tide things over until Sony move to a more standard cabling solution like Thunderbolt? -
I don't care about running the notebook LCD, I need an eGPU for external displays, I'm still in.
-
I am not sure if this is what your looking for but another z2 user has started to take apart his computer and took shots, hope it helps...
http://forum.notebookreview.com/sony/603509-sony-vaio-z2-vivisection.html -
-
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
-
User Retired 2 Notebook Nobel Laureate NBR Reviewer
A comprehensive benchmark set like that may well spawn off a "How to optimize Z2 HD6650M gaming performance" thread, summarized as : use an external LCD and disable HD6650M audio device. That is if benchmarks confirm it's a muxless setup. -
In addition, to test if it's 4X 2.0 or 1.0, try this one AMD APP Power Toys | AMD Developer Central
-
Meaning - if a fiber is dedicated to the GPU and another is dedicated to sending display data back to the internal display, couldn't the Sony setup be muxless AND exhibit no performance degradation on the internal display? Just curious. -
And I'm happy to run the benchmarks but I believe we've already confirmed it's a muxless setup based on the fact that while running on the ATI card, the Intel graphics utility still sees the internal display and can control color correction, etc. Right? Or would that confirms is muxer based? Either way, it should confirm something
-
There are two LP/TB controllers on the PMD each providing 4 PCIe lanes.
How is this multiplexed over the LP/TB channels ?
Maybe it's possible to get x8 1.0 PCIe using two LP/TB controllers ? -
We still need to find 12 other people... I'm going to try to get some new interested people on some french forums.
-
8X 1.0 would be possible for GPU only if lasers in optical fiber wire contain mixed data which is muxed/demuxed by transmitter/receiver dies themselves. Additional data can be demuxed/muxed by TB controller to be redirected on the right place. So It depends on these facts.
Do you know already some forum? -
Don't know if this is new information but the Intel TB chips have at least been identified (the ones used by apple).
AnandTech - Eagle Ridge: The Cheaper, (optionally) Smaller Thunderbolt Controller -
Light Ridge is the same standard TB chip used on Z21
-
Looks like 10Gb/s for the GPU
Attached Files:
-
-
Yeah... it's a 2X 2.0 .. that would become 4X only with Optimus compression, hacking the PMD in the way we know by Nando4.
This is sad
They decided to use only one laser for PCI-E .. -
User Retired 2 Notebook Nobel Laureate NBR Reviewer
-
You predicted my question XD
-
3DMark 11 runs the exact same score on an external display (hooked to dock) vs. the internal display. Both tests scored P1252. The was the benchmark at 1280x720 scaled to 1920x1080 displays.
For the external test, I disabled the internal LCD
For the internal test, I disabled and physically unplugged the external LCD
Interestingly, 3DMark identifies my card as a 6770m.
My config is i7-2620/6/256gbssd FWIW
ViDock 4G for Light Peak (Z21) - POSSIBLE.
Discussion in 'VAIO / Sony' started by Crystal1988, Aug 5, 2011.