Will Vidock 4 power the laptop display? Or is external still needed? Apologies if it has already been mentioned![]()
-
King of Interns Simply a laptop enthusiast
-
moral hazard Notebook Nobel Laureate
From what I understand the vidock4 will need an external screen.
basically the vidock4 will allow more powerfull cards.
there is another Vidock thread in this forum which expains it all many times over. -
-
-
masterchef341 The guy from The Notebook
for that, we can use benchmarks (already provided to us by Steiner), and if we want, we can compare them to standard 4650 benchmarks to see how performance changes. -
Would getting the vidock to power the laptop screen need a hardware revision of the vidock, or would it just be a software thing?
-
masterchef341 The guy from The Notebook
it would need a hardware revision of the ViDock, designed for a different type of laptop specifically built to be able to drive the laptop's display externally.
-
I recall paladin saying that this feature should be possible with ATI in the future with the current hardware
-
Seems to me that if you can make the machine understand that there is another video card available, it could setup some sort of crossfire.... -
sweet. i thought it was if you had an amd ati combo set up though, not just ati. if its just an ati card, SWEET! I never heard any sort of crossfire thing either. From what i hear, the laptop's gpu "disapears" to the computer. then again, who am i to know?
if you right, that is sweet.
-
I feel like using ViDock on internal screen is possible,
Something like this http://snipurl.com/n5qe0 (Switching from internal/Vidock instead of Physx)
I hope of Steiner32 could do it for us, when he have some free time for testing.
If Steniner is willing to do it then Good luck & Thanks in advance
Edit: Sorry about this, I realized it wont work (LOL) - I will leave the link just in case (Maybe?) the extended monitor thing might work -
I wouldn't count on internal display acceleration.
Think about it, for Crossfire or SLI, you need to do half of the computation on once card, half on the other, then combine the results somehow and output the signal through DVI in the usual way. (back in the days, good old Voodoo used to do this kind of stuff by having one card rendering one half of the screen and another card the other half; simple, yet effective!)
For ViDock to accelerate internal screen you have to pull a completely different mumbo-jump. Instead of outputting through DVI, you would need to get the signal 'back' somehow through PCI-E and patch it to internal display. That would require some special hardware in your laptop. So there is no way to do it.
What you could actually do is to write some fancy driver that offsets part of the computation to ViDock, thus using it as an accelerator rather than full-fledged graphic card. But then the ExpressCard would be the biggest problem! Think about how much data the graphic card actually outputs versus it's input. For example, if you had a simple static scene, once you load the data to the GPU, you have little to none input traffic, while output can be 300 frames per second. That's huge amount of data. In this case, EC would limit you very severely.
Of course that's the most stupid, direct approach on could take and we could optimize. Since we're not using ViDock GPU as a GPU, but rather as an accelerator, we could crunch the numbers in ViDock, transfer it to the internal laptop GPU and then use it to output our 30, (or 300) FPS. But still, we would have to get all the data necessary to render the whole scene for every change and that might be a lot.
OK, I'm not saying it can't be done, because it can. But the question is, who's going to do it? To pull it off you'd need specialized drivers for the GPU. ATI won't do it (why would they; ViDock is an external product), so that leaves VillageTronic. I doubt they have resources to spare to write their own ATI/NVIDIA drivers.
On a side note, ATI did something like this. There is one Fujitsu-Siements notebook that has an external GPU which can accelerate the internal display. I'm not sure how it works though. But that was a collaboration between FSC and ATI. If you had full source code for ATI drivers, you could play with ideas like that. But I really, really doubt (although it would be nice if I was wrong!) that an external company, without direct help from ATI, can afford to do something like this. -
VillageTronic IS working with ATI directly on this. It is the ATI rep that has told them that it could be done.
-
-
Donald@Paladin44 Retired
There is no more info than what Steiner32 said. ATi is claiming it will work, so it is up to them to make it happen.
-
King of Interns Simply a laptop enthusiast
Would be great if nvidia helped too. Competition is always healthy and good for us the consumer
Means less time to wait and possibly cheaper solutions!
-
paladin44: thanks for info! -
but vidock bypasses the internal gpu. thatd be cool but right now it cant happen.
SUPER SWEET ON THE LAPTOP SCREEN AND VIDOCK COMBO. HOPE ATI CAN GET SOME DRIVER SUPPORT THEREor maybe they should just work on trying to keep thier own up to date
. ati+drivers... ehh.
hope they do work on the screen though. in all actuality, if they can get that to work, more people would buy thier gpu's which gets them more $ which is a big plus for them. also, more money form a source that nvidia cant touch is A SUPER PLUS which they should be eager to exploit. -
masterchef341 The guy from The Notebook
its just going to cost more precious expresscard bandwidth...
i hope to see them focus on getting a specialized external GPU port. maybe i can get some more info on amilo. -
-
what u dont get is that a 4670 req. bandwidth is 21.3GB/s at max but a PCI-E 2x supplies 1GB/s which what a EXpressslot 2.0 is or am i confused. which still dosnt explain how one would get that much data through and only get a 10% drop as stated on the ViDock facebook. Stiener where are you...
-
tapper, that's the max theoretical bandwidth, which is nice if you need to load something quickly (new enemy pops up and you need to load his textures, etc). The big question is how much actually gets pushed through PCI-E when you're playing the game (and when! if it's during loading phase, it may load longer, but you won't notice a difference during gameplay). This would dependent on the engine, so you can't really say "ViDock performs 10% worse" and expect it to be valid across the board.
Isn't EC 2.5Gbps which is the same as PCI-E 1x? -
2.0 is out now which is same as 2x aparently and Vidock uses that. thanks Rep+
-
I think they finalized the standard in late march, and it was just announced. But you won't see any laptops with EC2 any time soon.
What VillageTronic means is that when you'll finally get EC2, the ViDock that you buy now will run faster and take full advantage of it. -
Donald@Paladin44 Retired
^--^
Exactly.
What so many of you are missing is that all your calculations and suppositions just don't matter. It works, and it works beautifully as can be seen by both of the videos that have been posted.
The ViDock2 is for the user that has a laptop with integrated video, or a lower end discrete video card maybe a year or two old, possibly issued by their company or a gift from Grandma, that can now be a gaming laptop and can be upgraded generation after generation as new cards come out by only buying the new video card. It is for the traveler who wants a nice lightweight 12.1" laptop on the road, but once back home they can turn it into a nice gamer. It is for anyone who DOESN'T have the Uber gaming rig, for whatever reason, but still wants to game without having to buy a new laptop, and wants to be able to upgrade as new generation cards come out without having to buy a whole new laptop to get a better card.
It is NOT for the filthy rich geek with a water cooled desktop with GTX 285's in SLi plus a laptop with 4870's in Crossfire, and buys new ones every year or two...or the person that would be just as happy with a desktop and NO laptop (yup, there are still some of those left out there), or the user that wants to spends thousands every year or two to buy new rigs.
So, filthy rich geeks and those who would prefer a desktop...this is not for you...so quit trying to smash and bash it just because it doesn't fit your needs and let the rest of us enjoy our fun with needs that are different than yours...
Get it? -
will this run Aion, same engine as far cry I maxed out? with the ViDock then?
* Inspiron 15 (1545) Laptop: Intel Core 2 Duo P8700 (2.53GHz/1066Mhz FSB/3MB cache)
* Genuine Windows Vista Home Basic
System Price : $589.00
Operating System
Genuine Windows Vista Home BasicMemory
4 GB DDR2 SDRAM 800MHz (2 DIMMs)Hard Disk Drive
160 GB SATA Hard Drive (5400RPM)Video
Intel Integrated Graphics Media Accelerator X4500HDWeb Camera
No CameraMedia Bay
8X DVD +/- RW w/dbl layer write capabilityCertified Refurbished
Certified RefurbishedBase
Inspiron 15 (1545) Laptop: Intel Core 2 Duo P8700 (2.53GHz/1066Mhz FSB/3MB cache)Hardware Upgrade
6 Cell Primary Battery
65W AC AdapterSoftware Upgrade
Windows Live
Microsoft Works 9.0
32BIT Operating System CD
32BIT Operating System CDSystem Color
Jet BlackLaptop Screen
15.6 WXGA Laptop Screen Display with TrueLifeNetwork Interface Card
Dell 1397 802.11B/G Wireless Mini Card -
paladin44: exactly! I want to have a laptop only, so I'm hoping to pair up ViDock with either Thinkpad T400s or Vaio Z. Both pack a punch, but not in the GPU department; so ViDock would make a perfect addition
tapper: as tests have shown, you can expect ~6.3k 3D Mark 06 performance, which should be great for many games
but my take is this: if you can, get a 1GB graphic card to run with ViDock. I have no tests to back it up, but logic tells me that it WILL make a difference ;-) -
-
-
thanks i just dont want to make a mistake and be at 370$$$ im just on limited budget so the best bang for buck is everything for me. being upgradable is only reason why im buying and its desktop Gpu
-
hey trapper, about the 4670 using 21.4 gb/s, i think (i know nothing about this stuff) but aule said that the information going into the gpu isnt exactaly the whole 21.4 gb/s. i think what he was saying is that a lot of it is what is going out of the gpu, not in. so, the gpu might not need a lot going in but it needs a lot going out. might be why it needs to go straight to an external monitor becuase the express card cant take the 21.4 back in.
i could be 100% off course but thats what i got out of it. PLEASE CORRECT ME! i dont want to be too far off. -
Randomdude: well, yes and no. The bus is 20Gbps wide for a reason - you want to be able to load a lot of data very quickly (textures, models, etc). But you won't be doing it all the time; thats peak performance.
I'll give you an example: imagine rendering a static scene at 60fps. Say youre playing FPS game, youre in a room with no windows, few things lie around, youre standing still. Nothing moves, nothing changes, like a photo. When you start rendering, you need to transfer some data (might be a lot) from the memory to graphic card. With 20gbps its going to be almost 10x faster than with 2.5Gbps. But the scene were rendering is static, so once the data is there, GPU simply crunches the numbers and outputs the same image 60x per second. So after loading the initial data, you don't transfer anything anymore.
Now, what happens when you move? All geometry, textures, shaders, everything is already in the GPU memory. So you just need to send a tiny bit of information that the position of the camera changed, and the GPU will produce new image.
Now, you open the door and go outside.
That's possibly mean new stuff - new chunk of game world to show - an it needs to be loaded. And the big question is (one that I can't answer - we need a game developer for that), how much there is to load?
Lets say there is 500MB of data (thats a lot, but I'm using extremes to illistrate the point). With 20Gbps thats 1/5s; with 2Gbps its 2s. So, ViDock would fall behind. If you needed to constantly swap huge chunks of data, the difference between ViDock and desktop would be really noticable.
But the nice thing is that this doesn't happen in real life. Before the game starts you load all the stuff and if the amount of memory on your GPU is big enough, its just stays there until you finish (you load small amounts of stuff as you go and big chunks on loading screens). You might need to add something later on, then add something else, etc, but if it still fits, then you're good. If it doesn't, then you need to start swapping between computer memory and GPU memory. Both are blazing fast, so it's not noticable. But ViDock limits that bandwidth.
So, final example: if you had a 128MB GPU and a game that required 512MB of data to render, on ViDock it would work like crap while in normal PCI-E, it would be MUCH better.
I've got an idea - I'll try to post this questtion (peak data rate/peak amount) to some game developers, like valve or ID. They must have answers (everyone does extensive engine profiling) and might be willing to share them! -
thx aule. ya i think i get it a little. basicaly if you have a 1 gig video card, it should *theoreticaly* work closer to normal (desktop) becuase it has more data in it. Vidock flops on the whole *new* information transfer.
thanks a lot.
o wow, that actually answers a lot of ?'s ive had about stuff before.
thanks. and ya it would be cool if you could ask some game dude peoples.
-
so, putting a 2 gig (4870x2) card would give it even better results because it could hold the most data at a time, right?
also, this is WAY OFF TOPIC, but how exactaly does a game get processed?
hdd to cpu to videocard to screen?
and lastly, people have been thowing around the idea of crossfire. wouldnt that not work because that would be a lot of info the 2 cards have to comunicate and with only the express 2.0 being the single gateway to each other, it would suck. right?
again, im asking a lot that is probably off topic, but why not -
game work on computers exactly like a xbox does so. ( iknow exactly how that works so im doing i that way.) the game data is stored on ROM disc or HDD. the OS of the computer/console then says well will you look at that the user wants to use this information. and so it executes the code on such file. which starts the file. the most important information is then moved to the RAM controlled by the cpu. and this is when you hear your HDD spinning. after the info is moved youll see the game come up this is when you can manipulate the game by teh entrance screen. now say i want to play multiplayer, i the user input multiplayer which exucutes another order to connect to a sever with other people, so we take a ride over the lines. (this is all CPU so far. so we head over to the NPU or nic card and it lines up the data to be shot out to that server via you line of transportation. so we got that done. now ther server over there sends back a signal that its recieved you info and is letting you join. this is when your GPU actually starts doing its stuff. after you joined the Server is take into acount where you are and is sending info to your computer about whats going on (which is then Proccessed by your CPU nwo once the CPU has figured out all that info by using the Games COde it starts to send The info to be displayed to you Gpu, this is just information small nothing big. The GPU then starts to work its magic. the Proccessor on the GPU starts to use the information from the CPU to draw up screen. depending on the settings is where you get more complicated items. the GPU is now using its onboard RAM to proccess and line up each pixals color and location. and you the user is sending information to the CPU through input device which then sends new info to GPU which then redraws the screen, and everything from connecting to server to here is happening all at same time, all about 30+ times a sec... this is pretty basic bu unless at were looking at page 17 of this thread and be with carpal tunel(SPCK*) this is what you getting
-
masterchef341 The guy from The Notebook
It all depends on how to optimize performance given design. Take a game like Crysis. You are outdoors and have a huge range of vision. At any given moment, you may be looking at basically anything that exists in the game. Not surprisingly, on a GPU with a large amount of onboard memory (shared memory will kill ViDock I expect, but that is a non-issue) holds up pretty well in Crysis. To optimize performance, they load up a ton of stuff into GPU memory and keep it there. At 1x pci-e 2.0 (about twice as fast as ViDock?) you still get about 85% performance or more, possibly pushing into the 90's%.
But then you look at a game like CoD4. Relatively smaller areas, with a lot of action and unique details in those areas. And CoD4 just absolutely suffers at 1x. Huge performance drops, like 80% loss in performance. It is possible that CoD4 optimizes performance in normal machines by limiting the amount of info stored on the GPU. Since they expect to have access to a certain amount of bandwidth, they can do this to lower the system requirements without changing performance on the high end. Obviously this concept is defeated if you restrict your bandwidth. -
well that answers my question, aion is like that were it loads entire map at each world and is outside with lots of visual effect everywhere, thanks you two. however i believe once i get ViDOck i will try on cod4 and give you guys feedback. hope i can instal a Express2.0 to my 1.2 that would be very nice, to my knowledge it would just be a different adapter slot maybe...
-
I will try to borrow the laptop again - and see how COD4 and Far Cry work. Will try that later today or tomorrow.
-
what lapatop you got now?
-
-
masterchef341 The guy from The Notebook
my theory (really just speculation) has to do with WHY CoD4 gets a huge performance drop at pci e 2.0 1x (almost twice ViDock bandwidth), not whether or not it happens.
on some level, it just depends on the implementation and the engine.
but there is not necessarily a direct correlation between losing performance at 1x bandwidth and graphics.
Crysis holds up well under 1x, and other AAA games do also. CoD4 and FSX suffer huge losses. -
thats a good speculation masterchef. (ps, thanks for not ripping on this about desktops any more
, youve come to the dark side my young grasshoppa
. lol jk) id be curious how that works. basicaly it all depends on the game. lol... this will be like a hit or miss with games for whoever gets it. however, it will probably only fail in a select few. either way, its gonna be better than a intel 4500 or ati 1300 lol. Im still shooting for the most memory on a vidcard.
ati 4870x2 with TWO gigs.
and back to trapper
basicaly its HDD-> Ram->Cpu->Express 2.0->Gpu->screen.
to me, it makes sense and that is why vidock works and why crossfire doesnt work. Crossfire, all the data needs to be shared with ONE EXPRESS2.0! we are already talking about performance loss because its only express 2.0. crossfire will actually shoot us in the heel.imo -
masterchef341 The guy from The Notebook
don't get me wrong, this thread has inspired me to take one of my old computers and upgrade it for $300 to dominate the ViDock. I haven't totally changed sides here. Still, I am curious about how the ViDock works and I also know the ViDock could come down a lot in price. If it came down sufficiently, I might jump on it.
ATI 4850 $100
Phenom II Triple Core $100
4 GB memory $50
Motherboard $50
Literally comes out to $298 + tax.
Case, power supply, keyboard, mouse, monitor, DVD drive and operating system I already have from my old build.
With the ViDock, you still need to have access to at least a monitor and mouse. So really, even starting from scratch, you just need a Case, power supply, keyboard, and DVD drive, and I believe we still have $100 left in the budget. -
ya very very tempting, i know. i have thought about going desktop but i need portability
. i think a great idea would be to have a hdd that you could interchange. pop it out of your laptop and into your desktop easily. that would kick @$$. get a cheap laptop for like 4-500 and then a good-decent desktop for like 700. game at home and portable on the go w/o synching. wala.
but that wouldnt work for some either and it would require laptop companies to take up the design.
i just want usb 3.0, express 2.0, fast 32nm cpu, 40nm dx11 gpu(vidock), windows 7, crappish gpu for laptop, led screen, and a 500+ 7200 hdd. lol no biggiewaiting till i graduate next year then itll be my own present
.
now to wait on the cod4 tests.stiener is awesome. props man, i woulda gotten a little annoyed after doing all this. thanks
-
-
-
You could optimize by not preloading too much, but it's not an optimization to limit what gets stored in onboard RAM. Once it's there, it costs you nothing.
Can you test other UT3 games? Last Remnant has a free benchmark. You could try Bioshock or UT3 itself too.
COD4 - haven't seen any tests. Can you give me a link? Also, how did you test it? Did you use ViDock or did you run a desktop with slow PCI-E lane?
I'm curious, does the performance improve if you do a second run of the same map (during normal gameplay, not benchmark of any sorts). What happens if you load the game and just stand still? What's your framerate?
Isn't PCI-E 1x the same as EC in ViDock? I didn't have a desktop for years, so, I'm not sure what 2.0 means here. Is that not what you get with EC?
Last Q: what's FSX? -
to aule, express 2.0 is basicly 2x as fast as 1.0. and i am 95% sure that PCI-e is same thing as express 1.0 -
masterchef341 The guy from The Notebook
sorry to not be clear enough.
pci-e 1x is basically the same thing as expresscard bandwidth wise.
the test i was looking at for CoD4 and Crysis performance was a desktop using a pci-e 2.0 restricted to 1 lane.
1 pci-e 2.0 lane = 2 pci-e lanes.
So, again, to be clear, the benchmarks that I was looking at showed that Crysis held 85-90%+ of its performance at approximately 2 lanes of pci-e bandwidth. Call of Duty 4 did not hold up so well.
I can't find the link, but I posted it here in a different thread. It is a Tom's Hardware Article where they explored and benchmarked reducing lanes on a PCI-e 2.0 graphics card. -
FSX= Flight Simulator X. Don't believe the hype about Crysis, THIS is the game that can easily surpass Crysis and GTAIV in its ability to bring PCs of any configurations to their knees.
ViDock 2 - My experiences so far
Discussion in 'e-GPU (External Graphics) Discussion' started by Steiner32, Jul 7, 2009.