I'm planning on getting me and my gf laptops and I've currently got my eyes set on either the Asus UL's or the Acer Timelines. Mine will be used for many things including light gaming, so one with dedicated graphics for me (Nvidia G210m or HD4330 - both DX10.1) is an easy choice - but for my gf I don't know whether I should save money and confusion and go with simple integrated graphics (intel x4500) or if there's enough non-gaming activities that make use of the graphics.
My question is this - what activities are accelerated significantly through use of dedicated graphics and what could I expect - i.e.:
Converting video (to common file type) with program XYZ time decrease of ZZ%
Watching DVDs / or specific types of movie files?![]()
Etc
I've read that flash will be GPU accerated soon too, will this be noticeable with just intel integrated or will it only work with dedicated graphics?
-
Currently .... nothing is accellerated substancially by GPU but games AFIK
-
masterchef341 The guy from The Notebook
games and video.
get an integrated chip for her from ati or nvidia. no one deserves the shoddyness that is intel integrated graphics.
9400m perhaps? -
Here's a random one for you.. password cracking
There are programs out there that use the GPU in addition to the CPU/cores to speed this up, especially since CUDA was released.
That's more of an informational tidbit for you, I don't expect you do much password cracking. -
Mostly video compression via CUDA. Try Badaboom.
-
Almost 3 years later, same laptop, different story on the light gaming.
GTA IV
Ghostbusters
Fallout 3
Batman: Arkham Asylum
Red Faction: Guerrilla
Left 4 Dead
Orange Box
Etc, etc...
Just a word of advice, unless you have plans to do heavy gaming on another computer/desktop, always keep the idea you might want more later on. -
-
Scientific Code.
-
Windows 7 does an incredible job actually utilizing the GPU when viewing most types of media. Most video playback is GPU accelerated when played through Windows Media Player in Windows 7, which is an incredible burden lifted off your CPU when you are viewing hi-def H.264 or MPEG2.
It will also use the GPU to transcode video for mobile devices, but you already mentioned that.
Flash 10.1 will finally allow people to watch things like Hulu in HD, even if they have mediocre CPUs. 10.1 exposes the H264 codec to the GPU if I have my facts straight (which I may/may not). I assume an x4500 would be able to do some decoding, provided it has an H264 decoder built into it (I don't know if it does). Flash 10.1 wont be coming out for at least 3 or 4 months though, so it's hard to really stake a huge amount on something that far away.
Eventually the GPU will act as a full co-processor, essentially offloading ANY task that can be parallelized, including things like OS Boot or working in Office, but that is several years away. -
-
you can also fry eggs with some videocards
-
3D animation creation software.
-
You can playback HD content with the latest CoreAVC codec via CUDA. It works very well. That's my only personal experience with using a GPU in a non-gaming situation. Your girlfriend may not be playing games, but surely she'll want to watch a few movies, no doubt.
-
You don't even need CoreAVC, Media Player Classic HC supports GPU acceleration using I think one of the FFMPEG codecs.
And Badaboom is pure tripe from a quality perspective, doesnt even support 2-pass encoding or single-pass CRF. -
jenesuispasbavard Notebook Evangelist
With nVidia GPUs, you can use CUDA applications on projects like SETI@Home and Folding@Home to considerably speed up computation. I use my GTX 260M for SETI@Home and it's much faster than using just the CPU (although my CPU is only a P7350).
They're adding support for ATI GPUs "soon" (it's been 'soon' for quite some time now - hopefully DirectCompute or OpenCL will solve all these issues). -
Encoding, and for NVidia chips, they will be able to run Flash HD off the GPU (I would also anticipate that sooner or later ATI will be able to do it too). I think we are going to see a lot more apps move to the GPU this year.
You don't want to be stuck with the garbage Intel calls a graphics chip, you will just be disappointed down the road. They have nothing remotely capable of any GP-GPU functions and won't until/if Larabee sees the light of day. That's going to be a long time, if ever. -
spradhan01 Notebook Virtuoso
What non-gaming activities can a GPU accelerate?
Discussion in 'Gaming (Software and Graphics Cards)' started by jk6959, Oct 14, 2009.