In my question for a new laptop, I've looked at graphics cards from both ATI and Nvidia (HD 2600/3600 vs GeForce 8600GT/9500GS). Both companies like to trought out various fancy tech specs about their cards, but one such spec really has me scratching my head (well, that and why some OEM's get away with sticking in DDR 2 and claiming its DDR3, or conflating turbocahe numbers with discrete memory) is hardware decoding of various video formats. It seems ATI covers a lot more, and a lot more obscure ones, than Nvidia. Is it that Nvidia just lets the CPU do more, or do they not report all the formats that the card does handle, or do all cards decode video in the card (ATI just advertizes it more)? Does ATI's Avieo and Nvidia's Pure Video basically function the same (or does one actually force the CPU to do more work than the other)?
I kinda would have thought the notebook would let the videocard decode all video to begin with. I understand encoding or transcoding would be handled by the CPU.
2000 Series
3000 Series
Only difference I can see between the 2000 and 3000 series is the 3000 has Display Port and integrated DTS decoding (and that's audio). Then again I'm not a an utter hound for numers.
Now for Nvidia's Pure Video. They dont have the 9 series cards up on the site, but this is their page about PV.
-
Discrete graphics cards primarily handle 3D processing, video decoding has always been done with your CPU. Keep in mind that Pure Video HD only works in certain applications.
I think the general opinion on these HD hardware decoders is that a good software decoder (CoreAVC Professional) performs better in Media Player Classic than the benefit you get of hardware decoding with PowerDVD or other supported software players. -
You dont get DDR3 memory in graphics cards right now anyway...
GDDR3 is a die-shrink of DDR2 I think thats what getting people confused. GDDR3 runs cooler thus allowing a higher clock speed! -
TheGreatGrapeApe Notebook Evangelist
These may help answer some of your questions ;
http://tibrium.neuf.fr/DXVASupport.html
http://www.tomshardware.com/reviews/avivo-hd-purevideo,1711.html
Both are powerful enough to do much of the decoding work, and the only one where nV falls short it VC-1, however it's not as demanding as h.264
The main issue though is when you play BR and HD-DVD discs it needs to do the decryption then decode both video and audio, so the hardware assist help on high bit rate titles. -
Thanks. Interesting reads from everyone. So ATI apaprently does have a bit of advantage. My new laptop really does need to do everything I through at it well and I'd be willing to sacrfice a little gaming performance for a big increase in HD content processing.
-
I don't think there will be that great a difference. Though in ATI's favor, with their hardware you can run Folding@Home and process more data faster than a PS3. Nvidia can't do that (yet).
Just make sure you are getting a decent dual core processor before worrying about GPU. Quite a bit of work is still done on the CPU, and you can't guarantee all video applications will work with your hardware acceleration. -
...I think you ignored my post.
Hardware HD processing runs slower than a quality software processor. -
I'd configure whatever laptop I wanted with a T8300 or 9300, so I think I'll have enough CPU power
-
masterchef341 The guy from The Notebook
truth is, if you get a new laptop with a decent core 2 duo processor, it is going to do everything you throw at it well. i would get the nvidia gpu on the basis that it is faster in games (if you plan on gaming).
i think about it this way. any good processor can handle any video format. the purevideo / ativideo stuff is really only to get richer color and do some advanced image processing on the frames before you see them to increase the quality. so if you start from a dvd and you are using the proper application, you can get really excellent quality out of the dvd. more than you would expect from a dvd.
but if you are watching videos that are transcoded from dvd, then you are already losing image quality, and even a good quality rip / transcode from dvd matched with the image processing software isn't going to match up to a dvd straight from source.
imo, you should pick a camp. either be a quality nerd and watch high quality source media and match it with the image processing software, or watch the transcoded versions and your processor will be able to more than handle it. -
TheGreatGrapeApe Notebook Evangelist
Lower CPU utilization is a goal for power consumption and heat concerns for many laptop users as well as for fluidity of playback on lower end system as often found in laptops.
Sure, get high speed multi-core setup and it'll do the job just as well, but that ignores the benefits of hardware assistance.
And Aston Martin Vantage and a Honda Civic Hybrid can both go up to the same speed limits in the US 55/65/75 etc, but the Hybrid will do it more efficiently, which is the goal here.
You statement is equivalent to saying, GPUs aren't needed CPUs can render the graphics in software. -
TheGreatGrapeApe Notebook Evangelist
And the HD2600/3650 and GF8600/GF9500 are equally matched, their performance is far closer than the high end desktops, you won't be missing out on much of anything, if you were to have chosen a GF8600GS you would be worse off on all counts IMO. -
I've been looking at notebooks with both brands of graphics cards. Some of the people here have seen my thread in the appropriate subforum, but for thsoe that haven't I am currently favoring a HP 8150 or ThinkPad T61p. Problem is each has their own pros and cons, even if they are incredibly similar. I'd like the extra performance the GF8600gt (or its workstation counterpart) offers over the Radeon 2600 in games. It's not massive, but it's there. Thing is everyday non gamming usuage, which I am more likely doing, seems to favor ATI's cards in battery life and now video decoding.
The obvious answer is wait for the Radeon 3600 which basically equals or outperforms the 8600gt and be done with it. I of course run the risk of not being able to find a computer with Windows XP (I already have a Vista license if I really wanted it, which I dont) or even will offer drivers to buy my own XP. I'm stuck in a fascinating game of wait and see. -
Second, if you also read up more on both ATI and NVidia's acceleration, you'll find they only work in certain programs. PowerDVD, sometimes Windows Media Player, etc.
I'll put it out like this, assume that Cloverfield Blu-Ray runs in PowerDVD with no acceleration at 75% CPU usage. Now assume that with hardware acceleration that figures drops to 50%. That same title, if played in Media Player Classic with CoreAVC software decoder, will run at 30-40% CPU usage. Of course I pulled those figures out of thin air, but it's to illustrate the current failure of hardware decoding.
The point I'm trying to make is that even the best hardware decoding (In it's current implementation) can't overcome poor codecs, a good software codec right now will always yield lower CPU utilization. -
TheGreatGrapeApe Notebook Evangelist
Second VC-1 doesn't really require much CPU acceleration it, H.264 that throttles CPUs, especially if it's an encrypted disc and you're doing audio decoding as the same time.
Ananad's review shows that issue on a few titles, notice the H.264 decoding is the one that needs help the most;
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3258&p=1
You show me a commercial disk comparison between the two solutions that supports your theories; and not ripped content, but with a standard disk, as that's what most people will be using when discussing this, not ripping 30-50GB to disk first.
The op was pretty specific about his questions and you steered him into a CoreAVC commercial. -
Ask anybody that knows anything about HD decoding and they'll tell you that Media Player Classic with CoreAVC is the lowest CPU utilization period, with a possible nod to the free cross-platform Mplayer. CoreAVC advertisement? I think not, they simply have the best software decoder on the market. ATI and NVidia's hardware decoding is incompatible with those players. Even Media Player Classic with ffdshow decodes it more efficiently than Cyberlink/WMP/etc.
I steered him away from his question because it's irrelevant. That's the part you can't seem to get. Hardware decoding is entirely irrelevant because it's still slower than every good media player's software decoding, and as I said earlier, the two are mutually exclusive. -
TheGreatGrapeApe Notebook Evangelist
No evidence, even from somewhere like Phoronix that uses CoreAVC often, means you have nothing to support your claim other than your guess. You still haven't explain even in theory how the hardware acceleration 'runs slower', so without any evidence this is just your guess. And your myths of the influence of money is funny as if the makers of CoreAVC aren't in it for the money either.
Surprisingly enough on their purchase page they mention GPU support to come later, so even THEY see the benefit of hardware acceleration, even if you as one of their cheerleaders don't.
http://www.behardware.com/news/8117/coreavc-stronger-than-avivo-purevideo.html
So that review shows CoreAVC losing out to the GF7 w/ Cyberlink (and mentions the ATi cards not having support in that build), so unless you have something newer, then I'd say CoreAVC still comes in behind hardware acceleration, until they finally deliver on that promised GPU support.
Laptop Video Cards: "Hardware" Decoding of Video Formats
Discussion in 'Gaming (Software and Graphics Cards)' started by LaptopGun, Apr 17, 2008.