The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Laptop Video Cards: "Hardware" Decoding of Video Formats

    Discussion in 'Gaming (Software and Graphics Cards)' started by LaptopGun, Apr 17, 2008.

  1. LaptopGun

    LaptopGun Notebook Evangelist

    Reputations:
    34
    Messages:
    362
    Likes Received:
    0
    Trophy Points:
    30
    In my question for a new laptop, I've looked at graphics cards from both ATI and Nvidia (HD 2600/3600 vs GeForce 8600GT/9500GS). Both companies like to trought out various fancy tech specs about their cards, but one such spec really has me scratching my head (well, that and why some OEM's get away with sticking in DDR 2 and claiming its DDR3, or conflating turbocahe numbers with discrete memory) is hardware decoding of various video formats. It seems ATI covers a lot more, and a lot more obscure ones, than Nvidia. Is it that Nvidia just lets the CPU do more, or do they not report all the formats that the card does handle, or do all cards decode video in the card (ATI just advertizes it more)? Does ATI's Avieo and Nvidia's Pure Video basically function the same (or does one actually force the CPU to do more work than the other)?

    I kinda would have thought the notebook would let the videocard decode all video to begin with. I understand encoding or transcoding would be handled by the CPU.

    2000 Series
    3000 Series
    Only difference I can see between the 2000 and 3000 series is the 3000 has Display Port and integrated DTS decoding (and that's audio). Then again I'm not a an utter hound for numers.

    Now for Nvidia's Pure Video. They dont have the 9 series cards up on the site, but this is their page about PV.
     
  2. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Discrete graphics cards primarily handle 3D processing, video decoding has always been done with your CPU. Keep in mind that Pure Video HD only works in certain applications.

    I think the general opinion on these HD hardware decoders is that a good software decoder (CoreAVC Professional) performs better in Media Player Classic than the benefit you get of hardware decoding with PowerDVD or other supported software players.
     
  3. Johnny T

    Johnny T Notebook Nobel Laureate

    Reputations:
    6,092
    Messages:
    12,975
    Likes Received:
    201
    Trophy Points:
    481
    You dont get DDR3 memory in graphics cards right now anyway...

    GDDR3 is a die-shrink of DDR2 I think thats what getting people confused. GDDR3 runs cooler thus allowing a higher clock speed! :)
     
  4. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
  5. LaptopGun

    LaptopGun Notebook Evangelist

    Reputations:
    34
    Messages:
    362
    Likes Received:
    0
    Trophy Points:
    30
    Thanks. Interesting reads from everyone. So ATI apaprently does have a bit of advantage. My new laptop really does need to do everything I through at it well and I'd be willing to sacrfice a little gaming performance for a big increase in HD content processing.
     
  6. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    I don't think there will be that great a difference. Though in ATI's favor, with their hardware you can run Folding@Home and process more data faster than a PS3. Nvidia can't do that (yet).

    Just make sure you are getting a decent dual core processor before worrying about GPU. Quite a bit of work is still done on the CPU, and you can't guarantee all video applications will work with your hardware acceleration.
     
  7. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    ...I think you ignored my post.

    Hardware HD processing runs slower than a quality software processor.
     
  8. LaptopGun

    LaptopGun Notebook Evangelist

    Reputations:
    34
    Messages:
    362
    Likes Received:
    0
    Trophy Points:
    30
    I'd configure whatever laptop I wanted with a T8300 or 9300, so I think I'll have enough CPU power :)

    Thanks fo the heads up on that. I'll keep that in mind. So it sounds like I want a quality quality software processor for the programs that support it (but do not support hardware) combined with an ATI graphics card for the programs that support it (but not the software ones). That seems to hedge my bets.
     
  9. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    truth is, if you get a new laptop with a decent core 2 duo processor, it is going to do everything you throw at it well. i would get the nvidia gpu on the basis that it is faster in games (if you plan on gaming).

    i think about it this way. any good processor can handle any video format. the purevideo / ativideo stuff is really only to get richer color and do some advanced image processing on the frames before you see them to increase the quality. so if you start from a dvd and you are using the proper application, you can get really excellent quality out of the dvd. more than you would expect from a dvd.

    but if you are watching videos that are transcoded from dvd, then you are already losing image quality, and even a good quality rip / transcode from dvd matched with the image processing software isn't going to match up to a dvd straight from source.

    imo, you should pick a camp. either be a quality nerd and watch high quality source media and match it with the image processing software, or watch the transcoded versions and your processor will be able to more than handle it.
     
  10. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Yep.

    As long as it does it properly I don't see how it could be slower unless it's dropping frames. You can add features and methods in software to both, especially as VPUs and their support software improve.

    Lower CPU utilization is a goal for power consumption and heat concerns for many laptop users as well as for fluidity of playback on lower end system as often found in laptops.

    Sure, get high speed multi-core setup and it'll do the job just as well, but that ignores the benefits of hardware assistance.

    And Aston Martin Vantage and a Honda Civic Hybrid can both go up to the same speed limits in the US 55/65/75 etc, but the Hybrid will do it more efficiently, which is the goal here.

    You statement is equivalent to saying, GPUs aren't needed CPUs can render the graphics in software.
     
  11. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    That's the best way to approach it because there's benefits to both methods, with the obvious ones of hardware assistance being that for battery life, heat and a guarantee of a minimal level of performance upon which anything more is a nice added benefit.

    And the HD2600/3650 and GF8600/GF9500 are equally matched, their performance is far closer than the high end desktops, you won't be missing out on much of anything, if you were to have chosen a GF8600GS you would be worse off on all counts IMO.
     
  12. LaptopGun

    LaptopGun Notebook Evangelist

    Reputations:
    34
    Messages:
    362
    Likes Received:
    0
    Trophy Points:
    30
    I've been looking at notebooks with both brands of graphics cards. Some of the people here have seen my thread in the appropriate subforum, but for thsoe that haven't I am currently favoring a HP 8150 or ThinkPad T61p. Problem is each has their own pros and cons, even if they are incredibly similar. I'd like the extra performance the GF8600gt (or its workstation counterpart) offers over the Radeon 2600 in games. It's not massive, but it's there. Thing is everyday non gamming usuage, which I am more likely doing, seems to favor ATI's cards in battery life and now video decoding.

    The obvious answer is wait for the Radeon 3600 which basically equals or outperforms the 8600gt and be done with it. I of course run the risk of not being able to find a computer with Windows XP (I already have a Vista license if I really wanted it, which I dont) or even will offer drivers to buy my own XP. I'm stuck in a fascinating game of wait and see.
     
  13. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    If you read up on HD acceleration at least Pure Video HD it only supports h.264 acceleration, which is only ~34% of all Blu-Ray / HD-DVD titles. Most are VC1 or other formats. So at best, you'll only get acceleration on 34% of what you watch.

    Second, if you also read up more on both ATI and NVidia's acceleration, you'll find they only work in certain programs. PowerDVD, sometimes Windows Media Player, etc.

    I'll put it out like this, assume that Cloverfield Blu-Ray runs in PowerDVD with no acceleration at 75% CPU usage. Now assume that with hardware acceleration that figures drops to 50%. That same title, if played in Media Player Classic with CoreAVC software decoder, will run at 30-40% CPU usage. Of course I pulled those figures out of thin air, but it's to illustrate the current failure of hardware decoding.

    The point I'm trying to make is that even the best hardware decoding (In it's current implementation) can't overcome poor codecs, a good software codec right now will always yield lower CPU utilization.
     
  14. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    And ATi's HD series accerlates AVC-1, so what's your point other than the one the OP already pointed out about there being a difference.

    Second VC-1 doesn't really require much CPU acceleration it, H.264 that throttles CPUs, especially if it's an encrypted disc and you're doing audio decoding as the same time.

    Ananad's review shows that issue on a few titles, notice the H.264 decoding is the one that needs help the most;
    http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3258&p=1

    Handy then that those programs ship with pretty much every laptop out there, eh?

    Don't bother pulling numbers out of the air, pull them from a respected website to back up your claims, otherwise I pull the number 0 as the level of relevance of what is primarily a Linux solution because they are stuck with softwar decoding not being given full access to hardware acceleration yet. If the OP is only going to use Linux maybe he'd care, but he's still never going to achieve the same functionality of hardware acceleration and post-processing.

    Lower than baseline, but not necessarily lower than hardware, and as the acceleration hardware built into these VPUs doesn't keep them from using software solutions as well, then your point is moot, and your original statements aren't supported by anything you've provided sofar.

    You show me a commercial disk comparison between the two solutions that supports your theories; and not ripped content, but with a standard disk, as that's what most people will be using when discussing this, not ripping 30-50GB to disk first.

    The op was pretty specific about his questions and you steered him into a CoreAVC commercial. :confused:
     
  15. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I don't have test figures because you'll notice none exist. Why? Because it hurts their sales figures when suckers (such as the OP and you) believe that hardware HD decoding is actually useful. Also, because review websites never actually use the software that really matters, I've yet to see anybody (Anandtech, Tomshardware, etc) run tests with MPC, Mplayer, ffdshow, AC3filter, etc, despite the fact they are widely regarded as the best media playback software. I already told you that the hardware decoding only works in crappy software, so if you expect Media Player Classic or VLC to work with your fancy video card it won't work. It's all about the money, they have exclusivity deals with certain software manufacturers.

    Ask anybody that knows anything about HD decoding and they'll tell you that Media Player Classic with CoreAVC is the lowest CPU utilization period, with a possible nod to the free cross-platform Mplayer. CoreAVC advertisement? I think not, they simply have the best software decoder on the market. ATI and NVidia's hardware decoding is incompatible with those players. Even Media Player Classic with ffdshow decodes it more efficiently than Cyberlink/WMP/etc.

    I steered him away from his question because it's irrelevant. That's the part you can't seem to get. Hardware decoding is entirely irrelevant because it's still slower than every good media player's software decoding, and as I said earlier, the two are mutually exclusive.
     
  16. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Sure thing bud.
    No evidence, even from somewhere like Phoronix that uses CoreAVC often, means you have nothing to support your claim other than your guess. You still haven't explain even in theory how the hardware acceleration 'runs slower', so without any evidence this is just your guess. And your myths of the influence of money is funny as if the makers of CoreAVC aren't in it for the money either.

    Surprisingly enough on their purchase page they mention GPU support to come later, so even THEY see the benefit of hardware acceleration, even if you as one of their cheerleaders don't. :p

    Actually you can't play BluRay nor HD-DVD titles with those players without ripping them first, so once again the utility of software-only players is limited. For ripped content, like I said, it's less of an issue.

    Nah, I asked them and they all said you're wrong. Just like your other errors above.

    However you miss the point, no one is asking if PowerDVD > CA > NERO etc., that's your own made up discussion. The question is about the benefits of hardware acceleration, which do exist and even the creators of your favourite product acknowledge that.

    Really, interesting it seems that theysay they're about to be added as a feature, so I guess it's not THAT incompatible. But then again maybe their software is icompatible since they've been promising VPU acceleration since 2 years ago when they first started getting beaten by nVidia's old GF7s in the early age of VPU baed H.264 acceleration (no one pretends the GF7 or X1K cards are anywhere near the level of the new workhorse UVD and PureVideo hardware);
    http://www.behardware.com/news/8117/coreavc-stronger-than-avivo-purevideo.html

    So that review shows CoreAVC losing out to the GF7 w/ Cyberlink (and mentions the ATi cards not having support in that build), so unless you have something newer, then I'd say CoreAVC still comes in behind hardware acceleration, until they finally deliver on that promised GPU support. :D

    I doubt he agrees, but then again it's probably that same group of yours that unanimously decided it was so. :rolleyes:

    No what you seem to miss is that for the OP's question your detour is irrelevant since it doesn't improve playback, and it cannot decode protected titles, so it limits him to ripped content or else the few unencrypted titles out there. No one cares about your software crusade of which is better Cyberlink, Nero, KMP or CA, that wasn't the focus of the question but it seems to be the only thing you can focus on. Once CA adds GPU acceleration then what happens to your position? Does it suddenly change because now they do what the others do to and then hardware acceleration matters?