The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    image quality on external display

    Discussion in 'Apple and Mac OS X' started by athenaesword, Jun 14, 2008.

  1. athenaesword

    athenaesword Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    hi guys,

    was wondering, if i hooked up the Macbook to a 50" HDTV, would it have inferior image quality compared to another notebook with a dedicated graphic card. ATI cards have been said to have superior image quality over Nvidia cards, so how do any compare to the integrated cards that the macbooks have?

    also, i'll obviously be running it out to max resolution of 1920 x 1200, so will the Macbook be able to handle HD formats at over 10K bitrate at that res?
     
  2. Budding

    Budding Notebook Virtuoso

    Reputations:
    1,686
    Messages:
    3,982
    Likes Received:
    0
    Trophy Points:
    105
    Image quality mostly depends on the quality of the cable you use and the panel you connect to. Having a faster GPU does not affect the sharpness or colour of the displayed image, unless incorrectly calibrated.

    Unless you are going to output 3D rendered content on your external, your Mac will have no problems displaying content on your TV.
     
  3. SauronMOS

    SauronMOS Notebook Evangelist

    Reputations:
    173
    Messages:
    436
    Likes Received:
    0
    Trophy Points:
    30
    Keep in mind that OS X doesn't take full advantage of GPU features for video playback. Not that the MacBook has any for video.

    So decoding will be done entirely by the CPU. You'll end up with all of the neat compression artifacting and other stuff that you wouldn't see if the MacBooks had dedicated GPUs and took advantage of them like Windows.

    Also, nVidia has held the image quality crown for more than half a decade now. They're the preferred cards among the HTPC enthusiasts and their drivers always have been and most likely always will be better than AMD/ATI.
     
  4. athenaesword

    athenaesword Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    I was under the impression that even with windows, unless you use purevideo, the graphic card really doesn't do anything 99% of the time, if at all. Most of the decoding is done with the CPU, with little to no hardware acceleration from the GPU. so how is this any different between the mac and CPU? I guess i should just ask in practical terms: will I be able to run a 15Mbit H.264 encode on a macbook connected to a 50" HDTV smoothly?

    I got another question regarding resolution and CPU load. will using a higher res like 1920 x 1200 on an external display require more CPU processing power while decoding a H.264 file compared to the native display on the macbook itself?

    @nvidia cards having superior image quality. that is somewhat different from what i've generally been hearing from people on forums regarding the "nvidia vs ati image" contests. i will not dispute this though, as I personally don't have much experience with ATI (been using nvidia cards for the last 5 years.)
     
  5. Fant

    Fant Notebook Evangelist

    Reputations:
    9
    Messages:
    399
    Likes Received:
    8
    Trophy Points:
    31
    Whether you are using an internal or external display should not affect performance. Your computer still has to decode a 1080p video file. In fact I'd say that if you run a resolution with the same horizontal resolution as the video (so that no scaling needs to occur) it would probably be faster than running at a lower res requiring the cpu to rescale the video to fit.

    IE: running a 1920x1080p video on a 1900x1200 video resolution should result in no rescaling so least amount of CPU.
     
  6. Budding

    Budding Notebook Virtuoso

    Reputations:
    1,686
    Messages:
    3,982
    Likes Received:
    0
    Trophy Points:
    105
    To put things simple: You will have no problems running HD quality MPEG4 encoded videos on an external HD display using a Macbook. If you start rendering in 3D however, there will be problems.

    Like I stated before, the difference in quality for displaying static images or movies will depend much more on your video cable and display than your GPU. If you display objects rendered in 3D at realtime using the latest rendering technologies, such as recently released computer games, assuming your GPU is fast enough, ATI will currently win slightly due to its support for the additional libraries in DX10.1.
     
  7. SauronMOS

    SauronMOS Notebook Evangelist

    Reputations:
    173
    Messages:
    436
    Likes Received:
    0
    Trophy Points:
    30
    DXVA and all of that are system wide in Windows. Any software can be written to take advantage of any available GPU video features.

    Windows built-in decoders take advantage of it, ffdshow, VLC, and all the rest I can't think of.

    It makes a huge difference. I mean, I played some 720p H.264 videos on my Mac and its pushing around 60% of 1 core (2.16GHz C2D). On my HP with a GeForce 8400M GS the CPU use spikes at 10% and hovers around 5%. It looks a lot better too.

    Same goes for DVDs. Hovers around 20% in OS X. In Windows CPU use spikes at about 5% and the video quality is much better.

    Well, everyone here is going to tell you that it will be able to play it. But thats not exactly the case. You'll have to try it and see for yourself. Decoding high bit-rate high resolution H.264 isn't exactly an easy task. Thats why you see modern dedicated GPUs as a requirement for WinDVD and PowerDVD to playback blu-ray and HD DVD.

    I'm playing some music videos I bought from iTunes and the CPU use is hovering around 20-25% of 1 core. So yeah.

    When you get into the higher bit-rate and higher resolution arenas.. the efficiency of the software decoder becomes an issue. Some software might play it smoothly with high CPU use, others might not. All but one piece of video playback software for the Mac is single threaded, so it doesn't matter if you have two cores running at 2.4GHz.

    If it can play it (and it might), the question becomes do you really want to play video with the CPU ramped up that high and the fans blaring and the MacBook (MacBooks and MBPs are notorious for their heat) getting that hot?

    I wouldn't use the quicktime trailers as a judge for your systems ability to play 1080p video either. Even if they are "1080p" the bit-rate is usually not much higher than 720p and the resolution isn't always a true 1920x1080. I just played the 1080p Get Smart trailer at quicktime.com and it used up most of 1 core and my fans are now running at 6200RPM.

    If you already bought the MacBook all you can do is try and see. If you haven't gotten it yet and one of your priorities is video playback its definitely better for you to look elsewhere and find a Windows system with a dedicated GPU.

    Well, OS X doesn't upscale video. It simply "stretches" it and makes the pixels bigger. So it shouldn't use more processor time. But you're already going to be running the fans at full speed anyway.

    The image quality thing is something the ATI fanboys made up years ago when ATI was the underdog compared to nVidia. Somehow it got accepted as "fact" much the same way people try to say the iPod is the worst sounding MP3 player. Somehow that got accepted as "fact" and now people think the iPod sounds bad. Its silly really.

    Plus ATI's drivers have always been a disaster. I've tried several ATI cards over the years and every time something stupid with the drivers caused them to not work right in one way or another.

    Theres also plenty of benchmarks and reviews online where you can see actual comparisons of what the two are capable of. nVidia always looks better when the latest GPUs are compared.

    The only place where ATI looks better than nVidia is with the Xbox360. But thats because nVidia pretty much slapped 2 GeForce FX 5200s together and called it the "RSX" and gave it to Sony for the PS3.

    I'd stay a mile away from ATI if possible. Though I will say Gateway's $999 system with the C2D 2.4GHz, 160GB HDD, 3GB of RAM, and ATI HD2600 512MB is a VERY good deal. That Acer that newegg has for $1100 (same as the entry level MacBook!) has a 15.4" screen, 3GB of RAM, 2.1GHz Penryn C2D, DVD writer, 250GB HDD, 512MB GeForce 8600M GT.

    Talk about making me depressed about buying a Mac
     
  8. athenaesword

    athenaesword Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    thanks for clearing that up Budding ;)

    This probably sounds silly, but I'd like to clarify. Would a 1080p video look different on a 1280x1xxx resolution compared to a 1900x1200 res, given the same monitor size?

    i see. thanks!
     
  9. athenaesword

    athenaesword Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    i didn't realise macs were so much inferior at HD playback compared to windows. I would think it's the lack of proper decoders that're compatible with Mac, rather than utilisation of the GPU, just like how coreAVC would work much better on slower systems compared to FFDshow.

    yu mentioned that there's all but one decoder that utilises both cores. why aren't you using that one? have you tried booting up in XP and running the same HD files using FFDshow, and then comparing the loads on CPU, and image quality, to OSX? would appreciate it if you had the time to try it out and report the results! ;)

    If i were using the macbook merely as a processor to decode my HD files for my HDTV, then the fans shouldn't be a problem since i won't be sitting right next to it anyway, and there'll be speakers to cover up the ambient fan noise.

    may i know what bitrate 720p H.264 encode you were using to test and get 60% CPU load? that's really unreasonably high, even for OSX. My friend has a 2.4Ghz Macbook. I passed him some 1080p Bluray encodes and he played them just fine using VLC. if you're getting 60% from a 720p file, then 1080p would definitely lag, assuming 10k+ bitrate. it didn't happen to him.

    i agree with what you said about the ipods. I myself own a pair of UE triple Fi 10 pros, which i use on an iriver clix2, so I know what sounds good and what doesn't. I've tried the same earphones on ipod touches multiple times and I'm not hearing any of that "dreadful, terrible sound" that everyone keeps ranting about. granted, it's not fantastic, but it sure as hell ain't ghastly either. Picture quality depends alot on the decoder however, which is why I asked if you were able to try to run the same files on the same macbook using FFDshow on media player classic, and get the same poor quality. If it improves, then i'ts more of a software, rather than hardware limitation, and thus can be fixed. or i can just watch using XP on macbook.
     
  10. Budding

    Budding Notebook Virtuoso

    Reputations:
    1,686
    Messages:
    3,982
    Likes Received:
    0
    Trophy Points:
    105
    You will definitely have no problem when playing back HD content using VLC player on the Mac. QuickTime and DVD Player have rather poor decoders and are therefore significantly more CPU intensive than VLC.
     
  11. SauronMOS

    SauronMOS Notebook Evangelist

    Reputations:
    173
    Messages:
    436
    Likes Received:
    0
    Trophy Points:
    30
    CoreAVC is more of a joke than a legitimate alternative. Their copy protection solution and the way they treat their customers... I'd rather have no H.264 playback than use their products.

    Because I have a Windows PC with a dedicated GPU for all of my HD needs ;) Quite honestly, I only use my MacBook for a few things these days. Browsing the web, email, and syncing my iPods and iPhone. I think to myself "wow this is the perfect email and browsing machine" then I think "why did I spend $1400 on a computer to just do that?" heh.

    I don't have XP installed in Boot Camp any more. I did when the Mac was my only system. But now that I have the HP with a dedicated GPU, I don't use Windows on my Mac any more.

    However, with a MacBook, the image quality would be dependent on the software decoders being used. Not the hardware. The MacBook's integrated GPUs have 0 hardware support for any video codec that isn't MPEG-1 or MPEG-2. The MacBooks with the GMA 950 have support for HWMC and the X3100 throws iDCT support into the mix. All that really does is take some load off the CPU. You can boost the image quality with DVDs by using PowerDVD or WinDVD to adjust the color, sharpness, and other settings in software.

    A MacBook Pro or iMac (or $800 PC with a dedicated GPU) would be different because of the video hardware. Throw Vista on either one of those and you get DXVA 2.0 and a lot of neat hardware features like deblocking, deinterlacing, etc.

    Yeah, but the processors are going to get hot. Trust me, the MacBook (and MBP) get ridiculously hot under load. Do you really want your system to be running hot for hours at a time? That can't be good for the case (theres lots of posts at various forums regarding the MacBooks cracking along the bottom by the CPU and along the back exhaust port, and my first MacBook's hinge started to yellow from the heat).

    But were they REAL blu-ray files? The full 45Mbps+ H.264 video you get on modern blu-ray discs? Or were they downloaded .mkv files? Because those are usually about 1/4 or less of the bit-rate of what you get with blu-ray.

    And, again, even if the MacBook CAN play it, the CPU use is going to be ridiculously high.

    The files I'm using are 720p files I encoded from TV shows I recorded with an ATSC TV tuner. 4Mbps. Basically the same as you get with the Apple TV's HD movie rentals.

    I just ran one through Quicktime on the Mac. CPU use did hover back and forth between 55 and 60% of one core. VLC stayed around 50%. However, the fans were on full blast at 6200 RPMs and my CPU was hovering at 155-157F. Case bottom was at 100F after a few minutes.

    Keep in mind this is the 2.16GHz MacBook with the GMA 950.

    By contrast, my HP with a 2GHz C2D (Santa Rosa chipset) and a GeForce 8400M GS pushed the video in VLC along happily at about 15% CPU.

    In PowerDVD 8 with PureVideo enabled.. CPU use spiked at 5% of one core, hovered around 3, dropped to 1% on occasion. Not only that, but the picture was cleaner, the colors were more realistic, and it just all around looked much better.

    You have to honestly ask yourself, do you want to pay so much for a Mac that is going to run ridiculously hot playing video content and it won't even do it as good as a PC that costs less? A PC that will most likely have an HDMI output instead of having to deal with both a mini-DVI to DVI adapter, DVI to HDMI adapter, HDMI cable, and optical with mini-TOSLink adapter?

    Theres other things to think about too. If you don't already have a blu-ray player, any Turion X2 or Core 2 Duo based notebook with a dedicated GPU will most likely have an HDCP certified HDMI output. So you can get that $200 external Lite-On blu-ray drive or an internal one with an external USB case. If you don't already have an upscaling DVD player, that same notebook will use the hardware to properly upscale your DVDs as well as deblock it, hardware deinterlace, etc. Theres also a Hauppauge USB TV tuner coming out that can record HD video over component inputs. It has a hardware H.264 encoder on it. You also get Windows Media Center (which comes with built-in DVD decoders that take full advantage of the GPU), which (I know the Apple fanboys will flame me for this) walks all over Front Row in terms of functionality and even UI.

    Seriously, head over to newegg. For the same price as the entry level MacBook you can get an Acer with a 2.1GHz Penryn C2D, 512MB GeForce 8600M GT, 3GB of RAM, 250GB HDD, DVD writer, HDMI output, etc.

    haha I actually don't think the iPod touch sounds as good as the other iPods ;) I've compared it against my 80GB 5.5G iPod and 8GB 3G nano and those two blow it away. It doesn't sound bad, but it doesn't sound as good as those two. I prefer the iPod sound though. Clean, detailed, doesn't favor any frequency over the other. And I like iTunes.

    Well, like I said, if you don't already have the MacBook, then don't buy it. Get something like that Acer with the GeForce 8600M GT in it. Or go over to Gateway and pick up their $999 15.4" system with the ATI HD 2600 in it.

    Vista with a dedicated GPU will give you much better image quality than OS X ever could (regardless of hardware). You'll get hardware features that clean up compression artifacting, deinterlace the video in hardware, HARDWARE based upscaling (as opposed to the software based stretching in OS X), hardware color correction, etc. You also have to only deal with 1 cable as opposed to multiple cables and multiple adapters like you would with the Mac.

    You'll also be able to run the PC at a lower power state, resulting in a much cooler all around system.

    I mean, trust me, the MacBooks get ridiculously hot. The newer Macs run "cooler" but the CPU running at 170F instead of 180F doesn't really mean much. Especially when the casing still gets just as hot and is still prone to cracking and discoloration as a result of heat.

    Seems video playback is your main concern and, honestly, for anyone who appreciates quality video playback, Macs are simply not an option. Trust me. This is coming from someone who had more than 250 movies rented in a year from netflix back in the day (when they allowed you to log-in and report movies being shipped back and get new ones shipped out immediately).