The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Intel Media Acclerator vs low end ATI/Nvidia

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Tywin, Dec 14, 2008.

  1. Tywin

    Tywin Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    I notice that Intel Media Accelerators are coming with HDMI outputs on them. That's nice, but in my experience with desktops the picture quality from Intel integrated video is atrocious just looking at the Windows desktop. Can anyone vouch for the picture quality of Intel integrated video? I may play DVD or Blu-ray on a notebook but I have absolute 0 interest in gaming. The most expensive video accelerator I would consider is an ATI 3400 or an Nvidia 9300.
     
  2. mr__bean

    mr__bean Notebook Evangelist

    Reputations:
    36
    Messages:
    449
    Likes Received:
    0
    Trophy Points:
    30
    I have never noticed any picture quality difference between an intel nvidia or ati card

    please explain what you mean by worse picture quality
     
  3. DetlevCM

    DetlevCM Notebook Nobel Laureate

    Reputations:
    4,843
    Messages:
    8,389
    Likes Received:
    1
    Trophy Points:
    205
    Same here - picture looks the same. Intel X3100 vs NVidia 8400GS

    (Except in games - say Age of Empires 3)

    I think you mean true HD, is that possible? Resoolutions like 1600*900 or 1920*1080

    In that case a normal desktop should still look the same as its not really one of the most challenging tasks in terms of graphics.
     
  4. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    You can't really tell through VGA out. You'd need HDMI or DVI.
     
  5. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    Intel's newest integrated solution, the HD4500 is capable of fully rendering and processing HD content with the same amount of quality that a dedicated video card can.

    Dedicated cards are mostly good for gaming, but their performance in everything else is more or less similar to an integrated.
     
  6. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    What the OP is talking about is that the signal quality of dedicated cards is usually higher than integrated cards.

    Generally speaking it's true. But there can be differences between individual laptops as well.
     
  7. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    Ah ok well the output from the actual HDMI might be worse on an IGP, but I think some manufacturers actually optimize their integrated GPUs for that purpose so I think you're right that it depends.
     
  8. Hancock

    Hancock Notebook Enthusiast

    Reputations:
    2
    Messages:
    41
    Likes Received:
    0
    Trophy Points:
    0
    HDMI is digital. Signal quality does NOT vary like it does with VGA output since, again, its digital.

    Amazing people here actually thought HDMI is analog. Its almost mind bending...
     
  9. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    It's not about analoge versus digital. The color management on dedicated cards is often better than on integrated cards. In my opinion the image quality coming out of Nvidia HDMI is a lot better than the image quality of Intel IGP HDMI.

    Don't think anyone here does.
     
  10. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    From my experience with the laptops in my sig:
    The X4500 is definately a much more powerful accelerator than the 7150Go, but it's image quality on fonts and stuff like that sucks, I don't know if its just a driver issue or if they G50 comes with a cheaper screen or what, they're both Brightview 15.4 widescreens, but the X4500 definately doesn't look very good on web page text or explorer windows text, and its gama is out of whack too. The gama can be fixed but I don't know what the deal is with the fonts etc displaying poorly. X4500 definately turned a new page in performance for intels integrated offerings though.
     
  11. Slaughterhouse

    Slaughterhouse Knock 'em out!

    Reputations:
    677
    Messages:
    2,307
    Likes Received:
    2
    Trophy Points:
    56
    I agree, the X4500 just doesn't look as good as a dedicated Nvidia GPU but I have a feeling that it is mostly a driver issue because it is certainly powerful enough to have decent picture quality.
     
  12. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Signal quality isn't determined much by the chip itself than the filter on the motherboards and the components around it. Same IGP on a different motherboard model will vary a lot sometimes because some motherboards do not have good filters for them.

    Same will be for HDMI. It is digital, but digital isn't 100% immune to image quality degradations.
     
  13. Hancock

    Hancock Notebook Enthusiast

    Reputations:
    2
    Messages:
    41
    Likes Received:
    0
    Trophy Points:
    0
    Yes digital is 100% immune to image quality degradtation on the output of HDMI. The bits coming out are perfect 1s and 0s there is ZERO information lose. I highly recommend reading about digital technology because you seem to believe that digital is similar to analog. Even if the digital signal is slightly altered (curve on rise and fall times) there will still be ZERO information lose meaning perfect information transfer and ZERO image quality lose. I highly recommend getting an education in digital circuit technology before posting more misinformation.
     
  14. hendra

    hendra Notebook Virtuoso

    Reputations:
    157
    Messages:
    2,020
    Likes Received:
    6
    Trophy Points:
    56
    If you want Blu-ray, it is best to stick to Nvidia or ATI card with at least 256MB Dedicated VRAM. Something like ATI 3470 or Nvidia 8400 GT at the very minimum. Intel may be catching up but to be safe you should stick to dedicated 3D accelerator. And don't forget about the screen too. In my experience, Sony XBrite Hi-Color is the best for watching Blu-ray. The color reproduction and black level are just amazing.
     
  15. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106
    Gosh we are not talking about signal degradation, he is talking about what is actually rendering on the screen, like color profiles, font smoothing or deblocking in HD medias. If one GPU is worse than another at dealing with does things you'll see a difference of quality in the picture.... No need to become so defensive.
     
  16. Hancock

    Hancock Notebook Enthusiast

    Reputations:
    2
    Messages:
    41
    Likes Received:
    0
    Trophy Points:
    0
    Please reread his post. He is talking about the actual signal structure coming out on the pins of the HDMI port which is why he is talking about components on the motherboard apart from the GPU. Yes these components make a difference in the analog world but in digital they are completely transparent.
     
  17. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    @ Hancock, please reread the first post of this thread and Ayle's post.

     
  18. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Image quality isn't always about whether there's noise or whatever. The rendering of the colors all matter.
     
  19. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    You can't say that digital is 100% immune to image degradation, in NORMAL PRACTICE it is, but if you ran an unshielded cable near a strong source of RF or other interference I guarantee you'll see problems. If you have enough of a signal quality problem your 1's and 0's can be unreadable or just interpreted wrong, unless your transferring the data with a protocol that has checksums and the equipment can adaptively use those checksums to repair data problems it can happen (and I'm pretty sure HDMI is not that type of protocol).

    Easy real world examples:
    GSM cell phones use digital cellular, but call quality varies depending on signal quality. Digital satellite, in bad weather the signal degrades and you can get sync problems, picture problems etc. Cat5 Ethernet, dropped packets. DVD/CD frequently can see read errors, you may not ever notice these problems but they're there. These are all digital mediums that suffer degradation when the digital information is incorrectly read or transferred. Most audio and video applications where digital signals are used are not an "all or nothing" proposition, the signal will still be output even if the data is partially corrupted in the transfer.

    But ultimately in the scenario the OP is talking about I think the problem is more rooted in how the graphics processors handle rendering the raw data they're fed. Certainly I'm not advocating voodoo science like 100 dollar HDMI cables, but if you buy the a really inferior HDMI cable that isn't properly shielded and operate it in a environment with above average RF noise you'll have an image quality loss.
     
  20. foxnews

    foxnews Notebook Consultant

    Reputations:
    145
    Messages:
    151
    Likes Received:
    0
    Trophy Points:
    30
    Hancock,

    here you are again. are you still trying to "educate" people here with your "laptop shop" knowledge? as you did on the other threads.

    why do you have to push people down in order to make our point? I feel bad for those who buy laptop from your "refurbished laptop shop"

    if you want to make a point, be constructive and professional with manner. Otherwise, you will make a fool of youself. That is college 101.
     
  21. Tywin

    Tywin Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    This thread got long fast.

    I use Windows XP at home and at work with the Windows Classic desktop, so it looks like I'm using Windows 2000. The shade of blue of the system tray is way off using integrated video, much too dark - it's like it has difficulty with processing/output of high frequency light.
     
  22. Tywin

    Tywin Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    You don't have the faintest clue what you are talking about. HDMI is the UDP of digital video - it just spits out the data and doesn't care whether it is sent or read correctly - you don't use HDMI for professional digital video, you use BNC connectors. When the frequencies are that high there are all sorts of things that can go wrong. In fact, the reason HDMI 1.3b was developed was because some manufacturers were making cables that met the 1.3a spec, but were not capable of 1080p resolution. People making products that don't do what they are supposed to do. Imagine that.

    If you don't know anything about transmission line theory, and I know very little, then keep your mouth shut.
     
  23. DetlevCM

    DetlevCM Notebook Nobel Laureate

    Reputations:
    4,843
    Messages:
    8,389
    Likes Received:
    1
    Trophy Points:
    205
    Small question along the side:
    Can you stop this and get back to an objective discussion?
    I think this seems to go down the wrong path...
     
  24. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    Being that Hancock got banned like 2 days ago I'd say the argument is over. Hopefully someone has some insight to what's wrong with intels drivers or whatever is causing the poor image quality on intel GMA notebooks. I didn't see nearly the problems in 3d rendered output as i did on the desktop/explorer etc, I wonder if they focused too much effort on getting some gaming "cred" and just abandoned the basic necessities of the drivers?
     
  25. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    I believe the problem is in the color management of the drivers.

    For example: Nvidia supports digital vibrance setting while Intel doesn't.

    When I connected a Vaio Z with Nvidia 9300 through HDMI to my 22" LCD it looked better than anything I'd seen. Colors were popping off the screen.
     
  26. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    Any idea why the fonts look so terrible? That's my biggest issue with the thing aside from the whacked out gama that I can get something to take care of.
     
  27. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    I don't know but you might want to try enabling Cleartype. That can make it a lot better.
     
  28. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    actually i just downloaded and ran quickgamma and just making some quick eyeball adjustments with its built in gamma bars it looks 1000% percent better. It looks like maybe it was so overdriven (had to turn the gamma way down) that it was causing a shadow effect on the text.
     
  29. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    actually, it does...
     
  30. Phil

    Phil Retired

    Reputations:
    4,415
    Messages:
    17,036
    Likes Received:
    0
    Trophy Points:
    455
    Well the ones I worked with did not have that option.

    On what laptop do you have it and can you show a screenshot?
     
  31. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    x3100, from my sig
    i can take a screenshot, but it's going to be in russian. for some reason the installer decided it was the best option without even asking me. not a big deal for me tho))

    [​IMG]