I notice that Intel Media Accelerators are coming with HDMI outputs on them. That's nice, but in my experience with desktops the picture quality from Intel integrated video is atrocious just looking at the Windows desktop. Can anyone vouch for the picture quality of Intel integrated video? I may play DVD or Blu-ray on a notebook but I have absolute 0 interest in gaming. The most expensive video accelerator I would consider is an ATI 3400 or an Nvidia 9300.
-
I have never noticed any picture quality difference between an intel nvidia or ati card
please explain what you mean by worse picture quality -
Same here - picture looks the same. Intel X3100 vs NVidia 8400GS
(Except in games - say Age of Empires 3)
I think you mean true HD, is that possible? Resoolutions like 1600*900 or 1920*1080
In that case a normal desktop should still look the same as its not really one of the most challenging tasks in terms of graphics. -
-
Intel's newest integrated solution, the HD4500 is capable of fully rendering and processing HD content with the same amount of quality that a dedicated video card can.
Dedicated cards are mostly good for gaming, but their performance in everything else is more or less similar to an integrated. -
What the OP is talking about is that the signal quality of dedicated cards is usually higher than integrated cards.
Generally speaking it's true. But there can be differences between individual laptops as well. -
Ah ok well the output from the actual HDMI might be worse on an IGP, but I think some manufacturers actually optimize their integrated GPUs for that purpose so I think you're right that it depends.
-
HDMI is digital. Signal quality does NOT vary like it does with VGA output since, again, its digital.
Amazing people here actually thought HDMI is analog. Its almost mind bending... -
It's not about analoge versus digital. The color management on dedicated cards is often better than on integrated cards. In my opinion the image quality coming out of Nvidia HDMI is a lot better than the image quality of Intel IGP HDMI.
-
From my experience with the laptops in my sig:
The X4500 is definately a much more powerful accelerator than the 7150Go, but it's image quality on fonts and stuff like that sucks, I don't know if its just a driver issue or if they G50 comes with a cheaper screen or what, they're both Brightview 15.4 widescreens, but the X4500 definately doesn't look very good on web page text or explorer windows text, and its gama is out of whack too. The gama can be fixed but I don't know what the deal is with the fonts etc displaying poorly. X4500 definately turned a new page in performance for intels integrated offerings though. -
I agree, the X4500 just doesn't look as good as a dedicated Nvidia GPU but I have a feeling that it is mostly a driver issue because it is certainly powerful enough to have decent picture quality.
-
Signal quality isn't determined much by the chip itself than the filter on the motherboards and the components around it. Same IGP on a different motherboard model will vary a lot sometimes because some motherboards do not have good filters for them.
Same will be for HDMI. It is digital, but digital isn't 100% immune to image quality degradations. -
-
If you want Blu-ray, it is best to stick to Nvidia or ATI card with at least 256MB Dedicated VRAM. Something like ATI 3470 or Nvidia 8400 GT at the very minimum. Intel may be catching up but to be safe you should stick to dedicated 3D accelerator. And don't forget about the screen too. In my experience, Sony XBrite Hi-Color is the best for watching Blu-ray. The color reproduction and black level are just amazing.
-
-
-
@ Hancock, please reread the first post of this thread and Ayle's post.
-
-
You can't say that digital is 100% immune to image degradation, in NORMAL PRACTICE it is, but if you ran an unshielded cable near a strong source of RF or other interference I guarantee you'll see problems. If you have enough of a signal quality problem your 1's and 0's can be unreadable or just interpreted wrong, unless your transferring the data with a protocol that has checksums and the equipment can adaptively use those checksums to repair data problems it can happen (and I'm pretty sure HDMI is not that type of protocol).
Easy real world examples:
GSM cell phones use digital cellular, but call quality varies depending on signal quality. Digital satellite, in bad weather the signal degrades and you can get sync problems, picture problems etc. Cat5 Ethernet, dropped packets. DVD/CD frequently can see read errors, you may not ever notice these problems but they're there. These are all digital mediums that suffer degradation when the digital information is incorrectly read or transferred. Most audio and video applications where digital signals are used are not an "all or nothing" proposition, the signal will still be output even if the data is partially corrupted in the transfer.
But ultimately in the scenario the OP is talking about I think the problem is more rooted in how the graphics processors handle rendering the raw data they're fed. Certainly I'm not advocating voodoo science like 100 dollar HDMI cables, but if you buy the a really inferior HDMI cable that isn't properly shielded and operate it in a environment with above average RF noise you'll have an image quality loss. -
here you are again. are you still trying to "educate" people here with your "laptop shop" knowledge? as you did on the other threads.
why do you have to push people down in order to make our point? I feel bad for those who buy laptop from your "refurbished laptop shop"
if you want to make a point, be constructive and professional with manner. Otherwise, you will make a fool of youself. That is college 101. -
I use Windows XP at home and at work with the Windows Classic desktop, so it looks like I'm using Windows 2000. The shade of blue of the system tray is way off using integrated video, much too dark - it's like it has difficulty with processing/output of high frequency light. -
If you don't know anything about transmission line theory, and I know very little, then keep your mouth shut. -
Small question along the side:
Can you stop this and get back to an objective discussion?
I think this seems to go down the wrong path... -
Being that Hancock got banned like 2 days ago I'd say the argument is over. Hopefully someone has some insight to what's wrong with intels drivers or whatever is causing the poor image quality on intel GMA notebooks. I didn't see nearly the problems in 3d rendered output as i did on the desktop/explorer etc, I wonder if they focused too much effort on getting some gaming "cred" and just abandoned the basic necessities of the drivers?
-
I believe the problem is in the color management of the drivers.
For example: Nvidia supports digital vibrance setting while Intel doesn't.
When I connected a Vaio Z with Nvidia 9300 through HDMI to my 22" LCD it looked better than anything I'd seen. Colors were popping off the screen. -
Any idea why the fonts look so terrible? That's my biggest issue with the thing aside from the whacked out gama that I can get something to take care of.
-
I don't know but you might want to try enabling Cleartype. That can make it a lot better.
-
actually i just downloaded and ran quickgamma and just making some quick eyeball adjustments with its built in gamma bars it looks 1000% percent better. It looks like maybe it was so overdriven (had to turn the gamma way down) that it was causing a shadow effect on the text.
-
-
On what laptop do you have it and can you show a screenshot?
Intel Media Acclerator vs low end ATI/Nvidia
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Tywin, Dec 14, 2008.