It doesnt stretch the entire display, it's a 1920x1080 Samsung Series 70, LED .. When I hook it up using HDMI hoping to experience better quality in picture it doesnt use the entire screen like using vga. (also auto adjust doesnt work with hdmi but does for vga) I' used GPU Scaling but it looks pretty much exactly the same as VGA and I think i just wasted 30$ on a hdmi cable haha. Am I doing something wrong? or does it really matter?
-
what resolution is it displaying at thru HDMi?
-
says 1920x1080, won't go any higher, thers a black border all around the screen,
-
It's 1080P... so it is at max resolution. As far as the black border goes, try to adjust it via TV's viewing option.
-
Wouldn't let me, only way I could figure it out was to use GPU scaling.
-
Does your monitor/tv display what resolution?
For example, when I connect to my TV via HDMI, it would tell me I'm at HDMI 1 or 2(depending on the port #), and the resolution it's displaying.
When I connected for the first time, I didn't have any borders, but it was only displaying at 12xx by 780p resolution, so I changed that to 19xx by 1080p, and everything worked.
I did not have to mess w/ GPU scaling at all, only thing I changed on the laptop was to change the resolution on nvidia control panel. -
-
Same thing happened to me. It is easy to resolve, you just have to know where to go. And it is not intuitive AT ALL.
1. Open Catalyst Control Center.
2. From the drop-down menu at top left, select Desktops and Displays.
3. At the bottom of this window there will be a picture of a monitor for each display you have active.
4. Pick the one that represents your HDMI display. If you aren't sure which is which, you can right-click the picture and select Identify Display.
5. Right-click the display attached via HDMI and select Configure.
6. Across the top of the window there will now be a series of buttons named Attributes, Avivo Color, Scaling Options, etc.
7. Click on Scaling Options.
8. Move the slider to the right until your image fills the screen like you want.
9. Click OK to save the changes. -
As far as the difference between HDMI and VGA... VGA cannot carry audio, HDMI can. VGA is analog, HDMI is digital, your video card is natively digital, your flat panel is natively digital. So to use VGA it needs to downconvert to analog (this is not lossless), and then upconvert it back to digital once the signal reaches the display unit.
For many situations you will not notice a difference. But sometimes you will. It is worth mentioning that digital signals do not care about refresh rate - a pixel is either "off" or it is "on". So things like tearing and scanline blurring just don't happen with HDMI.
In fact, I am right now looking at an HDMI screen and a VGA screen both attached to my R2. The HDMI display is as crisp as can be - there is zero blurring of text. The VGA display is still fairly crisp; however when compared side by side with the HDMI display I can definately notice a difference in text sharpness. -
Go to Monoprice.com you can get HDMI cables for a lot cheaper than $30
Purchased a HDMI Monitor.. but..
Discussion in 'Alienware 17 and M17x' started by Sabaku, Aug 25, 2010.