The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Advantages of HDMI-out on Laptops?

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by F=ma, Jul 26, 2007.

  1. F=ma

    F=ma Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    I realize that if you have a Blue Ray Player on your laptop, you benefit tremendously from having an HDMI out by being able to display at a full 1080p resolution; however, for Notebook computers that don't have next-generation optical drives, or for normal non-hd movie playing, what advantages does HDMI provide over VGA other than being purely digital.

    Is the maximum resolution better/higher?


    Suppose I was using my notebook at home and I HDMI'd it out to my 1080i/720p HDTV. Would I tell a difference over VGA'ing out to it?
     
  2. RogueMonk

    RogueMonk Notebook Deity

    Reputations:
    369
    Messages:
    1,991
    Likes Received:
    0
    Trophy Points:
    55
    Yes, you should.
     
  3. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    The notebook would theoretically be able to display full on HDTV resolution to your screen even if your original content wasn't in HDTV.

    It would make HD game trailers look absolutely fantastic if you had HDMI. VGA would be so/so, about the same as digital television.

    HDMI is much better than VGA. S-Video is slightly better than VGA.

    If money is tight and you never plan on watching HD content and such or gaming on your big screen, then VGA is adequate (in other words, what I use because I don't even have HDMI out). HDMI pulls out all of the stops and turns your screen into an "extension of the computer itself".
     
  4. Akilae Hunter

    Akilae Hunter Notebook Consultant

    Reputations:
    7
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    Most HDTV's also have a DVI plug.
    A single DVI connector can push the HD's resolution in pixels no problem, but there's other technical stuff about it that I don't quite get...

    How about comparing HDMI to DVI. I'd imagine DVI to be its biggest competitor, not VGA.
     
  5. Rsaeire

    Rsaeire Notebook Guru

    Reputations:
    0
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    HDMI and DVI are very similar. The main differences are size, bandwidth, encryption and sound.

    Size – The HDMI connector is a third the size of DVI.

    Bandwidth – HDMI = 10.2 Gbit/s Vs. DVI's = 3.7 Gbit/s (single) or 7.4 Gbit/s (Dual)

    Encryption – HDMI is compatible with the High-bandwidth Digital Content Protection (HDCP) digital rights management technology needed to watch HD-DVD or Blu-ray content.

    Sound – HDMI also allows the pass through of 8-channel uncompressed digital audio in conjunction with a video signal.


    See also here for additional information.
     
    Last edited by a moderator: May 5, 2015
  6. Akilae Hunter

    Akilae Hunter Notebook Consultant

    Reputations:
    7
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    HA, most laptops would crap themselves if they ever had to fully crank out 1080P with all 8 audio channels.

    Down with HDCP, too!

    I want to kick the arse of the guy that thought up the HDCP DRM crap with having to have all the players have all the cables in just a certain way...
     
  7. Rsaeire

    Rsaeire Notebook Guru

    Reputations:
    0
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    Tell me about it! As far as I recall it was the studios who were unwilling to release their movies in a new format that would be as easily accessible as DVD. They really thought they knew better before DVD came out, with their encryption method CSS, and look where that got them. The same can be said for AACS in relation to HD-DVD and Blu-ray.

    They've been shoving DRM down our throats for approximately 10 years and the answer from the market is still a resounding "no thank you".
     
  8. F=ma

    F=ma Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    I'm currently running my computer to a 1080i/720p HDTV, the video output is at 1280x720 resolution. I'd like for it to be more, what is the limiting factor?

    The screen resolution supported by the video card on my computer?
    The VGA cable?
    The fact that the HDTV itself is only 1080i/720p instead of 1080p?


    I would like to run at a higher resolution because I view the 50" TV from about 8 feet away, and small text is unclear. What would it take for me to run something like 1600x900ish or 1920x1080?

    A new video card? An HDMI cable? A 1080p TV? All of the above?
     
  9. matt_h1

    matt_h1 Notebook Deity NBR Reviewer

    Reputations:
    319
    Messages:
    1,667
    Likes Received:
    0
    Trophy Points:
    55
    VGA is substantially higher quality than S-Video.
     
  10. leftside

    leftside Notebook Geek

    Reputations:
    0
    Messages:
    85
    Likes Received:
    0
    Trophy Points:
    15
    > What would it take for me to run something like 1600x900ish or 1920x1080?
    Your TV needs to support those resolutions. A 1080i/720p HDTV is most likely 1280x720 resolution (or something very close to that).
     
  11. matt_h1

    matt_h1 Notebook Deity NBR Reviewer

    Reputations:
    319
    Messages:
    1,667
    Likes Received:
    0
    Trophy Points:
    55
    If your TV is 1080i the maximum screen res you can have is 1920x1080 which is alot higher than the 1280x720 res you have it at now. Im not sure if VGA can do Screen resolutions that high.

    If your TV is true 1080i then it should be able to accept 1920x1080, My mums TV is on 26" and it does some odd resoultion, 1366x768 I think
     
  12. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Some laptops have component out with an adapter cable that plugs into a special port. In fact, many people seem to confuse this special port with s-video.
     
  13. F=ma

    F=ma Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    So right now, at 1280x720, I have medium/large-ish text that's not so readable.

    If I bump up to 1600x900 or 1920x1080, will I just end up with small text that's not so readable? or will the readability get better?

    I just need better readability, and wondering what my options are. I currently run a 1080i 50".

    Will getting a 60" 1080i instead of a 50" 1080i be better than just getting a 1080p 50"? or does the progressive make more difference than the screen real estate?

    I'm sure ideally a 60" 1080p would be preferred, but I'm trying to work on a limited budget.
     
  14. ldiamond

    ldiamond Notebook Evangelist

    Reputations:
    3
    Messages:
    571
    Likes Received:
    0
    Trophy Points:
    30
    HDMI has more bandwidth, yes, but part of it is for Sound...

    HDCP is only a protocol, DVI works with HDCP as well....


    Digital is better, unless you cant display it correctly... If the bottleneck is your display, DVI/HDMI wont change a thing!
     
  15. matt_h1

    matt_h1 Notebook Deity NBR Reviewer

    Reputations:
    319
    Messages:
    1,667
    Likes Received:
    0
    Trophy Points:
    55
    I was running my Projector through S-video for a few weeks before I could get a long enough VGA cable, My text went from blurry to being crystal clear, Same resolution just a different cable.
     
  16. Rsaeire

    Rsaeire Notebook Guru

    Reputations:
    0
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    ldiamond - Yes, my mistake. HDCP can be used over a DVI connection.
     
  17. Phritz

    Phritz Space Artist

    Reputations:
    68
    Messages:
    1,276
    Likes Received:
    0
    Trophy Points:
    55
    Nearly all 720p TV's have 1366x768, dunno why though, 42"LG for me, I plug in via a DVI to HDMI cable and an optical cable to my receiver, I like that HDMI is all in one but I wan't to seperate the Audio and Video, one to my TV and audio to my receiver

    Does anyone have their laptop know exactly what model TV you have? As sson as I plug in (both lappies) they both know exactly what model TV I have... through VGA and DVI->HDMI
     
  18. Lil Mayz

    Lil Mayz Notebook Deity

    Reputations:
    599
    Messages:
    1,463
    Likes Received:
    0
    Trophy Points:
    55
    Even with the greater bandwidth advanatges, you won't be able to tell the difference in picture quality between VGA and HDMI. Unless you've got a very top end, superior resoution monitor, you won't be able to tell the difference. VGA supports 1080p resolutions, and you should be fine. Furthermore, very few laptops have HDMI outputs, HDMI cables are very expensive, and your sound quality would probably be better if you use a standalone audio interface.
     
  19. Lt.Glare

    Lt.Glare Notebook Evangelist

    Reputations:
    171
    Messages:
    500
    Likes Received:
    0
    Trophy Points:
    30
    Plus, again, HDMI uses DRM, and personally I think any and all technology that employs DRM should be boycotted. VGA and DVI do not.
     
  20. Rsaeire

    Rsaeire Notebook Guru

    Reputations:
    0
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    I agree with Phritz. I really do not want my audio and video on the same cable. I also do not understand the reason why audio was implemented within the HDMI spec in the first place. It makes sense if you have a home theatre setup with a separate surround speaker set, HD-DVD or Blu-ray set top player and a receiver, but since HDMI is being implemented on desktop graphics cards and laptops also, it really does not make sense in the latter situations. I imagine the people who benefit by having audio and video together within the one cable are few and far between.
     
  21. fox_91

    fox_91 Notebook Guru

    Reputations:
    2
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I was going to say, S-Video is slightly better than RCA (Composite)... Is there that big a difference from VGA to HDMI, unless you are going to watch HD Movies? VGA would be able to push out any reso your laptop will produce, and VGA is capable to putting out a HD signal, but its just analog, which would make it bad for things like Blue-Ray and HDDvd, but is there really that much a difference?

    I always kind of thought the point of HDMI was to carry Audio and Video in one cable. If you were just putting video thru it, its not much different from DVI, except that the cable is much smaller.
     
  22. FusiveResonance

    FusiveResonance Notebook Evangelist

    Reputations:
    143
    Messages:
    421
    Likes Received:
    0
    Trophy Points:
    30
    how about using a DVI to HDMI cable. So id be plugging in the DVI end into my laptop while the HDMI end plugs into my tv. Would i get transmission of audio? Is the cable treated as a DVI cable or an HDMI cable. So if i were looking at specifications such as max res. Do i look them up for a DVI cable or an HDMI? Im assuming it DVI
     
  23. mikeymike

    mikeymike Notebook Evangelist

    Reputations:
    70
    Messages:
    696
    Likes Received:
    0
    Trophy Points:
    30
    nope, ur incorrect here

    Both VGA and S-Video are capable of transmitting 480 analog lines vertical max!
    And some higher end Pro S-Video cable and devices can transmit 575i lines



    And ur right here fox. VGA is analog and can never be better or equal to hdmi/dvi
    People may not notice a visual diff with VGA but in reality the VGA signal is converted. Also VGA doesnt support HDCP so it can never be as pure as HDMI/DMI. If you are playing HD content then you need to go through an HDCP compliant DVI or HDMI connection. Plain and simple!
     
  24. fox_91

    fox_91 Notebook Guru

    Reputations:
    2
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15

    Ok maybe the org spec for VGA couldn't put out over 640x480, but this can't be true any longer. I know for a fact I run my 360 over VGA and run at resolutions around 1380x768. VGA can put out a HD signal, just not the ones from a HD-DVD or Blue Ray because of that encoding. A VGA Monitor can only handle 640x480, but i dont' think the cable itself it constrained to this resolution

    Linky

    I understand what you are saying about VGA, yes the reso is what you stated, but I believe VGA is used more generally now. something like WUXGA+ (1680x1050) can be run over a VGA cable, its not the cable that was called VGA, it was the display mode of 640x480, so yes VGA is 640x480, but thats not what the cable is capable of

    Video Standards

    THe above link shows the video standards. I believe that most if not all of those can be put over a VGA cable... you will notice VGA is on there, but thats the video standard of 640x480, not the VGA Cable's max resolution (I suppose we could start calling the cable a "15 pin video adapter" but i don't know if it would catch on lol
     
  25. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    I'm a bit confused by this talk of VGA being constrained to low resolutions. My girlfriend is running her 24" LCD monitor at 1920x1600 over VGA.
     
  26. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Most laptops would crap out displaying 1920x1080? Because that's all it is. They have been doing it for years.
     
  27. ninjafish

    ninjafish Notebook Guru

    Reputations:
    1
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    I love the HDMI on my laptop, I can watch all my high quality dvd rips on my 42" Samsung :D 720p

    It took me a while to find a resolution that looked good in the wide screen (also messing around with the different formats/zooms), I think i ended up with something under 1200xsomething.
     
  28. mikeymike

    mikeymike Notebook Evangelist

    Reputations:
    70
    Messages:
    696
    Likes Received:
    0
    Trophy Points:
    30
    Ok, lets look at the '15pin video interconnect cable' a diff way

    Go research the 'bandwidth' of a 15pin vga cable. You will find out that the max is approx 200mhz.
    No what is the bandwidth of a HD movie or other HD content that will travel over such cable??
    No matter how you look at it VGA is and was not designed to carry high res or high bandwidth over its interconnects. The VGA cables specs are only designed to optimize the VGA output of 640x480.

    If you are playing a HD or BluRay movie and transmitting it over VGA in essense thats like a Ferrari driving in a 20mph zone constantly
     
  29. lazybum131

    lazybum131 Notebook Evangelist

    Reputations:
    203
    Messages:
    532
    Likes Received:
    0
    Trophy Points:
    30
    There's a difference between the VGA display standard that is about resolution (i.e. VGA, XGA, WXGA, etc.) and the VGA connector.

    We're talking about the VGA connector here, which can output much higher than just 1080p. Most video cards have RAMDACs that can output up to 2048 x 1536 (QXGA) over VGA. It is definitely much higher quality than s-video, which can't even output a low-res progressive signal. I'm no expert, but seeing how VGA can output much higher than 1080p to monitors i don't think analog bandwidth with the actual connector or cable is a big issue.

    The quality of the electronics and conversions from digital to analog and back again is probably a bigger factor, so it depends on the output device and the television. As far as HDMI/DVI vs. VGA, I'd say for the most part, people wouldn't be able to tell a difference on a TV unless there was something wrong along the analog path.
     
  30. fox_91

    fox_91 Notebook Guru

    Reputations:
    2
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15

    I believe that is incorrect... fine you say a VGA cable has a bandwith of 200mhz, thats great. But a VGA Cable can put out resolutions over 1080P. Now if I had a choice would I use HDMI over VGA... Yes, and I would use DVI over VGA as well. But the argument was that S-Video is just as good as VGA. Now VGA in the true sense of the word, yes is 640x480. A VGA Cable however can carry much larger resolutions on it. I don't understand what the argument is... You can't argue with me and say VGA Cable can't carry a HD signal, because its just wrong. Yes it isn't a digital signal, and DVI or HDMI is better for the most part. but it can do it.... period
     
  31. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Go research 'bandwidth'. You will find out that mhz is not a valid unit for its measurement.
     
  32. mikeymike

    mikeymike Notebook Evangelist

    Reputations:
    70
    Messages:
    696
    Likes Received:
    0
    Trophy Points:
    30

    ok, i need to clear things up here. Firstly vga cannot and ill say it again cannot transmit a pure uncompressed 1080p video signal either at its output connector or thru vga interconnects.
    As soon as a HD vid signal leaves ur laptop via vga it is 640x480 max.
    Thats the vga limitations period.

    Every LCD, DLP, Plasma monitor has a built in video scaler and upconverter because there are just too many diff types of resolutions and devices that can be connected to the monitor. This scaler is built in within the TV or computer monitor itself.
    A video scaler whether it be a ADC(Analog to digital converter), DAC(Digital to Analog) or a combo of them does all the work to make you think that the VGA is actually transmitting 1080p.

    Once you connect your VGA interconnect cable to the back of ur monitor the resolution is 640*480. The video scaler will scale and upconvert the signal points to 1280x1024 pixels (if ur monitors native res is such) or upconverts to whatever is closest to ur external monitors native res.

    A sure way to test this urself is to connect your computer to a older external TV which most likely wont have an internal video scaler and you'll wonder why the picture is so crappy. Even if your laptop monitor is 1900*1200

    With HDMI or DVI there is no scaling or upconversion whatsoever. Well, i shouldnt say none but theres very little as its mostly just scaling to fit(in some cases)
     
  33. mikeymike

    mikeymike Notebook Evangelist

    Reputations:
    70
    Messages:
    696
    Likes Received:
    0
    Trophy Points:
    30
    wanna bet???

    me thinks its you who should go research 'bandwidth' to get some much needed know how.

    I would suspect the only bandwidth measure your aware of is of the digital form.

    Analog bandwidth is measured in hertz. As in my noted 'mhz' (megahertz)(hz = cycles per second) as related to the VGA cable interconnect because its of a analog nature.

    Digital bandwidth is measured in bits. I wont bother explaining this one as its one you already know.
     
  34. Wiz33

    Wiz33 Notebook Deity

    Reputations:
    54
    Messages:
    1,037
    Likes Received:
    19
    Trophy Points:
    56
    Wow! What a load of crap? You have no idea what you are talking about, do you? VGA as a videocard connector will carry 1080p with no problem, all you need to see is to set a LCD native mode (no scaling) and set a resolution matching that of the screen and you'll get a full screen. I'm driving one of my 24" LCD with VGA at 1920x1200 right now as I have both a desktop and a laptop hooked up to it. The only drawback is that as an analog signal, it is easier for it to pick up electrical noise from other devices and cables.

    Now VGA as a video format was 640x480 originally but the term VGA connector have became a generic term for analog videocard output that spans a much higher resolution range.

    All PC video signal are digital to start with but they were converted to analog and sent thru a VGA connector because all CRT type monitor are analog device. With the advent of LCD (which is a digital device), it's much more effective to go digital from the PC to the LCD since it gives a much cleaner signal path and is less prone to video noise which brings us the DVI connector and cable.

    HDMI is basically DVI with additional pathway for audio signal which offers a single cable solution for HD content vs the 4/5 cables solution of component video (3 video and 1-2 audio) or the 2/3 cables solution of a DVI connection (1 DVI video and 1-2 audio).
     
  35. lazybum131

    lazybum131 Notebook Evangelist

    Reputations:
    203
    Messages:
    532
    Likes Received:
    0
    Trophy Points:
    30
    Wowowowow :twitcy: , now i definitely know you have absolutely no clue what you're talking about. Have you ever tried to use a CRT monitor before? You must be really young if u haven't ever had to use VGA (aka RGB connector, D-sub 15).

    Digital fixed pixel displays have scalers because they have a fixed number of pixels and would have to scale both digital and analog video signals to display properly on the screen. The scaler has absolutely nothing to do with the conversion from analog to digital. If the digital display is fed a non-native resolution digital signal a scaler would still be required.

    The RAMDAC on a video card converts the finite number information (digital) into analog form, it absolutely does not some how downscale to 640x480. That would make absolutely zero sense, you would be implying that that all CRT monitors that have been capable of displaying higher than 640x480 did so by automagically pulling information out of nowhere.

    Your test makes no sense either. Old CRT televisions are crap for connecting computers cuz their 'native resolution', as in the number of phosphor dots (subpixels), are extremely low compared to a computer monitor, and they are interlace displays. It has nothing to do with VGA connector.
     
  36. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    I am aware of analog bandwidth meaning a range of frequencies, but I didn't know it could simply refer to the max frequency of a cable. All cables have a max frequency rating including digital cables. I don't think it's a very good use of the word to apply it to a vga cable or other multi-conductor cable because the max frequency doesn't describe the max throughput when there is more than one conductor present. However, I was wrong and the term is indeed used in this situation.

    Edit: In terms of VGA, is the bandwidth you cited a max frequency or a range? On second thought, I still don't see how the term can apply to a single frequency and I think it is (commonly) misused. If it is a range then I completely understand.

    Yeah, definitely.
     
  37. Lt.Glare

    Lt.Glare Notebook Evangelist

    Reputations:
    171
    Messages:
    500
    Likes Received:
    0
    Trophy Points:
    30
    Regardless, Unless you have a gigantic 60 inch TV that supports a resolution above 1800x1600 or whatever, with a HDMI/DVI cord, your not gonna REALLY notice a huge quantum leap in performance versus VGA.

    A digitial video signal is, as of yet, still not really needed.

    I'd also like to add that, if the signal that comes through a VGA cable is only 640x480, and is blown up by the TV's hardware, I wish they would apply the same enlarging method to photoshop/paint/et al.
     
  38. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    Don't forget that you would need a super high resolution source as well. :p

    Digital is nice, however, in cases where there is noticible EM interference on the cable.

    I would like to point out the sarcasm here, as some people may not catch it. I won't name any names...
     
  39. Wiz33

    Wiz33 Notebook Deity

    Reputations:
    54
    Messages:
    1,037
    Likes Received:
    19
    Trophy Points:
    56
    Another total myth. That may be true for traditional viewing for TVs where you sit a fair distance away from the screen but with a lot of HDTV now doubling as PC monitors, you tends to be closer to the screen and can easily pickup the little bit of video noise and such from a VGA connection (especially if you have a lot of other electronics gadget around that maybe giving off electrical noise).

    A perfect example is my office where I have both my laptop and desktop connected to a Dell 2407. Since it only have 1 VGA and 1 DVI input, one of the PC have to go analog. Most of the time, you won't be able to see the difference between the two input but under certain background pattern and color combo, you can definately pick out noise line on the screen from the VGA input while there is no such problem with the same image coming from the DVI.
     
  40. squeakygeek

    squeakygeek Notebook Consultant

    Reputations:
    10
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    I have never noticed interference on a VGA connection except once on my dad's pc when he used an extra long, extra cheap extension cable. If you have a halfway decent cable of a reasonable length, you usually won't have problems with interference. Not to say it doesn't happen. I definitely wouldn't call it a myth.

    Edit: On a side note, make sure you have the refresh rate set correctly. I once had problems with what looked like possible interference, but turned out I just didn't have it set on the right refresh rate. The "wrong" rate was actually given as an option by the driver.
     
  41. Wiz33

    Wiz33 Notebook Deity

    Reputations:
    54
    Messages:
    1,037
    Likes Received:
    19
    Trophy Points:
    56

    Cable were good just that the office envoirment have a lot of EM noise.
     
  42. Rsaeire

    Rsaeire Notebook Guru

    Reputations:
    0
    Messages:
    68
    Likes Received:
    0
    Trophy Points:
    15
    How did this thread turn from "HDMI-out on Laptops" to "let's all talk about VGA?" Seriously guys, let it go. All points have been made and just to be clear I'll make them one last time.

    VGA Resolution = 640x480

    VGA/D-sub 15 Cable = an analogue cable which can connect, for example, a video card to a computer monitor.


    Now, back to the topic at hand. Who has HDMI-out on their laptops and who uses it? I can't imagine there is that many people who would have a laptop with HDMI-out as well as have a HDTV/ HD capable monitor with HDMI-in.
     
  43. ls6

    ls6 Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    So what would happen if I connected a 1920x1200 WUXGA laptop to my 1900x1080 40" Sony LCD using an HDMI cable? Would it scale the 1920x1200 down to 1900x1080 or just crop the extended pixel lines? What about a 1680x1050 WSXGA laptop to the same LCD?
    Thanks
     
  44. ez2remember

    ez2remember Notebook Evangelist

    Reputations:
    28
    Messages:
    494
    Likes Received:
    0
    Trophy Points:
    30
    I'll be suprised if your Sony LCD TV is 1900x1080. It's most likely 1920x1080 which is "true" HDTV resolution. Anway from 1920x1200 to 1920x1080 it will downscale or crop depending upon if it uses 1:1 pixel mapping. It most cases it will downscale.

    With your other scenario with 1680x1050 it's most likely to upscale. If your TV supports the option of 1:1 pixel mapping then you can choose either.
     
  45. Wiz33

    Wiz33 Notebook Deity

    Reputations:
    54
    Messages:
    1,037
    Likes Received:
    19
    Trophy Points:
    56

    If you laptop properly detect the 40", you should be able to set a 1920x1080 desktop.
     
  46. ls6

    ls6 Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    OK, so If my laptop display is 1680x1050 and I connect the laptop to my 1080p LCD I should be able to adjust the GPU to display 1920x1080?
     
  47. Wu Jen

    Wu Jen Some old nobody

    Reputations:
    1,409
    Messages:
    1,438
    Likes Received:
    0
    Trophy Points:
    55
    Don't forget the nice 'FEATURE' that Vista offers if your using it. It automatically downscales your output to a lower resolution, unless your equipment is all HDCP compliant.
     
  48. Blotto

    Blotto Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5
    More fud in a thread that seems to be brimming with it. Vista does not downscale anything if your equipment is not HDCP compliant. Your Bluray/HDDVD playback software (Cyberlink PowerDVD etc) will downscale if you are not HDCP compliant, just like any other Hi-Def player on the market or the same software on any supported OS. Normal desktop operating or games or any non AACS protected media are not affected by any downscaling under any circumstance. People are *****ing about the DRM of the evil Windows Vista without realizing that the OS really doesn't do anything that WinXP MCE didn't except add support for new technologies (OCUR, etc) which intrinsically require some form of DRM.

    HDMI - Same video signal (as far as computers are concerned) as DVI
    VGA - Same RGB signal as DVI/HDMI just converted to analog for transmission. Has no 640x480 resolution limit can display 1080p and beyond with the same quality as HDMI/DVI provided the ADC, DAC, and the cable are all high enough quality.
     
  49. tritium4ever

    tritium4ever Notebook Consultant

    Reputations:
    7
    Messages:
    123
    Likes Received:
    0
    Trophy Points:
    30
    There really isn't much of an advantage to an HDMI output if you're not using it to output protected Blu-ray/HD-DVD content, aside from the smaller cable. DVI will provide equal quality for all intents and purposes, and VGA will most likely be indistinguishable assuming that your cable is of decent quality and of normal length (typically about 6'...15' monster VGA cables can go right into the garbage) and that your environment isn't abnormally saturated with all sorts of unusual EMI sources.
     
  50. Zetto

    Zetto Notebook Deity

    Reputations:
    71
    Messages:
    771
    Likes Received:
    0
    Trophy Points:
    0
    I dunno, 1920x1200 over DVI looks quite a bit sharper to me than same res over VGA connection. Just IMO.
     
 Next page →