I'd guess that the first time I asked this on this forum, it might not have gotten an answer because it either got lost in the conversation on the laptop or, was just too dumb a question to answer?
Still, I'll try it again... with fingers crossed for an answer.![]()
If I buy a notebook with the Radeon 6770M 1Gb card in it, and the laptop's maximum internal screen resolution is 1600x900(or even 1378x768), will the laptop output a 1920x1080 signal to an external monitor?
According to AMD, the max resolutions of the card is dependend on the connection as seen here:
BUT, do the laptop manufacturers somehow limit the external resolution of a laptop to the maximum resolution of the laptops screen?
Thank you very much for any and all help you can offer!!!
-
Laptop's LCD has nothing to do with the video card's max resolution. Yes it can easily output 1080p signal.
-
Guessing you are an HT nut(based on your ID), do you use a HTPC in your HT? -
-
Don't forget that refresh rate also matters. I think both VGA and HDMI can output 1080p resolution at 60 Hz, but at 120 Hz (refresh rate coming from YOUR GPU, not 120 Hz through interpolation) I know VGA can't do it, but HDMI might be able to.
-
since when do you need more than 60 Herz on an LCD monitor/TV ? ... heck even 50 works fine
-
tilleroftheearth Wisdom listens quietly...
VGA (I'm pretty sure) can do 240 MHz (at least it could once upon a time...).
The reason you want more than 60 Hz is the same reason Audio is oversampled and (usually) sounds better: less visual digititis, much smoother motion. -
-
-
- first, I would assume that you did a mistake of putting 240 MHz as opposed to 240 Hz.
- second, humans can interpret motion at the rate of 46 Hz or so max, i.e. everything that is changing faster than 46 times per second is not entirely visible to humans.
- third, the reason you see 50 and 60Hz as common came from the filming industry that was displaying at 24 frames per second at the beginning, as this was considered enough for us to interpret continues motion. Followed by the TV, the signal there was transmitted on the interlaced basis due to technology limitation, i.e. to produce one frame the ray would have to go twice from the top left to the bottom right pixel to form the frame, so at 50 Hz the TV was essentially outputting 25 full frames per second. It actually made lots of noise about the extra frame (from 24 to 25) that could influence human subconsciousness to do/want stuff and that's what some TV commercials used, but that's different story. The technology for the TV signals is greatly dependent on what power comes from the grid, so in Europe that is 50Hz and in US that is 60Hz, therefore the the above refresh rates came as standard.
- fourth, with the CRT monitors (succeeding the TV) it was making sense to do more than 60 Hz (still interlaced, non-interlaced came later) because those can display only 1 pixel at a time due to the one ray that is emitted from the back of the kinescope, and now people sitting closer (than with the TV) they could notice stuff going on. The LCD monitors on the other hand can keep displaying all their pixels at the same time and only need to be refreshed so that the displayed frame would change.
- fifth, dont confuse the human eye interpreting motion with sensing flicker, as those are two different things. Flicker you can sense but it doesnt meant that you understand what's going on. On a CRT monitor for example, at 60Hz refresh you can sense flicker if you look away from the monitor and still try to see what's displayed. Some people could notice flicker when looking straight at the monitor as well. Therefore people started increasing the refresh rate, and it was found that at about 85Hz or up humans dont even sense flicker anymore. That does not affect LCD monitors as I just explained a point ago.
- sixth, 60Hz is plenty of a refresh rate for LCD monitor, considering the 46Hz limitation of the human vision. 85Hz is plenty of a refresh rate for CRT monitor for people to not sense even flicker.
- seventh, if I make a company for producing TVs, I'll be advertising my new 300Hz TVs that many people will consider superior and therefore will buy. Couple years later I'm gonna roll out new 360Hz TVs ..... LOL.
- eight, I'm going to go and make me a sandwich as I think I explained enough
P.S. Dont get me started on the audio -
tilleroftheearth Wisdom listens quietly...
miro,
I feel so small (as an ant in NY!).
Yeah, I meant 240 Hz - the rest of the stuff you 'explain' is irrelevant to what some people can be very much tuned to (no matter what the 'specs' say for human vision).
Audio?
Yeah, let's not start...
Mmmm! Sandwich!
Laptop Graphic Card External Resolutions? 2nd try for Help, please?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by govtdog, Jan 5, 2012.