The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    HDMI Refresh Rate Issue

    Discussion in 'Gaming (Software and Graphics Cards)' started by nu_D, Jan 29, 2008.

  1. nu_D

    nu_D Notebook Deity

    Reputations:
    741
    Messages:
    1,577
    Likes Received:
    1
    Trophy Points:
    55
    I have an 8400GS in my HP DV2500. I am outputting the HDMI signal to a 40in Panasonic 1080p Plasma Display.

    I have tried changing outputting at 480 resolution all the way to 1080p, but the highest the refresh rate in the nVIDIA control panel that it will allow me to go to is 60hz.

    Is this the norm for an HDMI signal? Seems a little low to me. Is there anyway I can increase the refresh rate to 100hz?

    Thanks.
     
  2. nic.

    nic. Notebook Evangelist

    Reputations:
    97
    Messages:
    649
    Likes Received:
    0
    Trophy Points:
    30
    Not sure but normal HD TV i see out there has only 50hz refresh rate while some newer one with 100hz refresh rate, in my country though.

    Don't really know this notebook to tv stuff... good luck anyway.
     
  3. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    MOST 1080P sets have a refresh rate of 60Hz. Yes this is a standard. Only very few 1080P TVs can run any higher, and those that do usually run it at 120Hz.

    It's normal, just make shure that it's NOT running any lower, if it's running at even 59Hz, your signal will be 1080i which will be blurry on your MUCH nicer 1080P tv. for reference, 1080i is 1440x1080 the horizontal pixes are interpolated and 1080i can only display 29.9 frames per second.

    Btw, I know this because I as well own a 1080P tv, 52". And know quite a bit about them.

    [Edit] Also, this doesn't look like a gaming hardware, nor software question IMO.
     
  4. nu_D

    nu_D Notebook Deity

    Reputations:
    741
    Messages:
    1,577
    Likes Received:
    1
    Trophy Points:
    55
    Thanks for the help, guess it makes sense then. :)

    I would like to kindly disagree with your opinion.

    I looked through the options:

    What Notebook Should I buy? -----------> Nope.
    Hardware Components and Aftermarket Upgrades---------> Maybe.
    Accessories--------------> Nope
    Gaming (Software and Graphics Cards)-------> Maybe.
    Windows OS and Software---------Nope.
    .
    .
    .
    .
    You get the point. Basically, I choose this particular section because it said GRAPHICS CARDS. Now, as far as I can tell, my graphics card is outputting the signal to my television. No matter how you slice it, my graphics card is dealing with the issue. If you look at my question, it's how do I increase the refresh rate of my card?

    Where you want me to put it? I don't know. I looked, I saw graphics card... my issue deals with a graphics card.. if you want me to be reprimanded and sent to Guantanamo Bay it's cool.

    But I do ask, where else should I have put it then?
     
  5. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    Nah, it's fine, I didn't really care, never do. post it wherever you wish, I was under the assumption that there was a better section for this, and I could swear there was. never mind that. dont go all out because of my mistake now! ;)
     
  6. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    The refresh rate will not change the format of the image, you can display a 1080P image at 59 frames per second, what becomes an issue is how the TV/monitor/display handles it. Usually each TV is has presets on how to handle singnals, so if it receives one it's unfamiliar with it will likely revert to a preset closest to what it thinks it'll match. Obviously for your TV/monitor that is to revert to however it handles 1080i either dithered or interpolated for non-1080i monitors.

    Most HD-TVs you see that say '120hz' aren't actually anywhere near capable of displaying at 120hz, the reason for that number (which is usually well beyond the pixel response BWB-WBW range) is because it alows support of the current major cadences for media with 24, 30, and 60 all being well divisible into 120, the only ones not fiting in well are the now pretty much outdated 16fps, and the PAL etc rates of 25/50/100hz, which are still aren't gaining as much traction as a single 30/60 standard for HD.

    Actually no, 1080i is 1920x1080 just like 1080P, however some companies cut corner when either displaying the image or when recording/delivering the image which is where the crippled 1440x1080 standard comes from. A true 1080i monitor will display two 1920x540 fields interlaced, giving you the full 1920x1080 frame. So it's the reference standard, it's the hardware that does that.