The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    HDR or Bloom + AA , whats better?

    Discussion in 'Gaming (Software and Graphics Cards)' started by chrusti, Jan 23, 2008.

  1. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    Hands down, what gives oblivion a better look with a better performance?

    HDR or Bloom + AA2

    What do you guys think?
     
  2. Budding

    Budding Notebook Virtuoso

    Reputations:
    1,686
    Messages:
    3,982
    Likes Received:
    0
    Trophy Points:
    105
    HDR will definitely look better, but Bloom and AA will probably give you more frames per second. I would recommend HDR if your card can handle it.
     
  3. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    Depends on the resolution you are running at. If you are playing at native resolution, there is no point to AA. Just use HDR.

    If you need more FPS, bloom+AA is definitely nice.
     
  4. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    I see, I was thinking that at least in dungeons bloom+aa would look better :eek:

    Anyways, thanks for the answers!
     
  5. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    I'd recommend you try the fake HDR mod which will improve your lighting transistions and extremes while giving you the ability to run AA quickly.

    http://timeslip.chorrol.com/FakeHDRDesc.html

    Of course HDR+AA is also nice if you can run it smoothly enough.

    Like Greg says, I'd prefer no AA with a higher more native resolution and then adding HDR, higher/better textures, or more AF (always prefered higher AF than AA in oblvion).

    I personally ran texture mods with HDR and at least 4X HQ AF, looks the best IMO having played it on many setups.
     
  6. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    I dont see where to enable AF in the game, may you help me ?
     
  7. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    Oblivion does not officially support AF. You can force it however, using the nVidia Control Panel. Most computers can run 8xAF without a problem, as there is barely any performance impact (but there is eye candy to gain).
     
  8. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Yeah you do it through the control panel (same with adding AA ontop of HDR).

    I agree with Greg 8X AF should easily be sufficient, heck even 4X HQ AF is dramatically better and almost the most noticeable transition, more than 0-2 and even 4-8 IMO because 0-2 goes from blurry to still blurry but less so, whereas 2-4 really does go from blurry to sharp textures in things like the cobblestones and ground cover.

    Here's a good look at how the differences change based on AA, AF, resolution, etc.

    http://au.gamespot.com/features/6168650/index.html
     
  9. PC_pulsar

    PC_pulsar Notebook Evangelist

    Reputations:
    5
    Messages:
    479
    Likes Received:
    0
    Trophy Points:
    30
    the preference of aa or hdr totaly depends on the dps. My dps isnt very high because i have a 17inch screen with only 1440*900 resolution, that's why i prefer AA in oblivion.
     
  10. Soulburner

    Soulburner Notebook Evangelist

    Reputations:
    51
    Messages:
    399
    Likes Received:
    0
    Trophy Points:
    30
    You know for a "Super Moderator" I am very surprised at this...

    I have to disagree. For one, the resolution you play at on an LCD has very little if anything to do with the need for Anti-Aliasing. It all boils down to pixels per inch. Since you can't increase the resolution of a fixed-pixel display, you cannot increase the pixels per inch. The fact is that AA noticeably improves image quality on all LCD's at or below native resolution. For those running 1920x1200 on a 15.4" screen, the need will be much less (and the performance penalty will be large), but never is there "no point to AA". Since an LCD shows a digital, pixel perfect image, we don't have the natural AA effect of a CRT to fuzz the edges for us.

    As per the topic, IMO it depends on the game. For example in Oblivion, Bloom + AA is much better because the HDR in that game seems to have been tacked on at the last second. It appears to have been overused with little regard to whether or not it makes sense. Overexposure from a bright spot on someone's face in a tavern? Get real. All it does is blur all the textures together. Dark Messiah? There is a fine example of how to use HDR properly.
     
  11. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Edge-bleeding/blending of TV CRTs is not present in Computer CRTs for this 'natural AA' you talk about because the phosphorous bleed is nowhere near the same, and the electrons are more controled and evenly spaced thanks to aperture grills and better masks, so now on a quality CRT the edge-bleeding is no more so than the filter on an LCD. If your CRT is fuzzy, you should learn to adjust it or get a better one.

    The same argument would be made for the 'natural AA' of interpolation, but it would be equally wrong. AA is not blurring! And due to those very same interpolative properties and the way that AA works relying on proper colour balanced pixels anything outside of native resolution kills both the effectiveness of AA and AF.

    I leave it up to the individual which one looks better, but your comments on there is never 'no point to AA' is equally personal, because on an image that is pixel perfect would be degraded by the artificial addition of AA. The thing is like boosting bass on music, lotsa people like distortion, so some people may prefer a muddier AA'ed image, but it doesn't improve IQ on all LCDs like you contend.

    Perhaps you should just leave it as 'to each their own' and not worry about who's a "Super Mod" etc.
     
  12. ShinAkuma135

    ShinAkuma135 The King of Beasts

    Reputations:
    88
    Messages:
    1,618
    Likes Received:
    0
    Trophy Points:
    55
    having spent 23/24 hours playing oblivion...i can honestly say that HDR with forced af looks fantastic. if you have a bigger monitor with 1680+ res its even better.
     
  13. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    What looks better? Red or blue?
    HDR or Bloom? They do different things.
    It's apples and oranges. Try both and see what you like best.

    HDR is the most advanced effect, if that's what you're asking. But it may or may not look good (as said above, Oblivion's HDR is just silly. You can literally get blinded from people's faces in char creation)
     
  14. Soulburner

    Soulburner Notebook Evangelist

    Reputations:
    51
    Messages:
    399
    Likes Received:
    0
    Trophy Points:
    30
    A CRT will not have as sharp of an image as an LCD because you are using an Analog signal. It just can't happen. Some high end CRTs can come close, but you can't compare a fixed-pixel digital design to an analog tube. Many CRTs, whether they are TVs or computer monitors, have less of a need for AA than an LCD. Why? Because the LCD will expose flaws in the image much easier. It's simply the nature of the technology. I've used enough of both types of screens to know the strengths/weaknesses of each.

    Compare it to mp3s - if you already have loss at the source, you can't hope to have the same end result.

    And I don't understand your "muddied image" point. I have never seen properly-implemented AA muddy an image, unless we are talking about the early GeForce or Radeon days from 6+ years ago.
     
  15. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    C'mon talk about over-generalization. :rolleyes:

    That would be like saying CD is better than Vinyl and leaving it at that. You do realize that some LCDs have VGA input so in that case you have a minimum of 2 conversions from D->A->D versus only 1 from the RAMDACs. Wanna compare a single link to a BNC connection?

    Conditions matter alot. What's the quality of the TMDSs, and the quality of the DAC, and trans type and coating of the monitor. If the pixel pitch of your LCD is huge like those of the 27" DELLs or older 17" models then I bet you my 21" Professional IBM P260 in front of me right now is much sharper on it's shielded DVI-A connection running at 1920x1440. This monitor is careful adjusted and calibrated and not used for the first 15 mins every day while it warms up (glad to have a Tim Hortons in the building), and turned off every night to ensure it remains at its best. It rivals most LCDs in sharpness, and definitely bests them in colour gradations, even the LEDlit ones, whichm matters for me and would matter when talking about AA.

    Obviously not, that "it's simply the nature of the technology" statement goes back to that same misconception above about Vinyl and CDs, and comparing it as if all LCDs and all CRTs are the same under all conditions.

    Which doesn't matter here, you're confusing the noise factor as if there is no issue on LCDs regardless of panel and TMDS quality or even cable length, Or as if the RAMDACs significantly degrade the image to create 'virtual AA'. You also seem to thinks all analogue connections have giant noise issues that can't be handled and digital connections are pristine. :rolleyes:

    Actually I was talking about any version, where 8XAA applied on an interpolated image to some looks better than no AA on native resolution, and where other people prefer one or the other technique, including more recent applications by both nV and AMD/ATi including things like Temporal AA and Wide/Narrow Tent CF, that softens the image removing quality overall simply to make the appearance of harsh lines less noticeable. And that's the idea behind blurring is no effect AA it's just blurr. But like I said some people prefer that, and I'm getting the feeling based on your statements that you are one of those people who can't see where AA wouldn't improve an image.

    Like I said, that's your preference, but it's not the fact you seem to think it is.

    Just like the OP asking for opinions about HDR or Bloom+AA, most of the time it comes down to personal preference.
     
  16. Soulburner

    Soulburner Notebook Evangelist

    Reputations:
    51
    Messages:
    399
    Likes Received:
    0
    Trophy Points:
    30
    Interpolated? I really doubt that term applies to most PC gamers.

    Anyway I was talking about CRT's in general, not high end models, like I stated. Of course if you shell out the big bucks you will have something better than 90% of the CRT users out there.

    Still this doesn't change the fact that Oblivion has a poor implementation of HDR. Smearing blotches of light in random places doesn't make a game look good. I like the game a lot, but I leave HDR off. Others may object, but try out some other games that have HDR, namely Source engine games such as HL2: E2 or Dark Messiah and you will see the difference.