Hands down, what gives oblivion a better look with a better performance?
HDR or Bloom + AA2
What do you guys think?
-
HDR will definitely look better, but Bloom and AA will probably give you more frames per second. I would recommend HDR if your card can handle it.
-
Depends on the resolution you are running at. If you are playing at native resolution, there is no point to AA. Just use HDR.
If you need more FPS, bloom+AA is definitely nice. -
I see, I was thinking that at least in dungeons bloom+aa would look better
Anyways, thanks for the answers! -
TheGreatGrapeApe Notebook Evangelist
I'd recommend you try the fake HDR mod which will improve your lighting transistions and extremes while giving you the ability to run AA quickly.
http://timeslip.chorrol.com/FakeHDRDesc.html
Of course HDR+AA is also nice if you can run it smoothly enough.
Like Greg says, I'd prefer no AA with a higher more native resolution and then adding HDR, higher/better textures, or more AF (always prefered higher AF than AA in oblvion).
I personally ran texture mods with HDR and at least 4X HQ AF, looks the best IMO having played it on many setups. -
I dont see where to enable AF in the game, may you help me ?
-
Oblivion does not officially support AF. You can force it however, using the nVidia Control Panel. Most computers can run 8xAF without a problem, as there is barely any performance impact (but there is eye candy to gain).
-
TheGreatGrapeApe Notebook Evangelist
Yeah you do it through the control panel (same with adding AA ontop of HDR).
I agree with Greg 8X AF should easily be sufficient, heck even 4X HQ AF is dramatically better and almost the most noticeable transition, more than 0-2 and even 4-8 IMO because 0-2 goes from blurry to still blurry but less so, whereas 2-4 really does go from blurry to sharp textures in things like the cobblestones and ground cover.
Here's a good look at how the differences change based on AA, AF, resolution, etc.
http://au.gamespot.com/features/6168650/index.html -
the preference of aa or hdr totaly depends on the dps. My dps isnt very high because i have a 17inch screen with only 1440*900 resolution, that's why i prefer AA in oblivion.
-
I have to disagree. For one, the resolution you play at on an LCD has very little if anything to do with the need for Anti-Aliasing. It all boils down to pixels per inch. Since you can't increase the resolution of a fixed-pixel display, you cannot increase the pixels per inch. The fact is that AA noticeably improves image quality on all LCD's at or below native resolution. For those running 1920x1200 on a 15.4" screen, the need will be much less (and the performance penalty will be large), but never is there "no point to AA". Since an LCD shows a digital, pixel perfect image, we don't have the natural AA effect of a CRT to fuzz the edges for us.
As per the topic, IMO it depends on the game. For example in Oblivion, Bloom + AA is much better because the HDR in that game seems to have been tacked on at the last second. It appears to have been overused with little regard to whether or not it makes sense. Overexposure from a bright spot on someone's face in a tavern? Get real. All it does is blur all the textures together. Dark Messiah? There is a fine example of how to use HDR properly. -
TheGreatGrapeApe Notebook Evangelist
The same argument would be made for the 'natural AA' of interpolation, but it would be equally wrong. AA is not blurring! And due to those very same interpolative properties and the way that AA works relying on proper colour balanced pixels anything outside of native resolution kills both the effectiveness of AA and AF.
I leave it up to the individual which one looks better, but your comments on there is never 'no point to AA' is equally personal, because on an image that is pixel perfect would be degraded by the artificial addition of AA. The thing is like boosting bass on music, lotsa people like distortion, so some people may prefer a muddier AA'ed image, but it doesn't improve IQ on all LCDs like you contend.
Perhaps you should just leave it as 'to each their own' and not worry about who's a "Super Mod" etc. -
having spent 23/24 hours playing oblivion...i can honestly say that HDR with forced af looks fantastic. if you have a bigger monitor with 1680+ res its even better.
-
What looks better? Red or blue?
HDR or Bloom? They do different things.
It's apples and oranges. Try both and see what you like best.
HDR is the most advanced effect, if that's what you're asking. But it may or may not look good (as said above, Oblivion's HDR is just silly. You can literally get blinded from people's faces in char creation) -
Compare it to mp3s - if you already have loss at the source, you can't hope to have the same end result.
And I don't understand your "muddied image" point. I have never seen properly-implemented AA muddy an image, unless we are talking about the early GeForce or Radeon days from 6+ years ago. -
TheGreatGrapeApe Notebook Evangelist
That would be like saying CD is better than Vinyl and leaving it at that. You do realize that some LCDs have VGA input so in that case you have a minimum of 2 conversions from D->A->D versus only 1 from the RAMDACs. Wanna compare a single link to a BNC connection?
Conditions matter alot. What's the quality of the TMDSs, and the quality of the DAC, and trans type and coating of the monitor. If the pixel pitch of your LCD is huge like those of the 27" DELLs or older 17" models then I bet you my 21" Professional IBM P260 in front of me right now is much sharper on it's shielded DVI-A connection running at 1920x1440. This monitor is careful adjusted and calibrated and not used for the first 15 mins every day while it warms up (glad to have a Tim Hortons in the building), and turned off every night to ensure it remains at its best. It rivals most LCDs in sharpness, and definitely bests them in colour gradations, even the LEDlit ones, whichm matters for me and would matter when talking about AA.
Like I said, that's your preference, but it's not the fact you seem to think it is.
Just like the OP asking for opinions about HDR or Bloom+AA, most of the time it comes down to personal preference. -
Interpolated? I really doubt that term applies to most PC gamers.
Anyway I was talking about CRT's in general, not high end models, like I stated. Of course if you shell out the big bucks you will have something better than 90% of the CRT users out there.
Still this doesn't change the fact that Oblivion has a poor implementation of HDR. Smearing blotches of light in random places doesn't make a game look good. I like the game a lot, but I leave HDR off. Others may object, but try out some other games that have HDR, namely Source engine games such as HL2: E2 or Dark Messiah and you will see the difference.
HDR or Bloom + AA , whats better?
Discussion in 'Gaming (Software and Graphics Cards)' started by chrusti, Jan 23, 2008.