I'm curious as to what the majority of the gamers out there go for mostly. If you're strapped on hardware, do you go for a higher resolution and low detail, or lower resolution with all the detail?
Personally, I always stick to my native (1680x1050) and work my details around that. Anything lower than my native looks very blocky and stretched improperly.
-
Alienware-James Company Representative
-
i go for lower res and high details. somehow i feel like i'm missing out when i disable all the details and particle effects and what not. i will even go as far as 800x600 if i have to
. but high resolution and AA never really mattered to me.
-
I go for a lil bit of both.. my native is 1900x1200... so ill go like 1680 and put medium to high detail. CoD4 for example at that resolution i can max out detail and AA and get 90 frames... (and thats with ONE 7950... cant wait to put in TWO 9800s
)
-
I got for detail over resolution... I mean if you got Maxxxxed Res and rubbish GFX.. You lose the feel of the game...
Just for the record when I'm not gaming I stick to Max Res... 1440x900 -
I go with low but same aspect-ratio resolutions, and higher detail.
-
It depends on the computer. On my desktop I push everything to native resolution and as many details as I can, simply because it doesn't matter if I turn the resolution down. Crysis gains maybe 4 FPS from going to 1024x640 from 1680x1050, but the framerate doubles to go from DX10 Very High to DX9 High. 640MB of VRAM on a 320-bit bus but only 96 SPs = resolution independent performance.
On my ThinkPad, it's the opposite. 1024x640 or so is playable at medium to high in most games. 1440x900 and 1280x800 are basically unplayable at any detail settings in modern games - 128MB of VRAM on a 64-bit bus doesn't do any favors for resolution.
On my alternate desktop, I only a have a 1024x768 15" LCD hooked up so it doesn't matter - anything will run at 1024x768 on a 7600GT. On my Gateway, no modern game will run at all, except UT3, but that requires me to run it a 640x400 all minimum details; e.g., as low as it will go.
Note that I try to keep aspect ratio no matter what. -
Higher detail, lower resolution.
-
Matthewrs_Rahl Notebook Consultant
Agreed with the majority here.
I'm an ex-hardcore gamer (10+ hours a day). Nothing beats high resolutions, with the exception of high detail settings!
I always took the hit since my hardware was never great. I NEVER played above 1024x768 (but, never lower than 800x600). I agree you hould keep the same aspect ratio, too. Just works out better this way.
I determined my res based on the Frames Per Seconds (FPS) and quality settings I wanted.
All my quality settings needed to be med/high (with the exception of smoke, which I usually kept low). All my frames-per-seconds needed to be at MINNIMUM of a steady 60. Preferably 80. From there on, I would slowly crank up the resolution until I found my frame rates dropping. I used to play on some bad CRTs that only supported 60fps in the past, so it was easy to determine my cut-off point (the moment you hit below 60, you simply go back-a-step in resolution). Anyhow, that is just my opinnion. -
-
Thats interesting... I mean, my card also keeps VERY cool, below 60, the D901c on max fans cools amazingly. My processor is an E6700... Its not rele a laptop. Its a desktop/laptop hybrid if you will. But that 8800 Is new generation to mine as well... Try putting the setting "Dual VIdeo Cards" to "yes" even though you only have one video card. Alot of people have been finding that trick very helpful!!
-
Thanks Doodles, and good point, that Blackhawk has got some beastly desktop hardware!
-
Due to the fact that my display looks blurry at anything other than the native resolution. I tend to keep to a high (ish) resolution with reduced details. Works for me, so I'm happy.
-
I prefer native resolution over detail. For example; when I run Crysis in 1440x900 on my M1530 laptop, it looks awesome. It so much more blurry when I try any other resolution.
-
1440x900, all low I presume, as the 8600m GT can't run it much better than that.
*Edit* I do see the crazy overclock, so maybe a chance at low/medium. -
Higher res, lower detail. The only detail that really matters is smoke and "viewing distance." Those usually get maxed out. Smoke cause, usually I can see through it better if it is drawnout correctly. Viewing distance is a no brainer. Helps me shoot those cursed snipers.
-
KGann try the trick doodles said, and you'll be able to apply 2xAA
-
Perosnally, I agree with the native res thing, and will always run native res with lower details if poss, but it just isn't possible to run Crysis @ 1440 x 900 on an 8600MGT with smooth gameplay during combat, even if on low. Irrespective of the overclock. -
I will try that when I load up CoD4. Wonder why it likes "Dual-Video Cards" on?
And Icaru, I was kind of thinking that too. -
-
For me it depends on the genre and the quality of a game's lower settings. I couldn't imagine playing Sins of a Solar Empire at a low resolution, but Crysis on low detail settings isn't even worth playing.
-
Personally I aim for native resolution first, then worry about the other things. I like my games to look crisp and sharp, even if the textures are poor.
-
I usually will set resolution first, then sort out the detail settings later.
-
someguyoverthere Notebook Evangelist
Right now i have a geforce 5200 so I basically put everything on lowest and pray.
-
Wow, that's some old school gaming. Back in the Geforce FX days. (Remember being so proud of my FX5500 back in the day...)
-
1280x720 w/2AA @ med/high (Crysis, COD4 Multiplayer), and 1680x1050 w/2AA/4AA @ high (COD4 Singleplayer, older games). Other games fall somewhere in between.
-
Res for me as a competitive gamer is more important than detail level. I like to be able to aim precise and see more.
-
-
-
Agreed Protomenace. I have seen the 8600m GT in action, and owned one. (Though my was defective, so never really got to use it) They are great performing cards, and dominate the mid-range/performance cards, but 1280x720 med/high is unrealistic. Throw in AA, and you're just being absurd. With a 8800m GTS I run it at 1280x800, maxed and get around 30FPS, so I can't seem to believe the 8600GT is that close. I also haven't started tweaking Crysis, which everyone has recommended. Once I get to tweaking, I should have no problem running at native.
-
You may have assumed I stated I was getting ≥25FPS, however Crysis (never tweaked) at the mentioned settings runs at 10-20FPS on my G1S. I've beaten it twice (hard/delta) with lower settings, so now I play for "eye candy" at the mentioned settings.
-
I tried to put Crysis on high settings at native res once... Let's just say I would have had a better experience browsing the screenshots section at IGN. Then, because I was particularly angry at my GPU, I went to very high. And that's the story of how I had to hold down my laptop's power button to shut it down for the first time.
-
Well, I guess since it wasn't playable at those settings, that makes sense. Most people don't post their settings at unplayable framerates.
-
I disagree with most of you.
The first thing I do with any game is set the Textures as high as my VRAM will allow.
Then a moderate resolution (Usually 1280*1024).
Then I make sure each of the other enhancements are on at least medium.
Personally, I don't see why someone would want to run a game like Crysis @ 1600*900 on medium/low, when they could run it maxed out @ 1280*1024... the shaders, physics, and particle effects are what makes it special. Not 'crisp details', anything game can have 'crisp details', just my opinion. -
-
Hmm, don't think I could ever find 15FPS playable... but I guess I'm weird.
-
Matthewrs_Rahl Notebook Consultant
15fps. Good gawd. How can you stand it? Admittedly, I get a bit obsessive about FPS, but anything below 40fps (constant) and I refuse to play it. It will drive me bonkers.
I aim for 80, but settle at 60 (because, that is often the hertz max on many screens I've played with). -
Traditionally I've gone for detail first and resolution...well, often I haven't even cared about resolution. But I've also used CRT's traditionally, so low resolution wasn't as important as on an LCD. So I used a nice mix of 640x480, 832x624, and 1024x768. Not sure I ever went above that even my CRT monitor supported 1280x960 - just set details higher if anything.
Now I'm generally able to get at least high settings on all the games I have with native resolution, so it's not so much an issue - the one time I couldn't - (COD 4) - I compromised on both and went 1024x768 Medium settings. It's nice being able to run stuff near maximum for a change! -
ViciousXUSMC Master Viking NBR Reviewer
I find that hardware scaling these days is great and that I can barely tell the difference between a lower resolution and a higher one in a game as long as its the same aspect ratio. But boy you sure do feel the performance hit for that higher resolution!
So details always > resolution for me. The image is abit sharper in full native but thats all, however the effects are much more than a sharp image, its the lighting, the AF, the shadows, the particles. I think you would have to be blind to prefer resolution over details.
I finally get both tho with my new setup, a quad @ 3.6ghz and 2 4850's has been able to run every game maxed out at 1920x1080 for me except crysis, that only runs on high in a good frame rate, but I trid vhigh and it looks the same anyways. -
I max out details first. Then I find the max res those setting will allow. if I don't have an acceptable resolution (anything lower than 1024x768), I start sacrificing details to bump up the resolution. I lower textures to no lower than medium and then start cutting out the shadows. I'm not happy gaming if I can't at least have things at medium at 1024x768.
-
I try to play at my native resolution(1680x1050) whenever possible, I crank everything up and disable the AA.
-
It all depends on the game, and what I'm playing it on. Considering the horribleness that is Laptop LCD's, they generally are awful scalers. However, newer games are coded better, and I can actually scale (compare UT3 with oblivion). So I go first for res, then details. Shadows are usually turned down.
When gaming on an external screen, I don't have these issues, as the screen scales so well, that I can use windows at lower resolutions. -
on external monitors. you can do 800x600 without noticing much difference or blurr( talking about old CRTs), but i dont want to decrease my sperm count you know xD
-
Always details > resolution.
Unless its a game like world in conflict where changing the resolution has a minimal effect on performance. -
resolution>detail has always been my motto.
-
-
-
ViciousXUSMC Master Viking NBR Reviewer
on my laptop is 1280x800 then what ever details I can for good fps, for my old desktop it was 1280x720, for the new desktop its 1920x1080 all max
Including UT3 Zee it runs over 60fps with all maxed 1080p. -
That should offer some good lookin fraggin.
-
textures high
object detail medium
shadows low
shaders medium
physics high
game effects high
volumetric effects medium
water high
postprocessing high
particles high
sound medium
motion blur high (three quarters)
I use 'natural mod' and get around 23-30 fps = very playable and it DOES look awesome on my 1440x900 LG laptop screen. -
I, for one, would love to see you in a fire-fight with those settings keeping 23-30FPS. Any screenshots? That would probably help the 8600m GT community, as you must have a super PC. (I can only run Crysis at 1280x800 all high, don't see how you're getting those framerates with almost my settings, and half the card)
And Zee, I wish I had a better CPU. With UT3 1440x900, maxed out, I only get around 30FPS.
Your preference when gaming (resolution vs. detail)
Discussion in 'Gaming (Software and Graphics Cards)' started by Alienware-James, Jul 8, 2008.