Obviously a lot of us are obsessed with maxing out games, but really how important is it to have max settings vs having the perfect frame rate?
For example I was playing crysis 2 with dx11 high res textures at ultra, and I lowered to extreme, played for a bit and couldn't see any difference between ultra & extreme (except for the frame rate boost with extreme).
I always try and max games by reflex, but often when I'm involved in a game I am not really taking note of the graphics. I have always preferred to play max at 30-45fps rather than lower settings to get 60fps locked, so I wanted to see how others feel about this?
-
Depends on the game. If it's a single player game usually I'm fine with 40+. If it's a multiplayer FPS like BF3, then it's gotta be 60... Though I don't always get what I want with my setup.
-
It really depends on the game. Does it feel fluid to you t 30fps, does 30fps net you a disadvantage in multiplayer. IF the answers to those two questions are yes and no, then max them to your heart's content. I cannot stand something that feels choppy so i set the settings accordingly.
-
120fps is the new 60fps... believe it.
-
-
I'm a FPS guy. I like effects, but if it hinders my gaming, I will turn down them down. Most of my tweaks aren't even noticeable while gaming. I'd have to film both settings and compare to see the difference. Especially on a smaller screen.
-
For me it depends, like Tijo mentioned it depends on the game. 30FPS in Crysis 1 felt smooth to me, 30FPS in Assassins Creed was unplayable (for me)
-
i dont know why people so fussed. i used to play crysis on my 3650m (still do) and now shogun 2 on medium settings with around 20 frames and i am happy with it
-
The console versions of crysis 2 drop to 20fps in places but most console players probably don't even know or care... -
There is no difference beyond what your eyes can discern (around 60fps depending on who you ask). The reason 100+ feels smoother is because your framerate fluctuates during play, so even when there is a lull in your framerate at a particularly demanding section, it doesn't dip below that discernible level of FPS. That is why averaging 100+fps feels so smooth...this is also why I feel averaging 60fps isn't always enough even though I don't think anything above 60fps matters. When I average in the danger zone of 60fps or below, it means that it some situations the FPS might actually drop to a rate where my eyes can tell. I have really stopped caring about average FPS and I really only care about the minimum now. If my minimum is always above 30fps I am generally happy. I am on a older computer, a i7-920 @ 3.6 with a GTX570 @ 940 and on something like BF3 1080p ultra 4xAA it is usually smooth and once in a very long while I might get mid to high 20's and I find that still acceptable. As for answering the actual question, I am a sucker for high settings and I will pretty much run them at any cost. If I have to start reducing settings that is when I look towards new hardware. For my last couple of desktop systems I can expect about 3 years or so of running everything maxed out before something has to get changed. On a notebook maybe 1.5-2 years is more reasonable?
-
Didn't find the right option to click, so I write: Max settings, as long as I get 25fps+
-
There is a noticeable difference between 30 and 60 fps, and 60 is easier to play with. However, I much rather look at fantastic graphics because 30fps is completely playable anyway. Only some very fast paced games like Multiplayer FPS or Hardcore Starcraft may actually gain any notable benefit from more frames per second over 30.
I usually just try to find the best graphics for around 25-30 fps. If I can get max graphics for more, then that's awesome. -
For me the usual formula applies as well (60 for fps, 40-45+ for other games), although some games feel very playable at 30+ average (as long as there isn't too much dipping). The witcher 2 played with some tweaked highly set settings at a very playable 30+ in 720p.
Currenlty playing Skyrim in 1920x1200 with fxaa, 15x af and maxed settings (with many hd mods, pop mods etc) at a smooth 35 fps average in outdoor areas, little lower in very woodsy areas and much smoother in dungeons. -
Depends on game really. Most SP games I'm perfectly fine with 30+. For some online shooters 50+ is really required. BF3 is sensitive to FPS I'm finding. Higher FPS seems to result in more accurate aiming.
-
Max settings, as long as I get 30fps+ ... That's the bare minimum for me!
-
This myth of "eye can only see 30/60/72/100/120......" pops up time and time again, it simply is not true.
First and foremost, the human visual system does not 'see' in FPS.
TLDR; version - Science has disproved this myth many times over, the theoretical max a healthy human eye can see is 50,000 fps.
Response time of cones and rods [the cells in your eye that allow you to 'see'] and transmit/receive time can effectively be measured as fps by measuring the time it takes at each stage. This is very complex because of all the variables such as brightness, contrast, light levels in general, colors used, etc etc that all affect the outcome.
For simplicity - Rods see motion, cones see color. The response time (or used and ready to be used again time) through bipolar ganglion cells has been measured at 20msec (thats micro-seconds, not mili)
Another issue with the 'question' is people misunderstanding what they are asking, and HOW to ask it. {the first link explains it in-depth}
"How many frames per second can the human eye see?
This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as:
How many frames per second do I have to have to make motions look fluid?
And it's not the same as
How many frames per second makes the movie stop flickering?
And it's not the same as
What is the shortest frame a human eye would notice?"
These are all VERY different questions.
For the sake of simplicity, Yes , the eye can see more than N fps.
Lets not argue argue over proven facts though ^.^
How many frames per second can the human eye see?
AMO.NET America's Multimedia Online (Human Eye Frames Per Second)
AMO.NET America's Multimedia Online (Human Eye Frames Per Second 2)
Rods & Cones
Bipolar Cell Pathways in the Vertebrate Retina – Webvision -
masterchef341 The guy from The Notebook
If you're going to pick a number of frames per second that the human eye will no longer be able to differentiate, you need to start your ballpark estimation in the 300 range and work up from there.
Nevertheless, 60 frames per second is a good target for fluid motion and tight control for high action video games. -
15-20fps, you might as well just watch a slideshow thats set on the medium speed or something. I really cant fathom how anything below 30fps is playable. 25 is the absolute bare minimum ill go if the gameplay is worth it. The same notion goes towards those who need 60fps on multiplayer, I really just can't relate to that.
-
If maxed settings has me at 30fps+, I'm inclined to stay there.
If maxed settings has me at 45-50fps+, I will drop settings to achieve 60fps. -
masterchef341 The guy from The Notebook
-
moviemarketing Milk Drinker
-
Regardless of what everyone says, 60 fps appears to be at, or close to, the limits of my vision to distinguish. Between 60 and 120, there is certainly diminishing returns, if not no discernible difference.
30 fps, though, is completely playable for most games. The thing I absolutely can't stand is a wildly fluctuating framerate somewhere between 30-40 on the low end and 60 on the high end. It makes the game feel slower than if I were playing at a constant 30, so I usually end up locking games like this down to 30, where the experience feels more fluid. Oblivion and Skyrim both come to mind as games I've done this to.
For a multiplayer FPS like BF3, a high framerate is essential, imo. I find it hard to aim with any kind of precision at 30 fps, even worse with a fluctuating framerate. There is logic behind this, though - you'll get twice as many increments for you to gauge where you're pointing at 60fps as opposed to 30fps, so aim will be more precise. In single-player or non FPS games, though, it makes little difference. -
Personally i like to run at around or just under 90hz, since thats what my laptop screen can run at, oddly i found that my moms vostro 1700 screen runs at 120hz, but its not 1080p so that could be why, anyways i tried going back and playing in the 40-45fps range and its now become unbearable.
Sent from my DROID RAZR using Tapatalk 2 -
moviemarketing Milk Drinker
OK I went back and tried Witcher 2 at 1280x720 and I can actually play the damn game, finally! Pretty amazing difference. Is this just the most poorly optimized game since Crysis? Everything else runs great at 1080p.
-
Witcher 2 is the Crysis of its day. It will kick the of a system that will otherwise run everything great.
Sent from my Tricorder using Tapatalk -
First game I played with this machine was Witcher 2 and I thought to myself, worth it.
-
It depends what max settings we're talking about here. Generally I find that texture quality does the most to what I discern as good graphics, where as anti-aliasing has almost no effect on how I perceive games. So, I almost always play without anti-aliasing but with otherwise as high as possible settings to eek out 30+ consistent FPS (which usually means it'll hit 45 FPS average). As people have said, different games need different FPS. I have no problem playing strategy games at 30FPS (think Civilization not Starcraft) but action RPGs require at least a good 45 FPS to be coherent.
-
30 is my minimum, if I really have to go that low. I do prefer higher than 40 though.
-
What's killing your performance is the GDDR3 laden bandwidth. The GDDR5 toting 5870M has no problem with mostly Ultra settings at 900p, or even 1080p play with sacrifices.
It's also a CPU bound game, especially in towns/camps. If you overclock your CPU, you'll see a significant framerate increase.
To restate, if I max the game and it's in the 30-40fps range, I'll just sit content, but if it's I'm over 45fps, and it's obvious that lowering just a few settings will allow 60fps gameplay, I'll make that sacrifice. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Single players games are fine if they are above 30fps. So it's your choice to have native res (let's say 1080p) with Max details. If it drops below 30fps, you need to tweak a bit the settings.
If you play fps for fun (not professionally, or if you're addicted K/D ratio) then anything above 45fps should be fine.
However for games like BF3 that require a good hardware you may want to have a performance that never dips below 50-55fps at native res, for being competitive with a decent response.
Usually the general rule of the thumb as user said is that FPS needs to be above 60fps all the time. -
John Carmack knows a little bit about framerates. Check out his QuakeCon Keynote.
-
I play my games at 45fps minimum... I dont use filter (AA/AF)... quality is important but gameplay is crucial... But on a laptop theres also the noise of the fan in some situation... I prefer mid quality+45fps if the fan is quieter (i work at night)...
-
moviemarketing Milk Drinker
For most games, AA has a big impact on my laptop, but 16xAF has almost no frame rate cost compared to 0xAF. -
When someone wants to pay me to play games, I'll care about 60 FPS then.
-
Sure, this is by no means a high-end card or system (or even a mid-ranged one in today's standards) but if it can play Skyrim on High (albeit with shadows set on low) fluently, and comparative games run on mid-high settings, I see no reason why Witcher 2 cannot be played on this system regardless of settings or ridiculously low resolutions other than amateurish optimization by its devs. Games are not just bells and whistles, they should first and foremost be able to run properly when lowering down their settings, especially when they belong to the RPG genre and aren't trying to be glorified tech demos like so many FPS games these days.
Mind you, I'm not crying too much over this, as I didn't find Witcher 2's gameplay to my taste, either. Nevertheless, I really cannot agree it is optimized, let alone highly..
</snip> -
The only games that I 'require' 45FPS for are RTS, FPS and MMO PvP. Anything that requires fast movements/reactions since 1 second can mean win/lose.
-
moviemarketing Milk Drinker
Witcher 2 might look pretty on some insane 4xSLI desktop system, but it looks like total crap when you have to reduce your resolution down to 720p or lower, even though your PC runs every other game out there at 1080p and when you play Witcher 2 it looks probably even worse than the Xbox version. And even at those low settings and drastically reduced resolution, it is still laggy and the controls still feel rather unresponsive. Most of us are never going to see a smooth, responsive 60fps with Witcher 2, no matter what kind of settings or resolution we use. -
The only problem I've encountered is that this method does not work with nVidia Optimus, as it keeps the dedicated GPU active 100% of the time.
Sent from my Tricorder using Tapatalk -
The simple truth is that the 330M fell well below the minimum requirements CDPR laid out during the games development, and prior to its release. They explicitly told you it would not run on your GPU. In fact any game that lists an 'Nvidia 8800' as the minimum should not be expected to run on your card, but that does not equal poor optimization, inasmuch as it equals an outclassed GPU.
The Witcher 2 was built from the ground up, for higher-end systems, on purpose. They said that from step one. So yes, I do believe that they optimized the game, but for a certain level of hardware, and it's unfortunate that the ceiling was far above your 330M's head.
Evidence: My 6970M can run the game on Ultra @ 1080p. I'm not including that as a brag, but to point out that the desktop card it came from (the 6850) was considered a mid-range, affordable offering at the time. A mid-range card maxing a game considered that intensive, is a clear sign of optimization. Compare that to when Crysis came out, and the mid-range Nvidia offering was an 8600 GTS. Yeesh.
With Skyrim's system requirements being significantly lower than TW2's, it's natural that it runs in some form, on super low-end systems. But let's not obfuscate two arguments. -
-
-
-
"The Witcher 2 was built from the ground up, for higher-end systems, on purpose. They said that from step one. So yes, I do believe that they optimized the game, but for a certain level of hardware, and it's unfortunate that the ceiling was far above your 330M's head."
Is one, HUGE, contradiction, especially if I consider your previous claim that the game is highly optimized. I think you probably confuse what the term 'optimize' means
Skyrim's system requirements being 'lower' IS an example of a game that is actually highly optimized for a wide variety of hardware setups, since you can make it run on low-end systems but also scale it to look extremely beautiful on the best graphical settings as well.
But we're discussing technicalities, here, so let's drop it
P.S: I could run the game just fine on my desktop 5870, the 330M was an example for poor optimization of a game. -
This dilemma can actually ruin the enjoyment of a game for me somewhat. My old laptop had a 1080p screen with a 5650 (oc'd to 5730 level of performance). I could never decide if I preferred running at native resolution with settings turned down or sacrifice resolution in order to crank up settings. Crysis in 1080p medium vs 720p high was a decision I just couldn't make! So when I went for my current laptop I thought to myself just do away with the choice and get better future proofing too! So I went for a 768p screen combined with a DDR3 650M. It was either that or a 900P with 555M or 1080P with 540M (for my budget). I wanted to have something as future proof as possible (as I have no idea when I will be able to afford a new laptop) so the choice was clear!
Max Settings VS 60fps
Discussion in 'Gaming (Software and Graphics Cards)' started by paul2110, Jul 31, 2012.