The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Max Settings VS 60fps

    Discussion in 'Gaming (Software and Graphics Cards)' started by paul2110, Jul 31, 2012.

  1. paul2110

    paul2110 Notebook Guru

    Reputations:
    8
    Messages:
    67
    Likes Received:
    0
    Trophy Points:
    15
    Obviously a lot of us are obsessed with maxing out games, but really how important is it to have max settings vs having the perfect frame rate?

    For example I was playing crysis 2 with dx11 high res textures at ultra, and I lowered to extreme, played for a bit and couldn't see any difference between ultra & extreme (except for the frame rate boost with extreme).

    I always try and max games by reflex, but often when I'm involved in a game I am not really taking note of the graphics. I have always preferred to play max at 30-45fps rather than lower settings to get 60fps locked, so I wanted to see how others feel about this?
     
  2. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Depends on the game. If it's a single player game usually I'm fine with 40+. If it's a multiplayer FPS like BF3, then it's gotta be 60... Though I don't always get what I want with my setup. :(
     
  3. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    It really depends on the game. Does it feel fluid to you t 30fps, does 30fps net you a disadvantage in multiplayer. IF the answers to those two questions are yes and no, then max them to your heart's content. I cannot stand something that feels choppy so i set the settings accordingly.
     
  4. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    120fps is the new 60fps... believe it.
     
  5. paul2110

    paul2110 Notebook Guru

    Reputations:
    8
    Messages:
    67
    Likes Received:
    0
    Trophy Points:
    15
    Anything over about 80fps feels too 'slippery' for me
     
  6. Drunken1

    Drunken1 Notebook Consultant

    Reputations:
    20
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    I'm a FPS guy. I like effects, but if it hinders my gaming, I will turn down them down. Most of my tweaks aren't even noticeable while gaming. I'd have to film both settings and compare to see the difference. Especially on a smaller screen.
     
  7. CrAzYsIm

    CrAzYsIm Notebook Evangelist

    Reputations:
    25
    Messages:
    542
    Likes Received:
    0
    Trophy Points:
    30
    For me it depends, like Tijo mentioned it depends on the game. 30FPS in Crysis 1 felt smooth to me, 30FPS in Assassins Creed was unplayable (for me)
     
  8. Silverfern

    Silverfern Notebook Deity

    Reputations:
    96
    Messages:
    955
    Likes Received:
    1
    Trophy Points:
    31
    i dont know why people so fussed. i used to play crysis on my 3650m (still do) and now shogun 2 on medium settings with around 20 frames and i am happy with it
     
  9. paul2110

    paul2110 Notebook Guru

    Reputations:
    8
    Messages:
    67
    Likes Received:
    0
    Trophy Points:
    15
    Fair play, there's no right or wrong answer. I played a LOT of Day-Z on my old laptop at min graphics settings and only got 15-20fps (dropping to 10fps at times) and still enjoyed it.

    The console versions of crysis 2 drop to 20fps in places but most console players probably don't even know or care...
     
  10. NA1NSXR

    NA1NSXR Notebook Guru

    Reputations:
    2
    Messages:
    52
    Likes Received:
    0
    Trophy Points:
    15
    There is no difference beyond what your eyes can discern (around 60fps depending on who you ask). The reason 100+ feels smoother is because your framerate fluctuates during play, so even when there is a lull in your framerate at a particularly demanding section, it doesn't dip below that discernible level of FPS. That is why averaging 100+fps feels so smooth...this is also why I feel averaging 60fps isn't always enough even though I don't think anything above 60fps matters. When I average in the danger zone of 60fps or below, it means that it some situations the FPS might actually drop to a rate where my eyes can tell. I have really stopped caring about average FPS and I really only care about the minimum now. If my minimum is always above 30fps I am generally happy. I am on a older computer, a i7-920 @ 3.6 with a GTX570 @ 940 and on something like BF3 1080p ultra 4xAA it is usually smooth and once in a very long while I might get mid to high 20's and I find that still acceptable. As for answering the actual question, I am a sucker for high settings and I will pretty much run them at any cost. If I have to start reducing settings that is when I look towards new hardware. For my last couple of desktop systems I can expect about 3 years or so of running everything maxed out before something has to get changed. On a notebook maybe 1.5-2 years is more reasonable?
     
  11. JKnows

    JKnows Notebook Consultant

    Reputations:
    32
    Messages:
    148
    Likes Received:
    61
    Trophy Points:
    41
    Didn't find the right option to click, so I write: Max settings, as long as I get 25fps+
     
  12. compuNaN

    compuNaN Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    There is a noticeable difference between 30 and 60 fps, and 60 is easier to play with. However, I much rather look at fantastic graphics because 30fps is completely playable anyway. Only some very fast paced games like Multiplayer FPS or Hardcore Starcraft may actually gain any notable benefit from more frames per second over 30.

    I usually just try to find the best graphics for around 25-30 fps. If I can get max graphics for more, then that's awesome.
     
  13. andros_forever

    andros_forever Notebook Deity

    Reputations:
    141
    Messages:
    954
    Likes Received:
    0
    Trophy Points:
    30
    For me the usual formula applies as well (60 for fps, 40-45+ for other games), although some games feel very playable at 30+ average (as long as there isn't too much dipping). The witcher 2 played with some tweaked highly set settings at a very playable 30+ in 720p.

    Currenlty playing Skyrim in 1920x1200 with fxaa, 15x af and maxed settings (with many hd mods, pop mods etc) at a smooth 35 fps average in outdoor areas, little lower in very woodsy areas and much smoother in dungeons.
     
  14. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Depends on game really. Most SP games I'm perfectly fine with 30+. For some online shooters 50+ is really required. BF3 is sensitive to FPS I'm finding. Higher FPS seems to result in more accurate aiming.
     
  15. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    Max settings, as long as I get 30fps+ ... That's the bare minimum for me!
     
  16. mrzzz

    mrzzz Notebook Consultant

    Reputations:
    49
    Messages:
    179
    Likes Received:
    10
    Trophy Points:
    31

    This myth of "eye can only see 30/60/72/100/120......" pops up time and time again, it simply is not true.

    First and foremost, the human visual system does not 'see' in FPS.

    TLDR; version - Science has disproved this myth many times over, the theoretical max a healthy human eye can see is 50,000 fps.

    Response time of cones and rods [the cells in your eye that allow you to 'see'] and transmit/receive time can effectively be measured as fps by measuring the time it takes at each stage. This is very complex because of all the variables such as brightness, contrast, light levels in general, colors used, etc etc that all affect the outcome.
    For simplicity - Rods see motion, cones see color. The response time (or used and ready to be used again time) through bipolar ganglion cells has been measured at 20msec (thats micro-seconds, not mili)

    Another issue with the 'question' is people misunderstanding what they are asking, and HOW to ask it. {the first link explains it in-depth}

    "How many frames per second can the human eye see?

    This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as:

    How many frames per second do I have to have to make motions look fluid?

    And it's not the same as

    How many frames per second makes the movie stop flickering?

    And it's not the same as

    What is the shortest frame a human eye would notice?"

    These are all VERY different questions.

    For the sake of simplicity, Yes , the eye can see more than N fps.
    Lets not argue argue over proven facts though ^.^


    How many frames per second can the human eye see?
    AMO.NET America's Multimedia Online (Human Eye Frames Per Second)
    AMO.NET America's Multimedia Online (Human Eye Frames Per Second 2)
    Rods & Cones
    Bipolar Cell Pathways in the Vertebrate Retina – Webvision
     
  17. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    If you're going to pick a number of frames per second that the human eye will no longer be able to differentiate, you need to start your ballpark estimation in the 300 range and work up from there.

    Nevertheless, 60 frames per second is a good target for fluid motion and tight control for high action video games.
     
  18. m1_1x

    m1_1x Notebook Evangelist

    Reputations:
    27
    Messages:
    500
    Likes Received:
    0
    Trophy Points:
    30
    15-20fps, you might as well just watch a slideshow thats set on the medium speed or something. I really cant fathom how anything below 30fps is playable. 25 is the absolute bare minimum ill go if the gameplay is worth it. The same notion goes towards those who need 60fps on multiplayer, I really just can't relate to that.
     
  19. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    If maxed settings has me at 30fps+, I'm inclined to stay there.

    If maxed settings has me at 45-50fps+, I will drop settings to achieve 60fps.
     
  20. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    Either you wrote this backwards or you thought this through in a weird way.
     
  21. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    It's insane how demanding Witcher 2 is, even on the lowest settings. Skyrim runs great mostly maxed at 1900x1200 for you, but you mentioned you have to drop down to 1280x720 to get playable frame rates for Witcher 2? I tried dropping down to 1600x900 for Witcher 2 and it was still too laggy, perhaps I need to try 720p.
     
  22. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Regardless of what everyone says, 60 fps appears to be at, or close to, the limits of my vision to distinguish. Between 60 and 120, there is certainly diminishing returns, if not no discernible difference.

    30 fps, though, is completely playable for most games. The thing I absolutely can't stand is a wildly fluctuating framerate somewhere between 30-40 on the low end and 60 on the high end. It makes the game feel slower than if I were playing at a constant 30, so I usually end up locking games like this down to 30, where the experience feels more fluid. Oblivion and Skyrim both come to mind as games I've done this to.

    For a multiplayer FPS like BF3, a high framerate is essential, imo. I find it hard to aim with any kind of precision at 30 fps, even worse with a fluctuating framerate. There is logic behind this, though - you'll get twice as many increments for you to gauge where you're pointing at 60fps as opposed to 30fps, so aim will be more precise. In single-player or non FPS games, though, it makes little difference.
     
  23. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    Personally i like to run at around or just under 90hz, since thats what my laptop screen can run at, oddly i found that my moms vostro 1700 screen runs at 120hz, but its not 1080p so that could be why, anyways i tried going back and playing in the 40-45fps range and its now become unbearable.

    Sent from my DROID RAZR using Tapatalk 2
     
  24. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    OK I went back and tried Witcher 2 at 1280x720 and I can actually play the damn game, finally! Pretty amazing difference. Is this just the most poorly optimized game since Crysis? Everything else runs great at 1080p.
     
  25. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Witcher 2 is the Crysis of its day. It will kick the of a system that will otherwise run everything great.

    Sent from my Tricorder using Tapatalk
     
  26. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    :D First game I played with this machine was Witcher 2 and I thought to myself, worth it.
     
  27. gdansk

    gdansk Notebook Deity

    Reputations:
    325
    Messages:
    728
    Likes Received:
    42
    Trophy Points:
    41
    It depends what max settings we're talking about here. Generally I find that texture quality does the most to what I discern as good graphics, where as anti-aliasing has almost no effect on how I perceive games. So, I almost always play without anti-aliasing but with otherwise as high as possible settings to eek out 30+ consistent FPS (which usually means it'll hit 45 FPS average). As people have said, different games need different FPS. I have no problem playing strategy games at 30FPS (think Civilization not Starcraft) but action RPGs require at least a good 45 FPS to be coherent.
     
  28. Evanescent

    Evanescent Notebook Deity

    Reputations:
    144
    Messages:
    993
    Likes Received:
    0
    Trophy Points:
    30
    30 is my minimum, if I really have to go that low. I do prefer higher than 40 though.
     
  29. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The Witcher 2 is highly optimized. It's just the most graphically intensive game of its time.


    What's killing your performance is the GDDR3 laden bandwidth. The GDDR5 toting 5870M has no problem with mostly Ultra settings at 900p, or even 1080p play with sacrifices.

    It's also a CPU bound game, especially in towns/camps. If you overclock your CPU, you'll see a significant framerate increase.
    No, I said what I meant.

    To restate, if I max the game and it's in the 30-40fps range, I'll just sit content, but if it's I'm over 45fps, and it's obvious that lowering just a few settings will allow 60fps gameplay, I'll make that sacrifice.
     
  30. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Single players games are fine if they are above 30fps. So it's your choice to have native res (let's say 1080p) with Max details. If it drops below 30fps, you need to tweak a bit the settings.
    If you play fps for fun (not professionally, or if you're addicted K/D ratio) then anything above 45fps should be fine.
    However for games like BF3 that require a good hardware you may want to have a performance that never dips below 50-55fps at native res, for being competitive with a decent response.
    Usually the general rule of the thumb as user said is that FPS needs to be above 60fps all the time.
     
  31. PopeJamal

    PopeJamal Notebook Consultant

    Reputations:
    7
    Messages:
    104
    Likes Received:
    4
    Trophy Points:
    31
    John Carmack knows a little bit about framerates. Check out his QuakeCon Keynote.
     
  32. MegaBUD

    MegaBUD Notebook Evangelist

    Reputations:
    45
    Messages:
    670
    Likes Received:
    3
    Trophy Points:
    31
    I play my games at 45fps minimum... I dont use filter (AA/AF)... quality is important but gameplay is crucial... But on a laptop theres also the noise of the fan in some situation... I prefer mid quality+45fps if the fan is quieter (i work at night)...
     
  33. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Do you find that AF impacts your frame rate a lot?

    For most games, AA has a big impact on my laptop, but 16xAF has almost no frame rate cost compared to 0xAF.
     
  34. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    When someone wants to pay me to play games, I'll care about 60 FPS then.
     
  35. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
    Yeah, of course, it's a really highly optimized game. So optimized, in fact, that my 330M can't run it even at 800x600 (or even 640X480!) on the lowest settings possible, regardless of location.

    Sure, this is by no means a high-end card or system (or even a mid-ranged one in today's standards) but if it can play Skyrim on High (albeit with shadows set on low) fluently, and comparative games run on mid-high settings, I see no reason why Witcher 2 cannot be played on this system regardless of settings or ridiculously low resolutions other than amateurish optimization by its devs. Games are not just bells and whistles, they should first and foremost be able to run properly when lowering down their settings, especially when they belong to the RPG genre and aren't trying to be glorified tech demos like so many FPS games these days.

    Mind you, I'm not crying too much over this, as I didn't find Witcher 2's gameplay to my taste, either. Nevertheless, I really cannot agree it is optimized, let alone highly..

    How can you lock down a game to 30 FPS? Use the settings in the INI files or are you altering something in your driver?
    </snip>
     
  36. Voodooi

    Voodooi AFK for a while...

    Reputations:
    1,850
    Messages:
    2,874
    Likes Received:
    1
    Trophy Points:
    55
    The only games that I 'require' 45FPS for are RTS, FPS and MMO PvP. Anything that requires fast movements/reactions since 1 second can mean win/lose.
     
  37. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    I have to agree with you on this. I mean anyone who wants to talk about optimization, watch John Carmack's QuakeCon Keynote - this is a brilliant guy, and he goes into exhaustive details about all the work they do with their games to try to get the absolute highest framerates possible with existing hardware base, which is what optimization is all about.

    Witcher 2 might look pretty on some insane 4xSLI desktop system, but it looks like total crap when you have to reduce your resolution down to 720p or lower, even though your PC runs every other game out there at 1080p and when you play Witcher 2 it looks probably even worse than the Xbox version. And even at those low settings and drastically reduced resolution, it is still laggy and the controls still feel rather unresponsive. Most of us are never going to see a smooth, responsive 60fps with Witcher 2, no matter what kind of settings or resolution we use.
     
  38. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Depends on the game. If there is a .ini tweak or command line option that will do it, I use that. Otherwise, I use the "on-screen display server" software included with MSI Afterburner, which seems to work regardless of game (except GTA4, which it causes to crash, but GTA4 accepts command line input to do this anyways) and regardless of ATI/nVidia card. I turn off all of the actual on-screen display options, and auto-detection of non-specified apps, and use it solely to limit the framerate of the apps I tell it to.

    The only problem I've encountered is that this method does not work with nVidia Optimus, as it keeps the dedicated GPU active 100% of the time.


    Sent from my Tricorder using Tapatalk
     
  39. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    "it doesn't run up to my expectations on my card so..." or "well this game runs at..." are never the proper angles with which to attack the question of a game's optimization.

    The simple truth is that the 330M fell well below the minimum requirements CDPR laid out during the games development, and prior to its release. They explicitly told you it would not run on your GPU. In fact any game that lists an 'Nvidia 8800' as the minimum should not be expected to run on your card, but that does not equal poor optimization, inasmuch as it equals an outclassed GPU.

    The Witcher 2 was built from the ground up, for higher-end systems, on purpose. They said that from step one. So yes, I do believe that they optimized the game, but for a certain level of hardware, and it's unfortunate that the ceiling was far above your 330M's head.

    Evidence: My 6970M can run the game on Ultra @ 1080p. I'm not including that as a brag, but to point out that the desktop card it came from (the 6850) was considered a mid-range, affordable offering at the time. A mid-range card maxing a game considered that intensive, is a clear sign of optimization. Compare that to when Crysis came out, and the mid-range Nvidia offering was an 8600 GTS. Yeesh.

    With Skyrim's system requirements being significantly lower than TW2's, it's natural that it runs in some form, on super low-end systems. But let's not obfuscate two arguments.
     
  40. MegaBUD

    MegaBUD Notebook Evangelist

    Reputations:
    45
    Messages:
    670
    Likes Received:
    3
    Trophy Points:
    31
    Not that much... but still a performance hit... but in all honesty... i barely see the difference between 16xAF or none...
     
  41. NA1NSXR

    NA1NSXR Notebook Guru

    Reputations:
    2
    Messages:
    52
    Likes Received:
    0
    Trophy Points:
    15
    What are you hassling me for, I didn't make a single declarative sentence about what the human eye can discern.
     
  42. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Purely my observation, but at 0x AF, textures look like a soupy mess. At 4x-8x, they look good even from far away, and I can't tell a difference between 16x and 8x so I just leave it at 8x most of the time.
     
  43. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
    This sentence right there, the whole thing:

    "The Witcher 2 was built from the ground up, for higher-end systems, on purpose. They said that from step one. So yes, I do believe that they optimized the game, but for a certain level of hardware, and it's unfortunate that the ceiling was far above your 330M's head."

    Is one, HUGE, contradiction, especially if I consider your previous claim that the game is highly optimized. I think you probably confuse what the term 'optimize' means :)

    Skyrim's system requirements being 'lower' IS an example of a game that is actually highly optimized for a wide variety of hardware setups, since you can make it run on low-end systems but also scale it to look extremely beautiful on the best graphical settings as well.

    But we're discussing technicalities, here, so let's drop it :)

    P.S: I could run the game just fine on my desktop 5870, the 330M was an example for poor optimization of a game.
     
  44. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    This dilemma can actually ruin the enjoyment of a game for me somewhat. My old laptop had a 1080p screen with a 5650 (oc'd to 5730 level of performance). I could never decide if I preferred running at native resolution with settings turned down or sacrifice resolution in order to crank up settings. Crysis in 1080p medium vs 720p high was a decision I just couldn't make! So when I went for my current laptop I thought to myself just do away with the choice and get better future proofing too! So I went for a 768p screen combined with a DDR3 650M. It was either that or a 900P with 555M or 1080P with 540M (for my budget). I wanted to have something as future proof as possible (as I have no idea when I will be able to afford a new laptop) so the choice was clear!