Okay, so I see people talking about getting frame rates of 80, 90, 100, even120 FPS on various games with various graphics cards.
My question is this: if you have your display set up for 60 Hertz, that means (if I am understanding it correctly) that the entire screen is updated 60 times a second. Im not sure if LCD screens have the equivalent of a "scanline" or "vertical retrace" than CRT monitors to, but they still seem to have this Hertz setting, so Im assuiing there is some periodic update time
So, If you have more than 60 frames per second, then arent some frames are "drawn" but NEVER SEEN on your screen, because the display itself can't update that fast?
So my question is this: What is the point of a FPS higher than the Hz setting of your display? Or is this Hertz value meaningless with LCD?
-
if your display is 60 hz then it doesnt matter after 60 fps. technically yes that does mean there are produced frames that never make it to the screen. That is where vsync is come into play. vsync caps your frame rate to the hz of your monitor which is probably used to save power for the card but people do not usually like their fps capped at 60 or the new 120 hz mark.
-
If you dont cap your frame rate to the synch rate, whats the point? Frame rate numbers higher than the synch rate would be pretty much imaginary then, right?
I.e. If your synch rate is 60 Htz, then there is absolutely no difference beween a FPS of 100 and 10,000,000,000,000. Its just imaginary numbers after that, the gpu claiming it drew a frame that never saw a pixel actually displayed.
(It would seem, anyway)
So why would people want to waste processing power to put up a number that doesnt actually mean anything? (It would just cause the gpu to be hotter, with no actual benefit.) I guess thats my question. -
the problem with vsync in my own view was that since if never passed 60 that some times it dropped to a fps that i started to notice very little hints of "lagging" effect. but yes after it is basically wasted power. and who doesnt like a higher fps??? XD people do say that there is splicing if vsync is off but i have yet to notice that.
-
that's why I set max_fps at 60
-
i keep mine at max_fps at 75 just for a little head room
-
I set mine to 60, but I get LOTS of tearing. I have also heard that vsync is horrible and causes flase interpolation, etc., so I'm not turning it on.
300+ FPS in CSS with all settings high FTW! -
masterchef341 The guy from The Notebook
its only useful if you want to compare the power of two graphics cards. if you cap the framerate at 60, you won't be able to see any difference if they can both go higher.
-
No. They both appear exactly the same on the screen - only the guy with 300FPS is running a hotter GPU.
So why brag about it? Its like saying "My shirt has more invisible ultraviolet patterns than yours, although nobody can see them"
I dont get it. Is it just the thought that a higher number sounds better? Just a geek thing? My GPU runs hotter to produce imaginary images that no one can see? -
-
masterchef341 The guy from The Notebook
i dont think people brag about running 300 fps in games except maybe in that a game with low spec requirements (like half life 2) running fast means that other more intensive engines will run fast too.
same reason people brag about 3dmark scores. is 5400 3dmark points a better experience than 2485 3dmark points? Nope, unless watching 3dmark is especially appealing to you. I find it rather boring. in the past (i refuse to run 3dmark anymore) I would just leave and come back when it was done.
most people aren't overclocking their gpu's to get from 100 to 120 fps. its not worth it to underclock the gpu to bring it down to 60 because that is inconvenient. a newer gpu that runs 100 fps isn't necessarily hotter than an older one that runs at 60, either. keep that in mind. -
OK fellas here it goes! If you ever saw an old movie sometimes it got out of sync, you would see a bar in the middle of the screen! Now if I set the game to play at 60 FPS to match the screen refresh why do you think they are going to be in sync? V/sync retards performance to prevent tearing. The important word here is “retard” slow down, of course you don't like it! So if you allow the video card to produce as much as can, it is more likely to have a complete frame when the monitor wants it. If a GPU could produce ten frames for every one asked it would be much more fluid than a 1 to 1 ratio. V/sync does not improve performance in a FPS way, it prevents tearing! You want your GPU to always have a full frame ready, not V/sync sit and wait! As in life more is always better!!
Edit: Except death and taxes! -
Because displaying 1 frame every 60th of a second looks better than displaying..... 1 frame every 60th of a second?
I fail to see how the invisible, never-seen 9 "imaginary" frames not being displayed makes anything more fluid.
Yeah, I can buy that synching up the frame display to the refresh frequency as you described could, in theory, slow things down an incredibly small amount - but what I was saying that increasting the frame rate beyond that threshold never actually provides any benefits. -
Outputting to a monitor. Something that can get over 60Hz.
-
Edit: if an I'm sorry I would do it but I can't in this, I'm sorry! -
Hmm. Okay. It sounds like this is a topic dear to the hearts of many people.
-
i'd go without Vsync, u just get your fps capped for what?, little power save on the GPU? not that much because the Vsync keeps straining the battery, so you would be likely to save 5% of battery life, also it means if at any moment the fps drops you'd still have plenty of room; ex: i have 120+ fps and it drops to 80 or 70 lets say, it'd still be smooth play, but if you have Vsync on, you are capped at 60fps and the drop down at a determined situation would become into less fps and not to be consistent, also the fps flow would be unharmed.
-
hmm thats a good point, i should start turning Vsync ON. I can't even tell the difference between 25, 30, 40,50,60 fps, they all look and feel the same with me. To tell the truth on my desktop I regularly turn down settings not because my GPU can't handle it, but because I absolutely can't tell the difference between max settings and med/high settings.
-
You are not getting it! I have no more to say! except you don't get it!
-
But wouldn't running a game at 120fps makes the game speed 2 times faster than it (60fps) should be? For example in Company of Heroes offline mode I CAN really notice the difference between 120fps and 60fps coz with 120fps everything builds faster and everything moves faster and gets killed faster...
-
well exceeding the 60 cap gives you more headroom in case of slowdown...but if it is capped at 60 would there be less cases of slowdown?
-
im sorry folks but turning vsync on is for noobs, i notice a slowdown if i do. And look, 60fps is not fast enough for me, i notice speed increases up to 100fps, the game just runs faster, and more fluid over 60fps. And 30 fps is horrid!
-
unless you're outputting to a CRT monitor, you won't notice frame rates above 72mhz. Why? Because most lcds have 60mhz with a slight exception of some being able to render 72mhz. I don't know how you notice speed increases up to 100fps on a laptop (maybe it's in your head), because your monitor will only display 60 -.-
-
i actually dont have my laptop yet lol, but my reference is my 75Hz LCD. The actual framerate may be similar, but the game may be processing faster with higher framerates, thus its faster.
-
You arent actually having more frames drawn - your system is just skipping them - and so things will speed up. -
Is it possible to adjust VSync? Like making the cap 75 frames per second instead of 60?
-
It doesn't matter how many frames are rendered by the GPU if they aren't shown.
You're not really in a position to pretend that "no one gets it", or that your explanations are wasted, when your explanations are wrong.
Higher framerates do not cause smoother gameplay.
It works exactly as others have tried to say. The GPU spends time (and power) rendering things that aren't shown. That does not give you smoother gameplay. Nor does it "speed things up".
What is fastest? Rendering one frame every 1/60th second, and then waiting until the full 1/60'th has elapsed, or rendering, say, 200 frames within that time, and then throwing 199 of them away?
As far as the game is concerned, they're typically exactly equally fast. They do the same amount of effective work. One solution also does a lot of wasted work, but it's wasted, it doesn't matter.
Depending on the exact framerate, and a bunch of other factors, this might cause tearing. It might also be unnoticeable (if the difference between subsequent frames is small enough)
If you enable vsync, you will not get any tearing, period. The GPU only presents a new frame when the monitor is ready to draw a new frame. (And what it does is precisely what you said. It prepares a frame, and *then* it waits for the monitor. So there is always a full frame ready.
Without vsync, there isn't a full frame ready when the monitor starts drawing. There is a mishmash of the current and previous frames.
You are right, this can happen. It normally doesn't.
What decent, sensible games do is use a timer for the game logic. The game speed is controlled by how much time has elapsed since the last frame. Not by a fixed amount every frame.
This means that the game speed is always exactlythe same, regardless of whether you run at 0.5 or 2000 frames per second.
Almost all PC games do this today. Simply because not doing it causes so many problems due to the widely differing hardware the game has to run on. (framerate-dependant game speed would make the game run too fast on a fast computer, and in slow-motion on a slow computer. Neither is acceptable)
Ever heard of this phenomenon
As I said above, almost all modern PC games run at exactly the same speed regardless of framerate. A few don't, and in those cases, yes higher framerates make a difference (not always for the better. Try running old DOS games without a frame limiter)
However, if the framerate can't be kept *at least* at the monitor's refresh rate, then it will go down to half the monitor's refresh rate (usually 30fps), which may be noticeable. But no, on a 60hz screen, you do not notice a difference between 60 and 100 fps in modern games. You might think you do, which only proves that human beings are good at seeing what they want to see.
It isn't "for" anything else, such as controlling game speed.
Also, you arehaving more frames drawn. By the GPU, that is, which is usually the bottleneck. Without vsync, the GPU *will* draw every single frame, without skipping everything. What's more, it will also upload every single frame to the monitor. So it's not entirely accurate to say that "you only see every other frame". You just see half of every frame instead.
What it comes down to is this:
- In general, the game will run at exactlythe same speed regardless of framerate.
- In general, the rendered graphics will be approximately as smooth regardless of vsync setting. (One exception is if your framerate dips below the monitor's refresh rate, in which case vsync will give you artificially low framerates, which sucks. Another exception (which will generally, but not always, favor vsync enabled) is tearing. Without vsync, you will see different frames on different parts of the screen. The top lines will show older frames than the bottom part. Often, the differences are so small it isn't really noticed, and occasionally, it might just give an illusion of smoother rendering. But just as often, it'll render as visible tearing which looks like crap.
(The above is definitely true for CRT's. For LCD's, I'm not sure if they still update one pixel at a time, or if they load a single full frame and show that. If the former, they'll act like above. If the latter, they'll act exactly as if VSync was on, even when it's disabled. But in neither case will it translate into "smoother gameplay" or "more fluid rendering")
However, one thing I want to underline because there seems to be a lot of confusion about it, is this:
Game speed and rendering speed are totally independent. The actual game's "refresh rate" is usually not at all affected by the framerate, and vice versa. The gameplay doesn't become more or less smooth, and it doesn't speed up or slow down. (It may do so, in some older games, and in a *very* few modern ones, but it doesn't have to, and conceptually, the two concepts, game speed and render speed, are completely independent) -
(The above is definitely true for CRT's. For LCD's, I'm not sure if they still update one pixel at a time, or if they load a single full frame and show that. If the former, they'll act like above. If the latter, they'll act exactly as if VSync was on, even when it's disabled. But in neither case will it translate into "smoother gameplay" or "more fluid rendering")Click to expand...
I loaded up Chuck Yeager''s flight simulator (from 1987) onto my P4 2.8Ghz. The flight sim obviously did not have any independant timer. When I started, I essentially instantly crashed. There was no way at all to actually fly the plane anymore - the thing was expecting a 286 at about 16Mhz. So it ran about 175 times too fast, lol. -
That's wierd... Coz when I tried playing Company of Heroes and Command & Conquer 3: Tiberium Wars at higher settings with lower fps (both in offline mode) the game speed for both actually slows down. I'm sure of it because I actually look at my watch and time how long it takes for a structure to be constructed...
-
Alright guys, I registered to this forum just to get this off my shoulder. VSync, in the gaming community, is highly avoided because it reduces frame rate. This is because while capping your FPS, the graphics card waits to render the next frame. In any reputable tweak guide, it is advised to disable VSync for this reason. To quote http://www.tweakguides.com/,
"When VSync is enabled (ticked), your graphics card will synchronize with your monitor and only display whole frames - this means your maximum FPS will be capped at your monitor's maximum refresh rate at your chosen resolution, and more importantly in some areas your FPS may drop by as much as 50% if the graphics card has to wait to display a whole frame. Thus enabling VSync can have a major negative performance impact."
Furthermore, everyone knows that FPS is not a static number, instead it can fluctuate highly depending on the situation. Thus even though the monitor is capped at 60-85 FPS, the frame rate can fluctuate from 40-80 or even more. In a competitive First Person Shooter, such as CS:Source, no one wants to be put at a disadvantage because of their frame rates and that's where having a high average frame rate will help. If someone's average FPS is capped at 60 because of vsync, during a large scale firefight, it could easily dip into the 40's. However if the average frame rate is 120, the player will have comfortable fluctuations of 80-140. -
So pretty much, if you frame rate is above 60, you want to try and lower it to match the 60hz screen refresh rate, otherwise you experience tearing. In Half Life 2, I get around 80-90 FPS, which makes tearing noticible, so I turn on Vsync. If it was possible to get exactly 60hz for both the screen and graphics card without fluctuation, games would be so much better. In Guild Wars I achieve this most of the time and it plays so smoothly. Occasionally dropping +-10 either way.
Where you don't need to turn on vsync is if your frame rate is below 60, because then vsync locks the frame rate to 30 FPS, which in some cases is bad, especially games with fast motion. 30 FPS is fine in games where the camera is fixed, but can cause slowdowns in a game.
All about balance. If the tearing is annoying, turn it on. If it isn't leave it off. I find that if you turn vsync on in Counterstrike Source, it takes longer for the crosshair to contract when crouching, making accurate aimer take longer. -
The thing is though, in games like HL2, Oblivion, or any other single person game, an FPS difference of 60-100 won't matter. However, playing a multiplayer game such as Unreal Tournament or CS:S, it's advantageous to turn VSync off.
For example, even if your GFX card is capable of hitting a consistent 100 FPS, if you hit an intense firefight or explosion, the frame rate could easily drop to less than 60. With VSync off, the monitor would display frame rates in the 40's or 50's, still allowing fluid gameplay. However, with VSync on, even at 59 FPS, the monitor will only display 30. Thus, granted that most of the time 40 frames are being wasted, it's still advantageous to keep VSync off so that you don't get killed every time something blows up because your aim gets sluggish. -
Lithus said: ↑Alright guys, I registered to this forum just to get this off my shoulder. VSync, in the gaming community, is highly avoided because it reduces frame rate. This is because while capping your FPS, the graphics card waits to render the next frame. In any reputable tweak guide, it is advised to disable VSync for this reason. To quote http://www.tweakguides.com/,
"When VSync is enabled (ticked), your graphics card will synchronize with your monitor and only display whole frames - this means your maximum FPS will be capped at your monitor's maximum refresh rate at your chosen resolution, and more importantly in some areas your FPS may drop by as much as 50% if the graphics card has to wait to display a whole frame. Thus enabling VSync can have a major negative performance impact."
Furthermore, everyone knows that FPS is not a static number, instead it can fluctuate highly depending on the situation. Thus even though the monitor is capped at 60-85 FPS, the frame rate can fluctuate from 40-80 or even more. In a competitive First Person Shooter, such as CS:Source, no one wants to be put at a disadvantage because of their frame rates and that's where having a high average frame rate will help. If someone's average FPS is capped at 60 because of vsync, during a large scale firefight, it could easily dip into the 40's. However if the average frame rate is 120, the player will have comfortable fluctuations of 80-140.Click to expand... -
In C&C 3 the low fps really seem to slow down the game a bit, it's strange. but maybe this impression is because the animation is not that fluidly (this game is capped at 30 fps so low frame rates are LOW frame rates, below 20 fps)
-
All i need is 24 FPS, and im fine (the FPS a movie runs) Anything more than that... Dont really care.
-
Lets talk about anatomy and physiology .....
How FPS can the human eye see? remember than the eye works through chemestry (rods and cones)... when you look at sports, martial arts or a car race you may see sometimes blur...
Similar thing with the sound, very low or high frecuencies cant be heard by the human ear, while some animals can (special whistles for dogs)
My point is, maybe the cap about this issue is the human itself? doesnt matter if your game (software and hardware) can produce lots of FPS if your eye its confortalbe with less....
Just to add something to the thread =)
(the actuall answer its very interesting! search for it-good places dont just google it... and maybe we should "vsync" the games with our eyes? lol) -
masterchef341 The guy from The Notebook
you can see the difference between 30 and 60 fps.
http://www.tweakguides.com/files/FPSCompare_v05_beta.zip -
Very generically, frame rate differences are easily noticeable by a human from 0-30 FPS, between 30-60, there is a slight variance and the human eye can go as high as 80. Obviously a military sniper will see better than someone's grandma but these numbers are averages. Thus it's recommended to game competitively at a minimum of 60 FPS. However, this doesn't have to be followed and one could easily game at 30 FPS, you just have to put up with an occasional stutter.
-
d3sdichad0 said: ↑Lets talk about anatomy and physiology .....
How FPS can the human eye see? remember than the eye works through chemestry (rods and cones)... when you look at sports, martial arts or a car race you may see sometimes blur...
Similar thing with the sound, very low or high frecuencies cant be heard by the human ear, while some animals can (special whistles for dogs)
My point is, maybe the cap about this issue is the human itself? doesnt matter if your game (software and hardware) can produce lots of FPS if your eye its confortalbe with less....
Just to add something to the thread =)
(the actuall answer its very interesting! search for it-good places dont just google it... and maybe we should "vsync" the games with our eyes? lol)Click to expand... -
fabarati said: ↑Human's to see the world in FPS, they see it contiunouslyClick to expand...
Frame Rate Question
Discussion in 'Gaming (Software and Graphics Cards)' started by mtylerjr, Jul 29, 2007.