Mare
Čuven
- Učlanjen(a)
- 01.02.2002
- Poruke
- 6,129
- Poena
- 865
Naisao sam na jedan intresnatan clanak na ovu temu...
cheers
Frames Per Second: Fact & Fiction
Date: August 31, 2002
Provider: None
Manufacturer: None
Author: Mikhail Ivanenkov
--------------------------------------------------------------------------------
One day I got to thinking about all this FPS (Frames Per Second) business. There is so much talk about what's right, what's wrong, what can be and can't be seen. Considering I knew next to nothing aside from personal experience, I thought I'd do a little research and give a little fact and fiction. Many people argue about what makes a difference and what doesn't, specifically what the human eye can perceive. Some claim 24fps, others 30, some 60, some 200, some even upwards of 2000 and above. Feel free to add any numbers in-between. The truth of the matter is, every one of these people is right and wrong in their own respect. Why? Because that's not how the brain works. Try as I may to make the information "flow", it's relatively difficult because everything is intertwined. Therefore, I've provided a few categories with things to consider.
Motion Blur vs. Sharpness
Here's something interesting you can try right now. Take your mouse and shake it slowly. Now shake it really fast until you can't make out the outlines of the buttons. What's the FPS rate? Is it low because it's blurry and you can't make out the features? Or is it high because it doesn't look choppy (don't you think it would be really freaky if fast moving objects appeared at one point and then at another, with nothing in-between)?
Let me answer that for you: it's neither. Simply put, according to our brains frames per second don't exist. Hypothetically speaking, if you could make out the individual lines while shaking it really fast, then your eye was taking more "still shots" to make it look smooth and you'd have to shake it even faster to see the smooth motion. You can see where the catch-22 comes in.
Television
The most common "frame rate" on a television set is 24, with ranges that can go from 18-30 (these are approximations). I use quotes because TVs don't work the same way as computer screens. A television set doesn't render individual frames; instead it provides a range. Everything appears fluid because everything is blurred. Notice how still shots of action scenes aren't the most crisp images in the world? Wonder why. But then how come you can make out static details on a screen? Why can you see the same cracks in stones watching The Matrix as you can playing Q3A? See below.
Brightness vs. Darkness
Because of the way the world is set up, light is much easier to recognize than the absence thereof. To put it bluntly, it's a lot easier to notice a flash of light in a dark room than a total darkness in a bright room. The difference isn't apparent until the event time is reduced to hundredths of a second, but on a flicker-free TV set you can't see the black, although the refresh rate is only 100hz (100 times/second), whereas tests on Air Force pilots have shown their ability to not merely notice, but identify the type of aircraft when shown an image for only 1/220th of a second. Furthermore, eye sensitivity is different throughout (you can't notice the flicker of a 60hz monitor looking head on, but it's quite obvious when gazing sideways) which furthers these rates even more. Just please don't start me on the whole subliminal message trip.
Computers and Monitor Refresh Rates
A lot of people despise vertical sync, as this caps your maximum frame rate to that of your monitor's refresh rate (assuming we're talking CRT). Some things to consider: is running a game below the refresh rate of your CRT smooth? Or maybe it looks better synchronized? Or maybe it's best to run above it? What about using multiples (i.e. 170fps with an 85hz refresh rate)? Personally, I haven't noticed any differences. Just don't confuse that with variations in FPS. Take fluorescent lights for instance. Why do so many businesses use them? Because they save energy. Why? Because they don't provide a constant stream of light.
Well, according to the average person they do, but in reality they're just like your monitor: flickering to the point where you can't tell the difference. There used to be fluorescent lights that caused headaches for the same reason that you won't be able to play for long on a monitor with a 60hz refresh rate or using one of those dinky 3D glasses. Current LCDs don't have higher than 30-40hz refresh rates, but they're progressive. So things may look blurry, but they won't be choppy. And if you want to crank up all the features, you won't be getting 100+ frame rates anyway. Just a thought.
FPS in Games
This is of course, all about games. The reason you can't run them smoothly at 24fps is because each frame is just that: a still frame; not a blurred range. Now, let me dig up a random number out of the blue, say 40fps. Ok, so 40fps is sufficient, right? Yes and no. Let's say you're the proud owner of a Geforce4 Ti4600. You don't care about 200fps so crank up the resolution to 1600x1200, turn on 4x FSAA and 16x Anisotropic Filtering. Plus all the in-game candy you can throw at it. Your rates drop from 150 down to 40. Naturally, it looks like crap. Why? Several reasons.
The more detail you have the more "sampling" your eye does automatically (or, more accurately, you strain it more). Try this: find an older video card (or underclock what you currently have) that will only run 1600x1200 at 40fps without any special features. Chances are it'll look smoother because you don't have anisotropic filtering enabled. More importantly, however, is adaptation. If you've played games at 40fps all your life and haven't seen otherwise, it'll look pretty good to you.
The same goes for the person who's had the luxury of 100. Now swap their positions. The 40fps person won't notice much, because to them 40fps was smooth as is. The unlucky person will be the one who's downgrading; to him/her a dramatic difference is in store. This is one reason why a lot of people claim that anything over 60fps doesn't matter: because they've never been significantly over 60fps for long. I bet if you play at 250fps (possible, but only with older games at low resolutions) for a couple weeks and then go down to 125, you will see a difference.
So is there a limit?
Technically, yes. Practically, no. The technical "limit" would be anything ranging from the speed of light to how long it takes it to reach your eyes to how fast your brain can interpret it. Other things like internal chemical reactions and the state of developed neural connections should also be considered. But still, 300,000,000m/s is pretty fast. What am I trying to say? This:
THERE IS NO PROVEN LIMIT TO THE FRAME RATE THE HUMAN EYE CAN PERCEIVE.
So what's enough? 30fps? 300? 3000? Who knows. But on a more practical scale, 1024x768, 2-4x FSAA and 16x Aniso between 50-100fps in most games is just fine. If you have any questions or comments let me know here or post in the forums or both. Thanks for reading and enjoy the site!
cheers