Why do video game framerates need to be so much higher than TV and cinema framerates?
It seems that video games need to have something like 60 frames per second in order to be smooth and realistic. But TV and films only have about 24 or 25. Why the difference?
9 Solutions collect form web for “Why do video game framerates need to be so much higher than TV and cinema framerates?”
1. Responsiveness of input
There is a big difference in feel of the gameplay when input & response happen only 24 times per second vs. 60 times per second, especially for fast-paced games such as first person shooters.
Network buffers and input buffers are filled on separate threads, which means new state from the game server, or button presses from your gamepad, must wait until the next iteration in the game engine’s “update loop”. This wait can be as long as 42 ms for 24 updates per second, while only 16 ms for 60 updates per second. That’s a 26 ms difference or roughly 25% of the “lag” we experience on a 150 ms server connection vs. a 50 ms server connection.
I think there is a piece of history you’re missing here, so allow me to try and fill it in.
If you google 60fps vs 24fps you’ll find endless threads of people asking what the difference is. Most people will tell you that 24fps has been the standard since the 20s, but there is little explanation as to why.
If we actually look back to the creation of film will notice that 24fps has not always been the standard. Edison himself originally recommended 48 fps stating “anything less will strain the eye.” Edison films, however, did not follow this standard, nor did they seem to be standardized at all (with single films having a variation of larger than 10fps over the course of the film). American Mutoscope, one of Edison’s rivals, actually used 40fps, but the resulting camera weighed almost a ton.
However, these fast paced filmed used up too much film (a luxury at the time) and by Victor Milner’s time the standard was 16 fps. More than practical considerations, many film buffs actually critiqued films faster than 16fps as being “too fast.” The Birth of a Nation, for example, got as slow as 12fps in some sections.
The major problem with the period between 1910 and 1920, was that film speed varied so much. Even for a single filmographer their frame rates tended to vary between films. By the mid 20s, camera men had started to pride themselves on their even speed and being able to approximate 16fps (which they more usually measured in feet of film). Meanwhile, theaters had started demanding faster and faster speeds. While 16 fps may have looked more professional, in a crowed theater (or a small one), the audience seemed more able to decern the film at 24 frames per second.
When Annapolis was shot in 1928, the studio was mandating 24 frames per second. While many film crews did not appreciate the more frequent camera reloads (16 frames per second corresponds to about 1,000 ft in 16 minutes). By the time motorized cameras became common, 24 frames had become a de facto standard.
Its important to note, this was not a technical limitation, nor was it (frequently) a financial one. It was the end result of two opposing forces (camera crews and actors vs studios and theaters) desiring a different speed.
So Why not 60?
It’s worth noting that many TVs (eg NTSC) use 59.94 frames per second (60 Hz/1.001) counting interlacing. If you discount interlacing it’s 29.97 fps. This has to do with how the interlacing is actually implemented (specifically, to remove beating based on 60hz power sources found in the US). Originally they chose 60fps to match the power source, but this actually causes intermodulation (beating), which appears as flickering.
There is some evidence to suggest that human visual acuity drops off sharply after about 30 frames per second, though most human beings can still detect discontinuities in the motion illusion up to 60-75 frames per second. What’s more there is a large library of evidence that the human eye can detect jitter over 300 frames per second (Steinmetz 1996). So it a decent question to ask, why not 60? 60fps itself is an artifact of different technology (Television using 30 Frames per second and interlacing frames).
Ok, so we were forced into 60fps, why keep our 24 fps standard?
When making home movies first became a possible consideration (read VCR camcorders, my father had one for years, the thing actually took a VCR tape and wrote to it), they were optimized for TV production (ie. 60fps). As a result home movies had a vastly superior frame rate to standard film. Unfortunately, this quickly became associated with amateur production (which most home movies were). Consider movies which feature film shot on a hand held camera. Most people can instantly discern the much faster rate, but more surprisingly is that most people will tell you it looks lower quality.
The truth is, we think of 24fps as looking better because we’ve been trained to.
A number of Directors have tried to break away from 24fps (Peter Jackson shooting at 48, James Cameron at 60), but almost always they are forced to show these movies at the old standard. People just think it looks better. Film speed (like many things) is a social phenomena.
I hate to cite the Wikipedia entry for frame rate, but it makes sense:
In modern action-oriented games where players must visually track animated objects and react quickly, frame rates of between 30 to 60 FPS are considered acceptable by most, though this can vary significantly from game to game.
Watching film and television is an extremely passive activity.
Playing a video game, on the other hand, requires active participation.
However, then you have How Many Frames can the Humans See? which notes that the real issue is motion blur.
If you could see your moving hand very clear and crisp, then your eye needed to make more snapshots of it to make it look fluid. If you had a movie with 50 very sharp and crisp images per second, your eye would make out lots of details from time to time and you had the feeling, that the movie is stuttering.
Just think of modern games: Have you ever played Quake with 18fps? There is no motion blur in those games, thus you need a lot of frames per second more.
I’m no expert on the subject, but here is why it makes sense to me that real-world recordings can run at fewer fps than animations can while still being higher quality: An animated frame shows a single instant in time while a recorded frame shows a small interval of time. This is not the same as just blurring parts of the picture with motion in it.
This is why: Suppose the interval of time that you get in a recorded frame is 1 millisecond and suppose the universe runs at, say, 1 billion fps (the actual number is the planck time, but let’s not digress). Then the recorded frame is the average of 1 million points of time so that the 1 frame is actually based on a tremendous amount of information. In contrast the animated frame has just the information from a single instant in time. So don’t think of the recorded frame as just 1 frame, think of it as a summary of a million frames.
From that perspective it makes a lot of sense that animation must run at a higher fps than recordings need to. You could simulate the effect by running the computer at 1 billion fps and average that down to just 24 fps. I’m sure 1 billion fps would be overkill for that purpose, and it would be interesting to know at what point diminishing returns kick in. That might be a very low number like 60 or 100.
So recorded frames are more blurred than animated ones. The blur in recorded frames carry a lot of extra information about what happens between frames, while just adding blur to an animated frame removes information. This is similar to the difference between blurring and anti-aliasing, except we are working with time instead of space.
You need the higher framerates because the resolution and clarity of monitors is much higher than TV and cinema, and you are trying to spot any tiny movement or detail as it may be critical in-game.
TV and cinema have tried to rely on blurring, and for slow pans or conversation it works just fine but a segment with dramatic action is already a nightmare. The judder/blurring ruined Avatar for me on the big screen.
You can’t blur a game if you want to provide the clarity and detail people expect on a computer, so you have to provide more frames in order to keep the illusion of movement.
Even more than the input process and responsiveness, and the resolution and ad-hoc standards, there are the framerate peaks.
I asked myself your question a long time ago, and, on the personal research, I found this reason one of the most reliable:
In games, sudden heavy-load for the processors may happen instantly and unpredictably. Imagine an ambush in a FPS: in one frame, suddenly, tens of NPC models and IAs may be computed, bullets, etcétera.
This causes one of the most annoying things you can encounter in game playing: that framerate falls almost making unplayable your game. Under peak framerates may ruin a whole team finalist in a professional tournament.
Those framerate peaks may not occur in television or films.
In short, my point is: it’s not the standard mean framerate which is important in gameplay, nor the highest of the peaks, but the possible under-peaks that may ruin the experience.
First off, film went from 16 fps to 24 fps because of sound. Sound couldn’t stay synced at 16 fps so they had to make 24 fps the standard. 24 fps has been the standard since then and hasn’t changed due to costs. But now in the world of digital, it could change, but cinematographers believe 24 fps just has that “filmic look.” Some movie makers are considering higher frame rates for films.Peter Jackson is filming The Hobbit in 48 fps, but since films are played back at 24 fps there will only be certain theaters that can play the movie back at 48 fps to capture that extra quality. All other theaters will play a converted film that wont be able to see the extra quality at 48 fps. If you play something shot at 48 fps back at 24 fps, it will play back twice as slow.
So, higher frame rates will give more quality, and in the case of computer gaming, there is no 24 fps standard to keep them from running at higher rates. If the film/TV industry wants to ascend to a new level of quality, everything has to conform to a new standard that is shot at a higher frame rate then played back at a higher frame rate.
There is a very good reason for this. You see, movies are videos, so they take up video space. Naturally, this means that longer movies take up more space. Now, the way movies work is that they are rendered as frames, then played at a certain framerate. As a result, a 30fps movie is larger than a comparable 20fps movie.
Because the human eye cannot really distinguish between framerates above 30fps, it doesn’t make a whole lot of sense to make a movie run at lower fps because its smaller, renders more quickly, and people have trouble telling the difference.
However, video games work differently. They are rendered in real time, so more fps does not mean more space. Therefore, it is okay to run a video game at higher fps. Gamers also like to boast that they have higher fps than others, even though it is hard for them to see the difference.
So, while movies need to have low fps, games tend to have comparatively high fps. After all, it doesn’t really affect them, so they don’t need to limit it.
To think much more simpler about this. When it comes around to film, most of the time it’s frame-by-frame. Which is making or editing something in each and every frame put into a tv show, or a movie. So in saying this, in animation the animators must paint and detail each and every frame, however with framerates you might be able to say that with animation, they make it so that 24-25 frames are played per second. but since in animation things are made per frame, any mistake will show, so in those 25 frames of animation, somebody could just be painting the same thing all over again in each and every frame, and then change when it comes to the 25th frame.
As for games, things aren’t made or built like film or cinema, so instead of frames in some cases, it’s just one big play through of scene or gaming. And also since with games there’s more software and code that has to go into it there has to be…certain steps to be taken in order for things to be more smooth. In this case, increasing the amount of FPS.