Why 24p? Is it intrinsically better than other frame rates?
September 13, 2009 7:50 AM   Subscribe

Why 24p? Is it intrinsically better than other frame rates?

A few related questions:

As I understand it 24 frames per second was decided on becuase early film makers found out that it was the slowest frame rate that humans could watch with the film motion still appearing smooth. So shooting at 24fps would allow the most efficient use of film. Am I right about the history here?

Yet now every where I look in the digital video world, 24p is assumed to be superior to any other frame rate, but it seems to me that a higher frame rate would be even better at effectively capturing motion since 24fps was originally chosen as a reason to save film, not because it looked the best. Is there any reason besides tradition that people choose 24p over other frame rates?

Lastly, what is the highest frame rate that humans are still able to distinguish. For example, people can tell the difference between 24fps and 30fps, can they tell the difference between 1000fps and 10,000fps?
posted by afu to Technology (17 answers total) 4 users marked this as a favorite
 
24p is widely considered the preferable frame rate because so much existing material was shot on film. Leaving content at 24P avoids nastiness like interlacing, frame interpolation, etc. If everyone switched to shooting at 60p, then 60p would be preferable. 24p also has a certain film "look": it looks just unrealistic enough to remind people that they're watching a movie. Some people prefer that, others curse the judder when they watch it on a 60 Hz display.
posted by Inspector.Gadget at 7:57 AM on September 13, 2009


Quote: As Charles Poynton explains, the 24 frame/s rate is not just a cinema standard, it is also "uniquely suited to conversion to both 50 Hz systems (through 2:2 pulldown, 4% fast) and 59.94 Hz systems (through 2:3 pulldown, 0.1% slow). Choosing a rate other than 24 frame/s would compromise this widely accepted method of conversion, and make it difficult for film producers to access international markets"[citation needed].
posted by Brian B. at 8:00 AM on September 13, 2009


24p is so popular in digital because people are trained to think that it just looks closer to film than "TV." While 60i/30p has smoother motion, people associate it with "cheap" TV or consumer gear. Low-budget shooters trying to get their output to "feel" like film. Thus, 24p. See also: the wide variety of "film look" color correction software that sorta works (Magic Bullet), and crazy-but-cool things like the 35mm->digitial adapters that people use to produce a more "film-like" depth of field with non-film cameras.
posted by Alterscape at 8:00 AM on September 13, 2009


Response by poster: It is also "uniquely suited to conversion to both 50 Hz systems (through 2:2 pulldown, 4% fast) and 59.94 Hz systems (through 2:3 pulldown, 0.1% slow).

That goes against everything I have read. People hate both 2:2 pulldow (4% fast and change in audio is noticeable and annoying to many) and 2:3 pulldown (judder).

It's the future! why not shoot in 300p which would convert perfectly to both 50 and 60 hz.

Guess it is just tradition.
posted by afu at 8:33 AM on September 13, 2009


I hate sub-par frame-rates and am frequently frustrated that nobody wants to move film into more realistic speeds. 24p looks cheap and poorly animated to me. If I were playing Bioshock at 24p, I would be sadly disappointed. There are video comparisons of 24p versus 60p out there, and the difference is striking. Some of us grew up in a world largely devoid of artifices like film grain and low fps, and find the idea that it looks "better" laughable.

Some new televisions with higher refresh rates attempt to correct for poor framerates with anti-judder technology that basically smooths out the video stream. I find that movies look much better when this setting is turned on.
posted by Phyltre at 8:43 AM on September 13, 2009


Pretty sure they use it so there are only a few changes of film cans in the showing of a 2 hour movie.

Also, more "realistic" frame rates are just that--more revealing of flaws in the actors' faces and skin, amongst other things. Movies aren't trying to be real, they're trying to be better than real.
posted by Ironmouth at 9:10 AM on September 13, 2009


I hate sub-par frame-rates and am frequently frustrated that nobody wants to move film into more realistic speeds. 24p looks cheap and poorly animated to me. If I were playing Bioshock at 24p, I would be sadly disappointed.

This is about film, not video games. Film compensates for its comparatively low frame rate with its built-in motion blur. If you look at the actual film frames for a fast action sequence, the moving objects will be very blurry, just as they would if the objects were moving in front of you in real life.

Compare that to a video game, where if you looked at the individual frames, the fast moving objects would appear completely static. Thus, video games use a much higher frame rate in order to produce motion blur in the eye rather than having moving objects pre-blurred. There is some support for motion blur in computer graphics, but it's expensive to calculate, so most games rely on a higher frame rate.

Here is a good overview of both film and computer animation motion blur and why it's so computationally expensive to simulate.
posted by jedicus at 9:31 AM on September 13, 2009 [5 favorites]


Note also that, in high-FPS-preferred video games, you're typically not just watching for the sake of seeing what's going to happen, you're watching for the sake of aiming, dodging, etc. at high speed. Fast cuts between blurry shakycams were annoying enough when watching Jason Bourne try to survive a fight, but the important part of those scenes was the tension, not the specific geometry of the action. In a video game the tension comes from your interaction with such scenes, and making it hard to see exactly what's happening in the fraction of a second when you need to react would be outright unacceptable.
posted by roystgnr at 10:08 AM on September 13, 2009 [1 favorite]


Actually 18 fps was the slowest frame rate where motion appears smooth. 24 fps is the slowest frame rate that allows for sync sound.
posted by infinitewindow at 10:23 AM on September 13, 2009 [1 favorite]


Actually 18 fps was the slowest frame rate where motion appears smooth. 24 fps is the slowest frame rate that allows for sync sound.

Incidentally, this is why Chaplin and other clips often appear so fast and choppy, because they were shot in 18 fps but played at 24 fps in most archive footage.
posted by Brian B. at 10:32 AM on September 13, 2009


It's the future! why not shoot in 300p which would convert perfectly to both 50 and 60 hz.

or we could shoot in 50 *and* 60 fps (drop the frames that aren't 50 or 60, 100 fps I guess). Shoot, I should patent that.
posted by Palamedes at 11:02 AM on September 13, 2009


There are considerations for shutter speed and exposure as well that limit the speed of capture.
posted by Large Marge at 1:16 PM on September 13, 2009


A few considerations...

P for Progressive - that's a big part of it. A progressively sampled video delivers greater clarity than interlaced video generally. It is basically higher resolution.

24fps has a few pros - it is the existing standard with motion picture film (which, as has been pointed out, is a hold out from a long time ago). People often feel the unique motion characteristics lend a more filmic quality to the image.

24FPS is also the only existing framerate standard with a very clear path into existing TV standards.

Personally I am not a big fan of 24P (or 25P as is common in the PAL-world I work in). I find the decressed temporal resolution unplesant, and don't consider the increased apparent spatial resolution balances that out.

Now, higher rate progressive standards are a different issue! 50/60P are might better looking than their interlaced equivilents, but there are a huge raft of technical reasons they are not as commonly available yet. Ideally 1080P 50/60 resolutions will become the standard in the next few years. Presently 720P is the best we can really offer consistently in 50/60fps rates.
posted by sycophant at 3:20 PM on September 13, 2009


Well, it's worth talking about Douglas Trumbull. He created something called Showscan which shot and then projected film at 60 frames per second (big large pictures) nearly looked real, so some place near 60 frames (perhaps 120?) would be needed; anything above that might not make any real difference. Some of the rides at Disney/Universal use derivatives of this technology.

Of course, there must have been technical limitations - the tensile strength of film, the ability to build a projector/camera mechanism to push film that fast, etc. But 60fps looks more real....

And that's a problem. We're used to and educated around 24 frames.
• We are 'used' to the strobing and motion blur from 24. We like that look - have you ever seen a child watch a movie for the first time? It blows television away; it's huge and moving, etc... By the time we're in our teens, film=big deal/money and TV/Internet=cheap.
• We have established workflows (whether some like it or not.) to NTSC and PAL (in HD or SD)
• The video decks that do this work well downwards (from film in post)
• The video cameras fuel the fantasy that you can shoot video (cheap) and one day have a theatrical distribution (on film.)

Some other things that occur to me...
First, if we were doing any effects work, doing 60 frames is 250% more work than 24. Nearly every 'major' film has rotoscoping work done (drawn using computers nowadays.) You're talking about nearly tripling the work to be done.

Our data storage? At huge frame sizes? Yeah, best triple your data path and storage too. That means you'll have to buy tons of new hard drives and have huge data paths; Uncompressed HD runs around 700 Mb/s - about 6 gigs a minute; you're going to have to have some huge superfast RAIDs. Full film (4k) quadruples that.

A major upside of 24: It gives us less frames which means smaller files (for compression) or the same size (as say, 30) with 20% more data per frame. DVDs have the ability to handle 24 material and add the necessary pullup to playback on PAL and NTSC TVs.

Look: all of it comes down to this - we can't just jump in with the perfect system; look how difficult getting the digital conversion was in the states. You're talking about distribution; that's how all these systems make money. We're mired in the foundations (30 interlaced frames running at 60hz for timing; ditto for 25/50 for PAL.) Film was 4:3 until the proliferation of TV; now it's wider - usually way more than 16x9.
posted by filmgeek at 7:04 PM on September 13, 2009 [2 favorites]


retinal/optic nerve refresh rate is between 40 - 80 fps for most people, so 100 fps gives you a nice buffer. anything higher would be overkill.
posted by jrishel at 7:47 PM on September 13, 2009


I have to deal with this question constantly when I'm planning shoots. I have a film background, but work in television and have heard most of the arguments for and against it. I haven't had any serious workflow issues with 24p, so for me it all comes down to the look and that just really seems to be a matter of personal taste. I'm convinced that because of their experience with 24 fps in the movies, most people associate the film look with higher end production. Think of the difference between the look of soap operas as compared to the look of high-end primetime dramas. I'm sure this will change as people's tastes and the technology evolve, but I don't think that will happen fast. For me, the progressive look is the winner.

Someone up above mentioned the tv's that smooth out the frame judder. I see this every time I go to Best Buy, and I'm shocked at how negatively I react to it. Just today, they were playing a Blu-Ray copy of Pirates of the Carribbean with the smoother on and it just looked like crappy video to me. If I was the director, I would be SO pissed to have my film displayed that way. Just my taste, but, man, it bugs me.
posted by Noon Under the Trees at 8:39 PM on September 13, 2009


noon- I agree. The frame rate could be anything, the key is that it gets displayed at what it was shot at. The rest is history and compromises.

I can't wait for the day that a television will be able to just display what it sent to it. No pulldowns, no deinterlacing, nothing. If a film is shot at 13.5 FPS, the TV will display 27 frames in 2 seconds.
posted by gjc at 9:38 PM on September 13, 2009


« Older Do my fish need a change?   |   Why is my 1994 Jetta sputtering and shutting off... Newer »
This thread is closed to new comments.