Attack of the Benny Hill music!
May 9, 2007 9:07 AM   Subscribe

Why are old film clips always played at super-speed, in spite of the ease with which this could be fixed?

Whenever you see old film clips, people walk around at super-high-speed, as if there was amphetamine in the water supply up until around 1940. I always assumed that technical reason for this was that film used to be captured at a lower frame rate than those in common use today, so playing them back using the current frame rates made them speed up.

However, it's now trivial to change the speed of a clip to be correct - I can do it in a few seconds on my home computer. Given that, why are old clips still replayed too fast? Is there some overriding technical reason I'm not aware of? Is it purely a cultural phenomenon, a marker that says "this is an old movie?" It seems somewhat in poor taste when one has documentaries featuring, e.g., emaciated Holocaust victims strutting around camps in fast-motion.
posted by dmd to Media & Arts (15 answers total) 6 users marked this as a favorite
I've never actually tried slowing down one of those old clips, but I just wonder whether they'd look good, slowed down to normal speed. Would you lose the "persistence of vision" effect that keeps it from being choppy?

The reason I've always understood that they're shown like that, is because the cameras weren't fast enough to expose the film at 24fps or so (whatever the prevailing projection rate was at the time), and then the film was projected at a speed sufficient to give persistence of vision (so it didn't look choppy or like a flip-book). Hence people move quickly, but this was considered better than having choppy video.

But that was all 1900-1920s technology; there's no reason why film clips from WWII should be like that. Portable movie cameras in the 40s were more than capable of shooting decent framerates, unless a particular clip was just captured with some very, very old gear. But in general, only a very small minority of stuff from WWII ought to look like that. (Think of all the newsreels and other propaganda stuff -- none of it is undercranked like that.)
posted by Kadin2048 at 9:15 AM on May 9, 2007

Some information can be found here, and better yet a big article on the subject. A good Google search too. Apparently a lot of the silent films were actually designed to be shown at a slight fast speed. The 24 fps rate for both filming and playback is what you're used to.
posted by zek at 9:22 AM on May 9, 2007

I'd guess that it's partly for reasons surrounding artistic integrity in the fictional department, and the ethics surrounding altering documentary journalism. It's just a frame rate change, I agree, but change to a work in any way constitutes breaches in both regards even if it's just an attempted restoration.

Also, I'd imagine that there's consideration of value in preserving things the way people are now used to rather than what may seem to be 'realistic'.

The third would be technical. I'd think there'd be some problems with lack of data between frames which would cause a strange visual effect. I've slowed a few CG videos down myself in the past, and even with advanced video editing software you still get ghosting inbetween frames. This might be enough to deter people who are more concerned about values, integrity etc.
posted by jimmythefish at 9:23 AM on May 9, 2007

Given that, why are old clips still replayed too fast?

Because a) people are lazy and b) when you're playing an old clip, part of showing it in the first place is the nostalgia factor, which means you show the old clip, warts and all, to give the audience the flavor of what it was like to view the old clip at the time when it was new.
posted by frogan at 9:24 AM on May 9, 2007

i might not be right, but i bet it has something to do with the number of frames per second (fps) that was used for the original shot. i seem to recall that old movie cameras used to mainly take pictures at 18 fps, although the standard today is 24 fps.

running a clip that was originally 18 fps at 24 for viewing--and so the second of screen time that was ment to have only 18 shots now have 24--so a 'second' for a 18 fps camera would be .75 seconds for a 24 fps playback system. the projector compensates by playing ahead--it takes the first six frames from the next second of film. when viewed, it appears that things speed up, because the actual film is being shown in a shorter amount of time then it took to actually make.

the opposite is also true. to slow down action, you add more fps by speeding up the camera.

i would guess that with digital advancements, this factor could be more and more easily correctable for old film and video.
posted by lester at 9:26 AM on May 9, 2007

This is actually the subject of a lot of controversy among film-aficionados.

You can read some more about 'right' projection speeds here:

Wikipedia isn't up to snuff on this subject, actually often wrong.

This doesn't actually answer your question yet, and I suspect there may be many answers. I can think of two reasons:

1) Money - dumping a movie on a dvd or other medium instead of applying some TLC is cheaper. This applies even more so when it comes to hand-cranked films, where there is no perfectly constant framerate.

2) The frame-rate of older movies was significantly lower than 24fps, the current lowest speed at which humans can't distinguish separate frames. Playing films at 14-18fps will introduce a very obvious flicker in the image (again, mostly fixable, but at a cost) and so the choice might be made to rather speed things up then have this strobe-effect.
posted by Grensgeval at 9:30 AM on May 9, 2007

Playback of cameras of those eras was at 12-16 fps. Seconding the 'reduce flicker' explanation.
posted by damn dirty ape at 9:41 AM on May 9, 2007

Interesting article on early film frame rates.

Basically, the issue is that early film cameras were hand cranked, so the frame rate captured depends on who is behind the camera, and there were various "standards" for the fps on projectors, which are also often slow enough to look choppy.
posted by teg at 9:52 AM on May 9, 2007

Best answer: there are several reasons for this:

1. the old clips used to be run in projectors at that speed. what you're seeing is what they used to see back in the day.

2. it's not really that film used to be projected at a different speed (although they did, that's not why it's fast.) it's more that film used to be shot at a different speed than it was projected at. i know, that sounds weird. why do that? the answer is that, in the old days (the days that look the way you're talking about), the projectors were automatic, and the cameras, for the sake of cameraman control, were manually cranked. that's right, every frame shot was some dude manually rotating a lever to move the film past the lens. why? because sometimes the director wanted to slow down something that happened or speed it up for the sake of stunt work or the like. you can see this in the original Nosferatu with Max Shrek. There are moments where they shot faster than projection speed to make certain movements look more sinister/ethereal during projection. The technology at the time was limited enough that the only way to shoot at different speeds is to crank 'em by hand. the downside is that the cameramen had no way to be sure they were cranking at projection speed. like i said, the way you see it is how an audience saw it back in the day.

3. So, film isn't quite like video. When you watch old film on your tv, you're watching an entirely different way to display moving pictures trying to accomodate the way film displays moving pictures. Here's a wiki article on telecine, which is the process by which film is transferred to video. Read up on that to see the complexities involved in converting framerates and non-interlaced media to interlaced media. pay special attention to the graphic shows the 4 film frames "ABCD" becomeing the five video frames "AB(BC)(CD)D." Now, when you're editing this video version of your film stuff, there is no longer a well defined c frame. the c frame is split into two other frames to keep the speed consistent. so, retiming that clip is a very sketchy proposition. if you slow it down, how do those two split BC and CD frames get split? they probably shouldn't, and should instead be left alone and have the other frames split up around them, since they're already technically 2 times slower than the A, B and D frames. But your editing application doesn't necessarily know where the C frames are. It just reads frames as frames and doesn't (except in certain circumstances) differentiate between a b c and d frames. it has no way, in other words, to know that to retime those c frames differently. you'd wind up with c frames stretched across 4 other frames, blending with a frames, b frames, and multiple d frames on either end of the set of 4 frames. It's a mess to do it in a video editor, so it's actually not that easy.

4. The alternative is to get a film slow down processed by the film processor who telecined your film to video in the first place, and then have them telecine that slowed down version and give you a tape of that directly. this is a complicated, pain in the ass and very expensive process that is almost certainly not worth it to show an old film at proper walking speed. why bother faking what no one ever saw in the first place at that kind of expense?

5. the last reason is something you have to see to understand. if you have any video you can do this with try it at home. put your video into your editor, and speed it up to double speed. make sure that your editing software is set to not use anything like "frame blending" or interlacing interpolation of any kind, because that stuff doesn't work with film which isn't interlaced. Now export that as a new video. Now bring that newly exported video (do not skip the export step, it makes a difference.) into its own sequence and slow it down to half speed. That's what your old videos will look like. walking the right pace, but all jittery and skippy. All that pain, expense and difficulty would result in a video making the old films look kind of crappy instead of just quirky.

hope this makes sense and answers your question.
posted by shmegegge at 10:19 AM on May 9, 2007 [4 favorites]

There is a technique for converting 24 frames/sec film to the 29.97 frames/sec. of television called pulldown.

Something similar can be used on old film. I have a DVD of Nosferatu (1922) which has been processed in this way, and for the most part, it is very good.

However, some parts of such corrected films are still a bit wrong. Pull-down can only compensate for so much under- or over-cranking before the eye starts to perceive it as flicker or shuddering. Some movements start to "throb" or "pulse" at extreme levels of correction.

Additionally, once corrected, a film that was deliberately under- or over-cranked (as a special effect) has lost some of the original director's vision.

People processing films have to decide whether the pull-down artifacts are less objectionable than running the film too fast.

I recall some documentaries on WWI which made use of many Pathe newsreels that had been re-timed in this way, and they were much more dramatic than they had been when I had first seen them at Keystone Kops speed.
posted by Crosius at 10:22 AM on May 9, 2007

Best answer: Actually, for many years now, it's been fairly trivial to create smooth "in between" frames that didn't originally exist, using Optical Flow motion estimation algorithms. There are a myriad of off the shelf products now which can achieve this, including Furnace by The Foundry, Adobe After Effects 7 (which uses Furnace's Kronos optical flow engine), Shake 4 and Realviz ReTimer.

"Old school" frameblending the way shmegge describes (which really isn't a true interpolation, it's simply blending exisiting frames/fields together to fool your eye into percieving smoother motion, sort of like the motion equivalent of anti-aliasing), is quickly becoming passe even among consumer/prosumer apps like Final Cut Pro and Motion, which will get optical flow features in the Studio 2 update this month. It's already been passe in professional apps for several years now.

It's interesting this was posted right now, because I was just working on a few shots for a commercial just that required exactly what I described. In this case, the idiot director shot a rotating product shot on a turntable, but failed to shoot the rotation at different speeds. So all I got was a crazy fast rotation that didn't work, timing wise, with the actual edit in post.

Thankfully, when I brought the shot into Shake, I slowed it down 5 times it's normal speed, yet Shake managed to create all the "missing" in-between frames using optical flow, which saved the shot from being completely unusable, because of the "jitter" artifacts that shmegge described. And that was just using the default retiming settings. You can tweak the parameters to get some truly miraculous results.

So I guess this is a roundabout way of saying that everyone here is mostly right. Old film footage is almost always presented with the Benny Hill effect because slowing it down will make the footage look jittery, and applying frame interpolation processing to this slowed down footage would remove a lot of the psychovisual cues to most people that what you're seeing is a really old film. It's the same reason why 24p is such an important framerate for most film directors who are making the transition to HD. 30p and any interlaced format like 30i or 60i just looks wrong, even though it's technically superior to a low, 24fps framerate. You lose a lot of the "magic" with higher framerates, and I would hazard to guess that smoothing out old film footage with optical flow would just seem way too creepy to our modern eyes.

There's also a technical reason why more old footage isnt retimed with optical flow. The algorithms don't do well with footage where there are a lot of changes from frame to frame. In the case of old film, it's the film scratches, hair, gate weave, etc that tends to confuse optical flow algorithms. It all ends up not being worth the trouble to get all this old footage to look smooth and normal looking.
posted by melorama at 11:18 AM on May 9, 2007 [1 favorite]

I've seen old footage projected at "correct" speed (so that the motion of the subjects is lifelike), and it looked much better and more realistic to me, even if there was a slight flicker. I felt as if I was "seeing thru" the visual artifacts and catching a real glimpse of the past. Not sure if everyone's psychological experience would match mine, however.

As melorama says, it's increasingly easy to do this kind of speed correction. I don't know why it isn't done more -- I assume it's a combination of laziness and our ingrained expectation that old films = sped-up motion.
posted by Artifice_Eternity at 11:46 AM on May 9, 2007

Old clips are played too fast, as:

They often 'were'; they're usually between 12-18 fps, and when you play them back in conventional projectors @24fps, they're sped up.

The transfer was done this way, in a sense "burned in" - as the video tape you're playing back, was telecine'd incorrectly.

It's now a 'look' - the keystone cops movies were even more comical when shown incorrectly.

So, it's a technical f-up, combined with a 'look' you're familiar with.
posted by filmgeek at 12:09 PM on May 9, 2007

Response by poster: Wow. I didn't expect this level of detail. Thanks so much, everyone!
posted by dmd at 12:33 PM on May 9, 2007

Sorta related, but Herbert Morrison's famous play by play of the Hindenburg disaster was recorded on a out of whack machine that recorded too slowly, causing the playback to give his voice too high a pitch. It's interesting to have it played back correctly and to hear what he really sounded like.
posted by DieHipsterDie at 2:02 PM on May 9, 2007

« Older How much rent can I afford?   |   Can I find sanity and peace in life having the... Newer »
This thread is closed to new comments.