Discriminating video from film
June 24, 2010 9:40 AM   Subscribe

Why do movies recorded digitally look different from those on film?

I was watching two of my favorite movies (Atanarjuat [The Fast Runner] and The Gleaners and I [Les Glaneurs et la Glaneuse]) and I realized that two of my favorite movies were recorded on video. I can certainly tell the difference, but I don't know why. It doesn't seem like digital photography has the same limitations. If there's very little movement in a scene, it can be hard to tell the difference between video and film. I'm interested in the properties of the recording processes and how these affect the final product. Is it physically possible for video to mimic film?

Bonus points: Are there other kick-ass movies recorded on video?
posted by stinker to Technology (15 answers total) 4 users marked this as a favorite
From what I understand, digital is simply much clearer. Film is not as "true" as digital is, however that grainy quality of film is an aesthetic all to its own.
posted by molecicco at 9:45 AM on June 24, 2010

Framerate alone will create a lot of that look. 24fps film has a very distinct feel to it versus higher framerate video; ever notice how low-budget daytime TV looks weird, even discounting the bad lighting?
posted by Inspector.Gadget at 9:46 AM on June 24, 2010

Run Lola Run is mostly shot on film but switches to video for present-tense scenes that Lola isn't in.
posted by Jaltcoh at 9:47 AM on June 24, 2010

"Recorded digitally" and "recorded on video" are not synonymous.
posted by rokusan at 9:52 AM on June 24, 2010 [2 favorites]

Film typically is shot at 24 frames a second. NTSC video is 60 half-frames a second (so: 30ish) and HD video is 60 FPS. This means film will show more blur during movement, which can create a sense of motion that some people like, but does lose out on details - even when panning and zooming. In fact, for things with lots of movement, it doesn't matter what resolution you is shoot at if it's too low a framerate - having high resolution blur isn't much different from low res blur.

The other difference is grain vs noise, which also is the biggest difference in still photography. If it's shot with enough light, that isn't something that will be that noticeable.
posted by aubilenon at 9:55 AM on June 24, 2010

Is it physically possible for video to mimic film?

posted by griphus at 10:03 AM on June 24, 2010

Response by poster: "Recorded digitally" and "recorded on video" are not synonymous.

If you care to elaborate, what is the difference?
posted by stinker at 10:17 AM on June 24, 2010

Best answer: There's a few effects going on here between "film" and "video" (I will use these terms informally, back off pedants) that are most easily noticeable:

1. Framerate: Film is generally 24fps and video is generally 60fps or 30fps depending how you count it. This has obvious effects during motion. A video sensor can be made which records at 24fps, so you can completely fake this part.
2. Grain: Film has grain effects whereas video has noise effects. A high-quality video sensor plus software can fake this.
3. Depth of field: Film frames are generally physically larger than video sensors; without getting into the optics of it, this means a shallower plane of focus for video than for film. This is why pictures taken with SLRs can more easily have nice, defocused backgrounds compared to point-and-shoots. This can be matched by a video sensor of appropriate size.

There's a lot of stuff being shot on video that's made to look like film and does so quite successfully. Generally if you can tell it's either because they're going for that look intentionally or they're just not very good at it.

"Recorded on video" isn't the same as "recorded digitally" - there are plenty of analog video formats (VHS, for example). Obviously a digital recording isn't going to be on physical film, but if a digital recording is made at 24fps with a 35mm format sensor, to the untrained eye it will appear as "film" rather than "video". Conversely, you could, in principle, make a 60fps film camera with a small cell size that would look like "video" to most people.
posted by 0xFCAF at 10:26 AM on June 24, 2010 [5 favorites]

this means a shallower plane of focus for video than for film

FYI this sentence is backwards, it should be "for film than for video".
posted by 0xFCAF at 12:51 PM on June 24, 2010

Best answer: [The shallower plane of focus of a larger sensor] is why pictures taken with SLRs can more easily have nice, defocused backgrounds compared to point-and-shoots. This can be matched by a video sensor of appropriate size.

This is funny in light of how hard some filmmakers work in order to make deep focus shots work, relying on tiny apertures and sometimes faking it with split diopters. Orson Welles was most famous for this, but John Frankenheimer and Brian De Palma were also fans of the deep focus shot. Sometimes the effect you want is the one most difficult to achieve with your equipment.

Another quirk of video is that it (generally) has less dynamic range than film. Highlights are more likely to be blown out in video than in film.

As for films shot on video: there are so, so, so many Hollywood movies shot digitally nowadays, and you would hardly be able to tell the difference much of the time. It's going to be harder and harder to enjoy the aesthetic of video-as-film, as credibly filmlike 24p HD video becomes more and more commonplace.

That said, you should check out the following films if you enjoy the video look:
  • 28 Days Later
  • (excellent use of the limitations of DV around 2001)
  • Pieces of April
  • (decent film, but the ugly, muddy visuals are exemplary of DV's weaknesses)
  • Bamboozled
  • (very good movie, but once again, the visuals are emblematic of cheap DV)
  • Julien Donkey-Boy
  • (beautifully shot DV; one of the few attractive Dogme 95 films; same DP as 28 Days Later; the movie itself is fairly typical Harmony Korine "trying too hard to do too little", but it's not bad)
  • The Celebration
  • (great movie; shot on analogue Hi8 video; more emblematic of the whole Dogme 95 "look," except without being digital; once again, same DP as Julien Donkey-Boy and 28 Days Later)
  • 200 Hotels
  • (shot entirely in analogue video in the early 70s; definitely a relic from a lot of heavy drug use; incidentally, my friend's father once found the LP of this movie's soundtrack embedded inside the dry wall of a house he had just moved into)
    posted by Sticherbeast at 2:08 PM on June 24, 2010 [1 favorite]

    Sticherbeast is pretty much right on.

    The reason why digital photography isn't as noticeable is that digital still cameras, at least of the professional variety, have overcome the most important differences, such as sensor size (professional dSLRs have full-size sensors, the semi-pro ones are pretty close) and dynamic range.

    The difference in movies is going to be disappearing over the next few years. Sensor sizes is already a solved problem, and dynamic range is getting there (I just did a day of shooting with a Mysterium-X sensor upgraded Red One camera, and the two extra stops of range compared to the original sensor are really noticeable). Grain will be something you add at your leisure in post, given 4k+ resolution, this isn't really an issue.

    Actually, you should probably get used to seeing less grain in movies in general. Video masters of movies shot on film are made from the digital scans now, not from a telecine as was common earlier, and with cinema projection going digital, the only grain you'll see is the grain from the original camera negative, which is less than you might think. With digital cameras, of course, there'll be no grain, only slight sensor noise.
    posted by Joakim Ziegler at 3:17 PM on June 24, 2010

    Best answer: There are many factors that can make a difference in look.

    First we have to define the terms because a word like "video" gets used ubiquitously to describe many different things.

    For moving images we have 3 camera types to talk about.

    Film - Analog, light sensitive tape...we all know what film is.

    Video - Uses a CCD to capture the image, and stores it on an analog format like magnetic tape. This includes VHS, 8mm, Hi8, etc. Until digital came around all television was recorded in video. Converting a moving image to an analog signal allowed it to be broadcast live, and easily stored and copied.

    Digital - Uses a CCD to capture the image, and stores it on a digital format like magnetic tape, HDD, memory card, etc.

    It's important to note that video and digital both use a CCD to capture the image, they just store it in different ways. A CCD Sensor, even in a digital camera or camcorder is actually an analog device. Each pixel captures light and turns it into an electrical signal, and each pixel is read off the sensor one after another as an analog wave.
    The difference in a digital camera or camcorder is that it next goes to the analog to digital converter to change it into a digital file.

    There are 4 primary characteristics that are going to change the "look" of a movie, based on the format it was recorded in.

    Frame Rate


    Dynamic Range

    CCD/Film size

    Aside from frame rate, all of these are co-related and affect each other.

    Frame Rate:
    This was mentioned previously and is usually the culprit for that "film look". At some point the industry agreed on 24 frames per second as the standard. (I've heard the story goes that the illusion of motion begins at 12fps so they just doubled it and called it a day. I'm sure that's wrong though.)
    When television came around they had to come up with standards as well. So in North America the NTSC standard became 30fps(actually 29.97) and in Europe the PAL standard became 25fps.

    Once you start paying attention to it and watching for it, the 24fps "look" becomes easy to spot. The 30fps of television and video is slightly smoother and so looks different.
    Culturally we're used to movies being big, expensive, grand affairs, and television being the cheap, everyday movie.
    24fps looks fancy, 30fps looks cheap.

    Old super8 footage looks "vintagey" because to save money on film they'd usually record at lower frame rates like 16fps. So everything looks a little more choppy and when played back at 24fps action speeds up and people run around quick and look silly.

    A side effect of this is that in the consumer video or digital camcorder market you have to go to a fairly expensive level before you can find a camcorder that will record at 24fps. If you're recording something that you plan to have shown in a movie theatre you'll want to record it at 24fps so that it can easily be transferred to film.

    It's expensive to have a camcorder that can record at multiple frame rates. In a digital camcorder you need a more complex A-D converter to render out different frame rates.
    This trend is changing as the computing power in cameras gets cheaper and cheaper, but you still have to pay extra for the 24p model.

    On Film Resolution and Film Size are co-related. With the same grain density, the bigger the film size, the more detail it will capture. IMAX 70mm looks better than 35mm which looks better than 16mm which looks better than 8mm.

    Everyone argues about "megapixels" of 35mm film. I think a good comparison is around 12 Megapixels give or take. In the digital video world that's full 4K(4096 vertical lines of resolution). It actually varies depending on the aspect ratio.

    The big movies have been recorded on 35mm for decades now, so we're all used to seeing wonderful 4K resolution in all our favourite films. The difference is less noticeable on a VHS or DVD of course, but having a great starting product still shines through in the end.

    Video and Digital have always had to conform to strict standards; the resolution of the recorded product has to match the resolution of the television set for everything to play nicely together. So in the standard definition days that's usually 480 lines, and in HD it's 720 or 1080.

    So the resolution of Film is far greater than anything we're used to seeing from Video or Digital.

    Dynamic Range:
    The Dynamic Range of an image is the degree of difference from dark to light that can be captured in a single frame. Better Dynamic Range can better deal with high contrast scenes, retaining detail in the brightest and darkest areas.

    Typically a camera with poor Dynamic Range will "blow out" bright areas like the sky, or your bald uncle's forehead. Once a pixel goes completely white information is lost and can't be recovered.

    Film is inherently good at capturing a high dynamic range. It's one of the last areas that Film is generally superior to digital images.

    Basically CCD sensors and A-D converters are getting better and better, but 35mm film will generally have smoother, more pleasant transitions from light to dark. This looks better and contributes to that "film look".

    Video never really got close, and so is easy to pick out from film. The contrast and detail is just different.

    Dynamic range is also related to CCD size and resolution in Video and Digital cameras and camcorders. The larger each light capturing pixel is generally, the higher dynamic range will be. So for dynamic range you want a large CCD and a low resolution, so each pixel itself can be as large as possible.

    CCD/Film Size:

    All other things being equal, the larger the Film Plane, or CCD, the less or shallower depth of field you'll have.
    As far as Film and CCDs are concerned, 35mm is pretty big. And so with 35mm film you can achieve a shallow depth of field more easily. This is also why footage from large sensor DSLR video cameras looks so "cinematic".

    The larger a CCD is, the more expensive it is to manufacture, exponentially. Big sensors are very expensive, and so most Video and Digital cameras used for television and movies in the past have had CCDs smaller than 35mm film.
    That gives less flexibility in depth of field. Shallow depth of field looks "cinematic" because it's more often used in movies, and most movies are/were shot on 35mm film.

    So. In general movies shot on Film have a slightly slower Frame Rate of 24fps, a crisp, High Resolution and pleasant, high Dynamic Range image, with more instances of shallow Depth of Field.

    These factors generally combine to give Film and different look from Video or Digital.
    posted by breakfast! at 3:19 PM on June 24, 2010 [3 favorites]

    stinker: " "Recorded digitally" and "recorded on video" are not synonymous.

    If you care to elaborate, what is the difference?

    There's a weird differentiation going on here, which has its historical roots in film snobbery (as do many things). Basically, when shooting movies on video started becoming feasible (first with the advent of cheap DV, and then HD cameras like the Panasonic VariCam, the various Sony F900-series cameras, and so on), there was a lot of scoffing from the people who thought the only proper way to shoot movies was on film. They had some good points, but the level of disdain was hardly justified.

    So, when video started catching up to film, with the higher-end Sony cameras, the Red One, the Arri D21, and so on, people figured they'd call these cameras "Digital Cinema cameras" instead of video cameras, to escape the stigma. At the same time, there was a move from recording digital images on video tape to recording on flash memory, hard drives, and so on (except for Sony, which will always love their expensive tape formats), so "Digital Cinema" came to mean this too, as well as higher resolution (full HD and especially even higher, like 2k and 4k).

    So today, some people call everything electronic "video", while others reserve "video" for lower resolutions and/or tape based capture, while they use "Digital Cinema" for higher resolution, higher dynamic range cameras that capture on something other than tape, and generally have an image quality more resembling that of 35mm.

    It's all very confusing, and you shouldn't really care about it. If you want to piss the "Digital Cinema" people off royally, you could probably call everything digital "the betamaxes" or something.
    posted by Joakim Ziegler at 3:44 PM on June 24, 2010

    breakfast!: "The big movies have been recorded on 35mm for decades now, so we're all used to seeing wonderful 4K resolution in all our favourite films. The difference is less noticeable on a VHS or DVD of course, but having a great starting product still shines through in the end."

    Just to be nitpicky, there's a pretty good argument to be made that 35mm is at least close to 4k optical resolution, but that's the camera negative. Once you get the thing scanned and processed and printed and have had a generation or two of interpositives and internegatives made, and you sit down in your comfortable multiplex seat, it's a lot less than that.

    There are fairly convincing studies indicating that the resolution of a typical best-case projection print, even one made directly from a camera negative, is lower than that of HD video. But this is an ongoing discussion that will only die when 35mm dies. In the meantime, I can assure you that no projection copy will give you even close to 4k resolution.
    posted by Joakim Ziegler at 3:52 PM on June 24, 2010 [2 favorites]

    It takes a skilled videographer/cinematographer to get any medium to look good.

    One big difference is shutter speed (simulated or otherwise) versus aperture versus lighting. One of the nice things about film is that it took a (relatively) long time for the film to expose properly, so even though there are only 24 frames a second, more of the action is captured in sort of a blur. Which is more realistic to our eyes; when something races by us at a fast speed and our eyes aren't tracking it exactly, it looks blurry.

    Where digital "film" might not capture the motion as well. Instead of being a fluid capture of motion, it is 60 sharp photographs a second. If there isn't a little bit of blur in the motion, it looks wrong. That is why video games at 60fps look jerky but a film at 24 fps looks smooth and natural.

    Another video versus film deal is how the image was captured versus how it is being distributed. NTSC is 60 fields a second. Each field will have a temporally different scene. This is different than 30 frames per second, where there are only 30 time slices. If something was shot in ntsc interlaced but edited in a frame-wise system, it's going to look bad. And cheap NTSC cameras used to only do 30 fps and then just double that for 60 fields per second.

    (and then there is the issue of what the thing was shot it, how it was edited, how it is distributed, then broadcast, then how your display displays it. This muddies the waters terribly.)
    posted by gjc at 4:21 PM on June 24, 2010 [1 favorite]

    « Older Which fish balls are the good fish balls?   |   Used Smart Phones Newer »
    This thread is closed to new comments.