Why does TV video appear so different than that which is produced with normal video camcorder?
July 27, 2008 10:14 PM   Subscribe

Why does TV video appear so different than that which is produced with normal video camcorder?

I am wondering why videos produced for TV appear not only smoother, but have a very "distant" or unreal sense to them, whereas videos recorded by camcorder appear very raw and real like the way they happen in reality. I am not referring to the jitter caused by holding a camera, of course, but the different "quality" and properties that tv produced videos have. This is especially visible when you see low-budget commercials for cars/furniture, then back to regular TV quality. Is it the framerate, is it "hd" resolution, is it the photosensors, is it the lighting, is it interlace, is it tweening?
posted by torpark to Media & Arts (18 answers total) 6 users marked this as a favorite
 
A lot of stuff on tv is actually film these days.

But basically, it's all the things you mentioned and a lot more. HD cameras, better CCDs, 24 fps instead of 30. basically what comes from a $300,000 video camera looks a lot better than a $300 one.

Then there's professional lighting, color correction and other post production.

"distant" or unreal - this is traditionally the look associated with film, but expensive 24 fps ("film speed") video can pull it off to some extent these days.
posted by drjimmy11 at 10:23 PM on July 27, 2008 [1 favorite]


It's the different frame-rate in NTSC versus film.
posted by philomathoholic at 10:24 PM on July 27, 2008


A lot of it just light and non-shakiness though. Take your cheap camcorder, put it on a tripod, set it to 24fps if you can, and film at the "magic hour" (that cool light just before sunset), and you won't be too too far off from that pro look.
posted by drjimmy11 at 10:25 PM on July 27, 2008


I suspect it has a lot to do with depth of field and contrast. Typically the video straight off a consumer video camera is very low contrast and has a huge depth of field (everything is reasonably sharp, there's not really one point that's in focus).

Here's a photographic example (self-link) of the contrast issue. The top half is the image & histogram straight off the camera, and the bottom half is the image after setting a better black point and white point for the image. Also, in film & TV the color saturation is usually really exaggerated. A classic example of this is CSI: Miami (which apparently makes heavy use of the "magic hour").

Finally, don't overlook the power of composition to control your relationship to a scene. Home videos tend not to put much thought into it, but professional video is always meticulously composed.
posted by knave at 10:43 PM on July 27, 2008


The framerate is actually a big part of it. I was at best buy the other day and they were showing clips from "Hancock" on their HD TVs which were using some frame interpolation software to 'smooth out' the frame rate (to 120 fps, according to the sales guy)

The difference was really obvious and it definitely looked more like camcorder video then film.

You need about 30 FPS for your mind to see the video as motion, rather then as a sequence of pictures, but movies are shown at 24 FPS, just below the threshold. But the higher you go above 30, the 'smoother' it looks, up to a certain point.

The other thing to think about is lighting. That's why a lot of 'professional' photos look a lot better then regular snapshots, even when people use the same cameras. For example this photo, this one and this one are pretty 'everyday' looking, while this one, this one, this one and this one were are more unreal looking. But they were all taken with the same camera, in this case a Canon EOS Digital Rebel XTi

How the scenes are lit (and the skill of the photographer) have a huge impact on how the final images turn out. Good photographers could probably make 'unreal looking' videos with today's ordinary camcorders, maybe passing the result through a frame rate filter
posted by delmoi at 10:53 PM on July 27, 2008


Also, some shows use this service, or something similar.
posted by Fuzzy Skinner at 12:28 AM on July 28, 2008


Fuzzy Skinner: The framerate is actually a big part of it. I was at best buy the other day and they were showing clips from "Hancock" on their HD TVs which were using some frame interpolation software to 'smooth out' the frame rate (to 120 fps, according to the sales guy)"

I believe you're confusing frame rate with refresh rate. 120Hz LCD panels are the hot feature in TVs right now. They refresh the screen faster to reduce motion blur introduced by the slower refresh rates and refresh times of the technology. Feature films are shown at 24 fps but don't have this problem because they don't have to deal with slow pixels refreshing. In other words, even with higher refresh rates, there are still only the same separate frames displayed every second, but they may need to be displayed multiple times, depending on the refresh rate.

The picture on 120Hz sets can look smoother, but it can also introduce artifacts into the frame which some people dislike.

More info here.
posted by sharkfu at 1:29 AM on July 28, 2008


Remember that movies are shown at 24fps but each frame is show 3 times IIRC.

And yeah, the biggest differences between pro and amateur stuff are a) lighting and b) sensor size which co-incides with selective focus DoF.
posted by jedrek at 2:53 AM on July 28, 2008


Quite a bit of it has to do with lighting -- look at the set of ANY movie scene and you will see extreme amounts of lighting; which contributes to better contrast as well. A prosumer video camera could likely fair much better against a $110,000 camera if the lighting was correct.
posted by SirStan at 3:18 AM on July 28, 2008


These answers are really spot-on. To give you a real-life example, a big part of my job is compressing a documentary series for the web. The series often is composed of tightly-produced studio shots with big, expensive video cameras and meticulous lighting interspersed with field footage from hand-held video cameras ranging from the high-end to typical consumer stuff.

My compression drops the frame rate below both NTSC and film, and that is a big equalizer-- the quality differences of the footage are much less apparent, suggesting that the frame rate is a very large factor. However, there are still differences in quality between the two footage types, which seem reasonable to attribute to the depth of field and deep contrast of the better, bulkier cameras along with the professional lighting.
posted by Mayor Curley at 5:24 AM on July 28, 2008


Typical TVs refresh at 60hz; 24 (fps) does not divide evenly into 60, so some frames are held on screen for 3/60ths of a second while others are held for 2/60ths. 120hz refresh rates absolutely help with this, because 24 divides evenly into 120, so each frame is shown for 5/120ths of a second. The TV itself must be smart enough to "pull down" the signal and re-assemble it properly (with 0 artifacts, mind you) as most devices (DVD players, VCRs, cable boxes) natively send 24fps video using the 3:2 ratio. Many 120hz TVs are not smart enough, so there is some definite deception going on out there where your 120hz TV uses interpolation and just gives an eerie look to everything, giving people a thrill because they can simulate the so-called soap opera effect. Most TV and almost all film are shot at 24fps, which in conjunction with "incompatible" 60hz TVs has accustomed us to judder and unrealistic motion...soap operas are often shot at 30fps, which looks more lifelike on a 60hz TV, hence the term "soap opera effect."

So frame rate does play a role here in terms of reproducing realistic motion (it would seem there's a lingering preference that film actually look pretty fake!), but I'm sure composition, focus, lighting, and camera quality play an enormous role too.
posted by aydeejones at 6:03 AM on July 28, 2008


I'm going to buck the trend here and say it's mostly the lighting. Here's an example I dug up. When you say something is produced with a 'normal video camera', you're probably seeing something from the upper row in that example. The pros bring lights, and so they get something more like the middle rows.

Using a tripod and composing a shot carefully are the next biggest things. Actually, using a tripod is probably the first. Up to this point, we're talking about things which are well within the budget of an enthusiast. All it takes is the time and care to set things up right.

Depth of field is the next big marker, and you can't fake that. It takes a camera with a big sensor or a funky adaptor.
posted by echo target at 7:26 AM on July 28, 2008


Fuzzy Skinner: The framerate is actually a big part of it...
posted by sharkfu


Just because my nickname in high school was Mr. Super Picky Pants, I'll point out that the framerate quote should have been attributed to delmoi.
posted by Fuzzy Skinner at 8:40 AM on July 28, 2008


Fuzzy Skinner: "Fuzzy Skinner: The framerate is actually a big part of it...
posted by sharkfu


Just because my nickname in high school was Mr. Super Picky Pants, I'll point out that the framerate quote should have been attributed to delmoi.
"

oops-- i hit the wrong quote link. i apologize.
posted by sharkfu at 8:53 AM on July 28, 2008


I believe you're confusing frame rate with refresh rate. 120Hz LCD panels are the hot feature in TVs right now. They refresh the screen faster to reduce motion blur introduced by the slower refresh rates and refresh times of the technology.

Actually I'm pretty sure you're confused. The frame rate is the number of distinct images shown on the screen each second. The refresh rate is the number of times the screen is redrawn by the cathode in a CRT, in other words how often they "flicker". But since LCDs don't flicker, the term "refresh rate" would just be how often new data is uploaded to the screen, which would cap the actual frame rate anyway. Here is what wikipedia says on the page I linked too:
Much of the discussion of refresh rate does not apply to the liquid crystal portion of an LCD monitor. This is because while a CRT monitor uses the same mechanism for both illumination and imaging, LCDs employ a separate backlight to illuminate the image being portrayed by the LCD's liquid crystal shutters. The shutters themselves do not have a "refresh rate" as such due to the fact that they always stay at whatever opacity they were last instructed to continuously, and do not become more or less transparent until instructed to produce a different opacity.

The closest thing liquid crystal shutters have to a refresh rate is their response time, while nearly all LCD backlights (most notably fluorescent cathodes, which commonly operate at ~200Hz) have a separate figure known as flicker, which describes how many times a second the backlight pulses on and off.
But anyway, refresh rate has nothing to do with the 'smoothness' of a video. It's the frame rate that counts, and frame rates can be artificially increased using interpolation software, which is exactly what I said in my earlier comment.
posted by delmoi at 9:18 AM on July 28, 2008


Oh hmm. The wikipedia article does talk about 120 Hz "refresh rates" on LCDs in order to do the correct timing of frames. But that's not what I was talking about when I said the framerate was interpolation. Showing a 24FPS film on a 120Hz screen would mean just showing the same frame 5 times. But what these screens were doing was showing one frame once, then showing a combination of that frame and the next one on through to create smooth motion.
posted by delmoi at 9:24 AM on July 28, 2008


Video Frame Rate vs Screen Refresh Rate:

What Refresh Rate Means

With the introduction of television display technologies, such LCD, Plasma, and DLP, and also Blu-ray Disc and HD-DVD, another factor has entered into play that affects how frames of video content are displayed on a screen: Refresh Rate. Refresh rate represents how many times the actual Television screen image is completely reconstructed every second. The idea is that the more times the screen is "refreshed" every second, the smoother the image is in terms of motion rendering and flicker reduction.


In terms of interpolated frames, I guess it comes down to if you consider them "real" frames or a display technique. Coming from a film background, I do not. Since the director didn't put them down on film that way and the effect would vary from TV to TV (Sony's version would vary from Samsung's version, etc) they don't feel real to me. Maybe I'm taking too strict a view on that, though. And I haven't seen it in person so they might make the video look good.
posted by sharkfu at 10:04 AM on July 28, 2008


I thought they looked ridiculous, FWIW.
posted by delmoi at 11:10 PM on July 28, 2008


« Older How do I make an NPR audition recording?   |   Suggestions for making a Linux boot flashdrive Newer »
This thread is closed to new comments.