Why do some movies looks worse in HD?
August 30, 2010 10:34 AM   Subscribe

Why does avatar look so much WORSE on nice HD TVs?

I was walking in a mall today, where the had a large display of very nice hi-definition LCD, OLED(?), Dilithium Crystal, Flux Capacitor, Dark Matter, Plasma, other various flat screens, and even some 3D displays.

All the TVs were showing the movie Avatar.

I couldn't believe how crisp some of the more expensive TVs were, but more noticeably, I couldn't believe how terrible the movie looked.

All the scenes looked incredibly fake. The CGI stood out against the live actors, and many of the previously beautiful scenes were now obviously shot in a studio and not outside.

Most glaringly, the pacing seemed off. The action appeared to be happening too fast, like everything was subtly sped up.

Can someone explain this sped up effect to me? Something a bit more technical than, "In HD you notice the little details more". Because this pacing effect was more extreme than just seeing the makeup or acne on an actor. It was actually quite jarring.

If it helps, these were all Philips TVs being displayed.
posted by Telf to Technology (14 answers total) 10 users marked this as a favorite
I'd guess this is because those TVs use motion interpolation, which makes 24 fps film look like video - giving it that goofy "sped-up" soap opera look. My friend's TV does this and it drives me nuts.
posted by theodolite at 10:38 AM on August 30, 2010

Response by poster: Soap Opera-like is how I'd describe it. You might be on to something.
posted by Telf at 10:40 AM on August 30, 2010

The more expensive ones were probably upscaling, or similar. That tends to create the pacing issue you describe. It can look wonderful if the movie was intended for it, and kind of completely bizarre if it wasn't.

Avatar would probably look fantastic on the same TV if the settings were adjusted correctly.
posted by FAMOUS MONSTER at 10:41 AM on August 30, 2010

Yeah, it's motion interpolation. Ordinary film is 24 fps; today's HD sets interpolate to 120 or even 240 fps. This means that 4 of every 5, or 9 of every 10, frames are not part of the original program material.

Every set that has such a feature allows you to turn it off if you don't like it.
posted by kindall at 10:44 AM on August 30, 2010

Some HDTVs, especially the ones with 120Hz frame rates, have a sort of motion enhancement mode that in effect interpolates and adds frames in between the frames actually present in the video.

Example: a Sully arm motion was shot at 24 frames per second and takes 1 second to complete. On the Blu-ray disc, there are 24 frames of Sully moving his arm. The TV actually processes the video before it displays it, interpolating 24 frames into 48 frames and displaying them twice as fast. The worst of these systems only apply this to high-motion areas of the frame, so background images continue on at the same 24 frames per second.

Now all the Avatar CGI was done with very expensive computers with a lot of human oversight very slowly. Your TV is doing this with a mass-produced relatively simple chip with no real-time human oversight very quickly. In effect, your TV is re-animating the film. Of course it's going to look fake.

On footage with no or subtle CGI, it tends to look to me like the background was shot on film and the foreground or subject was shot on video. Very jarring to me, but I guess most people don't mind?
posted by infinitewindow at 10:46 AM on August 30, 2010 [1 favorite]

Also: Torch Mode (plenty more on that term in the Google).
posted by rhizome at 10:52 AM on August 30, 2010

I would first place doubt and blame on the video distribution system being used to send the source signal to the displays on the floor. Stores cheap out on these systems frequently, use goofy cables, and rely on people with often only a passing knowledge to connect the system together.

I've seen whole rows of decent HDTVs hooked up via analog cables with HD15 connectors and the video source a returned laptop running windows media player. On many HDTVs, using an HD15 connector will display the source without any video processing, and the analog signal is getting beat to hell and back scooting along through cheap, daisy-chained splitters.

On top of that, who knows how many people have been monkeying with the display settings.

I would cast aside nearly everything you see in a standard store demo.

Additionally, some of the FASTER! BIGGER NUMBERS!!1 120-480 Hz TVs will display that soap opera weird video framerate when fed certain types of video.
posted by terpia at 10:56 AM on August 30, 2010 [1 favorite]

A link on the Soap Opera Effect.
posted by terpia at 10:59 AM on August 30, 2010 [1 favorite]

Because, like all movies, it was shot to be displayed on the big screen, with the audience at least 15' away. When you take this stuff and display it on a nice HD screen up close and watch it for the second or third time, you start noticing the limitations of film-making. Some things don't look quite right, you spot errors, your eyes try to figure out where the CGI ends and reality begins, etc.

As a rule, I try not to watch a move I especially liked more than once a year. Once you start noticing imperfections its hard to un-notice them.
posted by damn dirty ape at 11:12 AM on August 30, 2010

I think it's worth mentioning here that this is why old films and tv generally looks so bad: we're viewing it on equipment it wasn't designed for. Yes, it was made during a time when effects were cheaper and not as good, but people were also viewing it on the equipment it was designed for.
posted by devnull at 1:16 PM on August 30, 2010

FWIW, Mall TV's are often badly adjusted- they crank the chroma and contrast up way higher than anything you'd do at home. When you bring your TV home, you often have to spend some time working on the picture to get it rational (they do this, because they need to compete with the other TV's- and people gravitate towards the rich color and high contrast when comparing from one TV to another).

Avatar, with it's electric blue cast of characters, seems as though it would suffer from this effect more than most movies.
posted by jenkinsEar at 1:34 PM on August 30, 2010

nthing motion interpolation. Like theodolite, my friend's and father's television both do this. You can turn it off, but my friend and father both claim that they can't see the difference (they are insane).
posted by King Bee at 2:05 PM on August 30, 2010

I'd guess this is because those TVs use motion interpolation, which makes 24 fps film look like video - giving it that goofy "sped-up" soap opera look. My friend's TV does this and it drives me nuts.

This. I was *just* in a store that had Alice In Wonderland playing this way, and while it made the TV look amazing, it made the movie look like crap. So presumably it's like the "Vivid" setting on these televisions -- it's going to be turned on in the store to make the TV look better, but when you get it home you want to turn the feature off.
posted by davejay at 5:12 PM on August 30, 2010

As far back as Attack of the Clones, I noticed that wholly digital films have looked kind of crappy on TV as opposed to the theater. In the theater, the final arena battle looked amazing and realistic, on DVD, sterile and clearly CGI, bad enough that I won't watch it willingly (quality of the story aside). I think it's pretty common.
posted by Ghidorah at 5:47 PM on August 30, 2010

« Older Needed: Post-Summertime Eligible Dependent Care...   |   Am I being too ambitious? Newer »
This thread is closed to new comments.