How does auto adjust work on an LCD monitor?
October 15, 2008 4:53 PM   Subscribe

How does the auto-adjust feature on LCD monitors work?

Specifically, in what ways does it take the current image on the screen into account? Does the source matter? Are there test patterns that would produce better results than a random image of whatever happens to be on the screen?

posted by wastelands to Technology (11 answers total) 2 users marked this as a favorite
What really needs to be adjusted, besides the brightness and contrast? I thought they used a phototranny to measure ambient light, and adjusted brightness/contrast according to a table to match the room lighting conditions.
posted by Class Goat at 5:17 PM on October 15, 2008

The ambient light sensor on my MacBook Pro is under the left hand speaker grille -- I can make the screen and keyboard change brightness by covering/uncovering it with my hand.

I don't use the auto-brightness because it always comes out too bright for me in a pitch-black room. I manually turn both screen and keyboard down to only one notch above "off." But I did a test for you...

My quick experiment shows that it seems to "take into account" the screen's image, but only in the sense that that image, if bright white, shines ambient light into the sensor the same as any other source might. It doesn't seem to get special treatment.
posted by rokusan at 5:27 PM on October 15, 2008

A clarification: One of the things I'm wondering about in particular is how it aligns the image to the screen. You know how when you change to a new resolution or install video drivers and the whole screen is off to the left, or whatever?
posted by wastelands at 5:33 PM on October 15, 2008

Well, unlike CRT's, LCD's have an exact amount of pixels in known positions. Educated guess: it intercepts the video cable signal and looks for lines of 0's (dark, unused pixels) and moves the image over accordingly.
posted by Mach5 at 6:08 PM on October 15, 2008

I'm not sure whether @Mach5 is right about finding black pixel signals, but he's on the right track.

A CRT is an analog device. It always has its little beam shooting the phosphors, at a known rate. Then, in a fantastically analog way, it uses whatever the current video signal is to light the current pixel phosphor at the current time. If your monitor is pretty good, and your video card is pretty good, you can usually get all of that synched up pretty close.

The sync portion of the signal tends to be VBLANK, which is the "vertical refresh period", or the time that it takes for the gun to return from the bottom right to the top left of the screen when it's finished a frame. We can deduce that a pixel arriving before VBLANK must be from a previous frame, and so the first pixel after a VLBANK must be the first pixel of the frame--we're synched. That doesn't mean, however, that the pixel the computer sends at time t is actually the pixel that should appear at the gun's coordinates at time t. We can get the timing, but not anything spatial (other than overall resolution), from the signal.

Of course, you can play with those knobs under the monitor and point the electron gun at a different place. You can tweak its scan pattern extensively. All of those knobs vary the relative strength of the deflection coils at various points in the cycle. If you slide the thing off the side, you can get the electron gun to shoot portions of the glass that haven't been treated with phosphor.

An LCD is a digital device. Unlike the rube-goldberg contraption of shooting electrons of various strengths* from a gun on an uncontrollable rail at a specially treated and masked piece of glass, we just tell each pixel how opaque it should be. There's nothing to slide around, because should does the monitor go about setting the opacity of a pixel that doesn't exist?

So, with an analog (VGA) connection, your LCD is going to do the same sort of signal synchronization as the CRT would. Except that the LCD knows where its pixels are already--there's no option for sliding the picture around.

Also, the whole process is made far easier by a digital connection (HDMI or DVI).

So, long story short: your LCD isn't really auto-adjusting. It's just doing the only thing it can with the given signal and its pixel array. You should really be asking why your CRT can't auto-adjust, where it might actually be useful.

*Oversimplification. All electrons are indistinguishable.
posted by Netzapper at 7:43 PM on October 15, 2008

how it aligns the image to the screen

They only do that with analog signals, because they only need to do that with analog signals. Basically, yeah, it's finding the edges of the VGA signal and aligning them with the screen. Pretty much any image will do for that.
posted by kindall at 7:46 PM on October 15, 2008

Netzapper, I'm sure you've used LCD monitors where the phase and pitch aren't correctly applied. This is of course the case when taking the analog VGA signal and applying it to the fixed-pixel screen. Digital connections don't have the adjust feature. They always get applied to only the visible area by their very nature, but if you are hooked up with a VGA cable, that's not a digital signal.
posted by odinsdream at 8:08 PM on October 15, 2008

Although if you're talking about the all-digital DVI to HDMI connection, it's not exactly a 1:1 map, for some reason. I'd love to learn more about that, since it's giving me a serious headache getting my computer to display a full image on an HDTV.
posted by odinsdream at 8:10 PM on October 15, 2008

If I had to guess, I would say the auto-adjust just advances or retards the time when it samples the analog rgb values to maximise the difference between two adjacent pixels.

I've just done a quick non scientific experiment on a dell 2407wfp over vga (native res 1920x1200), by trying the auto adjust with an all black image displayed then switching to a checkboard pattern (alternating black and white pixels) and seeing how good the adjustment was. Turns out an all black image doesn't auto adjust very well, but both my normal desktop and the checkboard pattern seem to produce the same good results.

odinsdream, the problem I have with using hdmi out from a PC to an LCD TV is that the LCD has a native resolution of 1360x768 whereas the PC will only output a stock 720p signal over hdmi (1280x720). So to fill the screen the TV scales the picture, which looks fine on a games console or a video signal but absolutely awful with the PC output. I don't know enough about HDMI, but if my understanding that the video signals are the same as DVI is right, the PC should be able to output any valid resolution - however TVs may only accept the stock resolutions.
posted by samj at 2:13 AM on October 16, 2008

The auto adjust adjusts the clock-pitch and phase of the signal to accurately map the incoming signal from the video cable to the separate, discrete pixels on your monitor.

A detailed explanation is here.

To see the process at work, visit this page, go full screen, and hit auto adjust.
posted by Pastabagel at 10:46 AM on October 16, 2008 [2 favorites]

Pastabagel, that link is awesome. I can actually see it happening. Thank you.
posted by wastelands at 6:47 PM on October 16, 2008

« Older I want to clone the obamanauts.   |   Recommend me a laptop! Newer »
This thread is closed to new comments.