wants more knob
December 12, 2006 4:44 PM   Subscribe

why did monitors switch to buttons with menus for controlling brightness and contrast, rather than knobs? knobs are so much faster and easier - it was like a dimmer switch for your desk!
posted by wumpus to Computers & Internet (18 answers total)
Because there's a lot more to control than just brightness and contrast, so an interface for dealing with a large number of settings made more sense?
posted by TonyRobots at 4:51 PM on December 12, 2006

Response by poster: yeah, but there was always two knobs just for those two, then a button which pressed, then controlled using those knobs; remember?
posted by wumpus at 4:57 PM on December 12, 2006

buttons are cheaper. A lot of interface went to buttons over the years (think tuning in a car stereo) and just recently for things like that, where a knob is a much better interface, are they coming back. Presumably monitor settings are almost never adjusted, so the cost of the knob is not worth it. Plus, they take up additional space on newfangled flat lcds.
posted by defcom1 at 5:00 PM on December 12, 2006

I think it's a valid question, still. I do calibrations for professional-grade CRTs where I work (animation/sfx house) and find the process immensely frustrating. Calibrations could be done much more efficiently if the the gain, bias, brightness and contrast settings could be tweaked with a rotary encoder rather than buttons.

On monitors that $1000, why the hell not?
posted by Evstar at 5:10 PM on December 12, 2006

In the very old days, the adjustments were made with actual analog potentiometers. These had a number of drawbacks, especially on a multisync monitor where settings for multiple scan frequencies had to be remembered. So at some point in the early 90s everything switched over to being digitially controlled, so that there was a microprocessor that could remember the geometry, alignment, convergence, etc. independently for each resolution. It also saved a lot of board space as things become much more tightly integrated. So naturally, the controls turned to up/down since interfacing a rotary control that is actually controlling a digital value is much more expensive than simply two binary up/down buttons. And since CRTs used to be extremely expensive (especially large ones), there was a lot of pressure to reduce the cost, so I can see something that is seldomly used as being prime candidates for savings.
posted by Rhomboid at 5:31 PM on December 12, 2006

Because digital is better?

OK, you can stop laughing now. The fact is, monitors are completely microprocessor controlled nowdays. Contrary to defcom1, analogue knobs are still cheaper than buttons + microprocessor + D-A converters. The extra value of buttons over knobs comes from things like :
  • re-usability of design - no need to carry different value parts for different models, just change the firmware.
  • automated calibration - the calibration station can check the screen and update calibration values in ROM without having to involve an expensive human.
  • price differentiation - keep the same hardware, give it a prettier new UI, and bingo!, you can charge more for essentially the same product.
That, and the fact that people have been so convinced that digital automatically equals better that sometimes we demand digital when it's actually the inferior solution to a problem.

As for the $1000 pro level monitors? Dunno... I suspect it's because manufacturers know most purchase decisions are made by non-techies, and buttons are just so much more high-tech and obviously better than old-fashioned knobs.
posted by Pinback at 5:34 PM on December 12, 2006

It's not that digital is better, it's that it's cheaper, more consistent, and more reliable.
posted by Steven C. Den Beste at 5:36 PM on December 12, 2006

By the way, digital is also more flexible. A monitor I used to own had a USB port, and you could adjust it using a program on your computer instead of having to muck with the buttons.

It had about 30 settings, all told. Trying to do all of that with pots woud have been ridiculous.
posted by Steven C. Den Beste at 5:38 PM on December 12, 2006

The fact is, monitors are completely microprocessor controlled nowdays. Contrary to defcom1, analogue knobs are still cheaper than buttons + microprocessor + D-A converters.

I think you're missing part of the equation. Buttons + microprocessor + D-A converters are cheaper than analogue knobs + A-D converters + microprocessor + D-A converters.

Modern monitors are going to be uprocessor-based regardless of whether they use knobs or buttons. Knobs take up far more space, and are more expensive. Ergonomics is relegated to third place.
posted by -harlequin- at 5:40 PM on December 12, 2006

There are less physical components when you integrate all the controls which means lower cost. I'm also fairly sure the performance of the integrated solution is much better which might translate to better refresh rates etc.
posted by robofunk at 5:44 PM on December 12, 2006

I think you're missing part of the equation. Buttons + microprocessor + D-A converters are cheaper than analogue knobs + A-D converters + microprocessor + D-A converters.

Thanks, that is what I meant. As others have already stated, monitor controls went all digital, which is when the knobs disappeared.
posted by defcom1 at 6:39 PM on December 12, 2006

Menus let you have fewer controls, which is cheaper. They also let the device change settings for you, so that you can have different profiles for different intputs or different sync modes or whatever.
posted by aubilenon at 7:18 PM on December 12, 2006

buttons are cheap, knobs are expensive
posted by caddis at 7:30 PM on December 12, 2006

Also, buttons are less likely to fail than knobs.
posted by Doohickie at 7:44 PM on December 12, 2006

The digital/analogue thing doesn't matter, folks. Just because it's a knob doesn't mean it directly controls the setting -- it can still report to a processor which can change the settings in software (think of the iPod scrollwheel).

The only reason I can think of for not using them is cost, and beyond that because most appliances have pretty damn woeful design. The knob is making a comeback however, like it has in kitchen appliances and cars. Give it time...
posted by bonaldi at 8:03 PM on December 12, 2006

Here's one reason -- it's easy to accidentally move a knob, including moving it by a small amount that wouldn't be obviously noticed, which could be a significant issue if you're relying on your color-calibrated (around the un-accidentally-changed settings) display to produce accurate, repeatable results.
posted by BaxterG4 at 9:20 PM on December 12, 2006

Optical encoder dials allow easy integration with microcontrollers, and they are pretty cheap, although still a lot more expensive than switches. I think I've seen high end monitors with encoder dials..

Unfortunately, a lot of the benefits of dedicated potentiometer adjustments are lost with encoder dials. In particular, old style pots have hard end stops indicating the limits of the range of adjustment, a line to display the current setting, and sometimes detents to help in locating important settings. It is pretty easy to read a cheap potentiometer with a microcontroller, and this is done sometimes, but it is very hard for a microcontroller to set a dial*. If the user requires any alternate ways to access a setting, like a remote control, old style controls are just not practical.

*Bookshelf stereos and stereo receivers in the late 80s and early 90s had motor driven pots. I collect these out of old/broken units whenever I can because they are awesome and hard to find.
posted by Chuckles at 8:49 AM on December 13, 2006

I would settle for a single knob and an array of buttons to select the function correlated to the knob. I have a bedside radio that more or less uses that concept. I for one HATE the poke-poke-poke-poke required to make adustments on most button interfaces.
posted by cairnish at 9:01 AM on December 13, 2006

« Older How can I optimize an audio streaming setup?   |   Quotation (About A Heart, I Think) From Toni... Newer »
This thread is closed to new comments.