Filmscanner input resolution?
January 8, 2005 12:16 PM   Subscribe

Filmscanner 101. Is there any advantage to setting the *input* resolution for scans intended for monitor display to anything greater than 72dpi? The default input resolutions on my scanner vary with the dimensions selected for the output file. Bigger output files require higher input resolutions (from 704dpi to 1759dpi). As everything will be displayed at 72dpi, why not *scan* everything at 72dpi? It all looks the same to me on my screen.
posted by carter to Technology (8 answers total)
 
My guess is zooming and printing.
posted by hendrixson at 12:25 PM on January 8, 2005


Monitors are not 72 dpi in any useful way. Ignore the input resolution and concentrate on overall size in pixels (if the software allows you to work that way). Scan at either the size you require, or if you're applying any effects or filters later, scan at 2 or 4 times the size and scale down afterwards, which will help hide the filtering artifacts.
posted by cillit bang at 12:37 PM on January 8, 2005


It depends on what you plan to do with the image. If there's a chance you may need to enlarge it later, zoom in on something, or print it, you'll most likely want to go with a higher dpi. Scan tips actually explains it better than I can.
posted by Slack-a-gogo at 1:35 PM on January 8, 2005


More from scan tips abut the myth of 72 dpi.
posted by loquax at 1:37 PM on January 8, 2005


It depends on what you're scanning.

If your images are mostly of landscapes for example, or other images with continuous tone you could probably get away with scanning at 72 dpi. But if there's fine detail in the slide, then I'd recommend scanning at the larger size then downsampling in Photoshop for optimum results. Photoshop's interpolation, on the whole, seems better than that of bundled scanner software.

It's the same principle that David Siegel recommends for 72dpi graphics in his book on web design. If you've got more detail to work with initially, you have more control over the final detal in the 72dpi version. For example, if you use Photoshop to sharpen an image at a larger size, the results are much better than sharpening the 72dpi version.

The other reason is "just in case." You might want to change the display size at some point in the future. You shouldn't interpolate up, and you shouldn't interpolate down more than once.
posted by Jeff Howard at 2:12 PM on January 8, 2005


I think you'll find that most software has better interpolation than most hardware. Hardware, especially that in low-processing-power devices like scanners and cameras, need to consider speed and code size. On a computer, where you can have gigabytes of memory (RAM and disk), a wicked-fast processor, and a user who isn't going to panic if it takes a second or two to complete, you can do things that are much more computationally intensive.
posted by five fresh fish at 2:28 PM on January 8, 2005


Response by poster: [Thanks, everyone; this is all very useful.]
posted by carter at 2:39 PM on January 8, 2005


Is there any advantage to setting the *input* resolution for scans intended for monitor display to anything greater than 72dpi?

Well, yes. You want to scan as great a resolution as possible, because your monitor only uses about 1/4th the dpi of print media (high-gloss fancy-pants magazines use even greater print resolutions). Dot-for-dot, that means if you scan something at 72dpi, then print it out on a 600 dpi printer, your image is going to be 1/8th as large as the original. That's no good.
posted by Civil_Disobedient at 4:48 PM on January 8, 2005


« Older Help me win the war against spyware   |   What does orbesque mean? Newer »
This thread is closed to new comments.