How to relate Pixels to inches
June 7, 2007 8:44 AM   Subscribe

I cannot relate picture size in pixels to actual dimensions.

I am looking for a source to educate me in how to relate a picture on my HD from Pixel size (say 800 x 600) to dimensions in inches. Do such things relate? Why is it so confusing? Any help is appreciated.
posted by JayRwv to Computers & Internet (16 answers total) 4 users marked this as a favorite
 
Could you be at all more specific? Are you talking about the size, in inches, of the thing you took an image of? How big of a print you can make? How big the physical sensor is? How big a pixel on your screen is?
posted by 0xFCAF at 8:55 AM on June 7, 2007


It depends on the DPI of your output device. If you're printing on a 300 DPI printer, then a 800x600 image will be (800/300)x(600/300) inches, or 2.6x2 inches.

Of course, this is without any kind of scaling. You can print at any size via scaling, but the above calculations will give the most accurate reproduction of the image.
posted by demiurge at 8:58 AM on June 7, 2007


The number you need is called "DPI" "Dots per Inch". 75 DPI and 300 DPI are common. Different displays will have different DPI ratings, and that will also depend on the display mode.

It isn't necessarily square, either; in some cases the vertical DPI is different than the horizontal DPI.
posted by Steven C. Den Beste at 9:00 AM on June 7, 2007


For screen display, the "standard" is 72 pixels per inch, but it's sort of a fiction, as best as I can tell.
posted by Richard Daly at 9:04 AM on June 7, 2007


Do such things relate?

No, they don't. Conceptually your question comes down to "how big is one pixel?" And the answer is "I dunno, how big is one pixel?" How big's your output device? What is your output device? Is it a screen? A printer? Both?

Let's try it from another tack: how big is 800x600? Well, I have a 21" CRT screen. I can set its resolution to 800x600 and 800x600 will be just about 20 inches across the diagonal. But wait! I also have an ancient 14" VGA monitor hooked up as a second display next to it. It's actually already set to 800x600 resolution, so that must mean it's not 20" diagonal, but 14ish! GOOD GOD, IT'S ALL GONE WRONG!

Why is it so confusing?

Because a pixel is a dot. It's not a measurement. It's just a dot. It turns out a 600 pixel line is about an inch long on my cell phone, but several inches long on my crappy nearly-dying secondary screen. Why? Because a pixel isn't a measurement. It's like saying a smurf is three apples high. How big's an apple?
posted by majick at 9:05 AM on June 7, 2007 [2 favorites]


This is a pretty good explanation, just replace "bitmap" with whatever file type you're using, and the same principles apply:

a) The only physically fixed dimensions that a bitmap image has are the number of pixels wide and high in the image (eg 900 x 700 pixels)

b) You can choose to reproduce that bitmap at any physical size, simply according to how large each pixel is made at the time of printing. For example, if each pixel is printed 1/100th of an inch wide, the printed 700 x 900 image would be 7" x 9".

c) Resolution is a measure of how accurately defined an image is, in terms of how many pixels appear in each unit of width or height. The effective resolution in dpi of the final printed image is therefore just the actual width of the printed image divided by the number of pixels shown across that width (ie 100 dpi in the example above).

d) Software may allow you to attach a dpi value to a particular file, or may attach one by default (typically 72 or 96 dpi for a screen oriented application, or 300 dpi for a profession print oriented application). But this number is entirely arbitrary and serves only one purpose - it is to allow the software to represent dimensions in 'real world' units such as inches, rather thna pixels. It does not alter or control the definition of the image itself, merely what dimensions will be shown on the display and, usually, what default printing width will be used. (via)
posted by desjardins at 9:07 AM on June 7, 2007


Oh and Yes, the 72 dpi "standard" is a fiction.
posted by majick at 9:08 AM on June 7, 2007


One tool that might help to understand the relationship is Photoshop. If you open your 800x600 image and access the Image Size dialog box, it'll show you the relationship between three measurements:

1) The pixel dimensions of your image
2) The resolution of your image in dots per inch (DPI)
3) The resulting physical dimensions

If you adjust the physical size, Photoshop automatically adjusts the resolution to compensate, and vice versa. For instance, if you decrease the physical size it'll increase the resolution.

What you'll notice is that the pixel dimensions don't actually change while you're doing all these manipulations. Physical size can be whatever you want, as long as the DPI changes in tandem. The number of pixels don't change--only their size.
posted by Jeff Howard at 9:29 AM on June 7, 2007


It depends on the DPI of your output device. If you're printing on a 300 DPI printer, then a 800x600 image will be (800/300)x(600/300) inches, or 2.6x2 inches.

This is wrong. DPI is dots per inch, not pixels per inch (PPI). Dots are ink. It typically takes many dots to create a pixel, because a pixel has colour and a dot does not, so some dots and spaces between them are needed to create a gradient (ie a colour) in order to depict one pixel.
(This is also why the DPI of modern printers is usually more in the ballpark of 1200 DPI than 300. 300 DPI isn't very good).

The short answer is that you can print an 800x600 image at any size you like. The only caveats are visual: the bigger you print it, the larger the pixels will be, which can be an ugly look if they're too big. Likewise, the smaller you print it, some of the detail may be lost as you reach the point where picture contains more detail than what the printer can print at that size.

If it helps to have a ballpark, my own preference is often to print at 10 pixels per mm, because it's very easy to calculate print size from resolution (800x600 becomes 80mm by 60mm), and this is a level of detail that is beyond almost all printers, but not greatly so. The result being that some detail is lost, but pixels cannot be seen. (It also makes your pixel-level grids and editing match up exactly with your print dimensions, but that's probably not important to you :-)
posted by -harlequin- at 9:33 AM on June 7, 2007 [1 favorite]


You might want to skip this, as it might confuse you more, but if you're less interested in what size you want your image printed, and more interested in what size gives the maximum print quality with a printer, then assuming a halftone process, the number you want is the printer's lpi (lines per inch).

LPI sort of sits between DPI and PPI. Your PPI should 1.5 to 2 times the printer's LPI to get maximum print quality, and the LPI will be a fraction of the DPI.
posted by -harlequin- at 9:46 AM on June 7, 2007 [1 favorite]


The thread that majick linked to had a useful discussion on this.

I think it is useful to to draw a distinction between logical resolution and physical resolution.

Macs have always used a logical resolution of 72 ppi; Windows has always used 96 ppi. What this means is that if you are in a drawing program that shows on-screen rulers, and create a line that appears to be one inch tall (as measured by the on-screen ruler), the line will be drawn with 72 pixels on a Mac screen, 96 on a Windows screen.

The fact is, however, that these days the physical resolution of many screens is much higher. With some of the high-res laptops these days, the actual number of pixels that fits into an actual inch may be more like 150--so that line that supposedly measures one inch on screen may be only about half an inch long if you hold a physical ruler up to it.

If you display an 800x600 pixel image on a screen with a physical resolution of 150 dpi, it's going to be 4" tall if you hold a ruler up to it.

What if you print it? That depends. With a bitmap image like a photo, it may assign one (logical) pixel to one dot. In which case, demiurge's answer is correct. But software may assign a logical dpi along the way, and force it to print one logical pixel so that it covers 1/72", or 1/300", or whatever, even though the printer may be capable of printing dots that are 1/1200". I've noticed that Photoshop is print-centric: it seems to give priority to intended print size, and makes the total number of pixels and dpi fit that, which is a little disorienting.
posted by adamrice at 10:07 AM on June 7, 2007


This is wrong. It typically takes many dots to create a pixel, because a pixel has colour and a dot does not, so some dots and spaces between them are needed to create a gradient (ie a colour) in order to depict one pixel.

That's right. In my calculations I was assuming a continuous tone printer. Your common ink jet printer will have to use dithering to reproduce colors and thus needs a higher DPI.
posted by demiurge at 10:09 AM on June 7, 2007


Here's a blog entry I wrote last year about this very topic.
posted by plinth at 10:30 AM on June 7, 2007


Response by poster: There is enough information you have provided to keep me busy for a considerable time. I will digest it and check the links give. Thanks for all the great great responses.
The question was prompted by "picture-resizer" software. I have a picture that is 12 inches tall by 20 inches wide. Of course I have no need for an image that big. So when I went to resize the image I was given options "in pixels" ie 800x600 for example. It would make more sense to me if the option was in inches, ie 4' x6", for example. I relate to 4 x 6, 5 x 7, etc for printing. Thanks again.
posted by JayRwv at 10:55 AM on June 7, 2007


When you say "I have a picture that is 12 inches tall by 20 inches wide," what you mean is "I have a picture that is 12 inches tall by 20 inches wide when printed/displayed by a specific program or device."

If you used a different program to print/display the same image, then it could come out at a different physical size.

If you are printing the picture, you do not want to resize the image to a different number of pixels. That would almost certainly decrease the resolution and level of detail in the image. Instead, you should keep the image file the same, but tell your printing software to print it at a specific physical size. Any good photo software will let you do this.

If you just want to look at the picture on your own computer screen, you don't need to resize the file. Instead, just use the "zoom" buttons in any image-viewing program to display it at a larger or smaller size.

If you want to put the picture on a web page or otherwise send it to other people to view on screen, then you should resize the file to fit on a typical computer monitor. Common monitors are around 800 to 1600 pixels wide, so an 800- to 1600-pixel-wide image will take up the whole screen when zoomed in to show maximum detail. The size in inches will be different on different monitors, but usually around 72 to 100 pixels per inch.
posted by mbrubeck at 11:13 AM on June 7, 2007


What I have noticed in Photoshop, InDesign, etc. is that even if the measurements are displayed in pixels, you can type the measurement you want in inches and the software will convert it.
posted by elle.jeezy at 1:27 PM on June 7, 2007


« Older Does a 33-year-old in good health need a daily...   |   Wiki backend and Wordpress frontend? Newer »
This thread is closed to new comments.