Don't you know we're talkin' 'bout a resolution?
April 8, 2005 1:52 PM   Subscribe

Just curious: why is 72dpi the standard resolution for the internets?

A question that came up while we're sitting here willing Friday's work day to end. My meagre Google-fu and Wiki-fu found nothing.
posted by papercake to Computers & Internet (31 answers total)
 
Best answer: It's actually from the first Macs --- their screens were very close to 72 dpi which also means that 1 pixel on those screens equals 1 pt. More info here.
posted by nathan_teske at 1:57 PM on April 8, 2005


Because there isn't a "standard" resolution, and most people in the "industry" work on Macs, which, as pointed out above, were originally 72 dpi.

dpi of most monitors varies wildly, from less than 72 dpi, to well over 120 dpi.

So, 72 dpi is standard just like the edison plug socket is a world standard. It isn't. :)

Someone has a hate site on the internets you might want to dig up, all about why people that believe everything on a screen is 72 dpi are morons. [heh]
posted by shepd at 2:14 PM on April 8, 2005


What's 72 dpi equal in the metric system?
A nice round number?
posted by Rash at 2:39 PM on April 8, 2005


You can visit here for a quick summary as well, if you can get by the author's underlying manic implication that screen resolution/display should be directly related to printer standards. We produce stuff for print on PCs and Macs all the time (and even both for the same job) and never have problems. Likewise for the Internet.

I suspect this current standard might change with higher resolution displays in future. Should be interesting. Add to that colour management and the emergence of vector graphics, the elimination of point size use by any good web developer and interesting times lie ahead.

Despite increasing resolutions at this time however, 72dpi still works and looks great on most monitors and saves bandwidth too.
posted by juiceCake at 2:47 PM on April 8, 2005


2.83464567 dots/mm

As alluded to by nathan, there are exactly 72 points in one inch.
posted by cillit bang at 2:47 PM on April 8, 2005


postscript's default coordinate system is 72 dpi, too (or in points, if you prefer).
posted by andrew cooke at 3:02 PM on April 8, 2005


sorry, that's misleading. postscript's default coord system is in 1/72 of an inch (or points). the minimum size of a "dot" or pixel depends on the individual printer (and the ink, really).
posted by andrew cooke at 3:04 PM on April 8, 2005


Best answer: "why is 72dpi the standard resolution for the internets?"

It's not. In fact, DPI is a meaningless value when the physical size of the output (the "I" in DPI) isn't and can't be specified, and it's certainly not a standard.

However, if we take the underlying meaning of your question rather than the literal question you asked, there's more of an answer:

"Why are commonly available graphics on the internet, such as those used on web pages, so damn miniscule? Whose stupid idea was that?"

At one time, it was somewhat rare and costly to have a display subsystem capable of resolutions higher than the IBM PS/2's 640x480 VGA. The average output device was a CRT measuring 14" diagonally or less!. Throughout the 90s, this remained a common configuration for many business and home PCs, although CRT sizes slowly but steadily increased. Even today, it's not unheard of to find a large 21" display being driven at 640x480 (I could throw a rock at one from here), though it's rare.

Because this 640x480 resolution was prevalent for such a great many years and more recently has represented a least common denominator, many individuals creating images for display on screen continue to treat the display as an extremely low resolution output device.

To further complicate matters, a very popular image manipulation program used by many people who create images for use on the display -- Adobe Photoshop -- continues to attempt to apply the concept of a mapping between inches and pixels to electronic images. Worse still, it uses the term "DPI" to do so throughout much of its user interface. While this is an inordinately valuable facility in an image manupulation program, and is doubly so for those visual artists who have a tenuous comprehention of what pixels are, it causes some confusion that persists even among those who you might think ought to know this stuff well.

Then of course, there's the short answer to your question: If, when rescaling an image in Photoshop, you type "72" into the dialog box's field marked DPI -- despite DPI being meaningless in this particular case -- the image will probably be reduced in resolution such that it will fit comfortably on the screen.

Why the arbitrary number 72? Because that's what many people who use the Macintosh, up until recently a tremendous majority of Photoshop users, think equates to the resolution of a screen. This is chiefly because at one time, very long ago, Macintosh screens were carefully made such that an inch of screen contained 72 pixels. As nathan mentions above, there was a very clever reason for this; since it no longer applies and never applied to any other display subsystem, anyone who thinks screens are 72 DPI is an idiot.
posted by majick at 3:16 PM on April 8, 2005 [1 favorite]


72 DPI is, in fact, not the standard DPI of the Internets. 96 DPI is. When you specify a font size in points, it is converted to pixels by assuming 96 DPI. The CSS2 spec says to use 72 DPI, but nearly all browsers, including Mac ones, use the Windows standard of 96 DPI. (This is why 8 point fonts on Windows are readable but 8 point fonts on Macs are tiny.)

Internet Explorer for Mac had a sweet little function that let you hold a ruler up to your screen and it would figure out your real DPI for you. This way a font specified as 12 pt would really be 12 pt tall on your screen. Of course this defeats much of the point of running a monitor at a higher resolution.

DPI has nothing to do with images on the Web, as others have pointed out.
posted by kindall at 3:37 PM on April 8, 2005 [1 favorite]


majick speaks the truth. I remember getting incorrectly lambasted by a web designer that the JPEGs I had on a site were set at a meaningless 300dpi.

As an aside, as stated in nathan_teske's link, the original Apple ImageWriter dot-matrix printers also printed at a mere 72 DPI, resulting in nice pixelly WYSIWYG.
posted by zsazsa at 3:39 PM on April 8, 2005


(This is why 8 point fonts on Windows are readable but 8 point fonts on Macs are tiny.)

That is to say, this is why this is true in programs other than Web browsers. An 8 point font in Microsoft Word for Windows at 100% zoom looks much bigger on-screen than an 8 point font in Microsoft Word for Macintosh at 100% zoom, which is why Macintosh users tend to leave their zoom set around 125% in Word. (The font size is the same when printed on both platforms, of course.) Windows UI fonts are 8 points while Mac UI fonts are 12 points. Etc.

Web browsers, as I said, assume 96 DPI on all platforms.
posted by kindall at 3:41 PM on April 8, 2005


kindall, Windows has a DPI ruler, kind of like IE for Mac. It's really not perfect, as it seems to just switch your fonts between tiny and huge. As higher-res screens become the norm (my laptop has a 130DPI screen, and it wasn't even the highest option), being able to change the DPI your operating system thinks it's at is going to matter more and more.
posted by zsazsa at 3:47 PM on April 8, 2005


Actually, if you want to get really persnickety, a point is not 1/72nd of an inch--it is 1/72.27. The difference is because the point measure was rounded off to only 4 (?) significant digits.

The original Macs used 1/72 because it was close enough for the typography they could (then) produce, and it made for a familiar unit of measure. But real typography requires that the difference be respected.
posted by adamrice at 4:59 PM on April 8, 2005


I don't understand why this is handled so poorly. Any vaguely recent screen can report its physical dimensions. The screen resolution is known. Calculating the actual dpi value from this is trivial. So I don't understand why so many software blindly assumes 72 or 96 dpi.
posted by reynaert at 5:42 PM on April 8, 2005


PostScript points are also 1/72 of an inch instead of 1/72.27. This also means that 1 inch = 6 picas and 1 pica = 12 points. This simplification was a good thing.
posted by D.C. at 5:57 PM on April 8, 2005


These discussions always end up drowning me in the historical minutiae of typography and screen development, but they never produce any helpful instructions.

Given that Photoshop handles this in a confusing way, what are some guidelines for using it to reprocess images for the web vs. for print?
posted by Tubes at 10:36 PM on April 8, 2005


Response by poster: Well, this is much more interesting and confusing than I thought it would be. Whenever I've been asked for an image to post online, the specifications have been 72 dpi. (I am working in a Mac environment.)

So, lets say you have an image -- a 300 dpi image -- and you want to post it on the web, in your blog or whatever... at what resolution would you save it?

For instance, I notice the AskMeFilter logo gif in the upper left hand corner is at 72 dpi when opened in PS (not 96).
posted by papercake at 5:36 AM on April 9, 2005


it doesn't matter. images are stored as pixels. a 72x72 pixel image will take up 1" on a 72 dpi screen. it will take up 0.5" on a 144dpi display. what the image metadata itself says is irrelevant (except that it might be used by printing software as a hint for the expected default dimensions for printing when the image is printed alone).

i suggest you read the comments above again. this is what majick is referring to when they said DPI is a meaningless value when the physical size of the output (the "I" in DPI) isn't and can't be specified - the dpi is a property of the display, not of the image itself.
posted by andrew cooke at 5:42 AM on April 9, 2005


Tubes - just stop thinking about dpi as anything other than a knob you can twiddle in some software to change sizes inside that program. that's all it is. otherwise, you need to think of pixels (for raster formats like gif, jpeg etc; for vector formats - postscript/svg - the whole issue is pretty much irrelevant).
posted by andrew cooke at 5:46 AM on April 9, 2005


Response by poster: andrew, I think I understand the above comments, and the "meaninglessness of DPI" (which sounds like a koan). But, then again, perhaps I don't... when a web designer creates a page, how do they determine the pixel size of images, taking into account the various displays? Is it just a matter of checking out the page at the most common resolutions and making sure the page looks good in all of them? And, if that's so, why am I always asked for 72 dpi versions of print work for display? Shouldn't they get a higher resolution so they can have more options?

I'm sorry if I'm just being thick-headed. For some reason I'm having a hard time wrapping my mind around this.
posted by papercake at 8:22 AM on April 9, 2005


Is it just a matter of checking out the page at the most common resolutions and making sure the page looks good in all of them?

yes. note that the image doesn't stay the same physical size. the number of pixels is fixed, so if the display has more pixels per inch, the picture appears smaller.

And, if that's so, why am I always asked for 72 dpi versions of print work for display?

in practice, that's just shorthand for "image for display on a screen". really they are saying "an image that will look good when displayed at 72dpi" which is about what a screen is (96 is about 72, compared to 300, say). in contrast, if they ask for a 300 dpi image for printing, then what they're saying is that the printer will print at 300 dots per inch. so to print at the same size it appears on the screen an image needs about 3 times as many pixels.

so it's the combination of
  • the number of pixels in the image and
  • the resolution (dpi) of the display/printer
that is important. setting the dpi on the image doesn't do anything (outside of whatever photoshop does when you change the value) because that number describes the display/printer, and not the image itself.

is that clearer? disclaimer - i am not a graphics designer. just someone frustrated with the muddle when trying to talk to such people.
posted by andrew cooke at 9:14 AM on April 9, 2005


When I'm designign a graphic for a web page, I size it in terms of pixels. I don't say to myself "I want this to be 2 inches square", I say "I want this to be 200 pixels square." The browser just shows 1 pixel as 1 pixel (unless you've resized it), regardless of the putative image dpi, screen resolution, default browser resolution, etc.

As you know, the web is a fluid medium. You don't know how text will look next to a graphic because you don't know the type size or typeface that the UA will render the page in--you can just take your best guess as to the range of possibilities, and try to nudge it in the right direction. It is possible to resize an image in ems or points using CSS, but I don't think it's a good idea.

Why you would be asked for 72dpi versions, I don't know. The web doesn't care. The only scenario I can think of is this: if you've produced an image to print as, say, 4" x 5" @ 300 dpi, you definitely don't want to use that image as-is on a web page because it will be much bigger than most screens. So you need to downsample it so that it will show at a reasonable size in most browsers. I don't use PS myself, but I believe that if you tell it to change the image from 300 dpi to 72 dpi, it will downsample it while preserving the image's print dimensions. Mind you, this doesn't mean that it will appear as a 4" x 5" image on anyone's monitor (except for that rare person who is really running a monitor that displays 72 pixels per inch).

Sometimes you'll notice people distinguish between print and screen image resolution by calling the former dpi, and the latter ppi.
posted by adamrice at 9:19 AM on April 9, 2005


if you tell it to change the image from 300 dpi to 72 dpi, it will downsample it while preserving the image's print dimensions

the printer has software that will scale images to whatever size you want. most image formats include something that says what the size of a pixel should be. if that doesn't match the size of the printer's "pixels" then the printer software can scale the image. alternatively, the operator can scale the image before printing.

so if you take a 4x5" 300dpi image and "change it to 72dpi", you reduce the number of pixels in the image. you also change the metadata in the image to say that it was generated for a 72 dpi display. this (ppi) metadata is ignored by web browsers, but printer software may notice it and, if the printer is 300dpi/ppi, it will automatically scale the image by a factor of (300/72) so that it prints the same size. so you still get a 4x5" print, but it is "blocky".

that is just the printer software being clever. if you erased all the metadata, or used a stupid printer, it would print it with one pixel in the image to one printed dot on the paper, and the image would print smaller than 4x5" (it would be about an inch square if the printer was 300dpi).

so there are two separate things: number of pixels and image metadata. number of pixels is what is most important. it controls how things appear in browsers and controls the quality of the image. metadata is (optionally) used by printers to print at the "right" size, but this can be over-ridden by the operator (eg you, in the print dialog).

again, disclaimer - this is from second hand knowledge; i don't do this for a living.
posted by andrew cooke at 9:41 AM on April 9, 2005


sorry, i realise that my post just above contradicts slightly what i said earlier. i was trying to simplify things by ignoring resolution metadata, but maybe i just made things worse.
posted by andrew cooke at 9:43 AM on April 9, 2005


It's becoming clearer... I feel a fog beginning to lift...
posted by Tubes at 10:49 AM on April 9, 2005


reynaert: I don't understand why this is handled so poorly. Any vaguely recent screen can report its physical dimensions. The screen resolution is known. Calculating the actual dpi value from this is trivial. So I don't understand why so many software blindly assumes 72 or 96 dpi.

You'd think so! But Windows is not designed to compensate for your current/actual resolution. I think (imagine (guess)) that OS X can do that and does, but MS doesn't really seem to care. As sort of mentioned earlier, you can adjust your screen DPI if you know you're going to be sticking with a large resolution, but so incredibly few people use this feature of Windows that it is buggy and messes up a lot of software.

It's in Display Properties: Settings tab: Advanced button: General tab: DPI setting. "Custom setting" allows you to pick whatever you like.
posted by blacklite at 11:43 AM on April 9, 2005


In fact, CSS2.1 assumes 96 dpi.
posted by joeclark at 12:45 PM on April 9, 2005


i would appreciate it if someone who does this for real could confirm what i said. i'm worried i'm wrong somewhere. thanks.

also, CSS2.1 is more complicated than joeclark reports. it recommends 96 dpi when the display is at "arm's length" - that doesn't mean it assumes anything. the dpi of the display device is used internally so that the absolute lengths (points, inches, etc) come out correctly, but this only works if the dpi is known. so they are quite careful to say that things only work correctly if the dpi is somehow available, and recommend a certain value only for certain uses.
posted by andrew cooke at 1:44 PM on April 9, 2005


Also bear in mind that when people say DPI in reference to a screen, they actually mean PPI (pixels per inch). DPI refers to dots on print, and it takes many dots to create a colour pixel, (since you need sufficient area for spaces as well as dots in each pixel in order to produce lighter tones of ink and thus colour)

Therefore, a 72 DPI printer can not print 72 pixels per inch.

Which just totally adds to the general confusion. DPI is a print specification, and is worse than useless when you try to apply it to video screens. It's like beating a round peg into a square so that you can fit it into a round hole - the peg already fit the hole before, now it doesn't.
posted by -harlequin- at 9:01 AM on April 10, 2005


Actually, I'd like to word that a little more strongly. It's not just people who think screens are 72 DPI that don't get it, it's people who think screens have DPI at all. Screens are PPI - their pixels are not created by dot density, but by brightness of the sub-pixel elements, so DPI is simply not a relevant concept, and breaking the meaning of DPI to incorrectly say "DPI" insted of "PPI" when talking about a screen causes real confusion when people go to print a (raster) image and discover that DPI does not mean what they thought it meant (PPI) with sometimes catestrophic results.

I've worked in various print, web, and video companies over the years. I try to avoid dealing with clients directly, but it's usually a safe bet that if someone says "DPI" for any medium but print (or perhaps the process of scanning a print), then they don't really understand what DPI is, and your job is to ensure you know what they meant, not what they've said. The specification is incoherrent - a warning bell that it might not say what the client thinks it says. (You should do that anyway, but some clients clearly know what they're doing, some don't and are aware of that, and some don't and don't realise it)
posted by -harlequin- at 9:39 AM on April 10, 2005


Response by poster: Thanks for the clarification. I should mark pretty much the entire thing "Best Answer." I work entirely in print, and so the web's use of images is somewhat foreign to me. Thanks for taking the time.
posted by papercake at 1:52 PM on April 12, 2005


« Older Empy roll-on bottles   |   Airport what? Newer »
This thread is closed to new comments.