Why is digital color matching so difficult?
November 7, 2023 7:06 AM Subscribe
If I take two digital photos of exactly the same object but with different lighting, matching the colors of the two photos with Photoshop (or in this case Pixelmator) is a difficult, nay impossible, task. I'm sure there are a lot of reasons but I'd like to collect a detailed enumeration of them.
One thing that will help solve your problem is to use a white balance card in each photo -- i.e., a piece of paper with white, black, and 50% gray squares on it. Some white balance cards have a full array of colored squares too.
Color is relative, so the reference points determine the tint or color cast of the resulting image. Another thing that can help is to use Lab color space, which holds subjective Lightness as fixed but allows adjustment of blue/yellow and green/magenta tints separately.
If the difference between the lighting setups of the two photos is limited to the brightness/intensity and directionality of the light, then you will probably be able to adjust to make the two similar. However if the color/temperature of the lighting conditions is radically different, the whole color universe can become irreconcilable. Like, for example, warm incandescent light versus the blue ambient light of outdoor shadows lit by the sky.
posted by MetaFilter World Peace at 7:27 AM on November 7, 2023 [4 favorites]
Color is relative, so the reference points determine the tint or color cast of the resulting image. Another thing that can help is to use Lab color space, which holds subjective Lightness as fixed but allows adjustment of blue/yellow and green/magenta tints separately.
If the difference between the lighting setups of the two photos is limited to the brightness/intensity and directionality of the light, then you will probably be able to adjust to make the two similar. However if the color/temperature of the lighting conditions is radically different, the whole color universe can become irreconcilable. Like, for example, warm incandescent light versus the blue ambient light of outdoor shadows lit by the sky.
posted by MetaFilter World Peace at 7:27 AM on November 7, 2023 [4 favorites]
Short answer? Because when an object is exposed to two different lighting sources, the camera sensor will see two different colors and it can be very difficult to adjust. It may possible if you have the raw sensor data and a very good idea of exactly what frequencies of light and the ratio of those frequencies are. The white balance tool in an app like Lightroom can probably get you close. Sometimes.
Medium Answer: Put a green object under a blue light and take a photo. Now do it under a yellow light. Now try and make it "green" again. White light is not really white, our brains just do an on the fly adjustment for the prevailing color temperature. The camera images short circuit this on the fly adjustment and expose (pun intended) the reality of the situation.
Long Answer: Flip through Vision and Art: The Biology of Seeing by Margaret-Livingstone. I highly recommend that you snag a physical copy from the local library if possible.
posted by SegFaultCoreDump at 7:28 AM on November 7, 2023 [2 favorites]
Medium Answer: Put a green object under a blue light and take a photo. Now do it under a yellow light. Now try and make it "green" again. White light is not really white, our brains just do an on the fly adjustment for the prevailing color temperature. The camera images short circuit this on the fly adjustment and expose (pun intended) the reality of the situation.
Long Answer: Flip through Vision and Art: The Biology of Seeing by Margaret-Livingstone. I highly recommend that you snag a physical copy from the local library if possible.
posted by SegFaultCoreDump at 7:28 AM on November 7, 2023 [2 favorites]
Some file formats use a reduced color palette. As a result, pictures taken when the human eye can't detect a change will end up with slightly different sets of colors; if the lighting is noticeably different to a human viewer, then the whole range of available colors will be, too.
I worked in desktop publishing in the 1990s and color-matching is hard. Additive versus subtractive color, CRT diaplays versus LCDs, CMYK ink versus hexachrome inks or spot inks... It's just hard. And that's with instrumentation: subjective human experience makes it much squishier!
posted by wenestvedt at 8:19 AM on November 7, 2023 [1 favorite]
I worked in desktop publishing in the 1990s and color-matching is hard. Additive versus subtractive color, CRT diaplays versus LCDs, CMYK ink versus hexachrome inks or spot inks... It's just hard. And that's with instrumentation: subjective human experience makes it much squishier!
posted by wenestvedt at 8:19 AM on November 7, 2023 [1 favorite]
If you're interested, read up on the physics of light and color (it's pretty interesting). But in short, there is no inherent color to any object, or at least not exactly - how much light an object is absorbing and how much it's reflecting can change its color pretty dramatically, which changes in different light conditions and what objects are around it.
I studied and then practiced photography in an earlier stage of life, and I got pretty good at color matching, but it was something that took time. I don't know about Pixelmator, but Photoshop allows you to adjust cyan, yellow, magenta, red, green, and blue - overtime, you'll get good at looking at an image and thinking "hmm, it's slightly a bit too blue - needs some yellow - and perhaps it's also slightly green, needs a bit of magenta"
posted by coffeecat at 8:28 AM on November 7, 2023 [1 favorite]
I studied and then practiced photography in an earlier stage of life, and I got pretty good at color matching, but it was something that took time. I don't know about Pixelmator, but Photoshop allows you to adjust cyan, yellow, magenta, red, green, and blue - overtime, you'll get good at looking at an image and thinking "hmm, it's slightly a bit too blue - needs some yellow - and perhaps it's also slightly green, needs a bit of magenta"
posted by coffeecat at 8:28 AM on November 7, 2023 [1 favorite]
This is a very complicated question involving psychology, human anatomy, biology, the environment, display technology, image processing, motion processing, and a bunch of software topics too. I once took a whole graduate class on this topic. If you really want "a detailed enumeration" be prepared to do a lot of research. Maybe you just want the high level list of factors, not details on each one.
The course was at the MIT Media Lab in 1999, "Digital Image Processing for Hard Copy" (MAS 814), taught by Mike Bove. Nothing of it remains online and that's long enough ago I doubt much of the syllabus would be useful now.
posted by Nelson at 9:59 AM on November 7, 2023 [2 favorites]
The course was at the MIT Media Lab in 1999, "Digital Image Processing for Hard Copy" (MAS 814), taught by Mike Bove. Nothing of it remains online and that's long enough ago I doubt much of the syllabus would be useful now.
posted by Nelson at 9:59 AM on November 7, 2023 [2 favorites]
Because they are two different things, I'd argue. Lighting has a powerful effect on our perceptions, so trying to match the same color with different lighting is like trying to match apples and oranges.
posted by Brandon Blatcher at 10:42 AM on November 7, 2023
posted by Brandon Blatcher at 10:42 AM on November 7, 2023
Colour is not an objective property of things in the real world. If you expect to be able to derive something like an absolute, objective colour from a photograph as a basis for simulating how it might have looked with a different illumination spectrum, you're just setting yourself up for disappointment.
A photograph is a record of a point-in-time interaction between an illuminating electromagnetic energy field with a specific frequency distribution, and objects within the camera's field of view, and the transmission filters between those objects and the camera's pixel sensors, and the response characteristics of those sensors. What's actually in the photo depends on all four, and that's before even thinking about what the colours are going to look like to a human.
The difficulty inherent in colour correction is most easily grasped using an extreme example: say you've taken a photo using a single-frequency monochromatic blue light source to illuminate a scene, and taken another of the same scene using a single-frequency monochromatic red light source, and you're trying to colour match them.
The information in the first photo is all about how the objects in your scene, and the sensors in your camera, respond to narrow-band blue light; the information in the second is all about how they respond to narrow-band red. For some objects, those responses will be very close to completely uncorrelated, which is to say there's actually nothing there you can match. How such an object responds to the blue illumination tells the photo processing software nothing at all about how it responds to the red and vice versa, which means that if the software is trying to simulate e.g. reddening up the blue photo's illumination a little then it's really got nothing to go on and all it can do is guess how your objects might behave.
Obviously most photos are not taken under narrow-band monochromatic lighting and will therefore correlate better, but the difference in difficulty is of degree and not of kind. With some lighting sources, especially if their spectra are as peaky as is typical of cheap LED, fluorescent or discharge lamps, fixing up the colour in post is just going to be somewhere between hard and infeasible.
The easiest colour matching is going to result from using illumination whose spectrum closely resembles black-body radiation such as sunlight or incandescent lighting, because that's the basic assumption underlying the maths inside the tools. It's an assumption that needs to be made, for example, if the idea of "colour temperature" is actually going to mean something.
posted by flabdablet at 11:10 AM on November 7, 2023 [5 favorites]
A photograph is a record of a point-in-time interaction between an illuminating electromagnetic energy field with a specific frequency distribution, and objects within the camera's field of view, and the transmission filters between those objects and the camera's pixel sensors, and the response characteristics of those sensors. What's actually in the photo depends on all four, and that's before even thinking about what the colours are going to look like to a human.
The difficulty inherent in colour correction is most easily grasped using an extreme example: say you've taken a photo using a single-frequency monochromatic blue light source to illuminate a scene, and taken another of the same scene using a single-frequency monochromatic red light source, and you're trying to colour match them.
The information in the first photo is all about how the objects in your scene, and the sensors in your camera, respond to narrow-band blue light; the information in the second is all about how they respond to narrow-band red. For some objects, those responses will be very close to completely uncorrelated, which is to say there's actually nothing there you can match. How such an object responds to the blue illumination tells the photo processing software nothing at all about how it responds to the red and vice versa, which means that if the software is trying to simulate e.g. reddening up the blue photo's illumination a little then it's really got nothing to go on and all it can do is guess how your objects might behave.
Obviously most photos are not taken under narrow-band monochromatic lighting and will therefore correlate better, but the difference in difficulty is of degree and not of kind. With some lighting sources, especially if their spectra are as peaky as is typical of cheap LED, fluorescent or discharge lamps, fixing up the colour in post is just going to be somewhere between hard and infeasible.
The easiest colour matching is going to result from using illumination whose spectrum closely resembles black-body radiation such as sunlight or incandescent lighting, because that's the basic assumption underlying the maths inside the tools. It's an assumption that needs to be made, for example, if the idea of "colour temperature" is actually going to mean something.
posted by flabdablet at 11:10 AM on November 7, 2023 [5 favorites]
Response by poster: If you really want "a detailed enumeration" be prepared to do a lot of research. Maybe you just want the high level list of factors, not details on each one.
I'm getting a comfortable level of detail here and I'll dive deeper on the more interesting bits. It's pretty clear there are distinct problems in physics, physiology and psychology in play as well as a jumbled mess of issues that derive from two or three of them at once.
In retrospect kind of obvious, but elsewhere someone pointed me to some of the better scientific articles written about the The Dress.
posted by Tell Me No Lies at 8:08 PM on November 7, 2023
I'm getting a comfortable level of detail here and I'll dive deeper on the more interesting bits. It's pretty clear there are distinct problems in physics, physiology and psychology in play as well as a jumbled mess of issues that derive from two or three of them at once.
In retrospect kind of obvious, but elsewhere someone pointed me to some of the better scientific articles written about the The Dress.
posted by Tell Me No Lies at 8:08 PM on November 7, 2023
As an even more extreme version of my red/blue light example: consider replacing the red light with an infrared laser, and trying to "colour-correct" the infrared photo to match the blue one.
If any of the objects in the scene you're photographing are underwater then you're going to be completely out of luck, because water is opaque to infrared. So not only will the objects in the photos have irreconcilable colours, the two photos will potentially depict different collections of objects.
More extreme still, but along the same lines: ponder trying to colour-correct an X-ray image of your hand.
Colour is not an objective property of things you might photograph in and of themselves. Closest it gets is being considered as a biologically convenient way to categorize the ways in which those things interact with a relatively narrow spectral band of EM radiation.
posted by flabdablet at 8:32 PM on November 7, 2023
If any of the objects in the scene you're photographing are underwater then you're going to be completely out of luck, because water is opaque to infrared. So not only will the objects in the photos have irreconcilable colours, the two photos will potentially depict different collections of objects.
More extreme still, but along the same lines: ponder trying to colour-correct an X-ray image of your hand.
Colour is not an objective property of things you might photograph in and of themselves. Closest it gets is being considered as a biologically convenient way to categorize the ways in which those things interact with a relatively narrow spectral band of EM radiation.
posted by flabdablet at 8:32 PM on November 7, 2023
You are asking why color matching a digital photo is nearly impossible. That carries the assumption that non-digital color matching is easier. Maybe understanding why non-digital is easier might help. I do color matched repairs of wood and other materials. The thing is there are a number of techniques that make it work even without an exact match.
One is you incrementally mix a color in the same light as the original, usually sunlight. This seems trivial but when you get it right you really know that you have gotten it right. Once you put that on the object, real world lighting conditions handle all the light of different temp. and angle effects which you'd need to fiddle with to get it to look right in a digital image. I wonder if altering the color of an object in a digital photo makes it stand out since it is no longer correct with the lighting in the rest of the photo?
posted by bdc34 at 6:05 AM on November 8, 2023
One is you incrementally mix a color in the same light as the original, usually sunlight. This seems trivial but when you get it right you really know that you have gotten it right. Once you put that on the object, real world lighting conditions handle all the light of different temp. and angle effects which you'd need to fiddle with to get it to look right in a digital image. I wonder if altering the color of an object in a digital photo makes it stand out since it is no longer correct with the lighting in the rest of the photo?
posted by bdc34 at 6:05 AM on November 8, 2023
Best answer: Just floated across my socials: Fundamentals and Applications of Colour Engineering edited by Phil Green. I haven't read it but the table of contents looks promising and I know respect the author of at least one chapter.
It's stupid expensive, so hopefully you can find it at a library, academic or otherwise. If nothing else the Table of Contents (on the website) seems like a good answer to your question.
posted by Nelson at 10:58 AM on November 9, 2023
It's stupid expensive, so hopefully you can find it at a library, academic or otherwise. If nothing else the Table of Contents (on the website) seems like a good answer to your question.
posted by Nelson at 10:58 AM on November 9, 2023
Response by poster: Dang. Everyone's answers have been very helpful but I'm marking Nelson's as the best answer as that table of contents really is a detailed enumeration of the entire topic.
There's much to follow up on in this thread, and if I'm really serious there's a 370 page book to buy. Thank you everyone.
posted by Tell Me No Lies at 2:07 PM on November 9, 2023
There's much to follow up on in this thread, and if I'm really serious there's a 370 page book to buy. Thank you everyone.
posted by Tell Me No Lies at 2:07 PM on November 9, 2023
Glad that was helpful! Just for posterity here's the chapter titles.
1 Instruments and Methods for the Colour Measurements Required in Colour Engineering
2 Colorimetry and Colour Difference
3 Fundamentals of Device Characterization
4 Characterization of Input Devices
5 Color Processing for Digital Cameras
6 Display Calibration
7 Characterizing Hard Copy Printers
8 Colour Encodings
9 Colour Gamut Communication
10 The ICC Colour Management Architecture
11 iccMAX Color Management -- Philosophy, Overview, and Basics
12 Sensor Adjustment
13 Evaluating Colour Transforms
14 Appearance Beyond Colour: Gloss and Translucency Perception
15 Colour Management of Material Appearance
16 Color on the Web
17 High Dynamic Range Imaging
18 HDR and Wide Color Gamut Display Technologies and Considerations
19 Colour in AR and VR
20 Colour Engineering Toolbox and Other Open Source Tools
posted by Nelson at 3:18 PM on November 9, 2023 [2 favorites]
1 Instruments and Methods for the Colour Measurements Required in Colour Engineering
2 Colorimetry and Colour Difference
3 Fundamentals of Device Characterization
4 Characterization of Input Devices
5 Color Processing for Digital Cameras
6 Display Calibration
7 Characterizing Hard Copy Printers
8 Colour Encodings
9 Colour Gamut Communication
10 The ICC Colour Management Architecture
11 iccMAX Color Management -- Philosophy, Overview, and Basics
12 Sensor Adjustment
13 Evaluating Colour Transforms
14 Appearance Beyond Colour: Gloss and Translucency Perception
15 Colour Management of Material Appearance
16 Color on the Web
17 High Dynamic Range Imaging
18 HDR and Wide Color Gamut Display Technologies and Considerations
19 Colour in AR and VR
20 Colour Engineering Toolbox and Other Open Source Tools
posted by Nelson at 3:18 PM on November 9, 2023 [2 favorites]
This thread is closed to new comments.
The two lights are most likely different intensities and different color temperatures, and they may be different types of lights iand in different positions. The object and the scene you are photographing are going to reflect the light differently, and the various materials may each reflect differently than each other. So to get an exact match is often impossible. You've got to make adjustments for the color temperature and brightness, which are relatively easy, but since the are many other variables, figuring out the exact color range that needs to be adjusted (without affecting the whole picture) just may be impossible.
posted by jonathanhughes at 7:26 AM on November 7, 2023 [4 favorites]