'How much radiation'
July 23, 2012 9:34 AM   Subscribe

Help me reassure someone about the safety of near infrared light.

I need to reassure someone about the safety of a small array of class 1 near infrared LEDs in a display they viewed for approximately 5 minutes. This is not about whether or not this is a reasonable concern, it's a question about communicating the safety of this device in a respectful and comprehensive manner.

I am not an optics whiz. My googling about infrared safety leads me either to information about highpowered lasers, which are not relevant to this device, or to dubious claims about cellphone radiation.

The LEDs in my device are comparable in strength to a tv remote. I would love some calculations/comparisons to share, such as '30 minutes of exposure to your remote is like 1 minute of being outside on a sunny day.' Comparisons to other common household devices would be useful as well.

Links to published studies are especially appreciated since the person has asked for this specifically.
posted by heyforfour to Science & Nature (9 answers total)
 
Check out the 2nd paragraph of the wikipedia article on infrared. On a sunny day, you are exposed to more infrared radiation than visible light — 527 watts. Compare this to the power that your LEDs require. The amount of infrared they emit will be somewhat less than that. I feel certain the sunny day will win by a considerable margin.
posted by ubiquity at 9:45 AM on July 23, 2012


I think the appropriate thing to note here is that the Sun is a massively more powerful emitter of near-infrared light than the device you're referring to would be. This useful picture shows the spectrum of the sun, and you will notice there is definitely not a sharp drop-off at the visibility corners. Intuitively, this makes sense, as the sun is quite hot.

Per Wikipedia as well, the sun provides about 100,000 lumens per square meter. For comparison, a 60W incandescent light bulb is about 800 lumens. I am having a hard time finding examples of infrared LEDs used for remotes, but I can tell you based on how long they last with a single battery, it will be a very small number (ie, much smaller than 100 lumens). In other words, going outside will expose this person to more infrared energy by several orders of magnitude.

Out of curiosity, what is the device?
posted by saeculorum at 9:47 AM on July 23, 2012


Here's a possible concern: "I looked directly at the LEDs, and focused my eyes on them. Wouldn't that focus the IR on my retina?"

Answer: no. Different frequencies refract differently. That's why a prism casts a spectrum. If the lense in your eye was focusing visible light, then infrared would be out of focus, and the IR would be spread widely over the retina, drastically reducing the effective intensity.
posted by Chocolate Pickle at 10:25 AM on July 23, 2012 [1 favorite]


Here is an idea. Find something with a red led, if you can find one close to the same wattage so much the better. Ask your friend to look at it. Then tell him "Ok, any damage that will be done has already happened." Or say, "this is just like what you will be looking at except the ir light is LOWER energy"
posted by d4nj450n at 10:52 AM on July 23, 2012


A bazillion years ago when I worked at Ye Olde Electronics Hut we had lots of remotes for the demo models of TVs and we also had a demo model of a security camera system set up. You could point a remote at the security camera and press a button and see the LED flashing in the display monitor.

So, maybe you could use a similar camera that shifts near-infrared light further into the visual spectrum and demonstrate the relative brightness of your device in comparison to a TV remote or some other common source of near-infrared to show that it's not any usually high exposure.
posted by XMLicious at 11:05 AM on July 23, 2012


Here's the LED used in the TV-B-Gone remote:

Super-bright 5mm IR LED

From the specs, I think, pulsing, it consumes 1.6 watts. High-efficiency LEDs (which these probably aren't) apparently produce under 50 lumens/watt, but we'll go with "50" for easier math. So, that LED will do about 80 lumens. Compare this with saeculorum's answer about the Sun's output, or a 60W bulb's output. Note that this would be a max pulse, rather than the much lower continuously lit mode, and given a very, very efficient LED. If it were turned on continuously, and with more realistic efficiencies, we're probably talking about 5 lumens.
posted by chengjih at 11:08 AM on July 23, 2012


Chocolate Pickle: Near IR can be closer in wavelength to red than red is to blue, so that isn't a very compelling defense. Plus if we're talking about an array if IR leds, the fact that it's blurry doesn't matter - a blurry rectangle will still be fully IR-colored in the middle.

But yeah, near IR is all around us. I've got a camera converted to see it (self-link) - exposure times are roughly comparable to visible light.

Near infrared basically behaves like normal light. It comes out of light bulbs and the sun. It is blocked by clothing and other stuff. People are a bit more transparent to it - like when you shine a flashlight through your hand, you can see the red light makes it through. IR is like that but more so.

Here's a good hand-wavey sciency explanation:

1) It's less IR than you get going outside.
2) The health risks are directly affected by the wavelength; stuff off the red end is safe, and stuff off the blue end is dangerous. When they don't just go right through stuff, gamma-rays, x-rays, and UV can all break atomic bonds, which can damage cells, fuck with DNA, etc. When they interact with matter at all, visible light, IR, Microwaves (wifi, bluetooth, and cordless phones all use microwaves), and radio waves all just heat stuff up, so as long as you don't cook you'll be fine.

People are a little more transparent to IR than they are to visible light. If you shine a flashlight through your hand - you'll get some red going through and a bit more IR.
posted by aubilenon at 2:51 PM on July 23, 2012


IR light isn't ionizing; Ionizing radiation is what you have to worry about. Now, I don't know about eyes, but it sounds like they did the equipvilant of staring at a candle, which is unlikely to harm anything.

If you have a webcam it picks up IR light already, and you just need some software to turn the filter off. You should be able to adapt this guide to your purposes pretty easily. Here is a direct link to the PDF from the National Institute of Science and Technology.

Then compare the IR emitter to a candle or something similar.
posted by Canageek at 7:40 PM on July 23, 2012


Response by poster: Thanks so much everyone! Ironically all these answers will be much more useful for future explanations than for the person in question, and for that, these answers/comparisons are great.

The specific person found the actual safety documentation from the specific product used much more helpful/comforting than infrared/visible comparisons - go figure!
posted by heyforfour at 4:01 PM on July 30, 2012


« Older Property's cheap, people need rentals - how can I...   |   What does it feel like to see clearly with both... Newer »
This thread is closed to new comments.