Are satellites built from old metals to avoid post-A-bomb radiation levels?
August 4, 2009 9:49 AM   Subscribe

Are satellites (or satellite parts) ever made from metal salvaged from pre-1945 shipwrecks? Possibly to ensure that those metals were refined before atmospheric A-bomb explosions increased the background radiation levels?

A friend of mine recently visited the ESA site (Europe's answer to NASA) in the Netherlands. While there, he was told by one of the guides that some satellite components are made from metals salvaged from old (WW2 or earlier) shipwrecks. The reasoning ran:

i) The planet's atmospheric background radiation is measurably higher now than it was before the A-bomb detonations.
ii) Therefore, some materials manufactured before the war (presumably referring to steel here) have a lower background radiation than the equivalents manufactured today.
iii) For some scientific applications, having slightly less radioactive materials is worth the extra expense of hauling old shipwrecks up from the seabed and re-processing the materials. Examples given were very sensitive radiation sensors and building radiation shields around delicate equipment.

I've heard this before from another source, although can't remember what that source was. I can just about imagine a mechanism (an increased proportion of Carbon-14 in the Carbon used to make steel from iron?), and Europe certainly has enough WW2-era shipwrecks available for this to be logistically possible.
However, a phycisit friend of ours reckons that this is probably nonsense. She argues that radiation isn't contagious; there's no way a slightly increased atmospheric radiation level could affect the radioactivity of a newly refined slab of steel.
So my questions are:

1) Have you heard that satellites or their components are sometimes made with metal from shipwrecks? What's your source for this?
2) If satellite parts are made from shipwrecks, could it be due to different radiation levels? If so, why is old metal less radioactive?
Bonus question:
3) Is my hypothesis about increased levels of Carbon-14 in modern steel completely insane? If not, could an archaeologist in the future apply Carbon-14 dating techniques to chunks of steel in the same way that we do to our archaeological finds?
posted by metaBugs to Science & Nature (17 answers total) 7 users marked this as a favorite
I doubt it. The quality of newly manufactured metal over metal of questionable provenance is night and day. Satellites need to built to very strict specifications and using recycled materials seems inherently risky.
posted by JJ86 at 10:06 AM on August 4, 2009

Best answer: It is my understanding that carbon-14 dating is only used in dating organic material. It relies on the fact that organisms fix carbon into their structures. Know the approximate atmospheric carbon-14 concentration and the rate at which carbon-14 decays into other isotopes and you have an approximate date.

The carbon in steel is probably left over from the carbon in the ores before they were refined, and as such is likely much older than any organic carbon used in carbon dating. So old that its carbon-14 levels should have declined to negligible levels, leaving it undatable using radiocarbon methods.

Also radiocarbon dating is thrown off by the Industrial age because we've been releasing carbon into the atmosphere from fossil fuels that don't contain carbon-14. So future archaeologists will have to find a different way to calibrate their dating systems.

I know this doesn't directly address your main question, but it's all I know about it.
posted by reegmo at 10:08 AM on August 4, 2009

Best answer: The Wikipedia article on the scuttling of the German High Seas Fleet at Scapa Flow after World War I says that "Minor salvage is still carried out to recover small pieces of steel that can be used in radiation sensitive devices, such as Geiger counters, as the ships sank before nuclear weapons and tests irradiated the world's supply of steel", citing Daniel Alan Butler's book Distant Victory: The Battle of Jutland and the Allied Triumph in the First World War as its source. (Take with a grain of salt, as with all WP info.)
posted by Zonker at 10:11 AM on August 4, 2009

Best answer: I work in human spaceflight, not satellite construction, but I can tell you that pretty much everything we build is made from aluminum, not steel (the International Space Station spaceframes are all aluminum, for instance). This is to save on weight, because it's expensive to launch stuff into orbit. I would imagine most satellite bus frames operate on the same principle. Ships weren't built from aluminum in WWII, as far as I know.

Also, "radioactive" isn't a catch-all term. I can't speak to your question #3, but when NASA was launching the Cassini probe, for instance, there were protests at the nuclear material onboard, despite the fact that the particles it emitted weren't dangerous to humans. Not all radioactivity is created equal.
posted by zap rowsdower at 10:12 AM on August 4, 2009

Best answer: 1. I haven't heard about their use in satellites, but the high purity germanium radiation detector I was using 5 minutes ago is surrounded by a shield made with WW1 Steel. This steel is often used in radiation detection where low levels of radiation are important. Also in medical radiotherapy devices for the same reason.

Conceivably it could be called for in some satellite applications, though I can't really think of one. Usually there isn't tons of shielding on a satellite because lifting large amounts of steel into orbit is hard. Also the radiation environment in orbit is much higher than on earth without the protection of the atmosphere and our magnetic field. So: maybe, but kind of doubt it.

2. Radioactive fallout gets everywhere. Some of the places it gets to are steel foundries and the ore that is going into them. So old steel does have measurably less radiation than new steel in basically all cases.

3. Not insane, I don't know about C-14 in particular, but the relative amounts of fallout in steel have probably been declining since the atmospheric testing ban. Dating might be difficult because you'd need baselines and need to know how much C-14 was in steel made at the peak of atmospheric testing, which might vary by latitude.
posted by pseudonick at 10:13 AM on August 4, 2009 [5 favorites]

I've heard about it for underground neutrino observatories (see this metafilter post) amongst other applications. I'd be mildly surprised if it was used for satellites since you'd think you get a bunch of radiation in space anyway.

This Wikipedia article has a reference to it. Adding 'scapa' to your googling may help you track things down.
posted by edd at 10:14 AM on August 4, 2009

I have heard that a certain type of lead salvaged from old shipwrecks is highly sought after by the electronics industry.
posted by Gungho at 10:17 AM on August 4, 2009


Have a look at these:

It seems certain instruments require pre-45 metal.
posted by Mr. Yuck at 10:20 AM on August 4, 2009

If you gave me hunks of steel made in 1920, 1965, and 2009, I could rank them in order of age using the germanium detector I mentioned above, just by the relative levels of fallout material.

Actually dating a sample of unknown age would be very tricky though.
posted by pseudonick at 10:20 AM on August 4, 2009

Best answer: I don't know about satellites, but this is commonly done for low-level radiation measurement equipment, as most current steel contains trace levels of fallout nuclides, principally Cs-137 and Co-60. The amount of radioactivity is too low to be of any practical concern, but, modern detectors beeing pretty sensitive, it will interfere with measurement.

I have also heard of lead from Roman piping being used for the same reason, but I don't know if it's true.

On preview, what pseudonick said.
posted by Dr Dracator at 10:30 AM on August 4, 2009

I would imagine the main reason is the sea-water itself-- it's the best and cheapest radiation shield we could hope for.

Since those ships have been sunk, they've been protected from both cosmic, some natural and most man-made radiation bombardment by the massive weight of water on them. So their internal radioactivity is essentially the same as the day they were sunk.

So it's both a manufacturing issue (they were made by less tainted stock) and also the preservation of them from that point on which makes it such an ideal material for these scientists and their super-sensitive tools.
posted by Static Vagabond at 10:37 AM on August 4, 2009

Best answer: Here are some abstracts, no satelites mentioned though:

Use of low-background germanium detectors to preselect high-radiopurity materials intended for constructing advanced ultralow-level detectors

A Historically Significant Shield for In Vivo Measurements

The “Discovery” of alpha activity in lead and solder

The last one concerns another angle, that of Pb-210. Lead 210 is a natural radioactive isotope, with a half life of 22.3 years. Old lead will have less Pb-210 and is therefore desirable for low-background applications, though it doesn't have anything to do with the start of the atomic age.
posted by Dr Dracator at 10:45 AM on August 4, 2009

Best answer: For an additional data point, here is another article mentioning the use of pre-WWII steel for ground based high-sensitivity radiation measurements, and this article from the same facility specifically mentions battleship armor. It sounds like these facilities have been used to examine material brought back from space; I have been unable to find anything about the use of this steel in satellites.
posted by TedW at 11:11 AM on August 4, 2009

However, a phycisit friend of ours reckons that this is probably nonsense. She argues that radiation isn't contagious; there's no way a slightly increased atmospheric radiation level could affect the radioactivity of a newly refined slab of steel.

Increased atmospheric radiation is a symptom of radioactive particles floating in the atmosphere from nuclear tests. It is not the atmospheric radiation that contaminates the steel (radiation is not contagious unless it's neutron radiation), it is the radioactive particles in the air that contaminates the steel.

So you can angle that you're both right - the increased atmospheric radiation is not a problem (radioactive pollutants from the atmosphere are the problem), however pre-atomic-age materials can have measurably different properties from modern materials, and are sometimes sought for this reason.
posted by -harlequin- at 2:14 PM on August 4, 2009

A lot of satellites are nuclear powered, so using low-rad metal seems like too little, too late. (A lot aren't nuclear powered too, so that doesn't rule them out.)
posted by DU at 5:52 PM on August 4, 2009

Trying to remember a talk from an experimentalist a few months ago, so some of the details may be slightly off, but my understanding is that ancient lead is often used as a shield in dark matter detection experiments. See CDMS, for example. (These are Earth based experiments I'm discussing, I haven't heard about it being used for satellites... I find that idea a little puzzling.) The issue, if I remember correctly, has more to do with when the metal was originally formed from ore, at which point impurities are introduced. In ancient metals, the radioactive impurities have had plenty of time to decay. As far as I know, it had little to do with the atomic testing of the 50's.

The talk I'm remembering was by Richard Schnee from Syracuse. You might try emailing him. I'm sorry I can't be more helpful at the moment. I'm about to head to the airport.
posted by dsword at 12:09 PM on August 7, 2009

Best answer: Some of this has been mentioned in pieces above.

Now that I'm back in the country, I can clarify a bit. The issue, AFAIK, has nothing to do with fallout. The issue is that you get all the isotopes of whatever metal elements you're trying to isolate when you first extract it from ore. Some of these are unstable. These isotopes are the decay products of heavier unstable elements present in the ore, and so are replaced after they decay.

After the metal is formed, you've removed all the heavier elements chemically. Depending on the half-life and decay chains of the isotopes that remain, you can just let it sit for a while (say 100 years), and it will become less radioactive--the unstable isotopes are no longer being replaced by decays of heavier particles.

Like I said, I know this is important for dark matter searches, where background can kill your experiment.
posted by dsword at 7:44 AM on August 8, 2009

« Older Strategizing a Lego offering   |   Help Me Overthink this Plate of Job Titles Newer »
This thread is closed to new comments.