(Or "How I Learned to Stop Worrying and Love the Modern Bomb Curve"
June 12, 2008 9:48 AM   Subscribe

How many above-ground nuclear bomb detonations (for testing purposes or otherwise) could human life on Earth realistically hope to withstand? At what point is devastating environmental collapse just an inevitable result of increased carbon-14 levels, or is there such a point?

It's well-established (though maybe not-widely-known outside of specialist circles) that the above-ground nuclear weapons tests conducted by the US and USSR in the latter part of the 20th century had immediate and far-reaching environmental consequences.

According to research published by the FBI, "Atmospheric testing of thermonuclear devices between 1950 and 1963 significantly increased the level of 14C in the atmosphere and food chain. The testing events nearly doubled levels of 14C in terrestrial organisms." These tests were such dramatic events in Earth's geologic history, the math behind radiometric dating techniques literally has to be adjusted to take what's known as the "modern bomb curve" into account.)

So, as a hypothetical exercise, how much more above-ground nuclear testing or other such activities could the Earth withstand before reaching a tipping point after which global carbon-14 levels would likely render the Earth uninhabitable for humans? What's the maximum number of such above-ground nuclear detonations compatible with the possibility of further life on Earth, in other words?
posted by saulgoodman to Science & Nature (19 answers total) 2 users marked this as a favorite
 
Response by poster: d'oh. left off the trailing parenthesis in the title. i'm so embarrassed.)
posted by saulgoodman at 10:03 AM on June 12, 2008


by what mechanism are you supposing carbon 14 would render the earth uninhabitable?
posted by sergeant sandwich at 10:38 AM on June 12, 2008


This obviously requires a ton of assumptions so I'll try to state all of mine.
(1) Let's say that human life would start having serious problems at 25 rem/yr (see some dose information here). Current overall background is 0.3 rem/yr, with carbon-14 contributing only 0.001 rem/yr [wiki].
(2) For carbon-14 to raise background radiation to 25 rem/yr, it would have to be approximately 25,000 times pre-testing numbers.
(3) From another wikipedia article, I'll estimate all 1950-1963 atmospheric testing at around 500 explosions give-or-take. From that same article, you can see that the yields on those explosions varied a lot, but let's say that on average, explosions now would be 10 times the size. (Maybe this is way off, I dunno).
(4) From the wikipedia article on carbon-14, you can see a graph showing atmospheric carbon-14 doubled during the 60s due to testing. So 500 explosions doubled it (or 50 modern-day explosions with my estimate of 10-fold size).
(5) Since we need to get 25000 times background, it's 25000 x 50 explosions = 2,250,000 modern day explosions in the atmosphere.

I'm guessing we'd have other problems at that stage.
posted by Durin's Bane at 10:40 AM on June 12, 2008


Here is some safety data on carbon-14.

It's quite a leap to go from "screwing up a test that measures infinitesimal amounts of a radioactive substance" to "killing all humans".
posted by smackfu at 10:41 AM on June 12, 2008


Wikipedia suggests there were about 735 atmospheric atomic tests, assuming Russia did the same number as the US. Naturally occurring carbon-14 contributes about .01mSV/year to humans. From your source, those 735 tests doubled the level of carbon-14 exposure to (presumably) .02mSv/year. The NRC limits yearly exposure to 500mSv for nuclear workers, which suggests that amounts above that are probably harmful. So, at .01mSv/year per 735 tests, we'd need 36 million atmospheric tests or so.

I don't think carbon-14 would be the problem at that point. I think the better question is "how many atmospheric tests before nuclear fallout in general started to cause significant deaths worldwide?"
posted by jedicus at 10:50 AM on June 12, 2008


Best answer: We'd probably have issues from non-C14 radioactive isotopes, ozone damage, and pure thermal energy well before any effects were seen from the C14 itself. FAQ on the effects of nuclear explosions.
posted by j.edwards at 10:50 AM on June 12, 2008 [1 favorite]


What I've read is that the biggest danger is increasing atmospheric dust. That depends on the rate at which they're detonated. If, during a nuclear war, a lot (on the order of a hundred) of bombs are set off all at once, the amount of smoke and dust added to the atmosphere would increase the planet's albedo enough so as to set off the next ice age.

The presumption is that this might not outright bring the human race to extinction, but that it would reduce the world population to maybe a few tens of thousands, all living at a stone-age level in the few remaining forests near the tropics.

If the number set off was a lot higher, on the order of half the world's arsenal, the ecological consequences would be far worse. That's the "nuclear winter" scenario, and it could be as bad as the ecological consequences of the Chicxulub meteor strike.

But the negative consequences of those disaster scenarios are not due to radiation release.
posted by Class Goat at 10:54 AM on June 12, 2008


Best answer: C-14 has a very weak decay compared to other more dangerous radio-isotopes, like K40 and I131. There's also an enormous reserve of non-biologic carbon which C14 freely exchanges with; in contrast K and I are concentrated by mammals. If you go to the wiki on C14, you'll see that C14 currently makes up about 1/360th (~1 mrem) of the average radiation exposure, so you could double that about eight more times before you got on the same scale as our other sources of exposure. The current federal limit on workers for annual exposure is 5 rem, so you'd need 5000-10000 times the amount of explosions to start to get noticble effects just from the C14.

Like I said, the other isotopes generated in testing are much more hazardous. The amount of those that you generate is strongly dependent on what kind of explosions you have, with underground and very high altitudes blasts generating comparatively little fallout compared to surface and water-surface detonations. You also get different fallout depending on the weapons; efficient mostly fusion unboosted weapons will leave relatively less junk if I recall correctly. Note that they're much more hazardous for large long-lived creatures like us. There is plenty of wildlife around Chernobyl. If you don't live that long and produce lots of offspring, your lifecycle has much more tolerance for radiation.
posted by a robot made out of meat at 10:58 AM on June 12, 2008


Response by poster: Like I said, the other isotopes generated in testing are much more hazardous.

Just want to clarify something real quick: Part of the question is meant to be, apart from the relatively negligible risks of radioactive contamination, are there any other foreseeable ecological consequences to increasing global C-14 levels by several orders of magnitude within a relatively short span of time? Thanks for all the great answers so far!
posted by saulgoodman at 11:12 AM on June 12, 2008


apart from the relatively negligible risks of radioactive contamination, are there any other foreseeable ecological consequences to increasing global C-14 levels by several orders of magnitude within a relatively short span of time?

A factor of two is not an order of magnitude. Aside from it's radioactive nature, C-14 is chemically identical to normal carbon. There are risks associated with non-carbon isotopes being produced, is that what you want?
posted by a robot made out of meat at 11:20 AM on June 12, 2008


Best answer: are there any other foreseeable ecological consequences to increasing global C-14 levels by several orders of magnitude within a relatively short span of time?

Very likely not. In general, biological systems are unaffected by different isotope levels. The big exception, such as it is, is heavy water (that is, water primarily composed of oxygen and deuterium). In that case, the single extra neutron effectively doubles the mass of the hydrogen atoms. Even so, it takes significant amounts of heavy water to pose a threat to living organisms. Carbon-14 would have a much, much smaller effect. Whereas deuterium is twice as heavy as ordinary hydrogen, carbon-14 is only 8.3% heavier than carbon-12.
posted by jedicus at 11:25 AM on June 12, 2008


Response by poster: A factor of two is not an order of magnitude

no, but in a hypothetical larger-scale scenario (all-out sustained nuclear war, unchecked above-ground testing for several decades using increasingly powerful weapons, etc.), there'd be a much larger increase, i'd think.

C-14 is chemically identical to normal carbon.

What are the consequences (if any) of having massive extra amounts of normal carbon in the atmosphere and how long before those consequences (if any) became untenable?

There are risks associated with non-carbon isotopes being produced, is that what you want?

actually a little more detail on those risks would be good, too.
posted by saulgoodman at 11:28 AM on June 12, 2008


It would also depend on how far above ground they were. You can set of lots of bombs at 20--30km altitude without killing much of anything except electronics.

It would also depend on the design of the bomb. Tsar Bomba was very clean, with very low fallout for its yield. On the other hand, put a cobalt case around a basic big thermonuclear warhead instead of DU, and you can maybe wipe out all mammalian life with a few bombs from the Co-60.
posted by ROU_Xenophobe at 11:46 AM on June 12, 2008



What are the consequences (if any) of having massive extra amounts of normal carbon in the atmosphere and how long before those consequences (if any) became untenable?


C-14 occurs naturally at about 1 part in a trillion C-12 atoms. So, doubling the concentration of C-14 means you've increased the total concentration of carbon by about 1 part in a trillion.

That being said, the consequence of dumping large amounts of normal carbon into the atmosphere through some other means (nuclear weapons won't do it, but burning fossil fuels in principle could) might be the start of a runaway greenhouse effect, turning the Earth into something like Venus.
posted by dsword at 12:38 PM on June 12, 2008


There are risks associated with non-carbon isotopes being produced, is that what you want?

actually a little more detail on those risks would be good, too.


There's actually a lot about that in j.edwards' excellent link, near the bottom of the article. I can't believe I read the whole thing, but it just got more and more fascinating as I went. The risks to life seem to be especially profound for elements that have similar structure to elements that are normally incorporated into a living body. Iodine-131 is bad because, like any other iodine, your body concentrates it in your thyroid gland. This allows it to cause serious radiation damage to that local area. Strontium-90 and -89 are similar enough to calcium that the body deposits it in bones, which means its radiation has serious effects on the nearby bone marrow (which in turn messes up your immune system because the bone marrow is the source of your white blood cells).
posted by vytae at 12:44 PM on June 12, 2008


C14 decay radiation

C14 beta decays and the electron that pops out of C14 doesn’t have that high of an energy. You can quickly calculate how far it goes in human tissue by calculating the range:

Range/Stopping power lookup for electrons

Select your material and type in the energy of the electron in MeV (not in KeV which the first link reports the electron energy as) and divide the range by the density and you will get how far it will travel in the material. For C14 it travels about 0.02cm in human tissue, which I believe means it will end up only in your dead skin. So any C14 in the atmosphere will only affect your dead skin and you don’t need to worry about it. As for how many bombs are needed for the environment to collapse you would need to calculate how much of C14 you would need to ingest and calculate you’re your dose from there.

BTW ROU_Xenophobe, a Co laced nuclear bomb will kill less people due to it absorbing neutrons that would otherwise be used to produce fission fragments, that coupled with its long half life compared to that of fission fragments makes it a terrible idea if your trying to maximize death. What a Co nuclear bomb would do however is make the area uninhabitable for longer due to its long half life. It is more of a plow salt into the land approach instead of a kill off everyone method of war.
posted by metex at 1:03 PM on June 12, 2008


Best answer: If above-ground nuclear explosions make the earth uninhabitable for people, it won't be due to 14C activity. I'll assume your question is about the welfare of people who don't die when the bombs knock their houses over.

People in the region around the Chernobyl reactor meltdown had a substantially increased risk of thyroid cancer. The thyroid contains lots of iodine, and one common fission product is 131I, which has an eight day half life. Apparently a few weeks of elevated exposure was enough to trigger cancers years later. I saw an article (in Physics Today? a couple of years ago?) showing that children born in the Chernobyl region after the meltdown did not have a raised thyroid cancer rate, while children who were alive then did, which is consistent with a risk due to a brief exposure.

The other concern I remember after Chernobyl was strontium 90, which has a thirty year half life and shares a lot of chemical properties with calcium. I think the worry was that milk from cows in the fallout zone would carry lots of strontium, which would get incorporated into your bone structure and, like iodine in the thyroid, give you an internal dose for a long time. But I don't think that concern lasted very long. Maybe the cattle got their strontium dose from fallout on the ground, which washed away after a couple of growing seasons. I would have to look this one up.

The other long-lived, copiously-produced fission product that comes to mind is cesium 137, which also lives for about thirty years. Cesium doesn't have a biological function that would enhance its uptake like the other two examples. But a lot of it comes out of fissions --- I have heard it said, but haven't read it or worked it out myself, that most of the long-lived activity from the atmospheric testing era came from cesium.

A nuclear explosion releases about a kilogram of neutrons. I think most of the neutrons that stop in air make 15N; I don't know what fraction make 14C. Let's wildly guess it's 1%, so from one explosion 10 grams of neutrons make 150 grams of 14C. If there were 1500 atomospheric tests before 1963, they made about 225 kilograms of 14C. Since this is an error-in-the-exponent calculation, call it "less than ten tons."

I guess a mole of neutrons weighs a gram, so our 1500 atmospheric tests made 15,000 moles of 14CO2, which occupies a volume of 35 cubic meters at room temperature, about like a storage shed. With our fudge factor, less than forty storage sheds. Interesting that this is apparently the volume of naturally occurring atmospheric 14CO2 as well.
posted by fantabulous timewaster at 1:08 PM on June 12, 2008 [1 favorite]


I'd just like to say: good work HTML-encoding that!
posted by Class Goat at 2:53 PM on June 12, 2008


Response by poster: I'd just like to say: good work HTML-encoding that!
Agreed. Marked as a best answer on that basis alone.


Thanks all! Some great information to dig into here. I'd always wondered whether there were any practical consequences to all that extra carbon-14 being introduced into the environment by nuke testing (apart from the obvious consequences for carbon-dating methods, which I only learned about incidentally while researching for a writing project many years ago). Seems the consensus answer is: Meh.
posted by saulgoodman at 5:56 PM on June 12, 2008


« Older Help me spoil my cat!   |   I need a storage shelf/unit with cubbyholes and... Newer »
This thread is closed to new comments.