Example of famous organizational decision based on faulty data?
May 21, 2013 5:46 PM Subscribe
I'm looking for examples of high impact organizational decisions based on faulty data that had long term negative impacts. (My use of the word "organizational" is meant to imply a "group of persons organized towards some end." Could be a corporation, or a not-for-profit, or a govt entity, etc.)
I have to give a public testimony tomorrow re: the quality of data being used to defend a pretty massive and impactful decision that will be close to impossible to reverse.
This has been a very long project and I'm mentally exhausted. When I was pulling together some of my thoughts for what to say, someone else asked me for examples of other large or notable organizational decisions based on either faulty data or a faulty decision-making process.
I'm blank. Mentally exhausted and blank.
I've looked for examples of "garbage in, garbage out", "Sunk Cost Fallacy," "Willful Ignorance", "Tragedy of the Commons", "Problem of Many Hands" type decisions in any industry or context...I'm not interested in just examples from the tech world though. I'm looking for an example of a decision where, after the fact, some independent entity acknowledged, "Yeah, we should have considered that." Or a plan that was put forth was pushed through with disastrous results, even when data showed that it should have been halted or changed.
I'm hoping it will help me to work out how to present my testimony tomorrow.
I have to give a public testimony tomorrow re: the quality of data being used to defend a pretty massive and impactful decision that will be close to impossible to reverse.
This has been a very long project and I'm mentally exhausted. When I was pulling together some of my thoughts for what to say, someone else asked me for examples of other large or notable organizational decisions based on either faulty data or a faulty decision-making process.
I'm blank. Mentally exhausted and blank.
I've looked for examples of "garbage in, garbage out", "Sunk Cost Fallacy," "Willful Ignorance", "Tragedy of the Commons", "Problem of Many Hands" type decisions in any industry or context...I'm not interested in just examples from the tech world though. I'm looking for an example of a decision where, after the fact, some independent entity acknowledged, "Yeah, we should have considered that." Or a plan that was put forth was pushed through with disastrous results, even when data showed that it should have been halted or changed.
I'm hoping it will help me to work out how to present my testimony tomorrow.
New Coke?
posted by Daily Alice at 5:53 PM on May 21, 2013
posted by Daily Alice at 5:53 PM on May 21, 2013
There's this. A choice inflammatory quote: "All I can hope is that future historians note that one of the core empirical points providing the intellectual foundation for the global move to austerity in the early 2010s was based on someone accidentally not updating a row formula in Excel."
posted by cairdeas at 5:54 PM on May 21, 2013 [1 favorite]
posted by cairdeas at 5:54 PM on May 21, 2013 [1 favorite]
Jinx, headnsouth
posted by cairdeas at 5:54 PM on May 21, 2013 [1 favorite]
posted by cairdeas at 5:54 PM on May 21, 2013 [1 favorite]
German defense of France in WWII, due to Operation Bodyguard
Bomber gap
Missile gap
posted by djb at 5:56 PM on May 21, 2013
Bomber gap
Missile gap
posted by djb at 5:56 PM on May 21, 2013
George Bush says that the intelligence failure leading to war on Iraq is the biggest regret of (his)presidency.
posted by jacalata at 6:01 PM on May 21, 2013
posted by jacalata at 6:01 PM on May 21, 2013
"Mitt Romney says he is a numbers guy, but in the end he got the numbers wrong. His campaign was adamant that public polls in the swing states were mistaken."
posted by qxntpqbbbqxl at 6:06 PM on May 21, 2013 [1 favorite]
posted by qxntpqbbbqxl at 6:06 PM on May 21, 2013 [1 favorite]
The famous printing of "Dewey Defeats Truman" headlines in the 1948 US presidential election when the polls that led to the prediction had a badly flawed methodology.
posted by GenjiandProust at 6:08 PM on May 21, 2013 [1 favorite]
posted by GenjiandProust at 6:08 PM on May 21, 2013 [1 favorite]
Consumer groups pushed for regulations requiring fire-resistant children's pajamas. The fire retardant used, Tris, turned out to cause cancer in children.
posted by theora55 at 6:29 PM on May 21, 2013 [1 favorite]
posted by theora55 at 6:29 PM on May 21, 2013 [1 favorite]
Best answer: For years women entering menopause were almost automatically given estrogen replacement on the basis that observational studies showed that women taking estrogen had fewer cardiovascular events (strokes and heart attacks) than women who did not. But observational studies (which just looked at outcomes in women who happened to be taking estrogen vs woman who happened to not be) couldn't account for differences that might affect both--i.e., maybe the kind of women who would choose to go on estrogen therapy just tend to have healthier lifestyles. So they did some large randomized controlled trials--and the HERS study, which came out in 1998, and the WHI study, which was published in 2002, showed more cardiovascular disease in women taking hormone replacement therapy.
Those two studies basically brought hormone replacement therapy to a screeching halt and totally reversed the standard of care for post menopausal women. Wyeth had sponsored the HERS trial in the hopes of developing a giant market for estrogen and instead presided over its demolition.
(now of course more data is coming in that suggests that for younger women without pre existing heart disease estrogen might actually be beneficial after all, so the pendulum is swinging again, but it was a giant bomb at the time).
posted by The Elusive Architeuthis at 6:30 PM on May 21, 2013 [2 favorites]
Those two studies basically brought hormone replacement therapy to a screeching halt and totally reversed the standard of care for post menopausal women. Wyeth had sponsored the HERS trial in the hopes of developing a giant market for estrogen and instead presided over its demolition.
(now of course more data is coming in that suggests that for younger women without pre existing heart disease estrogen might actually be beneficial after all, so the pendulum is swinging again, but it was a giant bomb at the time).
posted by The Elusive Architeuthis at 6:30 PM on May 21, 2013 [2 favorites]
The Allied support for the Dutch resistance continued unaware of Das Englandspiel, where the entire resistance movement had been infiltrated and turned.
posted by scruss at 6:36 PM on May 21, 2013
posted by scruss at 6:36 PM on May 21, 2013
Best answer: The decision to launch the Challenger shuttle in 1986. As Tufte describes in Visual Explanations, Thiokol engineers knew there was a problem with the o-rings, but they didn't present it in a convincing way.
posted by Killick at 6:38 PM on May 21, 2013 [6 favorites]
posted by Killick at 6:38 PM on May 21, 2013 [6 favorites]
Thalidomide, a drug, which was supposed to help with morning sickness in pregnant women, but caused many forms of birth defects. The drug was not tested thoroughly because apparently, scientists believed that drugs taken by pregnant women could not pass through the placental barrier. Nevertheless, some countries, like the U.S., refused to approve it. Others, like Canada, left it on the market despite warnings from doctors.
posted by Bokmakierie at 6:49 PM on May 21, 2013
posted by Bokmakierie at 6:49 PM on May 21, 2013
Best answer: Inspired by Killick: a failure by NASA to appreciate the danger to the shuttle from foam strikes. There was a sense within the organization that because it hadn't caused a fatal problem before, that it wouldn't do so in the future.
"The CAIB report found that NASA had accepted deviations from design criteria as normal when they happened on several flights and did not lead to mission-compromising consequences."
posted by 1367 at 7:26 PM on May 21, 2013
"The CAIB report found that NASA had accepted deviations from design criteria as normal when they happened on several flights and did not lead to mission-compromising consequences."
posted by 1367 at 7:26 PM on May 21, 2013
Best answer: A Mars rover mission crashed because one part was designed in inches and another was in metric.
posted by KRS at 7:28 PM on May 21, 2013
posted by KRS at 7:28 PM on May 21, 2013
I first thought of Tufte's illustration of the Challenger data, like Killick. In looking for a good link, I came across this paper that found problems with Tufte's analysis. Maybe it wasn't as obvious an error as he made it look. Things are are rarely so simple as the stories we tell about them.
As an example of "...a plan that was put forth was pushed through with disastrous results, even when data showed that it should have been halted or changed" - The Swedish ship Vasa sank not because of faulty or incomplete data, but because the data was ignored. At least, that's the way the story is told.
posted by Snerd at 7:30 PM on May 21, 2013 [2 favorites]
As an example of "...a plan that was put forth was pushed through with disastrous results, even when data showed that it should have been halted or changed" - The Swedish ship Vasa sank not because of faulty or incomplete data, but because the data was ignored. At least, that's the way the story is told.
posted by Snerd at 7:30 PM on May 21, 2013 [2 favorites]
Team B. During the Ford Administration, which overlapped with Bush 41's tenure as head of the CIA, they decided to do an alternative analysis of the threat capabilities of the USSR. So, they assembled a team, including Richard Pipes and Paul Wolfowitz, and this team came back with an overstated and inaccurate view of the USSR as a threat. Team B's findings wound up becoming the basis for how the Carter and Reagan Administration dealt with the USSR, which in turn led to Carter spending more money on military toys and less on human intelligence, and of course it also led to Reagan spending money like it was nothing.
posted by Sticherbeast at 7:33 PM on May 21, 2013
posted by Sticherbeast at 7:33 PM on May 21, 2013
The Titanic.
Possibly the Hindenberg, which may have gotten inadequate info regarding weather, thus burning catastrophically.
posted by Michele in California at 7:49 PM on May 21, 2013
Possibly the Hindenberg, which may have gotten inadequate info regarding weather, thus burning catastrophically.
posted by Michele in California at 7:49 PM on May 21, 2013
New Jersey Transit not being well prepared for Sandy, especially in contrast to the MTA.
posted by Sophont at 7:51 PM on May 21, 2013
posted by Sophont at 7:51 PM on May 21, 2013
Perhaps the decisions made before the Battle of the Little Bighorn, aka "Custer's Last Stand," meet your criteria.
At least two different scouting reports warned that the Native American gathering in June, 1876, at the mouth of the Little Bighorn River, was huge, larger than any ever seen before. Instead of recognizing the threat of an enemy force that greatly outnumbered his own, General Custer interpreted the reports as an opportunity -- he would strike the gathering quickly, in broad daylight, and devastate the Native American numbers before they could disperse in fear at word of his approach. He even refused an offer of Gatling guns, suggesting that transporting artillery might slow his advance too much.
We all know how that plan worked out.
posted by peakcomm at 8:42 PM on May 21, 2013
At least two different scouting reports warned that the Native American gathering in June, 1876, at the mouth of the Little Bighorn River, was huge, larger than any ever seen before. Instead of recognizing the threat of an enemy force that greatly outnumbered his own, General Custer interpreted the reports as an opportunity -- he would strike the gathering quickly, in broad daylight, and devastate the Native American numbers before they could disperse in fear at word of his approach. He even refused an offer of Gatling guns, suggesting that transporting artillery might slow his advance too much.
We all know how that plan worked out.
posted by peakcomm at 8:42 PM on May 21, 2013
Best answer: I'm currently reading Nate Silver's The Signal and the Noise, which has some examples that answer your question. I'm only part way through, but here are some examples that you may find helpful:
-2008 financial crisis
*"too big to fail"
*security ratings
*housing bubble
-weather/natural disasters forecasting... you want people to be prepared, but if you "cry wolf" every time then people won't listen
*Hurricane Katrina
*Japan's 2011 earthquake/ tsunami/ nuclear reactor
-annual flu predictions
*should the government spend money on flu vaccines (for the 2009 swine flu)?
-climate change
*concerns about global cooling in the 1970s
posted by oceano at 8:43 PM on May 21, 2013 [1 favorite]
-2008 financial crisis
*"too big to fail"
*security ratings
*housing bubble
-weather/natural disasters forecasting... you want people to be prepared, but if you "cry wolf" every time then people won't listen
*Hurricane Katrina
*Japan's 2011 earthquake/ tsunami/ nuclear reactor
-annual flu predictions
*should the government spend money on flu vaccines (for the 2009 swine flu)?
-climate change
*concerns about global cooling in the 1970s
posted by oceano at 8:43 PM on May 21, 2013 [1 favorite]
Best answer: For your purposes I would stay away from anything political since "testimony," to me, means trying to persuade a diverse audience.
Exposure of scientists and workers to radiation in the 1930-1950s seems like a poor decision but it was more so based on a lack of data rather than poor data.
The U.S. Bombing of the Chinese embassy in Belgrade is a good example of simple factual error (the CIA identified the wrong coordinates on the map) which was fairly unpolitical, but military decisions don't always make the best analogies.
The wikipedia article for sunk cost fallacy has an interesting example of the joint British-French development of the Concorde, which many people felt, as data regarding the economic reality of operating the plane changed/was refined, was a bad idea but the development went on anyways because the powers-that-be felt it was too late to abandon the project.
posted by midmarch snowman at 8:47 PM on May 21, 2013
Exposure of scientists and workers to radiation in the 1930-1950s seems like a poor decision but it was more so based on a lack of data rather than poor data.
The U.S. Bombing of the Chinese embassy in Belgrade is a good example of simple factual error (the CIA identified the wrong coordinates on the map) which was fairly unpolitical, but military decisions don't always make the best analogies.
The wikipedia article for sunk cost fallacy has an interesting example of the joint British-French development of the Concorde, which many people felt, as data regarding the economic reality of operating the plane changed/was refined, was a bad idea but the development went on anyways because the powers-that-be felt it was too late to abandon the project.
posted by midmarch snowman at 8:47 PM on May 21, 2013
All the bonds that were purchased by various entities relying on misguided financial data during the 2007/2008 melt down which wiped out many personal retirement plans.
posted by bkeene12 at 8:51 PM on May 21, 2013
posted by bkeene12 at 8:51 PM on May 21, 2013
Robert McNamara on the Vietnam War: 'In 1995, he took a stand against his own conduct of the war, confessing in a memoir that it was “wrong, terribly wrong.”'
McNamara was SECDEF from 1961 to 1968 and introduced "systems analysis" to the DoD.
posted by the man of twists and turns at 9:23 PM on May 21, 2013
McNamara was SECDEF from 1961 to 1968 and introduced "systems analysis" to the DoD.
posted by the man of twists and turns at 9:23 PM on May 21, 2013
Best answer: In The Wisdom of Crowds, James Surowiecki analyzes the Bay of Pigs invasion as an example of groupthink--Kennedy and his advisors all shared similar information and a similar outlook, so they only confirmed each other's incorrect assumptions. You could search for "groupthink" to find more examples of similar events.
This isn't exactly what you're looking for, but Karl Weick has written about the Mann Gulch fire in 1949, in which firefighters didn't follow protocol, and many were killed. Their decisions seem paradoxical in retrospect, and Weick has analyzed how their "sensemaking" ability in the midst of confusion failed.
If you take a look at Suroweicki's book, he discusses many of the ways groups or organizations can fail to make good decisions, so he may have other leads in there for you. I read it 5+ years ago, so I don't remember much detail.
posted by pompelmo at 9:56 PM on May 21, 2013
This isn't exactly what you're looking for, but Karl Weick has written about the Mann Gulch fire in 1949, in which firefighters didn't follow protocol, and many were killed. Their decisions seem paradoxical in retrospect, and Weick has analyzed how their "sensemaking" ability in the midst of confusion failed.
If you take a look at Suroweicki's book, he discusses many of the ways groups or organizations can fail to make good decisions, so he may have other leads in there for you. I read it 5+ years ago, so I don't remember much detail.
posted by pompelmo at 9:56 PM on May 21, 2013
The Hyatt Regency Disaster killed over 100 people and was caused by poor initial design and a last minute design change that was not properly investigated by the engineers before the hotel was built.
posted by TrialByMedia at 11:55 PM on May 21, 2013
posted by TrialByMedia at 11:55 PM on May 21, 2013
The Titanic is a classic example of data driven organisational incompetence. Plenty of academic sources online too.
posted by BenPens at 5:02 AM on May 22, 2013
posted by BenPens at 5:02 AM on May 22, 2013
A Mars rover mission crashed because one part was designed in inches and another was in metric.
Didn't it just miss it's target? That's an even better metaphore than crashing.
posted by cjorgensen at 6:41 AM on May 22, 2013
Didn't it just miss it's target? That's an even better metaphore than crashing.
posted by cjorgensen at 6:41 AM on May 22, 2013
Not quite a data issue, but the Maginot Line in France was a strategic disaster. It was an excellent defense against a direct attack using WWI-era technology. However, not only was the Belgian border not protected by the Line at all, but even where the Line did protect the border, it was easily foiled by the Germans' WWII-era technology.
posted by Sticherbeast at 6:49 AM on May 22, 2013
posted by Sticherbeast at 6:49 AM on May 22, 2013
The Hubble telescope was notoriously flawed when it first launched. It was even the butt of jokes in movies and tv produced at the time. There was some data that the lens was misshapen, but other data said it wasn't. NASA chose to believe the data saying it was correct.
posted by ImproviseOrDie at 9:47 AM on May 22, 2013
posted by ImproviseOrDie at 9:47 AM on May 22, 2013
This thread is closed to new comments.
posted by headnsouth at 5:52 PM on May 21, 2013 [2 favorites]