Best explanation of the Dunning-Kruger Effect
July 11, 2018 9:24 PM   Subscribe

I want to be able to define, explain, convince, make memorable the Dunning-Kruger effect. I think what I want is a concise but evocative definition, and also one or two really good concise non-statistical examples of it in action. I know about McArthur Wheeler already.

I find myself trying to explain the Dunning-Kruger effect frequently. The studies around it (that I found quickly searching -- maybe there are others) seem based in the idea that more people think they are experts than actually are; the argument for the effect seems statistical.

I'm looking for an approach that will lead people to consider themselves as fallible in this way, rather than 'That's why those other people do such dumb things.'

I'd love to find first person examples/analogies that are memorable and likely to convince a person that they _themselves_ are just as likely to experience the effect. So, pointing to other people who are skeptical of science because they are not as educated is NOT what I'm going for.

More like, hey, remember how Dad went to the store and got the wrong kind of detergent because he didn't even _know_ there were multiple kinds, or putting a battery in backwards because you didn't realize there were two ways to do it, or something like that. Maybe? Maybe there's a better way of thinking about it?
posted by amtho to Grab Bag (20 answers total) 9 users marked this as a favorite
Something more like, "it takes an expert to know which things they don't know in a field". Known unknowns etc
posted by JonB at 10:01 PM on July 11

Maybe "When you hear about the DK effect, and just know it doesn't apply to you, that's the DK effect".
posted by Calvin and the Duplicators at 10:04 PM on July 11 [19 favorites]

This article by Errol Morris has a number of examples. It also features this summary of the effect, from Professor Dunning himself:
Even if you are just the most honest, impartial person that you could be, you would still have a problem — namely, when your knowledge or expertise is imperfect, you really don’t know it. Left to your own devices, you just don’t know it. We’re not very good at knowing what we don’t know.
A lot of the Morris's examples are probably more complicated than you're looking for, but I thought this one (again, in a quote from Dunning) was pretty clear:
People often come up with answers to problems that are o.k., but are not the best solutions. The reason they don’t come up with those solutions is that they are simply not aware of them. Stefan Fatsis, in his book “Word Freak,” talks about this when comparing everyday Scrabble players to professional ones. As he says: “In a way, the living-room player is lucky . . . He has no idea how miserably he fails with almost every turn, how many possible words or optimal plays slip by unnoticed. The idea of Scrabble greatness doesn’t exist for him.”
This article by William Poundstone is shorter but contains some specific examples of the things Dunning and Krueger studied, including firearm safety and judging how funny jokes are.
posted by yankeefog at 1:19 AM on July 12 [4 favorites]

"By definition, you don't know what you don't know" is something I hear a lot.
posted by M. at 3:09 AM on July 12

I’m reading The Hate U Give by Angie Thomas with my older kids and a more complicated example of the DK concept would be many white people in the United States and racial awareness/literacy (one of the reasons we are reading the book as a family)
posted by childofTethys at 3:23 AM on July 12

For people who see themselves as smart and educated: show them this relevant XKCD, and then explain that all of us are like that physicist, and this especially applies to people with deep knowledge of a certain field.
posted by SaltySalticid at 5:35 AM on July 12 [3 favorites]

Part of my job is tracking schedules for big construction projects (rebuilding powerplant generators, repairing dam spillways, etc.). The team members are all very accomplished and expert engineers, contracting agents, and project managers. Although I'm not in any authority position, my experience in tracking the projects provides opportunities to give feedback on the process.

You can guess that nearly every project runs behind schedule, and some team members never cease to be amazed when this happens. A few even get angry about it, with the implication that the other team members are either incompetent, lazy. or both. The reality is that it's basically an established fact that people underestimate the time needed to accomplish any given task. And when you're talking about a multiple year project that relies on dozens of people both inside our organization and contractors, slippage is guaranteed.

One of the ways I try to get others to understand that schedule slippage is human nature is to give my own example of telling my wife I need to mow the yard, then put some stuff away from patio, and vacuum out the car, so I'll be done in an hour and a half. Two hours later, I'm still not done, even when I tried to take into account my knowledge that everything takes longer than you think. Then I ask them to think about a time they ran an errand or did some task around the house, and how their estimate compared with the actual time spent. The most common response is they get a big involuntary smile on their face and nod.

We all think we are "experts" on estimating schedules, but we don't know how bad we are at it, even when we think we do!
posted by The Deej at 5:43 AM on July 12 [5 favorites]

I think D-K is different from what Metafilter refers to as engineer's disease which is where you have someone who is legitimately an expert in one field not realizing that their expertise does not extend to all fields. I deal with this a lot as someone who provides technical support to highly accomplished researchers and professors. At least with engineer's disease there is a legitimate (logically flawed as it may be) reason for the person to feel like they have some expertise. Being the expert is sort of muscle memory for them and it's hard to turn off. D-K is more someone who has no earthly reason to think they're an expert on anything, but they act like they are anyway because they are so inexpert globally that they can't figure out what being an expert in anything would even entail. (See: most of the Trump administration and Trump's frequent "No one knew that X was so complicated!" [Narrator: Everyone knew but you, dumbass.] Though I think Tillerson was a case of engineer's disease, not D-K--he knew how to business and CEO, and thought that would translate to knowing how to bureaucrat and diplomat and it did not. DeVos is the walking manifestation of D-K.)
posted by soren_lorensen at 6:25 AM on July 12 [2 favorites]

If you haven't read the original article, do. Although it's a journal article, it's written clearly enough so that most MeFi types won't have any trouble understanding it, and it has three separate studies. The easiest to understand is humor: They asked professional comedians to rate jokes, then asked a sample of regular people to rate the the jokes and their own ability to recognize humor.
posted by Mr.Know-it-some at 6:31 AM on July 12 [5 favorites]

I'm looking for an approach that will lead people to consider themselves as fallible in this way, rather than 'That's why those other people do such dumb things.'

For a longish conversation with an acquaintance, I would be tempted to ask them to name three fields or topics that they know a lot about and then tried to lead them to times that people with no knowledge tried to mansplain to them and were wrong. But there's a step in the middle that I can't seem to figure out about extracting good examples without a specific area of knowledge to look at. It would be intriguing if we had an app or AI driven software that could pre-test people (at parties?) with 20 questions before field-specific conversations started.
posted by puddledork at 7:35 AM on July 12

"A little knowledge is a dangerous thing."
posted by porpoise at 9:30 AM on July 12

There are always plenty of good examples in the news because plenty of people sound off on topics about which they are not qualified. I remember one example clearly because it seemed to me so egregious.

It was a few years ago when there was an argument going around that the human eye was so marvelous that it could not have been the result of evolution but could only be the result of divine creation. I heard a pundit on the radio begin talking about it saying "I just happen to believe....." Yeah this guy thinks his off-the-cuff notion is more reliable than the consensus of hundreds of thousands of biologists studying the subject for a hundred years.

However, the essence of D-K is not that people offer opinions that they are not qualified to give. Rather it's the observation that if you are not an expert, you don't have the tools to know you are not an expert. I've had the experience many times that some solution to a problem seemed like it would work, but when I asked an expert, it was explained to me that of course they thought of that, but it wouldn't work for reasons A, B and C. Once I had that additional knowledge, I was just that tiny bit more expert and knew that my original idea was infeasible.
posted by SemiSalt at 10:13 AM on July 12

I've had the experience many times that some solution to a problem seemed like it would work, but when I asked an expert, it was explained to me that of course they thought of that, but it wouldn't work for reasons A, B and C. Once I had that additional knowledge, I was just that tiny bit more expert and knew that my original idea was infeasible.

Yes! SemiSalt, those specific examples might be exactly what I'm looking for. Could you give details of those problems, solutions, and the things the experts knew that you didn't?
posted by amtho at 10:56 AM on July 12

I'm having a little trouble remember an example that happened to me, but here is one.

Back in 1983, the Mianus River Bridge in Greenwich collapsed. It was a big problem for trucking companies, some of whom demanded to be allowed on the Merritt Parkway which runs somewhat parallel for a distance. State officials said "Can't do that. The bridges aren't high enough." I believe they had to take some trucking execs over to the Merritt and show them the bridges before they got the point across. This may be on my mind since a North American Van Lines truck was destroyed by the King Street bridge just this week.

I'll keep thinking.
posted by SemiSalt at 11:55 AM on July 12

Check out You Are Not So Smart
posted by Gin and Broadband at 12:38 PM on July 12 [1 favorite]

Think of a marathon which is open to the public: there's going to be elite runners that can estimate their finish time to within a couple of minutes, there's going to be good club runners that estimate their time a little higher or lower than actual, erring on higher, and there's going to be rank amateurs that guess an unrealistically good time but end up barely finishing because they didn't take the prep seriously but they play football sometimes and are pretty fit, so. And of course some other people who guess realistically low and achieve it.

Compare the order people finish the race and the order their initial estimates would have put them in. The people finishing last aren't in any general sense "dumber" and the first to cross the line isn't "smarter", (or even "fitter": the late finishers could be elite athletes in a different sport) but the relative level of running expertise means that the fastest 10 will also be more accurate guessers than the last 10.
posted by Wrinkled Stumpskin at 12:52 PM on July 12

Here is an example I remember from many years ago. I told a friend that you absolutely had to only use brake fluid in a car's braking system. He argued that brakes are just a hydraulic system, and hydraulics will operate with any fluid, and that therefore there no good reason not to use motor oil in an emergency. Turns out that most automotive brake systems use a type of rubber in the seals that swells with motor oil, quickly causing catastrophic failure.
posted by 445supermag at 5:10 PM on July 12 [1 favorite]

Another example:

"Why don't we just throw the nuclear waste into the sun?"

Seems easy, but leaving aside the possibility of the rocket blowing up on the pad and spreading the deadly stuff around, throwing something into the sun is very difficult. The earth doesn't fall into the sun because it's moving at 67,000 mph. That's more than three times the velocity it takes to put a satellite in earth orbit. To "throw something into the sun", it would take a rocket big enough to escape earth's gravity, and then also to slow it from 67,000 mph to something much slower. In other words, a really, really big rocket for even a ton or so of nuclear waste.

Yet another (this one may be too wonky), but it really happened:

Suppose you have a proof (perhaps partial proof) of Fermat's Last Theorem which relies unique factorization the left hand side of the equation:

a^n + b^n = c^n

(The expression a^n means "a to the nth power).

But someone comes along and points out that even though the expression has unique factors in the real number, not so in the complex numbers. So maybe you can do some more math and handle that case.

But then someone else comes along and points out that maybe it may not have unique factors in some other kind of numbers that you don't know about and maybe haven't been invented yet. So somehow, you have to prove that can't be done.
posted by SemiSalt at 9:36 AM on July 13

"The first rule of Dunning-Kruger Club is you don't know you're in Dunning-Kruger Club" is a new favorite of mine.
posted by Jacqueline at 10:28 AM on July 13 [2 favorites]

My eighth-grade teacher had this on a poster in her classroom, and it's stuck with me ever since: "A big part of being smart is knowing what you're dumb at."
posted by Rykey at 8:16 PM on July 16

« Older Booties of Many Nations   |   Places to go with a car nearish Brno and Prague. Newer »

You are not logged in, either login or create an account to post comments