What cognitive biases should everyone know?
August 1, 2012 10:06 AM   Subscribe

What are the five cognitive biases that everyone should know?

I am teaching an introductory logic class in the fall. Along with the usual formal developments, I would like to set aside five or ten minutes of lecture time each week to introduce students to important and/or interesting cognitive biases, like the fundamental attribution error, the planning fallacy, and so on.

But which ones? Lots of cognitive biases have been studied, and most of them are fascinating. Help me narrow the list by telling me what you think is the most important one, or what you think are the most important five (or ten or fifteen). One semester gives me opportunity to make about 15 such presentations, plus or minus two, so giving me more than 15 is probably not worth doing.

Even better if you can give me a short, sweet argument that the bias(es) you suggest is (are) the most important.
posted by Jonathan Livengood to Education (32 answers total) 297 users marked this as a favorite
 
We would all do well to learn to recognize subjective validation (which ties in with the Forer effect.)
posted by griphus at 10:08 AM on August 1, 2012 [16 favorites]


Best answer: As a layperson I think Dunning-Kruger effect is FASCINATING, and I think it's worth teaching about because it applies to everyone; it's something I notice a lot in my own life and with colleagues and students; it doesn't just apply to logic or argument, it applies to how we think about ourselves. People can really relate to it (thinking you're great at something because you don't know better or don't have information; for example, my second-grade students thought they knew ALL the math because they could add and then they learned about multiplication. They didn't believe me when I told them there was something called calculus).

It's also good because it helps you be more thoughtful about your own positions and abilities in areas where you don't have much experience AND it can help you be more confident in areas where you do have experience because you realize that even though there's a ton you don't know, you're recognizing that on a higher level. I talk about it all the time including in job interviews and stuff. Of course, I don't currently HAVE a job so perhaps that's not the best argument in favor of it. Still, it's really interesting and gives you a good perspective for life as well as logic.
posted by Mrs. Pterodactyl at 10:14 AM on August 1, 2012 [15 favorites]


I don't know if it's "cognitive" enough to fit your course, but I would put in a pitch for talking about presentism/recency bias as a useful thing to do in nearly any early-undergrad class. I see it as one of the fundamental goals of almost all college education in the humanities to impart a richer historical consciousness, an awareness (and reflexive skepticism) of our bias to think of things in the terms of our own day though past people may have thought quite differently about them. So one argument for talking about it is that it'll connect to things your students are studying across the humanities, in other historically-informed classes or disciplines, and with luck provide some more abstract theoretical reinforcement and vocabulary for the habits they're being asked to develop elsewhere in place- and period-specific courses.
posted by RogerB at 10:22 AM on August 1, 2012 [2 favorites]


Best answer: I can't believe that the Baader-Meinhof phenomenon (a.k.a. frequency illusion) doesn't have its own Wikipedia page. I hear about it everywhere.
posted by Etrigan at 10:34 AM on August 1, 2012 [19 favorites]


Expectation bias is important because it shapes what knowledge is shared with the world. We tend to believe scientific results that fit with what we already "know" to be true and may have a hard time recognizing when something else is actually at work.
posted by goggie at 10:37 AM on August 1, 2012


Best answer: Sunk Cost Fallacy?

This has kept more of my girlfriends in marginal relationships than money and a big penis.
posted by Ruthless Bunny at 10:39 AM on August 1, 2012 [25 favorites]


There used to be a WP entry on the Baader-Meinhof phenomenon, but the deletionists killed it. (Most of the reasons for doing so boiled down to "I've never heard of it," despite ample sources.) It's too bad because the article was very good at one point.
posted by Kadin2048 at 10:40 AM on August 1, 2012 [6 favorites]


Best answer: Confirmation bias.

Availability bias (e.g. overestimating the likelihood of a terrorist attack because vivid memories of specific attacks easily come to mind). Related: normalcy bias ("this kind of terrorist attack has never happened before, so there's no need to plan for it").

Sunk costs, or escalation of commitment.

Anchoring and focusing.

Base rate fallacy.

Illusion of control. (This could have huge implications for public policy.)

Neglect of probability. ("This will either happen or it won't, so it's a 50/50 chance!")

Underestimating the likelihood of clustering.

Outcome bias (e.g. the winning candidate must have run a brilliant campaign, and the losing candidate must have run a terrible campaign).

Experimenter's bias.
posted by John Cohen at 10:44 AM on August 1, 2012 [17 favorites]


Prospect Theory
posted by JPD at 10:44 AM on August 1, 2012 [1 favorite]


This overlaps with RogerB's suggestion, but the Availability Heuristic. It applies to almost any case when a question with an open-ended answer is asked, and it has a huge effect on the choices we make in everyday life. Why does McDonalds adverstise so much? When I asked my doctor why he was prescribing a specific antibiotic rather than one of the half-dozen other options, he said, "Well, this one has been working on my patients."
posted by Mr.Know-it-some at 10:46 AM on August 1, 2012 [1 favorite]


Self-Serving Bias
posted by chndrcks at 10:51 AM on August 1, 2012


The Focus illusion. Awareness of this is really helpful on a personal mental health level as well as contributing to general rational thinking.
posted by weston at 10:55 AM on August 1, 2012


The Availability Cascade (great for leaping off into topics related to politics and public discourse, the role of mass media/Twitter trends/etc. in forming your judgments about the world, etc.)

The Backfire Effect (especially good in the context of how small social/political/religious interests tend to respond to negative revelations about either leaders or the existing body of arguments they rely upon.)

The Clustering Illusion (I really must insist on at least one directly applicable to the reasons you should not ever put money into a lotto drawing of any size, but this is one that can be really easily shown in about thirty-five seconds using a demonstration on a number line as seen in this Khan Academy video, so.)

The Curse of Knowledge (especially good for anyone in your class thinking about teaching or becoming a TA in grad school, or wondering why their math/science/etc. teachers are so often lousy at actually explaining things to them.)

Pareidolia (mostly just because it's important for understanding a lot of why so many people come up with such different arguments using the same information, and why people are so freaking insistent about finding/defending patterns when there's no reason to suppose a pattern is present; I stopped thinking "Jesus in a piece of toast" illusions were a likely sign of psychosis when I finally understood/accepted this one.)

I also think the sunk cost fallacy and confirmation bias are vital, but they were mentioned above, and your students are highly likely to hear about them somewhere else.
posted by SMPA at 11:12 AM on August 1, 2012 [8 favorites]


Another vote for the Dunning-Kruger Effect.

Also, The Introspection Illusion is awesome and kind of terrifying.
posted by alphanerd at 11:27 AM on August 1, 2012 [1 favorite]


The danger is always that students will learn the formal properties of a bias or fallacy but fail to recognize it in their own reasoning in everyday life. Base rate neglect is one of the most hugely important biases, but I'm not sure that many students will internalize it well enough for it to affect their lives. On the other hand, students are very good at understanding sunk costs, and there's evidence to show that a class on it affects students' future decisions. It would be nice to know exactly which ones are readily internalized, but in the absence of such information, I think it makes sense to pick "easy", less abstract ones that it seems like students would immediately see as applicable to them, like the fundamental attribution error or the planning fallacy.
posted by painquale at 12:01 PM on August 1, 2012


Post Hoc ergo Propter Hoc

No True Scotsman

Misleading Vividness is my nominee for the most important one, because it is abused so heavily by zealots for purposes of propaganda.
posted by Chocolate Pickle at 12:29 PM on August 1, 2012 [1 favorite]


Just World Fallacy
posted by Hactar at 12:36 PM on August 1, 2012 [2 favorites]


Seconding Dunning-Kreuger and Anchoring.
posted by Hollywood Upstairs Medical College at 12:40 PM on August 1, 2012


Seconding sunk cost. Learning that made a major change in my life. Explaining it to others has helped them out as well.
posted by Hactar at 12:49 PM on August 1, 2012


Survivorship Bias
Bias bias: Using a bias or falacy as proof of an incorrect conclusion.
posted by jade east at 12:52 PM on August 1, 2012 [1 favorite]


Here's a list of 24, with short descriptions (I love this site).
posted by mon-ma-tron at 12:59 PM on August 1, 2012 [1 favorite]


Second neglect of probability. Often turns up in metaphysical arguments.

No clue what the name for it is, or even if it's a fallacy, but the fact that a 1-in-a-million chance happens 300 times a day in the USA (pop. ~300m).
posted by katrielalex at 2:21 PM on August 1, 2012 [1 favorite]


And people get converses and contrapositives mixed up all too frequently: (A implies B) and (not-B implies not-A) are equivalent; (A implies B) and (B implies A) are not.
posted by katrielalex at 2:27 PM on August 1, 2012 [1 favorite]


Another vote for anchoring in the top 5. Very practical for preparing students for careers in politics, business, law, marketing, etc.
posted by likeatoaster at 3:39 PM on August 1, 2012


I can't believe that the Baader-Meinhof phenomenon doesn't have its own Wikipedia page. I hear about it everywhere.

One of the comments on your link mentions "Plate of Shrimp" from "Repo Man" which is how people I know refer to this phenomenon. Unlike many, I'm quite familiar with the Baader-Meinhof gang, from writing a paper about terrorism for the modern German History class I took back in the day; your link doesn't give a very convincing argument for using them for naming this effect.
posted by Rash at 7:28 PM on August 1, 2012


Wikipedia has a list of cognitive biases: http://en.wikipedia.org/wiki/List_of_cognitive_biases

There's a very practical use of the Planning Fallacy. Always ask others (e.g. colleagues, contractors) to come up with their own time estimates for finishing their work. They always come up with an estimate of about half the time I would have suggested, never fail.
posted by xammerboy at 7:41 PM on August 1, 2012


PLEASE teach every student you cross paths with the true meaning of "begging the question". Maybe then people will stop using it to mean "raises the question" and I will once again be able to sleep peacefully at night.

Circular reasoning is bad mostly because it's not very good.
posted by triggerfinger at 8:10 PM on August 1, 2012 [1 favorite]


I don't know the name of this bias so I'll call it "Ownership Bias" - the tendency to defend a belief, process, design, or system based on emotional attachment rather than logic. I.e. because you created it, or otherwise invested heavily in it, you'll strongly defend it.

For example, it's a known negotiation tactic to try to convince your adversary that they came up with the idea you wish to convince them of, rather than try to convince using pure logic.
posted by Terheyden at 8:44 PM on August 1, 2012 [1 favorite]


Seconding Just World Theory. It's soooo prevalent, and a belief in it leads to such a lack of compassion--for others and for onesself.
posted by parrot_person at 2:03 AM on August 2, 2012


Anchoring Bias is *enormously* useful for things like negotiating money.
posted by talldean at 9:17 AM on August 2, 2012


Response by poster: Thanks everybody! I would love to mark you all as best, but I will be a little more selective -- no offense meant if your answer didn't get picked.
posted by Jonathan Livengood at 10:11 AM on August 2, 2012


So--are logical fallacies (e.g. post hoc ergo propter hoc) really cognitive biases?
posted by snuffleupagus at 4:02 AM on August 8, 2012


« Older What video editing/effects software should I get...   |   Where is the drainage divide of San Francisco? Newer »
This thread is closed to new comments.