How do I find out the things that I believe in are not real?
March 1, 2020 1:30 AM   Subscribe

I've been reading about flat earthers, anti-vaxxers, anti-taxers and similar folks. One thing they seem to have in common is the ability to reject obvious facts. It kind of spooks me. I'd like to find a test that determines which less than mainstream thoughts are in my head, and how to remove them.
posted by Marky to Science & Nature (12 answers total) 32 users marked this as a favorite
 
Here are 32 suggestions to get you started. Some of this may be familiar - but useful if you were one of those simple people who believed the earth orbits the sun or that humans evolved form apes. It looks like they stole from Wikipedia's master list of popular misconceptions.
posted by rongorongo at 2:41 AM on March 1, 2020 [4 favorites]


The BBC's conspiracy theory quiz

How Stuff Works' conspiracy theory quiz

The Independent's common misconception quiz

clearerthinking.org common misconception quiz

For non-quiz resources, try Straight Dope, Snopes and Cracked.

I'd suggest challenging yourself regularly by noticing the "facts" you take for granted and asking yourself "hmm, I've always heard that but how do I know that's true?" and doing a little research.
posted by bunderful at 4:44 AM on March 1, 2020 [3 favorites]


If I might push back gently against the premise of the question, the degree to which a belief is mainstream is not much of a guide to its reliability. There are many completely mainstream "common sense" ideas that are actually just broken.

The best way I know to get rid of beliefs that are either untrue or unhelpful or both is just learning to observe how emotionally I react when any given belief is challenged. If I find myself feeling personally affronted by somebody else's expression of a belief that's at odds with one of mine, that's a pretty good indicator that I need to go away and make an honest search for counterexamples to my own cherished truths. That's a far far more reliable reality check than the more common procedure of looking for confirming instances.

Beliefs, it seems to me, should properly always be provisional. Having beliefs is generally a good thing because it makes reasoning practicable: trying to reason without some framework of workable assumptions just takes too long to be useful. But axioms - self-evident beliefs - are risky things and it pays not to cling tightly to many of those. The point is not to allow any particular belief to be so strongly required as to make one's entire worldview collapse if it's proved to be deficient.

The only axiom I've never found a counterexample to is that something is going on. Everything more detailed than that is provisional.
posted by flabdablet at 4:54 AM on March 1, 2020 [21 favorites]


I would second the idea that "less than mainstream" does not equal invalid, and that "mainstream" does not equal true.
Obvious example: Economic growth can continue indefinitely within a finite ecosystem. This belief is not only mainstream, it runs the world. It's completely wrong.
I'm one of those people who questions everything and believes very little. The only significant thing I've been unable to disprove was pointed out by the Buddha 25 centuries ago: everything is impermanent.
posted by Joan Rivers of Babylon at 6:06 AM on March 1, 2020 [7 favorites]


The general question of how we know what we know and how we know what is true is very deep. However, in real life, it's pretty clear that we "know" most of what we think we know because someone told us. It might have a parent, a teacher, a classmate, a journalist, a filmmaker, or just a guy in an airport waiting area. Like Snopes, we can probably run down the original source for a lot of things, but our experience is that we are often dependent on someone else being right.

A second way to know something is through experience. By now, you are probably pretty sure flames are hot and ice is cold. But experience can also be misleading, and we are all susceptible to confirmation bias. I had some chicken soup, and I felt better....

Just restricting ourselves to medical subjects, we are constantly reminded that even the "experts" are often wrong. Butter was good until it was bad, then bad until it was good again. Exercise was good for losing weight until it wasn't. Etc, etc, every day. Think about how many decades, maybe centuries, sick people were beld without anyone noticing that it wasn't doing any good.

A couple ways to examine a proposition to see if it holds up.

1) Does it actually make sense? Is it really likely that jets are spewing special substances (chemtrails) over the land and thousands of people are keeping the secret? Is it really likely that the Apollo project, which employed tens of thousands of people, was a hoax?

2) Does it stand the test time? It was plausible to doubt Darwin's theory of evolution in 1860, but is it really plausible now given the generations of biologists that have studied it and the confirmation of it with modern genetics?
posted by SemiSalt at 7:38 AM on March 1, 2020 [1 favorite]


However, in real life, it's pretty clear that we "know" most of what we think we know because someone told us.

Yep. The real difference between most flat earthers or anti-vaxxers and most people who are sure those ideas are wrong is simply who they choose to listen to. If you "know" the earth is not flat, it most likely is not because you have personally examined any of the evidence for a round earth. You most likely don't even know what supporting scientific evidence exists. You just know that mainstream scientists agree that the earth is round. You don't really know the earth is round any more than a flat earther knows it isn't.

It sounds like that's the kind of "knowing" you're looking for. You want to make sure none of your ideas are out of the mainstream. Fortunately, that's an easier task than making sure your ideas are true. You can simply check to see what kinds of sources are presenting an idea. If it's coming from well-respected newspapers, government agencies, scientific organizations, etc. it's probably mainstream. If it's coming from sources that claim the truth is being covered up or denied because of some conspiracy, it's almost certainly not.

But I suspect what you really want to know is not how you can check out the validity of a new idea someone puts forth on Facebook, but how you can go through the ideas you already accept as true, some of which you may have held without question for years, and identify those that may not be supported by mainstream thought. Probably the best thing you can do is read widely, including the kind of reading you've been doing lately about fringe ideas, the common misconceptions lists people are suggesting above, browsing Snopes, etc. Eventually you'll begin to get a feel for the type of thinking that's generally considered wrong so you'll be more likely to recognize it in your own beliefs.

It's also good to develop a willingness to question anything and everything. Of course, doing a good job of that means being willing to consider whether some mainstream ideas might be wrong. If you want to move towards feeling confident that your ideas really are true, as opposed to generally recognized as true, you'll need to do that. And probably what will actually happen is that you will become less confident that you really know the truth. That's not a bad thing, though.
posted by Redstart at 8:34 AM on March 1, 2020


I can't remember where I learned the phrase, but, the question, "what would convince you that you are wrong," works for me. If there is no sincere answer, you've crossed over into beliefs that have no connection to reality. If you can clearly articulate the answer, and it hasn't been met, then you're just a weirdo who questions things. That's not a bad thing.

As someone who 95% of the world would disagree with (on the far left, atheist, materialist side), consensus, in general, seems like a bad choice. But, obviously, most would disagree. (And consensus among experts who share your broad world view may deserve a special category.)
posted by eotvos at 9:39 AM on March 1, 2020 [3 favorites]


I've been listening to a podcast called "You're Wrong About" which is basically these 2 millenial journalists researching hot button news stories from their childhoods and exploring the nuance that everybody missed the first time around.

One of their favorite questions is "What do we not need evidence to believe?" For example, the satanic panic of the 80's - it happened right around the time that awareness of childhood sexual abuse was starting to hit the mainstream, but denial of that was very strong. However, the idea that daycare workers were abusing toddlers in satanic rituals somehow was accepted by huge numbers of people. Like, really? How would that even work? Could you actually get a room full of toddlers naked for a ritual? Do you know how hard it is to get a room full of toddlers to put their coats on to go outside?

Now I always ask myself what level of evidence I'm requiring for a belief. I remember reading somewhere that we apply different standards of evidence to things that confirm our preconceptions vs. things that challenge them. It's CAN I believe that vs MUST I believe that.

So, if you tell me that deficits always are reduced during Democratic administrations and rise during Republican administrations - yeah, sure I can believe that. I accept it and don't look further.

If you tell me Nixon supported civil rights for African Americans - I'm gonna have to google that one. I won't believe it until I must.

It's hard to catch erroneous beliefs in yourself but I think one good start is to interrogate your evidence. Just imagine a little mefi demon sitting on your shoulder saying "cite, please" constantly.
posted by selfmedicating at 9:42 AM on March 1, 2020 [3 favorites]


Oh just thought of another thing. There's a branch of psychology called Behavioral Economics that explores common cognitive distortions that people are prone to. There's a great book called Predictably Irrational that you would probably like. The podcast Hidden Brain is also a great resource for exploring some of the common ways that we humans fool ourselves. If you know some of the most common traps I think you'll be less likely to fall into them.
posted by selfmedicating at 9:50 AM on March 1, 2020 [4 favorites]


Stand in front of a bathroom mirror and look at your reflected face. Familiar? How big is that reflected face compared to your real-life face? Check your assumption by measuring both from top of head to chin.
posted by rongorongo at 11:51 AM on March 1, 2020


I realize the question is which mainstream ideas are in the OP's head and how to remove them.

But it's (to me) amazing at how some concepts just hang in there despite being easy to debunk.

#1 - mirrors reverse things left and right... no, if you take a piece of thin paper, write your name on it with black marker and hold it up in the bathroom so you can read it, you can also read it in the mirror... it won't "reverse" unless you turn it around and "show" the paper to the mirror, but you could just as easily flip it over in which case it will be the inverse not the reverse!

#2 - the use of schizophrenia to mean split personalities (as opposed to psychotic split from reality) is very easy to check in Google or a dictionary, but people continue to use it incorrectly.

So the above two examples aren't like the round earth, they can be easily checked in just a few minutes. It reminds me of the Carl Sagan quote "Every kid starts out as a natural-born scientist, and then we beat it out of them."
posted by forthright at 7:42 PM on March 1, 2020


I apologize in advance if this example doesn't work for you, but there is a 70% chance it will.

Reading the following passage will very possibly contradict something you "know".
New York Times: "[...] About 90 percent of scientists believe G.M.O.s are safe — a view endorsed by the American Medical Association, the National Academy of Sciences, the American Association for the Advancement of Science and the World Health Organization [...]."
That feeling of knowing you're right despite other people's input is sometimes called intuition. Whenever decide to trust it over the facts you're in dodgy territory.
posted by Tell Me No Lies at 7:52 PM on March 1, 2020


« Older American Tuna Salad Query   |   Relationship Advice with a Drinker? Newer »
This thread is closed to new comments.