Adjusting the mental rear view mirrors.
December 13, 2011 10:59 AM   Subscribe

How do I spot and account for my philosophical blind spots?

When talking to an incredibly smart friend (she's completing her PhD in physics at the moment) she brought up global warming and how she didn't believe in it. A think a large part of this is due to her dad, who worked as a statistician for many years and as far as I can tell (never met him) is incredibly right wing.

She also said that she didn't have time to read the literature and was relying on his information more than anything else. I shifted topics, not wanting to get into a long debate during a transoceanic chat.

But this got me thinking, this is essentially a blind spot in her philosophy/world view, for someone who otherwise is (from my perspective at least) very rational and thoughtful. She generally doesn't suffer from the blindness of trying to apply her expertise everywhere that I've read about other scientists doing.

If she has blind spots like that in her worldview, I probably do as well. Confirmation bias and the like.

So what sort of examination of my beliefs and philosophy can I do to watch for things like this, where my belief structure has me ignoring reality? It's something that I try to watch myself for, but I can't help shake the idea that I'm probably doing the same thing as my friend.
posted by Hactar to Religion & Philosophy (30 answers total) 21 users marked this as a favorite
 
Your friend admitted that she "didn't have time" to read the literature about global warming and climate change. And thus, she has what you have termed a "blind spot" on the issue.

It strikes me that actually reading about the principles you hold, and all the arguments for AND against, would therefore be a good way to ensure that you have at least entertained the arguments from all sides.

(I actually made a point of doing that for the Creationism/Bunk debate, even though I was 99.9% sure that I was still going to conclude it was bunk. And yes, sure enough, it's bunk; but I understand its bunkitude a lot better, and am confident it's not just a blind spot of my own.)
posted by EmpressCallipygos at 11:10 AM on December 13, 2011


There are lots of epistemological developments over the past 100 years (post-Kantian) or so that focus on the finitude and opacity of self-consciousness. One common point among many is that bias and prejudice, while not exactly desirable, is unavoidable. Even a systematic effort at dragging our prejudices into the light of day and exposing them just obscures all the others. When we pay attention to attitude X, we unconsciously neglect Y.

Some approaches see this as cause to give up the task of pursuing "objectivity" altogether, while others reformulate critical thinking as an ongoing, never-completed task of examining beliefs and attitudes.

If the latter approach strikes youa s worthwhile, I suggest starting with a critical thinking text.
posted by reverend cuttle at 11:13 AM on December 13, 2011 [2 favorites]


Keep in mind that even the smartest people only know what they are told or read about. (Helen Keller was criticized because as a deafblind person, how could she know anything except what people told her? Mark Twain replied, that's true of all of us.) Unless we are actually in Iraq or friends with Putin or experiencing intense sunburns in Antarctica due to the ozone laying thinning, we don't know squat. Sure, we have informed opinions, but they're only due to secondhand information. I remember I would have arguments with conservatives about various topics, and they would spout what they read in the Economist about events far away from their everyday lives, and I would either shrug because it was boring or just quote from things that I had read. But when the topic came to something that I knew about (disabilities, which I was involved with for 17 years), I could have a genuine conversation because I had experienced it firsthand.

Unfortunately, we can't experience everything firsthand. But then, neither can the people you're dealing with.
posted by Melismata at 11:15 AM on December 13, 2011 [4 favorites]


Sadly, you're not going to be able to correct for more than a tiny portion of your blindspots. Think of the knowledge you think you have now - what percentage of it have you discovered or derived at first hand? .001% maybe? We all depend on second- or third-hand information, and our knowledge can't be any more certain than the trust we place in our personal webs of knowledge.

I think the important thing is to pick your battles. If there's a topic you care about, try to get as close to doing first-hand work on it as you can.
posted by facetious at 11:15 AM on December 13, 2011


or, like melismata said :-)
posted by facetious at 11:16 AM on December 13, 2011


For me the thing is just recognizing that it's okay to live with some uncertainty. If you value a reasoned position, supported by legitimate evidence, then it's also reasonable to think that there will be many, many instances in your life where you will have to admit that for whatever reason you will be unable to invest in gathering that evidence... that you will have a 'blind spot.' And the same applies to everyone else.

In terms of trying to discover where your 'belief structure' doesn't gel with 'reality,' well that's a pretty open question. Obviously, reading up on an issue, accounting for your perspective and investigating others, will help. But 'reality' is a heavily (and socially) manipulated thing, right? It relies on our experiences, and on our interpretations of those (shared and individual) experiences. Especially in the days of filtered searching, etc., we're all kind of living in specialized realities.

Being open to some degree of uncertainty can, I think, mitigate some of the problems associated with philosophical blind spots.
posted by emilycardigan at 11:25 AM on December 13, 2011 [2 favorites]


Just ask yourself, "How do I know this?"

Right: "I believe the earth goes around the sun. How do I know this? Because of scientific reason A, B and C. And I know A because of ..."

Wrong: "I believe XYZ. How do I know this? Because my friend Dave told me he knew this guy who had this brother that one time said ... OK, wait a minute here ..."
posted by Cool Papa Bell at 11:27 AM on December 13, 2011 [1 favorite]


Identify a belief that you hold strongly, and then seek its strongest contradiction.
posted by No Robots at 11:29 AM on December 13, 2011 [2 favorites]


Two essays/books that expanded how I thought about philosophical blind spots:

1. Derrick Jensen in his book Endgame (and in some of his talks) discusses how people go to great lengths to hide their presumptions.

I think that in order to discover some of our philosophical blind spots, we need to look at the "hidden" presumptions behind our beliefs. Here is his example of the hidden premises behind a fairly common sentence:
For example, you hear some talking head on television ask, “How are we going to best make the U.S. economy grow?” Premise one: We want the U.S. economy to grow. Premise two: We want the U.S. economy to exist. Premise three: Who the hell is we ?
I think this can get a little tiring, but it is a helpful exercise to figure out what premises lie behind your own beliefs. You can evaluate those premises on their own.

2. George Orwell's "Politics and the English Language" also illuminated how people use language to obfuscate their beliefs. If you can become clearer in how you explain your beliefs, then you can find or eliminate your philosophical blind spots.
posted by baniak at 11:31 AM on December 13, 2011 [8 favorites]


But this got me thinking, this is essentially a blind spot in her philosophy/world view

Uh, that's a big leap on your part. You do know that this field of study is very much in flux, right?
posted by blargerz at 11:41 AM on December 13, 2011


Reading up on the various cognitive biases is a useful exercise.
posted by dfriedman at 11:42 AM on December 13, 2011


You Are Not So Smart is chapter after chapter of cognitive biases explained to the lay person. It's a humbling read, in a good way.
posted by quarterframer at 11:50 AM on December 13, 2011


Best answer: For the second time today, I'm answering a question with the perspective of someone who once asked the same thing, but who's now older.

I have good news and bad news.

Bad: there's no way to systematically do this.

Good: the outlook makes it happen. You describe a sincere, honest openness which most people lack. Most people not only don't seek to fill in gaps and question their assumptions, they unconsciously (or even consciously) recoil from all that.

If you can hang on to this ernest uncertainty, this honesty, this open curiosity, you'll never stop learning (most people crystallize in their mid 20's). The gaps will fill in by virtue of your open-eyed curious thirst for truth. No need to force it! No need to cultivate some academic notion of completionism.

Just let the waters of life erode your false assumptions and fill in your gaps by virtue of evidence which will never stop presenting itself if you're merely awake and receptive to it!

Stay curious and open-eyed. Never think you've "arrived". Stay fluid, and keep thirstily chasing down info and insights when things intrigue or surprise you. This is a major key to life itself.
posted by Quisp Lover at 11:58 AM on December 13, 2011 [5 favorites]


There are many books devoted to this important topic. Try browsing the philosophy section of a nearby library or bookstore. For instance, I haven't read Attacking Faulty Reasoning, but it looks good (if you can find it at a reasonable price).

The Meaning of It All, by the famous physicist Richard Feynman, is a very short, engaging, enlightening book that touches on these themes in the context of science, religion, and so on.

Here's a good, well-sourced piece in Cracked listing 5 ways we persistently go wrong in our thinking. (The headline is misleading — it's not so much about "logical fallacies" as it is about why we commit the fallacies at all.)

I have lost track of how many times I've re-read this article by Jonah Goldberg about the "tyranny of cliches."

Here's a profound and relevant passage by John Stuart Mill. (That's a self-link to my own Posterous site. I added paragraph breaks to make it more readable online. If you want to go straight to the source, here's the whole chapter from On Liberty.)

One very simple idea I try to keep in mind, which you can apply to any controversial issue: ask yourself why an intelligent, reasonable person would disagree with you. If your answer (on a certain issue) is that there's no way for someone to reasonably disagree with your view, you probably haven't thought about it hard enough. Notice how this is based on that passage by Mill. People tend to characterize their opponent's view to sound unreasonable. That's a terrible mistake. You might as well give up on thinking altogether. It is crucial to understand your opponent's view framed as reasonably as possible. Mill says if you haven't done that, you don't really understand your own view.

Finally, if you like to read blogs, there are many good ones that focus on themes of skepticism, epistemology, etc. Here are a few I like: Overcoming Bias, Meteuphoric, Rationally Speaking (which also has a podcast), Less Wrong.
posted by John Cohen at 11:58 AM on December 13, 2011 [10 favorites]


I think the biggest key is being willing and ready to admit that you don't know something or that you could be wrong. IMO informative non-competitive conversations are more likely to be productive than ego-driven-I'm-right-you're-wrong debates.

Ask yourself, about a couple of different hot topics, "What if I was wrong about this?" What would it mean if you realized you were wrong about Colorful Martians? Would you lose friends? Lose face? Could you be frank with the people you care about? If you brought it up over Thanksgiving would there be mild interest or an outraged uproar?

Or "What could convince me that I am wrong?" If the answer is "nothing," hmm, well. That's interesting.

I think these questions are valuable because they alert you to issues that you have attachments to that have nothing to do with facts. And most likely no matter what most of us will keep holding beliefs because it's what our parents thought, or NOT what our parents think, or because we have so much of ourselves invested in those beliefs that to say "whoops, I think Colorful Martians are in fact X and not Y!" would mean redefining your whole life.
posted by bunderful at 12:02 PM on December 13, 2011 [1 favorite]


"But this got me thinking, this is essentially a blind spot in her philosophy/world view, for someone who otherwise is (from my perspective at least) very rational and thoughtful. She generally doesn't suffer from the blindness of trying to apply her expertise everywhere that I've read about other scientists doing. "

This looks like a blind spot to me, she could be right. Climate science complicated, and the number of people with an informed scientific opinion is many orders of magnitude smaller than the number of people with an opinion they are certain is right.

Be wary of certainty in general.
posted by pseudonick at 12:21 PM on December 13, 2011 [4 favorites]


I think OP is saying that his friend has a blind spot not because she disagrees with him on climate change but because she stated a firm opinion and also said she had not done her own reading on it, and this is apparently out of character for her, and yet she does not appear to realize this.

OP, giving you the benefit of the doubt here, but I'm assuming that if your friend said "I've read X and Y and have been following Z discussion, and that information has shaped my current thinking on climate change," you would not refer to this as her blind spot but as a point on which the two of you disagree.

If not, then yup - she's holding up a mirror to one of your own blind spots.
posted by bunderful at 12:46 PM on December 13, 2011


I try to be very mindful of a certain squirrely, squirmy feeling that I get when I'm being confronted with information that seems to contradict a firmly-held belief that I have. It means that the firmly-held belief may not be on as strong a foundation as I'd like to think.

The squirrely feeling is often followed by me getting defensive and going on the attack against the person / ideas that are challenging my belief.

I try to rechannel the squirrely feeling into a signal that I should take a beat to relax, and say, "Tell me more. Why do you believe that? What evidence is in favor of that? How can I learn more?"

I also have become wary of the feeling of relief when I encounter the "answer" that "disproves" a challenging idea. It is a signal that I should look carefully at this evidence, and make sure it is really true and relevant, instead of just conveniently confirming my bias.

(This is all very touchy-feely, but as a scientist, I use this same bullshit-meter for scientific ideas, philosophical ideas, and political ideas.)
posted by BrashTech at 12:57 PM on December 13, 2011 [1 favorite]


I read once that it is easier to read information that you disagree with than to listen to it or to watch it. It's something to do with how reading is a learned skill rather than an instinctive one. And since radio and television are also a type of technology, but are both types of technology that rely on old, deep-seated skills like listening to someone's voice and reading someone's facial expressions, apparently those mediums arouse stronger reactions in most people. Anecdotally, I've found this to be true. I can't watch Fox News for more than a minute without changing the channel, but I can finish an entire article by Bill O'Reilly even if I am arguing with him in my head the entire time. This realization has helped me widen my perspective quite a bit.

Also, since it's very hard to figure out what you don't know, I think it's a better strategy to start consolidating and clarifying what you do know, because if you pursue a line of inquiry deeply enough you'll start to uncover some of your unseen assumptions. So pick a subject that you care about and figure out if you can explain your major opinions on that topic. How far back does your chain of reasoning go? Are you drawing your own conclusions or relying on someone else's? Can you start getting your hands on primary sources rather than secondary ones?

Finally, I think it's really helpful to try and get some kind of international perspective on the world. You don't even have to speak another language to do this. I have some British ancestors, so I like reading British newspapers, but there are a lot of English-language newspapers out there from all different parts of the world.
posted by colfax at 12:58 PM on December 13, 2011 [1 favorite]


this sounds kind of weird, but zen flesh zen bones and zen mind beginners mind as jumpin-off points into reading zen thoughts helped me identify certain thought patterns that were unproductive or obfuscatory and set me on the path to remove them.

that being said, i have a friend a lot like the one that you are talking about who is very intelligent and very very strident about their views despite seemingly not taking into account data that might change their mind about stuff. i am frustrated with this pal sometimes because they are really smart yet hold what to my mind seem like weird opinions. in the interest of not being a dick i don't often engage with her about her opinions if i disagree unless i have some data i can toss off at her regarding the issue at hand. in a way this has caused me to both become more well-informed regarding issues i truly care about and much more willing to not hold a big-ass opinion about things i don't feel truly informed about, which i think has been good for me as a person.

hope that helps?
posted by beefetish at 1:04 PM on December 13, 2011


One more link: a thought-provoking essay by Paul Graham about how social acceptability can be the enemy of intellectual honesty. Excerpt:
Let's start with a test: Do you have any opinions that you would be reluctant to express in front of a group of your peers?

If the answer is no, you might want to stop and think about that. If everything you believe is something you're supposed to believe, could that possibly be a coincidence? Odds are it isn't. Odds are you just think whatever you're told.

The other alternative would be that you independently considered every question and came up with the exact same answers that are now considered acceptable. That seems unlikely, because you'd also have to make the same mistakes. Mapmakers deliberately put slight mistakes in their maps so they can tell when someone copies them. If another map has the same mistake, that's very convincing evidence.
posted by John Cohen at 1:25 PM on December 13, 2011 [5 favorites]


I like to look for what I call the diagonal opinion. For example, if I think MetaFilter is entertaining and a waste of time, I'd posit that it's not entertaining and not a waste of time and look into that.
posted by michaelh at 1:56 PM on December 13, 2011 [1 favorite]


Question your assumptions. Don't conflate what an authority tells you with truth. Maintain humility.
posted by benbenson at 2:21 PM on December 13, 2011


I think one valid approach is to simply not be certain about most things.

Works for me!

So far.
posted by jsturgill at 2:25 PM on December 13, 2011 [1 favorite]


It's something that I try to watch myself for, but I can't help shake the idea that I'm probably doing the same thing as my friend.

Eh, I don't know that you can. You got a monkey brain, I got a monkey brain. It's a pretty amazing monkey brain. But it has limits. Watch for squirrelly feelings, know you can be wrong, give people credit for having sincere beliefs. That's about as much as you can do.
posted by Diablevert at 2:55 PM on December 13, 2011 [1 favorite]


Make friends with people who are smart and nice, but who disagree with you about a lot of things. When you find some people who you get along with and whose intelligence you can really respect, but who hold different opinions from yours, it really makes you start to question things you take for granted. Plus, you can have awesome intellectual debates.

(If you think it's impossible for someone to be smart and nice, but disagree with you about a lot of things, because people who disagree with you about important stuff are stupid and mean, that is, in itself, a cognitive bias you should work to correct.)
posted by decathecting at 2:58 PM on December 13, 2011 [2 favorites]


A Guide to Critical Thinking could help you with that.
posted by leigh1 at 3:15 PM on December 13, 2011


Here's a good, well-sourced piece in Cracked

That right there is the nub of the thing. Who would ever have thought such a thing was possible?

Quisp Lover has it exactly right. There is not now and never will be any way for a human being to live without blind spots, simply because we are so tiny compared to all there is to know.

Working assiduously to attend to your own blind spots as you notice them is certainly a good and worthwhile thing to do, and will probably make your life richer and more interesting than it would have been otherwise. But if you fall into the trap of thinking of people who do that as superior to people who generally don't, that's a whole big blind spot of its own. In fact everybody is an expert on their own experience, and everybody is somebody you can learn something from.

I think the best that can be done is to follow the old advice about keeping an open mind, but not so open your brains fall out. It's a balancing act. You will stumble frequently. That's OK.
posted by flabdablet at 7:12 PM on December 13, 2011


Freeman Dyson has an essay in the current NYRB, 'How to Dispel Your Illusions' (a review of Daniel Kahneman, Thinking Fast and Slow), which you might find relevant.
posted by verstegan at 2:44 AM on December 14, 2011


I haven't read Attacking Faulty Reasoning, but it looks good (if you can find it at a reasonable price).

I've used this text in courses, and can attest that it is good for a course in symbolic logic, but may not be ideal for informal study.
posted by reverend cuttle at 6:35 AM on December 14, 2011


« Older A subscription service for professional grade...   |   How many kWh would be produced in a year in the... Newer »
This thread is closed to new comments.