"Rules of Thumb" and "Instincts" of the Human Mind
July 21, 2009 11:34 AM   Subscribe

What "rules of thumb" or "instincts" of the human mind are commonly applicable?

A few months ago, I was reading Influence: The Psychology of Persuasion by Robert Cialdini. Cialdini seems to suggest that we have hard-wired instinctual responses to many types of situations. For example, he cites a religious organisation giving out flowers in the hopes of triggering a reciprocity response, where the receiver feels like they need to give something back. The message seems to be "if {someone gives you an item} then {you feel a social obligation to give something back, even if you don't know them}."

I am curious about two closely related questions:
  1. What instinctual rules, in as close to an "if A then B" form as possible, govern human-human interactions?
  2. What rules, again in as close to "if A then B" form, govern our own minds?
I am curious about pretty much any instinctual interaction that might occur in (my) daily life, from negotiation to introspection.

I have found that popular non-fiction seems to have these themes, but also seems to take a really long time to get to the point and sometimes does not explicitly cite scientific results. (Nonetheless, book recommendations would be OK.) Arguably, books like How to Win Friends and Influence People, The Game, Getting to Yes, and Never Eat Alone all fall into the mould of "if A then B" human-human interactions (though I don't have any particular emphasis on the self-help genre). In terms of our own minds, I suppose I mean things like Dan Gilbert's work, where he says things like "if {you are prevented from selecting among several alternatives} then {you will be happier than if you were given a choice}." Some rules might be easier stated in a direct form rather than "if A then B," for example Gilbert suggests that we vastly overestimate the effect of any particular event on our personal happiness.

I would also be curious about particular psychological experiments along these lines, but I still only want the brief takeaway point (possibly with some background/setup). For example, in the Milgram experiment it was kind of shown that "if {an authority figure tells someone to do something} then {they will do pretty much anything, up to and including killing someone}."
posted by pbh to Science & Nature (23 answers total) 17 users marked this as a favorite
 
The book Flow could be boiled down into one statement. I don't know if I'm including all the elements, but my off-hand recollection of the book's message is: you will be happiest if you do an activity that poses appropriately increasing challenges allowing you to continually engage with the activity and apply your increasing level of skill to effectively tackle the challenges.
posted by Jaltcoh at 11:52 AM on July 21, 2009 [4 favorites]


Best answer: It turns out that we're wired to believe Post Hoc fallacy. The correct biological formulation as bred into us is "correlation often implies causation" -- and that is, in fact, true often enough to make it worth following.

It means you can learn to manipulate the world to your own benefit without necessarily fully understanding why what you do leads consistently to the desirable result you seek.

Of course, it can mislead you a lot, too, and lead to useless activities. But if you don't have science, it's better than nothing.

Most of early human engineering was developed this way. For example, no one knew why it was that if you heated up certain kinds of pretty blue and green rocks on a fire, you got metallic copper afterwards. But you did, and copper was not only nice to look at, but it could be turned into tools and weapons. No one knew why it was that firing a clay pot made it hard and stable, but it did, and pottery was very useful.
posted by Chocolate Pickle at 12:09 PM on July 21, 2009


From the perspective of a historian actually, I can't imagine simple "if A then B" scenarios for anything but the most basic functions: if sleepy then sleep, if horny then masturbate. Even in those seemingly simple cases there are all sorts of factors that could inhibit the a to b flow: religious beliefs, in/appropriate venue, and so forth.

There is an incalculable number of variables that make up an individual's reaction to a need or desire, and individuals do not always act rationally. I don't believe you'd find anything but a pop-sciencey freakonomics answer to this question.

It'll be interesting to see what others have to say.
posted by vincele at 12:12 PM on July 21, 2009 [1 favorite]


Well, there's this one that's pretty deep: if a child is screaming in terror, run towards that child. But the value of that one is pretty obvious.
posted by Chocolate Pickle at 12:20 PM on July 21, 2009


The reciprocity theory that you've come up with was well-developed by a social scientist, Mauss, in the early 20th century. The book is called The Gift (although the book is actually probably a bit easier to understand than the Wiki entry on it), and it is one of the few texts of its type that still has an active influence on social theory today. I highly recommend it.

However, despite the feeling that we can get from a book like The Gift that there is an *instinct* involved, cross-cultural research does not support the notion that we have "hardwired instinctual responses" that translate directly into social customs. In very broad ways, we have many common responses to our needs and to social organization that make different social customs recognizable to people who do not practice them themselves. But, when you start to investigate the details, you tend to find that there are a lot of distinctions. You end up getting statements that are so overqualified that they begin to be ridiculous. I went to a talk where a person said that every society gossips. Gossip was defined as talking about people, animals, weather, and events that were familiar to both/all speakers. So really, all the person has managed to established is that people everywhere talk about mutually known things. Given what language is, that doesn't seem much more insightful than saying that all humans sleep when sleepy enough.

There are correlations between certain things, but these are loose, and as I've said, broad. So societies that are horticultural have more organizational commonalities with other horticultural societies and less with hunter-gatherer, irrigated agricultural, or industrial societies. But that isn't a hard and fast rule: exceptions and differences/similarities where you don't expect them can almost always be found.

If you're interested in delving into some cross-cultural research and looking at "what makes us the same and what makes us different"*, then I would recommend taking an anthropology course. If that's not possible, but you'd really like some more detail, I'll look into some solid recommendations for you. Let me know if you want that (I really shouldn't spend too much time on that since the thesis is calling.... but I'll happily do it if you're really interested).

*this is how a prof of mine once defined the central questions of anthropology
posted by carmen at 12:40 PM on July 21, 2009 [1 favorite]


Adding to Chocolate Pickle's point that we're wired to do what works, as social creatures, we are hard-wired to recognize our fellow minds in the world, and this also results in the anthropomorphizing of, well, everything.
In some cases, like your dog, this is not a huge stretch. But it goes much further; having a bad day? The universe has it in for you! (we are wired to see minds so much as to easily make the jump to thinking of even the world itself as something with a mind - and deliberately being malicious!). By default, pantheons of gods lie behind every facet of the world that we don't understand. Life must have a purpose because the actions of minds have purposes. Nature (itself an anthropomorphism) seems likewise to have a purpose. Even less character-based concepts like karma still suggest some kind of external awareness.

We are wired to see intention everywhere.
The world is causal, and the causes are minds.
posted by -harlequin- at 12:44 PM on July 21, 2009


Response by poster: vincele: "From the perspective of a historian actually, I can't imagine simple "if A then B" scenarios for anything but the most basic functions: if sleepy then sleep, if horny then masturbate. Even in those seemingly simple cases there are all sorts of factors that could inhibit the a to b flow: religious beliefs, in/appropriate venue, and so forth.

This is a totally reasonable concern that I think you've phrased more clearly than I would have. I should clarify.

I went with "instinct" and "rule of thumb" because I couldn't quite figure out how to phrase this question, but I think the Jaltcoh and Chocolate Pickle responses above are more or less what I'm looking for. Ultimately, I think you're right that these things are both situated and personal, so I guess if it's relevant I'm most interested in rules of thumb/instincts for "Western" culture/development, and I would be happy with things that just apply more often than expected or in the majority of cases rather than 100% of the time. Also, I suspect that many of these rules are more likely to apply to strangers rather than people who know one another well, which might remove a little bit of the psychological back and forth.

There is an incalculable number of variables that make up an individual's reaction to a need or desire, and individuals do not always act rationally."

I think I am actually most curious about situations where these sorts of instincts or societal norms lead to irrational behaviour. (As long as the instincts lead to a consistent irrational behaviour, or consistently irrational behaviour.) For example, why do people not play the Centipede Game rationally, or at least, according to how game theory would suggest? How should one modify an agreement or negotiation to account not just for the rational self-interest of the actors, but also for their emotions or instincts? I imagine (perhaps wrongly) that if I had a general catalogue of rules for how people act irrationally, it would be easier for me personally to act rationally.
posted by pbh at 12:48 PM on July 21, 2009


Response by poster: carmen: "In very broad ways, we have many common responses to our needs and to social organization that make different social customs recognizable to people who do not practice them themselves. But, when you start to investigate the details, you tend to find that there are a lot of distinctions. You end up getting statements that are so overqualified that they begin to be ridiculous."

I stated this a little in my response to vincele, but I think that what I'm looking for is a lay person's human-mind-cookbook---an escape from all of this over-generalisation and nuance that seems to happen in the academic literature. (...and an escape from the lengthy and often un-sourced popular non-fiction literature.) In the same way that people have books about how to cook in a particular style, and then there are cookbooks which just have recipes, I just want the recipes! (Possibly in the Western, US-centric style.) I am (in this question anyway) not looking to understand humanity, so much as construct a list of shorthand rules to which I can add nuances later.
posted by pbh at 1:04 PM on July 21, 2009


Best answer: "Predictably Irrational" is the book you're looking for. It's a whole book on this topic.

Also, you might enjoy "Games People Play" by Eric Berne. In my opinion, Berne's model is a toy psychology. It models something person-like, but simpler. Still, it gives you a fun framework to use when thinking about human interactions.
posted by grumblebee at 1:41 PM on July 21, 2009 [4 favorites]


I don't claim to know the psychology behind this, but someone mentioned reciprocity above -- and I've always found it quite touching that some variation on the Law of Reciprocity (i.e., "treat others the way you'd want to be treated yourself") seems to be a universal philosophy in all but maybe two of the world's religions.

(I define "religions" quite loosely here; there are similar statements in various secular creeds as well. The only two such philosophies that I'm aware of that do NOT value any kind of law of reciprocity are Satanism, and the World Church of the Creator, which is a very small new faith that takes various right-wing militia teachings and makes a religion out of them.)
posted by EmpressCallipygos at 1:52 PM on July 21, 2009


Here are some anecdotal bits of social-engineering / observations from my life:

- People in charge feel as if they're not doing their job unless they make decisions.

As a worker, this means that if I just hand my boss or client a project that I consider finished, he will ask me to change something about it. The requested change is often arbitrary. ("In my considered opinion, the background should be a lighter shade of pink!") This is what leads me to believe that the manager cares less about the specifics of his critique than about the fact that he's making a critique at all.

But his random change can really screw me up, especially if it's a change that's hard to make. So I've learned to always present (at least) three choices. "Here's my design. I can't decide between the red background, the brown one, or the green one." Secretly, I like all the backgrounds. When I present my work this way, the manager almost always confines his "choice" to the options I give him, and we both walk away happy.

I've also used this technique with overly critical people. I KNOW George is going to criticize my party, so I ask him what he thought of the floral arrangement. As it happens, I don't really care about it, so that allows him to go on a rant without my feelings getting hurt.

- People are confused by multiple levels of nesting. In nature, we often find visible objects inside other visible objects (e.g. a bird in a nest), but we rarely find visible objects inside visible objects inside visible objects... When I teach basic computer classes, I find many people get confused by complex paths to files or loops within loops within loops.

- It's REALLY, REALLY, REALLY hard for most people to apologize. And it's also hard for many people to accept an apology.

- Most people flock towards other people. I notice this, because I don't (Aspergers...) If there are two exits out of a train station, one of them tends to have way more pedestrian traffic than the other. I always wonder why hardly anyone is going out the (say) left door. And, shaking my head, I exit through it, wondering why everyone else wants to put up with a crowd. My guess is that, in general, people will flock to where other people are heading.

- People trust their memories. This is a simple one, but it has profound implications, especially when you read about how faulty human memory is. I think most people have read about how almost all eye-witness testimony is flawed, but they think about memory problems as things that affect OTHER people.

I have come to accept that my memory is flawed. This weakens me in arguments. In the past, I'd have arguments like this:

Other person: I know you drank all the wine, because I remember you doing it!

Me: Yeah, well, I remember leaving some for you!

Now, I wimp out like this (but I think my wimping out is based in reality)...

Me: I have a really strong memory of leaving some for you. I could swear I did, but of course, I may be wrong.
posted by grumblebee at 2:01 PM on July 21, 2009 [2 favorites]


I found this article interesting.

It claims there are certain universal (e.g. cross cultural, pan historical) human values. For instance, we all value fairness.

However, some people value "purity" more than others. This value doesn't manifest itself the same way in every person who has it. For some, "purity" means religious purity. For others, it means law-abiding. But the people who have a strong leaning towards it tend to vote Republican.

People who value fairness above purity tend to vote Democrat. It's not that Republicans don't value fairness. They do. They just value purity equally or more than fairness.

These values are probably hard wired. So if you value purity, it's a profound feeling for you.

This is why -- or one of the reasons why -- Democrats and Republicans tend to view each other as if the other side is a space alien. It's pretty impossible for the two sides to understand or relate-to each other.

(During the Monica-Lewinsky thing, I had a friend who absolutely came to LOATH Clinton. She couldn't understand why anyone would tolerate a guy who cheated on his wife being in charge of the country. I absolutely couldn't fathom why this was such a big deal to her.)

I think it also explains (not excuses) some of the hypocrisy we sometimes see in Conservatives. Because "the flesh is weak," a person might "sin." However, he might still have a really profound feeling that sinning is bad.

I think I have the purity "gene." For me, it manifests itself as an obsession with aesthetic purity. (In other words, I get incensed by errors and clunkiness in art).

Growing up in a Liberal community, I lucked out. I was able to have a "conservative" value without becoming a social pariah. But my strong urge towards purity makes me sometimes feel for Republicans. I seem to be able to get in their shoes better than some of my fellow liberal friends.
posted by grumblebee at 2:14 PM on July 21, 2009


Best answer: I can't call up any specifics right now, but Edward O. Wilson's On Human Nature is full of what you're looking for.
posted by newmoistness at 2:19 PM on July 21, 2009


This is pretty close to what I study. I'm curious as to whether or not there is some sort of moral "deep structure," but most social or interpersonal interactions don't seem to be hard-wired. They're enculturated. "Instincts" and "rules of thumb" are necessarily contradictory. "Rules of thumb" are generated from an internal statistical calculator, and it requires input in the form of experience.
posted by solipsophistocracy at 2:32 PM on July 21, 2009


You might find some pointers to the kinds of thing you're looking for by skimming a Psychology 101 textbook or two.

My favorite example is the Asch conformity studies, which showed that if you're in a group and everybody else makes a statement ("line A is shorter than line B"), most people will go along with the group, even when the statement is clearly objectively wrong.
posted by kristi at 6:47 PM on July 21, 2009


Best answer: Behavioral economics is the field you're looking for. Another way to describe it is the psychology of decision making. Nobel laureate Daniel Kahneman is probably the most famous and best thinker in this field. Some people think he's the greatest living psychologist.

A few theories from the field:

1. Losing something hurts a lot more than gaining something feels good. Sometimes ridiculously more. Here's an example from Ariely's book, lifted from wikipedia: Duke University has a very small basketball stadium and the number of available tickets is much smaller than the number of people who want them, so the university has developed a complicated selection process for these tickets that is now a tradition. Roughly one week before a game, fans begin pitching tents in the grass in front of the stadium. At random intervals a university official sounds an air-horn which requires that the fans check in with the basketball authority. Anyone who doesn't check in within five minutes is cut from the waiting list. At certain more important games, even those who remain on the list until the bitter end aren't guaranteed a ticket, only an entry in a raffle in which they may or may not receive a ticket. After a final four game, Carmon and Ariely called all the students on the list who had been in the raffle. Posing as ticket scalpers, they probed those who had not won a ticket for the highest amount they would pay to buy one and received an average answer of $170. When they probed the students who had won a ticket for the lowest amount they would sell, they received an average of about $2,400.

2. Everything is relative. If someone asks you to add 300 to the last four digits of your phone number and to write that new number down, and then asks you when you think Attila the Hun died, your guess will differ by centuries depending on the arbitrary phone number calculation you did before.

3. People are really bad at probability. If you are given a description of a hypothetical person like this: Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. you will think they are more likely to be a feminist bank teller than just a bank teller even though it is logically impossible.

4. Your unconscious is much more powerful than you think. If you are asked to rearrange some scrambled sentences that contain words like "wrinkled", "Florida", "gray", etc. you will walk out of the room and down the hall more slowly than how you came in. You will be primed into a state of acting old. Another example: Asian-American women primed to think about their race will do better on a math test than Asian-American women primed to think about their gender.

Some more researchers to check out: Richard Thaler, Paul Slovic, Gary Marcus, Amos Tversky.
posted by AceRock at 6:57 PM on July 21, 2009 [4 favorites]


Another one. Your "rational" or "deliberate" brain is pretty weak. If someone gives you a seven digit number to memorize and tell to someone else at the end of a hallway (psychology experiments often take place in hallways), and along the way you are interrupted and given a choice of snacks: either a piece of fruit or a piece of chocolate cake, you are much more likely to take the chocolate cake than someone who is only given a three digit number to memorize.

This is because your deliberate system, which is what is responsible for good, responsible, rational decision making as well as short term memory, is being occupied by having to memorize a longer number and so it is less able to override your "automatic" or unconscious system that wants you to have the cake.
posted by AceRock at 7:06 PM on July 21, 2009 [1 favorite]


you will be happiest if you do an activity that poses appropriately increasing challenges allowing you to continually engage with the activity and apply your increasing level of skill to effectively tackle the challenges.

Basically, try to find activities in life that are like good video games.
posted by AceRock at 10:45 PM on July 21, 2009


Best answer: I think I am actually most curious about situations where these sorts of instincts or societal norms lead to irrational behaviour.

http://en.wikipedia.org/wiki/List_of_cognitive_biases
posted by martinrebas at 1:43 AM on July 22, 2009


Best answer: I'll second grumblebee's suggestion of "Predictably irrational". There's also "Irrationality". These books are both fairly comprehensive treatments of the predictable ways in which we tend to make irrational decisions: fearing losses more than we value gains, being better at noticing and remembering evidence that confirms our biases, the same mistakes that everyone tends to make when thinking about probabilities etc. etc. They're very illuminating and give advice on how to avoid falling into those mental traps.

I haven't finished "Predictably irrational" yet. My impression so far is that "Irrationality" is somewhat more thorough and provides more references to original research papers, but I prefer the writing style in "Predictably Irrational".
posted by metaBugs at 2:47 AM on July 22, 2009 [1 favorite]


Best answer: Dan Ariely, The author of "Predictably Irrational," has a couple of good talks on TED.org. He also has a blog.
posted by grumblebee at 6:23 AM on July 22, 2009 [1 favorite]


I think the cognitive biases are probably the best example. One difficulty in this exercise is that it will be hard to use it as predictive- you never know which instinct will be the overriding factor in an individual's decisions.
posted by gjc at 6:42 AM on July 22, 2009


There's a bit or irrational behavior that may be specific to me, but I doubt it: if I make a promise to keep a secret, I don't feel I am bound by that promise when it comes to sharing it with someone really close to me.

For instance, if Fred confides in me that he's having an affair and asks me not to tell anyone, I may assure him that I won't and then tell my wife.

When I confess this here, I feel ashamed of myself, but while I'm living my life, I don't feel as if I'm doing anything wrong.

Two values tend to collide: my belief that keeping someone's secret is honorable and my belief in sharing intimate things with those who are closest to me. Though I don't tell Fred that I'm going to spill his beans to my wife, I don't feel like I'm being dishonest, because I feel like my wife and I are sort of one entity -- a team -- and that Fred should know that. I realize that's a rationalization. I'm not excusing my behavior; I'm just explaining it.

What if my wife feels the same way? I say to her, "I'm going to tell you a secret about Fred, but I promised I wouldn't tell anyone, so please don't share it." She agrees, but later tells her best friend, Jane, about Fred's dalliance. She doesn't feel like she's violating my trust, because she's known Jane since they were both children. She and Jane are like sisters. And my wife assumes that I know that, and that I don't mean "don't tell Jane" unless I specifically say so.

Jane is really close with her mother, Betty, and so she tells her mom Fred's secret. As it turns out, Betty is friends with Fred's wife's Aunt....

One day, Fred comes home and, to his dismay, finds all his possessions on the lawn.

It's always funny to me when people have affairs and think there's no way their spouses will ever find out. (Maybe this is another form of irrational behavior -- something about people having a hard time imagining complex causal chains.) If I have an affair, I am likely to confess it to at least one person and the person I am having an affair with is likewise likely to do so. So that adds two weak links to the chain. That plus the fact that I may be married for the next 40 years ups the chance that at some point, maybe fifteen years from now, the chain will break.

People also seem to have a hard time understanding that low odds, when repeated over and over, are likely to eventually pay off.
posted by grumblebee at 6:48 AM on July 22, 2009 [1 favorite]


« Older Does she not want to go home?   |   Dealing with being friend-dumped? Newer »
This thread is closed to new comments.