Mental Action
July 9, 2006 3:42 PM Subscribe
Can we believe at will in the same way that we can raise our arms at will?
I can raise my right arm pretty much whenever I want to, simply by trying to make it go up. I can do it without trying to do anything else as a means to that end. For instance, I do not try to act upon my brain in a certain way, nor do I know which muscles I'd have to flex. I just want it to go up, and, somehow, I execute the action.
Are there any cases where a person can acquire a belief in the same way?
By "belief" I shall mean a high probability of being true that one ascribes to a proposition. E.g., I take there to be a 97% chance that it will rain in Argentina in the next three days.
Some things that might help in thinking about the question:
1. Could I go from 97% to 96% at will? In this case I'd be modifying what I take to be the likelihood by only one percentage point. If the modification is small, should this make the task easier?
2. Be careful to distinguish between really taking there to be a so-and-so likelihood versus merely supposing so for the sake of argument. We can merely suppose anything; really believing is a different matter.
3. If believing at will is possible, should we expect that smart people are more likely to be able to do it than dumb people? If so, this suggests that believing at will is a kind of talent. Or would dumb people be more likely to be able to pull it off? If so, this suggests that believing at will, though it may be an ability, is not a talent, and is more likely a result of intellectual vice: stupidity or irrationality of some kind.
I can raise my right arm pretty much whenever I want to, simply by trying to make it go up. I can do it without trying to do anything else as a means to that end. For instance, I do not try to act upon my brain in a certain way, nor do I know which muscles I'd have to flex. I just want it to go up, and, somehow, I execute the action.
Are there any cases where a person can acquire a belief in the same way?
By "belief" I shall mean a high probability of being true that one ascribes to a proposition. E.g., I take there to be a 97% chance that it will rain in Argentina in the next three days.
Some things that might help in thinking about the question:
1. Could I go from 97% to 96% at will? In this case I'd be modifying what I take to be the likelihood by only one percentage point. If the modification is small, should this make the task easier?
2. Be careful to distinguish between really taking there to be a so-and-so likelihood versus merely supposing so for the sake of argument. We can merely suppose anything; really believing is a different matter.
3. If believing at will is possible, should we expect that smart people are more likely to be able to do it than dumb people? If so, this suggests that believing at will is a kind of talent. Or would dumb people be more likely to be able to pull it off? If so, this suggests that believing at will, though it may be an ability, is not a talent, and is more likely a result of intellectual vice: stupidity or irrationality of some kind.
Belief is not a single stimulus event, like raising your arm. Beliefs are complex information-based patterns configured into a long-term substrate of neurons.
Because of this, they are not "movable" in the sense that your arm is movable. They are like photographs, or software. They are a particular arrangement of things. It takes a lot of effort to "change" software because many small relations between things must change just so in order for the desired result to occur.
Same thing with beliefs. Beliefs are patterns, not things, so in order to change them, you must make many small changes to a complex system, and hope the final result is in fact the desired change in the overall system.
Thinking of beliefs as individual things will only lead you away from the truth.
posted by clord at 4:08 PM on July 9, 2006 [2 favorites]
Because of this, they are not "movable" in the sense that your arm is movable. They are like photographs, or software. They are a particular arrangement of things. It takes a lot of effort to "change" software because many small relations between things must change just so in order for the desired result to occur.
Same thing with beliefs. Beliefs are patterns, not things, so in order to change them, you must make many small changes to a complex system, and hope the final result is in fact the desired change in the overall system.
Thinking of beliefs as individual things will only lead you away from the truth.
posted by clord at 4:08 PM on July 9, 2006 [2 favorites]
If you want to believe in something, chances are you'll be more willing to accept anything that supports that belief.
posted by popcassady at 4:08 PM on July 9, 2006
posted by popcassady at 4:08 PM on July 9, 2006
It depends on whether you believe that free will exists. Question 1&2 aim more at justification than belief. The question is raised where does the belief originate. Like the raised arm, a belief is a process within the brain that is causally based. Are our beliefs determined by a causal process? If they are we wouldn't have free will regarding the % that we believe, it would just be a fact within our brain. The % could differ, but it wouldn't be by choice but more from another reaction within the causal process. The IQ of the person forming the belief wouldn't really matter, if you believe we are all causally determined beings, all our brains would function the same (mental defects being the exception). Now you could negate that our beliefs are determined, but that leads to other problems.
posted by vionnett at 4:09 PM on July 9, 2006
posted by vionnett at 4:09 PM on July 9, 2006
Are you saying that people can choose to believe things at will, and can then modify those choices at will? And that some people might be better than others at making choices and continuing to choose to believe in them?
If so, then my answer is "yes." Happens every hour of every day.
SMITH
Why, Mr. Anderson? Why? Why, why do you do it? Why, why get up? Why keep fighting? Do you believe you're fighting for something, for more than your survival? Can you tell me what it is? Do you even know? Is it freedom? Or truth? Perhaps, peace? Could it be for love? Illusions, Mr. Anderson. Vagaries of perception. Temporary constructs of a feeble human intellect trying desperately to justify an existence that is without meaning or purpose! And all of them as artificial as the Matrix itself. Although, only a human mind could invent something as insipid as love. You must be able to see it, Mr. Anderson. You must know it by now. You can't win! It's pointless to keep fighting! Why, Mr. Anderson, why? Why do you persist?
NEO
Because I choose to.
posted by frogan at 4:15 PM on July 9, 2006
If so, then my answer is "yes." Happens every hour of every day.
SMITH
Why, Mr. Anderson? Why? Why, why do you do it? Why, why get up? Why keep fighting? Do you believe you're fighting for something, for more than your survival? Can you tell me what it is? Do you even know? Is it freedom? Or truth? Perhaps, peace? Could it be for love? Illusions, Mr. Anderson. Vagaries of perception. Temporary constructs of a feeble human intellect trying desperately to justify an existence that is without meaning or purpose! And all of them as artificial as the Matrix itself. Although, only a human mind could invent something as insipid as love. You must be able to see it, Mr. Anderson. You must know it by now. You can't win! It's pointless to keep fighting! Why, Mr. Anderson, why? Why do you persist?
NEO
Because I choose to.
posted by frogan at 4:15 PM on July 9, 2006
Lots of people think that their cats love them. So it is possible.
posted by LarryC at 4:17 PM on July 9, 2006
posted by LarryC at 4:17 PM on July 9, 2006
Many people choose to love someone. Its a belief system like anything else. You choose to believe in this person. Even when they let you down, you may have a twinge of doubt but ultimately decide to go on believing in this person. You decide to believe, you decide to stay that course.
posted by vacapinta at 4:33 PM on July 9, 2006
posted by vacapinta at 4:33 PM on July 9, 2006
I'm with criticalbill. People work the selection bias all the time, sure, but that's not nearly as simple as raising your arm.
posted by equalpants at 4:33 PM on July 9, 2006
posted by equalpants at 4:33 PM on July 9, 2006
Response by poster: I should clarify that I am inquiring about whether we can believe at will without acquiring any new EVIDENCE regarding the proposition in question.
Also, I have tried to ask the question in such a way that whether determinism is true is irrelevant. Even if determinism is true, there is such a thing as wanting to raise one's arm and proceeding to raise it without trying to do something else as a means to that end. My question is whether we can EVER do the same with regard to a belief: acquire one in response to a practical (as opposed to evidential) reason straightaway, without aiming at a means to that end.
Furthermore:
1. critical bill says: "Beliefs are complex information-based patterns configured into a long-term substrate of neurons... like photographs, or software. They are a particular arrangement of things..."
Reply: notice how I define "belief" in my question. I am not using the term in your way; I am using it in my way. I am talking about a different kind of phenomenon, and my question is about what I am thinking about.
2. Frogan: your existential dialogue seems to be about overt action.
posted by Eiwalker at 4:34 PM on July 9, 2006
Also, I have tried to ask the question in such a way that whether determinism is true is irrelevant. Even if determinism is true, there is such a thing as wanting to raise one's arm and proceeding to raise it without trying to do something else as a means to that end. My question is whether we can EVER do the same with regard to a belief: acquire one in response to a practical (as opposed to evidential) reason straightaway, without aiming at a means to that end.
Furthermore:
1. critical bill says: "Beliefs are complex information-based patterns configured into a long-term substrate of neurons... like photographs, or software. They are a particular arrangement of things..."
Reply: notice how I define "belief" in my question. I am not using the term in your way; I am using it in my way. I am talking about a different kind of phenomenon, and my question is about what I am thinking about.
2. Frogan: your existential dialogue seems to be about overt action.
posted by Eiwalker at 4:34 PM on July 9, 2006
clord: Same thing with beliefs. Beliefs are patterns, not things, so in order to change them, you must make many small changes to a complex system, and hope the final result is in fact the desired change in the overall system.
That sounds about right. I would further add that by the time one becomes adept at consciously doing this sort of thing, or at being aware of the subconscious processes that routinely make it happen, the concept of "belief" has changed quite a bit.
There is also the more indirect method of just putting yourself into situations where you know those unconscious reactions to stimuli will probably tend to encourage a particular belief, without it requiring any conscious understanding of the process by which that happens. Like going to church every day in the hopes you'll eventually believe whatever they're preaching.
posted by sfenders at 4:36 PM on July 9, 2006
That sounds about right. I would further add that by the time one becomes adept at consciously doing this sort of thing, or at being aware of the subconscious processes that routinely make it happen, the concept of "belief" has changed quite a bit.
There is also the more indirect method of just putting yourself into situations where you know those unconscious reactions to stimuli will probably tend to encourage a particular belief, without it requiring any conscious understanding of the process by which that happens. Like going to church every day in the hopes you'll eventually believe whatever they're preaching.
posted by sfenders at 4:36 PM on July 9, 2006
Response by poster: I'm sorry! I attributed a quote to criticalbill that actually came from clord! Oops!
posted by Eiwalker at 4:37 PM on July 9, 2006
posted by Eiwalker at 4:37 PM on July 9, 2006
At issue is the definition of "belief". A hard core empiricist-skeptic will attempt to have no "beliefs" at all; rather she will just attempt to understand the information as it is available through the senses, but always refraining from drawing positivist conclusions. Many people will claim that the only beliefs they have are those that essentially they have learned from their experience in the world, which is to say, they could just choose to "believe" some random projection, but that would be a waste. They want to comprehend what actually is, they want the truth. And by most accounts, the truth is 'out there'. Beliefs are what you understand about what you perceive and experience.
Free will is theorized to allow you to control you, but not the world or what you perceive. The idea of a "belief" is an affirmation of your perceptions, or an interpretation of your perceptions. If you believe it's going to rain, presumably that is based on some evidence or experience; if it's not, then you're just asking if a person can project their fantasies and personal dreams, and ignore reality. Sure: we usually call it mental illness.
posted by mdn at 5:00 PM on July 9, 2006
Free will is theorized to allow you to control you, but not the world or what you perceive. The idea of a "belief" is an affirmation of your perceptions, or an interpretation of your perceptions. If you believe it's going to rain, presumably that is based on some evidence or experience; if it's not, then you're just asking if a person can project their fantasies and personal dreams, and ignore reality. Sure: we usually call it mental illness.
posted by mdn at 5:00 PM on July 9, 2006
notice how I define "belief" in my question.
You use it in a very odd way, actually. I don't think anyone's usual idea of belief distinguishes between a 96 and a 97% chance of rain. I can estimate the probability of rain without believing anything particular about it. Anything that can be numerically quantified like that is going to depend on a systematic and conscious process to get the result, so changing one's estimate is just a matter of revising the system or the input data. No great trouble, but somewhat specific to the particular thing you're estimating. In the case of rain you could construct moving averages of moving averages of weather data, repeatedly build climate models, and just generally try things until you get arbitrarily close to the number you want. Might take a while depending on how "really" you want to believe in the accuracy of the outcome...
But you already discarded the usual meaning of "belief" and then apparently realized you still need it for this to be an interesting question, and add it back in pretending it's not central: "really taking there to be a so-and-so likelihood", which is the belief that your estimate is numerically accurate, is the actual "belief" in question, not the probability that it will rain.
Now, one can have varying degrees of belief, but normally they are quite simplistic; often just "it will probably rain" is good enough, with the "probably" thrown in with no really exact meaning, just a generic qualifier among several that are habitually added. There can be a variety of these belief qualification standards that people develop, and the more intellectually sophisticated someone is, the more I'd say they're likely to have. So, you can see how it gets complicated.
And if you don't have that kind of thing going on at all, then I don't think you're talking about anything like "belief". It seems entirely possible to get by without them, after all. You'd just say (and think) "This method of estimating the probability of rain says there's a 97% chance. How interesting." No complicated concept of belief required.
posted by sfenders at 5:02 PM on July 9, 2006
You use it in a very odd way, actually. I don't think anyone's usual idea of belief distinguishes between a 96 and a 97% chance of rain. I can estimate the probability of rain without believing anything particular about it. Anything that can be numerically quantified like that is going to depend on a systematic and conscious process to get the result, so changing one's estimate is just a matter of revising the system or the input data. No great trouble, but somewhat specific to the particular thing you're estimating. In the case of rain you could construct moving averages of moving averages of weather data, repeatedly build climate models, and just generally try things until you get arbitrarily close to the number you want. Might take a while depending on how "really" you want to believe in the accuracy of the outcome...
But you already discarded the usual meaning of "belief" and then apparently realized you still need it for this to be an interesting question, and add it back in pretending it's not central: "really taking there to be a so-and-so likelihood", which is the belief that your estimate is numerically accurate, is the actual "belief" in question, not the probability that it will rain.
Now, one can have varying degrees of belief, but normally they are quite simplistic; often just "it will probably rain" is good enough, with the "probably" thrown in with no really exact meaning, just a generic qualifier among several that are habitually added. There can be a variety of these belief qualification standards that people develop, and the more intellectually sophisticated someone is, the more I'd say they're likely to have. So, you can see how it gets complicated.
And if you don't have that kind of thing going on at all, then I don't think you're talking about anything like "belief". It seems entirely possible to get by without them, after all. You'd just say (and think) "This method of estimating the probability of rain says there's a 97% chance. How interesting." No complicated concept of belief required.
posted by sfenders at 5:02 PM on July 9, 2006
mdn : "A hard core empiricist-skeptic will attempt to have no 'beliefs' at all"
How would she come about to this "stance"?
posted by Gyan at 5:33 PM on July 9, 2006
How would she come about to this "stance"?
posted by Gyan at 5:33 PM on July 9, 2006
Yes, we can; although most people don't believe in it, and will look at you funny if you do it in public.
posted by flabdablet at 5:33 PM on July 9, 2006
posted by flabdablet at 5:33 PM on July 9, 2006
"You can't always think what you want"
I like to keep up with various cognitive science blogs, and I found out that "mental control" is a research area. You might want to check it out. I followed links from BPS Research Digest: Don't suppress negative thoughts about yourself to a bibliography for mental control and ironic processescompiled by Daniel M. Wegner at the Mental Control Lab.. Fun, huh? There's an online experiment page.. I'm usually too shy to write up FPPs, but am tempted by this page. hope it helps answer your question. white elephants.
posted by bleary at 5:44 PM on July 9, 2006
I like to keep up with various cognitive science blogs, and I found out that "mental control" is a research area. You might want to check it out. I followed links from BPS Research Digest: Don't suppress negative thoughts about yourself to a bibliography for mental control and ironic processescompiled by Daniel M. Wegner at the Mental Control Lab.. Fun, huh? There's an online experiment page.. I'm usually too shy to write up FPPs, but am tempted by this page. hope it helps answer your question. white elephants.
posted by bleary at 5:44 PM on July 9, 2006
Waiting for an argument...
You're going to get one - over what the definition of a "belief" is - and it's not going to answer your question. Why don't you define it and spare us the trouble?
posted by ikkyu2 at 5:50 PM on July 9, 2006
You're going to get one - over what the definition of a "belief" is - and it's not going to answer your question. Why don't you define it and spare us the trouble?
posted by ikkyu2 at 5:50 PM on July 9, 2006
Response by poster: sfenders makes three points that I'd like to respond to.
1. "I don't think anyone's usual idea of belief distinguishes between a 96 and a 97% chance of rain."
I agree. Let's change the question, then, to be whether one can go from, say, roughly 95% to roughly 80%. Surely we can think in terms of rough spreads.
2. "you already discarded the usual meaning of "belief" and then apparently realized you still need it for this to be an interesting question, and add it back in pretending it's not central: "really taking there to be a so-and-so likelihood", which is the belief that your estimate is numerically accurate, is the actual "belief" in question, not the probability that it will rain."
Clever, but wrong. You're thinking that by "really taking there to be a so-and-so likelihood" as opposed to merely supposing it for the sake of argument, I have set myself up for an infinite regress.
Thus if I believe that there is a so-and-so chance, then I must believe that I believe this, in which case I must believe that I believe that I believe this, in which case I must believe that I believe that I believe that I believe this, etc., ad infinitum.
But this problem does not arise, since my belief is not about myself: it's about the chance of rain. So, you're wrong in saying that "really taking there to be a so-and-so likelihood" is equivalent to "the belief that your estimate is numerically accurate." My belief is about the chance of rain, not about myself.
3. "It seems entirely possible to get by without [beliefs]... just say (and think) 'This method of estimating the probability of rain says there's a 97% chance. How interesting.'"
I think that some people really do sometimes think that some things are more likely than others: they do not always simply think, "Such and such method suggests P."
posted by Eiwalker at 6:06 PM on July 9, 2006
1. "I don't think anyone's usual idea of belief distinguishes between a 96 and a 97% chance of rain."
I agree. Let's change the question, then, to be whether one can go from, say, roughly 95% to roughly 80%. Surely we can think in terms of rough spreads.
2. "you already discarded the usual meaning of "belief" and then apparently realized you still need it for this to be an interesting question, and add it back in pretending it's not central: "really taking there to be a so-and-so likelihood", which is the belief that your estimate is numerically accurate, is the actual "belief" in question, not the probability that it will rain."
Clever, but wrong. You're thinking that by "really taking there to be a so-and-so likelihood" as opposed to merely supposing it for the sake of argument, I have set myself up for an infinite regress.
Thus if I believe that there is a so-and-so chance, then I must believe that I believe this, in which case I must believe that I believe that I believe this, in which case I must believe that I believe that I believe that I believe this, etc., ad infinitum.
But this problem does not arise, since my belief is not about myself: it's about the chance of rain. So, you're wrong in saying that "really taking there to be a so-and-so likelihood" is equivalent to "the belief that your estimate is numerically accurate." My belief is about the chance of rain, not about myself.
3. "It seems entirely possible to get by without [beliefs]... just say (and think) 'This method of estimating the probability of rain says there's a 97% chance. How interesting.'"
I think that some people really do sometimes think that some things are more likely than others: they do not always simply think, "Such and such method suggests P."
posted by Eiwalker at 6:06 PM on July 9, 2006
Response by poster: You're going to get one [an argument] - over what the definition of a "belief" is - and it's not going to answer your question. Why don't you define it and spare us the trouble?
Hopefully you'll find what you're looking for in my original explanation to my question.
posted by Eiwalker at 6:15 PM on July 9, 2006
Hopefully you'll find what you're looking for in my original explanation to my question.
posted by Eiwalker at 6:15 PM on July 9, 2006
As others have pointed out, that's an odd definition of belief. One way to approach the question is to do as Immanuel Kant did in his first critique (the Critique of Pure Reason): that is, to say that "I believe that X is true" is simply drawing attention to the transcendental ego involved in any proposition that "X is true." In other words, there's no difference between my affirming, "The moon is made of green cheese," and my saying, "I believe that the moon is made of green cheese." (I don't believe that, by the way.)
Another way to approach it is the scholastic distinction (which goes back to Plato) between knowledge and belief: you know something to be true when you can deduce it from universal principles, whereas you believe it to be true when you can't make such a deduction. According to this line, there's a subtle distinction between knowledge and true belief: a historian who has studied the evidence knows that Jack Ruby shot Lee Harvey Oswald, while a layperson who has read a popular book might simply have the true belief that Ruby shot Oswald. If it's knowledge, it's no longer a matter of belief.
The problem with this approach, as Richard Rorty and other pragmatists have pointed out, is that we have no direct line to the nature of the universe; we have to figure it out by agreed-upon methods. The Kantian approach has the virtue of putting an emphasis on affirmation. What seems to be significant about belief is that you claim that it is true.
There are different degrees of belief, of course, which seems to be what your question is getting at. But there's a world of difference between saying, as you did, that belief is a matter of the chance that you affirm a proposition, and saying that you believe that there is a high chance that a proposition is true. I can't fathom what it might mean to say that "there's a 70% chance that I believe it will rain tomorrow." But there's no problem with my saying, "I believe that there's a 70% chance it will rain tomorrow." The grounds for my belief are an entirely different matter.
In other words, I think your question would be better framed in terms of degrees of certainty, not probability of belief. Interestingly enough, modern probability mathematics involved a significant element of subjectivity (questions about what the odds are, given a certain state of evidence, that a proposition is true). But that's distinct from degrees of belief. But there's another measure: your willingness to act on your beliefs. That has a significant affective component.
Which leads me to a possible answer to your question: Pascal's Wager. In a nutshell, the seventeenth-century philosopher Blaise Pascal posed the question: is it reasonable to believe in God? He gave birth to game theory in his response: it's a good bet to believe in God. And, Pascal continued, if you go through the motions, eventually you will come to truly believe. Modern sociological studies of religious conversion, like Rodney Stark's research on the Unification Church, shows that converts who are initially skeptical of a religion's claims come to believe them after they are socially integrated into the church/sect/cult/whatever. No new evidence has been presented; rather, social bonds have helped make formerly ridiculous claims seem plausible. For a more general treatment of the issue, you can take a look at Steven Shapin's book A Social History of Truth, as well as the classic by Peter Berger and Thomas Luckmann, The Social Construction of Reality.
posted by brianogilvie at 6:24 PM on July 9, 2006 [1 favorite]
Another way to approach it is the scholastic distinction (which goes back to Plato) between knowledge and belief: you know something to be true when you can deduce it from universal principles, whereas you believe it to be true when you can't make such a deduction. According to this line, there's a subtle distinction between knowledge and true belief: a historian who has studied the evidence knows that Jack Ruby shot Lee Harvey Oswald, while a layperson who has read a popular book might simply have the true belief that Ruby shot Oswald. If it's knowledge, it's no longer a matter of belief.
The problem with this approach, as Richard Rorty and other pragmatists have pointed out, is that we have no direct line to the nature of the universe; we have to figure it out by agreed-upon methods. The Kantian approach has the virtue of putting an emphasis on affirmation. What seems to be significant about belief is that you claim that it is true.
There are different degrees of belief, of course, which seems to be what your question is getting at. But there's a world of difference between saying, as you did, that belief is a matter of the chance that you affirm a proposition, and saying that you believe that there is a high chance that a proposition is true. I can't fathom what it might mean to say that "there's a 70% chance that I believe it will rain tomorrow." But there's no problem with my saying, "I believe that there's a 70% chance it will rain tomorrow." The grounds for my belief are an entirely different matter.
In other words, I think your question would be better framed in terms of degrees of certainty, not probability of belief. Interestingly enough, modern probability mathematics involved a significant element of subjectivity (questions about what the odds are, given a certain state of evidence, that a proposition is true). But that's distinct from degrees of belief. But there's another measure: your willingness to act on your beliefs. That has a significant affective component.
Which leads me to a possible answer to your question: Pascal's Wager. In a nutshell, the seventeenth-century philosopher Blaise Pascal posed the question: is it reasonable to believe in God? He gave birth to game theory in his response: it's a good bet to believe in God. And, Pascal continued, if you go through the motions, eventually you will come to truly believe. Modern sociological studies of religious conversion, like Rodney Stark's research on the Unification Church, shows that converts who are initially skeptical of a religion's claims come to believe them after they are socially integrated into the church/sect/cult/whatever. No new evidence has been presented; rather, social bonds have helped make formerly ridiculous claims seem plausible. For a more general treatment of the issue, you can take a look at Steven Shapin's book A Social History of Truth, as well as the classic by Peter Berger and Thomas Luckmann, The Social Construction of Reality.
posted by brianogilvie at 6:24 PM on July 9, 2006 [1 favorite]
P.S. Ian Hacking's book The Emergence of Probability is essential for understanding the dual objective-subjective nature of modern probability theory.
posted by brianogilvie at 6:26 PM on July 9, 2006
posted by brianogilvie at 6:26 PM on July 9, 2006
I think that most responders are correct in saying that your question hinges on the definition of belief, and if one defines belief as most understand it; an internal conviction about a certain reality or truth of the world, then no, one can't really change the belief at will, as one would just be lying to oneself, or asking oneself to believe a contradiction / impossibility. As you put it yourself in pt. two...We can merely suppose anything; really believing is a different matter
It's like asking if one can willfully believe that 2+2 = 5.
One can be open to the fact that there may be conditions where 2 + 2 does equal five, but until one hears this additional evidence, it is a contradiction our mind automatically rejects.
Raising one's arm at will holds no such contradictions.
However, everyone has plenty of uncertainty about the world, and so in the real world, it is not at all uncommon to make willful choices about belief.
Anytime you have faced a decision, you have ultimately made a willful decision about what to believe. Should I marry this girl? Should I go to this college?
There are reasons to answer affirmatively or negatively to those questions, and the reasons may be quite balanced. By answering yes, or no, you are willfully choosing a belief; that this girl is the right one for you, or that that college is the wrong one for you.
The more strongly you believe something, the more you are convinced by the truth of a proposition, the less easily you will be able to amend the belief.
In this sense, to compare it to raising one's arm, the degree of conviction to your belief is like the amount of a weight that holds the belief in place, preventing it from being willfully altered.
The lighter the weight, the more unsure you are of a belief, the easier it is to alter the belief.
As I'm quite unconvinced of everything I just wrote, I think it would be quite easy to will myself to believe that everything I just wrote is all bullshit.
There, I just did it.
The answer to your question is yes.
posted by extrabox at 6:52 PM on July 9, 2006
It's like asking if one can willfully believe that 2+2 = 5.
One can be open to the fact that there may be conditions where 2 + 2 does equal five, but until one hears this additional evidence, it is a contradiction our mind automatically rejects.
Raising one's arm at will holds no such contradictions.
However, everyone has plenty of uncertainty about the world, and so in the real world, it is not at all uncommon to make willful choices about belief.
Anytime you have faced a decision, you have ultimately made a willful decision about what to believe. Should I marry this girl? Should I go to this college?
There are reasons to answer affirmatively or negatively to those questions, and the reasons may be quite balanced. By answering yes, or no, you are willfully choosing a belief; that this girl is the right one for you, or that that college is the wrong one for you.
The more strongly you believe something, the more you are convinced by the truth of a proposition, the less easily you will be able to amend the belief.
In this sense, to compare it to raising one's arm, the degree of conviction to your belief is like the amount of a weight that holds the belief in place, preventing it from being willfully altered.
The lighter the weight, the more unsure you are of a belief, the easier it is to alter the belief.
As I'm quite unconvinced of everything I just wrote, I think it would be quite easy to will myself to believe that everything I just wrote is all bullshit.
There, I just did it.
The answer to your question is yes.
posted by extrabox at 6:52 PM on July 9, 2006
Response by poster: Some people are mentioning Pascalian measures for acquiring a belief, such as going to church on a regular basis in the hopes that Christianity will somehow rub off on one. I agree that that kind of thing is possible, and that some people do it. But, again, I'm wondering whether a person can acquire a belief--any belief--without trying to do something else as a way of obtaining it.
posted by Eiwalker at 6:57 PM on July 9, 2006
posted by Eiwalker at 6:57 PM on July 9, 2006
Let's change the question, then, to be whether one can go from, say, roughly 95% to roughly 80%.
Okay, but still you were using an implicit all-or-nothing concept of belief in saying one either "really" takes something to be true or just supposes it as a possibility. If there were a simple linear scale on which certainty of belief could be measured, the difference between the two would be a matter of degree. That entertaining a possibility feels like a qualitatively different thing than really believing it is one indication that things are not quite so simple. There are various kinds of doubt one might have about the truth of any proposition, and each can have some kind of relative degree of strength which may be difficult to comprehend. I see no reason to think that it's possible to distill them all down to one number that means anything.
posted by sfenders at 7:06 PM on July 9, 2006
Okay, but still you were using an implicit all-or-nothing concept of belief in saying one either "really" takes something to be true or just supposes it as a possibility. If there were a simple linear scale on which certainty of belief could be measured, the difference between the two would be a matter of degree. That entertaining a possibility feels like a qualitatively different thing than really believing it is one indication that things are not quite so simple. There are various kinds of doubt one might have about the truth of any proposition, and each can have some kind of relative degree of strength which may be difficult to comprehend. I see no reason to think that it's possible to distill them all down to one number that means anything.
posted by sfenders at 7:06 PM on July 9, 2006
Best answer: I don't know how the brain works, but suppose that an increase or decrease in the degree of one's confidence in a proposition can be caused by a mere increase or decrease in intensity of electric current from a certain neuron (no pun intended) to certain other neurons.
The amount of electric current could be analogous to the degree of muscle flexing required in an arm-raising.
Perhaps someone could increase or decrease the electric current without specifically having one's mind on doing so (or thinking about the brain at all), in the same way that one could flex the right muscles to raise one's arm without having one's mind on doing so (or thinking about muscles at all).
If anything is bullshit, this probably is. But it's at least a stab in the dark, rather than a quibble over word usage. (But for an example of defining "belief" as I have, see Jonathan Bennett's paper, "Why is Belief Involuntary?" in the journal Analysis 50 (1990).
posted by Eiwalker at 7:11 PM on July 9, 2006
The amount of electric current could be analogous to the degree of muscle flexing required in an arm-raising.
Perhaps someone could increase or decrease the electric current without specifically having one's mind on doing so (or thinking about the brain at all), in the same way that one could flex the right muscles to raise one's arm without having one's mind on doing so (or thinking about muscles at all).
If anything is bullshit, this probably is. But it's at least a stab in the dark, rather than a quibble over word usage. (But for an example of defining "belief" as I have, see Jonathan Bennett's paper, "Why is Belief Involuntary?" in the journal Analysis 50 (1990).
posted by Eiwalker at 7:11 PM on July 9, 2006
You can't flex the right muscles to raise one's arm without having one's mind holding the "intent to raise one's arm" That willful intent triggers the subsequent actions which have been learned since you were a baby. So you are thinking about muscles, and you are thinking about the needed muscles to raise one's arm, it's just that consciousness of these specific thoughts has receded into the background.
posted by extrabox at 7:29 PM on July 9, 2006
posted by extrabox at 7:29 PM on July 9, 2006
If you're asking what I think you are, then cognitive dissonance and doublethink are two relevant mental processes. I wouldn't characterize them as being terribly analogous to raising your arm, but they require effort, and probably are easier to pull off for more intelligent thinkers. I personally regard Pascal's Wager as the result of cognitive dissonance.
I also don't know if you could claim that people engage in these processes at will (i.e. they aren't thinking "I want to believe X but it's crazy...time to use doublethink!"). They're probably better described as processes that result from extreme emotional motivation.
posted by Humanzee at 8:23 PM on July 9, 2006
I also don't know if you could claim that people engage in these processes at will (i.e. they aren't thinking "I want to believe X but it's crazy...time to use doublethink!"). They're probably better described as processes that result from extreme emotional motivation.
posted by Humanzee at 8:23 PM on July 9, 2006
On further introspection, I guess it is likely that some kind of "degree of confidence" in an isolated concept is at least some of the time maintained mostly independently from whatever inputs in the network of the mind led to its evaluation. Probably not all of the time; I imagine whatever other ideas went into forming a belief disconnect from the belief itself only gradually and not normally influenced by any conscious control. Perhaps the nature of skepticism is in training the mind to automatically make a little effort to re-evaluate the belief every time it gets involved in thinking and isn't already freshly connected to things. The practice of faith could then be the opposite.
It seems relatively easy (though not exactly normal practice) to temporarily stop believing something by just clearing out that belief and then not bothering to update it by thinking any further around it. To the extent that other thoughts rely on it, it would tend to come back when it's needed; such conflicts would have to be resolved. The concept itself wouldn't go away, either; just the notion that there's some truth in it.
Choosing to believe something in any similarly simple way seems more difficult; but of course selectively allowing only thoughts that are likely to increase the belief happens all the time and I see no reason why it couldn't happen consciously. That would be the slow and difficult way. Maybe there is also some easy trick to temporarily set up a belief without going to that trouble, but the more nonsensical it is, the harder it is to suppress the thought processes that would normally eliminate it. One might manage that by not thinking about it at all; but it's hard to act on or in any way use such a belief without thinking about it, so it would be much less strong in practice that beliefs formed in more normal ways.
So, yeah, on trying it, it seems like I can conjure up a very strong and intense certainty of the truth of there being a 97% chance of rain in Argentina today, but it won't last long at all unless I'm also good at self-deception to create the mental structures that would sustain it in use, and blind faith to ignore all those that don't -- most immediately, the memory of how this belief was created, which I think pretty much nobody not seriously mentally ill can get away from quickly. The first bit is easy once you get the hang of it, but doing much more would take serious work.
posted by sfenders at 8:23 PM on July 9, 2006
It seems relatively easy (though not exactly normal practice) to temporarily stop believing something by just clearing out that belief and then not bothering to update it by thinking any further around it. To the extent that other thoughts rely on it, it would tend to come back when it's needed; such conflicts would have to be resolved. The concept itself wouldn't go away, either; just the notion that there's some truth in it.
Choosing to believe something in any similarly simple way seems more difficult; but of course selectively allowing only thoughts that are likely to increase the belief happens all the time and I see no reason why it couldn't happen consciously. That would be the slow and difficult way. Maybe there is also some easy trick to temporarily set up a belief without going to that trouble, but the more nonsensical it is, the harder it is to suppress the thought processes that would normally eliminate it. One might manage that by not thinking about it at all; but it's hard to act on or in any way use such a belief without thinking about it, so it would be much less strong in practice that beliefs formed in more normal ways.
So, yeah, on trying it, it seems like I can conjure up a very strong and intense certainty of the truth of there being a 97% chance of rain in Argentina today, but it won't last long at all unless I'm also good at self-deception to create the mental structures that would sustain it in use, and blind faith to ignore all those that don't -- most immediately, the memory of how this belief was created, which I think pretty much nobody not seriously mentally ill can get away from quickly. The first bit is easy once you get the hang of it, but doing much more would take serious work.
posted by sfenders at 8:23 PM on July 9, 2006
Best answer: "I believe I will go to work tomorrow."
What does this mean? It means that I have a cartoon version of the world in my brain, and in that version -- in which I can fast-forward into the future -- I can see that I DO go to work tomorrow. (By "cartoon version", I'm talking about a mental map of the world, which is generally narrative in form.)
Most of the time, I am unaware of the distinction between the cartoon world and the real world. So I think (or believe) I'm also going to work tomorrow in the real world.
Every once in a while, there's an error in the cartoon, and I become aware that it's not mapping correctly to the real world -- at which point I become surprised. (I think that's what surprise is. It's when the real world doesn't match the cartoon world, and we become aware of the mismatch.)
In spite of the fact that the cartoon world is in my head, I have very little conscious control over it. If I DID, I would feel like I had magic powers. (Since, as I've said, I'm generally unaware that the cartoon world isn't the real world, if I could will changes in the cartoon, I'd think I was willing changes in the real world. And since I can't seem to make a pile of money appear, I clearly can't control either world.)
The cartoon world affects/produces emotions. If I think my cartoon wife (the cartoon map of my real-world wife) is going to die, I feel sad and scared. If I think she's going to throw me a party, I feel happy and excited. I have very little control over these feelings.
We haven't discussed feelings much in this thread, but to me they are key. I'd say that a "belief" is 90% emotion and 10% reason (this may differ from person-to-person and from belief-to-belief). If I FEEL like something is going to happen (based on cartoon-world logic), than I believe it's going to happen.
Usually.
Every once in a while, there's one of those breaks between the real and the cartoon (the surprises that I mentioned, above). When this happens, I may FEEL that something will happen but KNOW that it won't (or vice versa).
For instance, though I'm now sure that God doesn't exist, I used to feel that He existed but think He didn't. In other words, it sure FELT like there was a God, but I knew (or thought I knew) that there wasn't. Was I an atheist or a theist? Did I believe or not?
I don't think that's answerable. Not because it's a deep question, but because our language fails us at this point. "Belief" is a fuzzy term. The best we could say is that emotionally I believed and intellectually I didn't.
Though I can work to affect my emotions (and when I do, I'm maybe 5% successful), I can't really control them. Feelings just HAPPENS to me. So if we tie any component of "belief" to emotion (and how can we not?), then the answer is NO, we can't control them (much).
This is why I disagree with Vacapinta (though I think we may be arguing semantics)."Many people choose to love someone. Its a belief system like anything else. You choose to believe in this person. Even when they let you down, you may have a twinge of doubt but ultimately decide to go on believing in this person. You decide to believe, you decide to stay that course."
No. I may decide to STAY with a person who is hurting me. Or I may "do the healthy thing" and move on, but that will just change my actions, not my FEELINGS. I will still be in love. (Feelings aren't divorced from all other mental/physical processes, so of course moving on WILL change my feelings, but not right away, and not in a way that I can easily predict.)
I think that when we say thinks like, "I'm choosing not to love you any more" or "I'm going to go on loving you, despite what you've done," we mean that we HOPE we'll be able to continue/not-continue our feelings. Or we're just announcing our what are feelings are-- we already have stopped loving (or we already ARE continuing to love).
If we banish feelings from this discussion, then we're left with pure reasoning (if it's really possible to separate thoughts from feelings). I think intellectual beliefs are weak, unless they have emotional backing, but for the sake of argument, I'll assume that they exist and that they are important. The question is: are they mutable?
Well, I DO seem to have a bit more control over my thoughts -- my unemotional thoughts -- than my feelings. My unemotional thoughts involve pushing around tokens in my brain -- using a mental blackboard. And yes, I can cross out and erase. But we're talking about the coldest sort of thoughts. I think most of what we call intellectual is simply MORE about reasoning than feeling. But feeling is still part of it. And if emotion is even a SMALL part, then that small part won't be under our control.
And we also don't have total control over the non-emotional part -- just a bit more.
Bottom-line: beliefs are largely (and usually) immutable in the short-term. In the long-term, one MIGHT be able to budge certain beliefs. But doing so might take work (partaking in a ritual, like frequent churchgoing) and the results are unpredictable.
As I'm sure is clear, I came to all these conclusions by navel gazing. To the best of my reckoning, this is an accurate description of how I work. I don't have access to any one else's psyche, and I'm not using a scientific method.
posted by grumblebee at 9:43 PM on July 9, 2006 [1 favorite]
What does this mean? It means that I have a cartoon version of the world in my brain, and in that version -- in which I can fast-forward into the future -- I can see that I DO go to work tomorrow. (By "cartoon version", I'm talking about a mental map of the world, which is generally narrative in form.)
Most of the time, I am unaware of the distinction between the cartoon world and the real world. So I think (or believe) I'm also going to work tomorrow in the real world.
Every once in a while, there's an error in the cartoon, and I become aware that it's not mapping correctly to the real world -- at which point I become surprised. (I think that's what surprise is. It's when the real world doesn't match the cartoon world, and we become aware of the mismatch.)
In spite of the fact that the cartoon world is in my head, I have very little conscious control over it. If I DID, I would feel like I had magic powers. (Since, as I've said, I'm generally unaware that the cartoon world isn't the real world, if I could will changes in the cartoon, I'd think I was willing changes in the real world. And since I can't seem to make a pile of money appear, I clearly can't control either world.)
The cartoon world affects/produces emotions. If I think my cartoon wife (the cartoon map of my real-world wife) is going to die, I feel sad and scared. If I think she's going to throw me a party, I feel happy and excited. I have very little control over these feelings.
We haven't discussed feelings much in this thread, but to me they are key. I'd say that a "belief" is 90% emotion and 10% reason (this may differ from person-to-person and from belief-to-belief). If I FEEL like something is going to happen (based on cartoon-world logic), than I believe it's going to happen.
Usually.
Every once in a while, there's one of those breaks between the real and the cartoon (the surprises that I mentioned, above). When this happens, I may FEEL that something will happen but KNOW that it won't (or vice versa).
For instance, though I'm now sure that God doesn't exist, I used to feel that He existed but think He didn't. In other words, it sure FELT like there was a God, but I knew (or thought I knew) that there wasn't. Was I an atheist or a theist? Did I believe or not?
I don't think that's answerable. Not because it's a deep question, but because our language fails us at this point. "Belief" is a fuzzy term. The best we could say is that emotionally I believed and intellectually I didn't.
Though I can work to affect my emotions (and when I do, I'm maybe 5% successful), I can't really control them. Feelings just HAPPENS to me. So if we tie any component of "belief" to emotion (and how can we not?), then the answer is NO, we can't control them (much).
This is why I disagree with Vacapinta (though I think we may be arguing semantics)."Many people choose to love someone. Its a belief system like anything else. You choose to believe in this person. Even when they let you down, you may have a twinge of doubt but ultimately decide to go on believing in this person. You decide to believe, you decide to stay that course."
No. I may decide to STAY with a person who is hurting me. Or I may "do the healthy thing" and move on, but that will just change my actions, not my FEELINGS. I will still be in love. (Feelings aren't divorced from all other mental/physical processes, so of course moving on WILL change my feelings, but not right away, and not in a way that I can easily predict.)
I think that when we say thinks like, "I'm choosing not to love you any more" or "I'm going to go on loving you, despite what you've done," we mean that we HOPE we'll be able to continue/not-continue our feelings. Or we're just announcing our what are feelings are-- we already have stopped loving (or we already ARE continuing to love).
If we banish feelings from this discussion, then we're left with pure reasoning (if it's really possible to separate thoughts from feelings). I think intellectual beliefs are weak, unless they have emotional backing, but for the sake of argument, I'll assume that they exist and that they are important. The question is: are they mutable?
Well, I DO seem to have a bit more control over my thoughts -- my unemotional thoughts -- than my feelings. My unemotional thoughts involve pushing around tokens in my brain -- using a mental blackboard. And yes, I can cross out and erase. But we're talking about the coldest sort of thoughts. I think most of what we call intellectual is simply MORE about reasoning than feeling. But feeling is still part of it. And if emotion is even a SMALL part, then that small part won't be under our control.
And we also don't have total control over the non-emotional part -- just a bit more.
Bottom-line: beliefs are largely (and usually) immutable in the short-term. In the long-term, one MIGHT be able to budge certain beliefs. But doing so might take work (partaking in a ritual, like frequent churchgoing) and the results are unpredictable.
As I'm sure is clear, I came to all these conclusions by navel gazing. To the best of my reckoning, this is an accurate description of how I work. I don't have access to any one else's psyche, and I'm not using a scientific method.
posted by grumblebee at 9:43 PM on July 9, 2006 [1 favorite]
When I first read the question, I took it as meaning belief in the sense of religious faith. With the rest of the thread, I'm not so sure, but with that thought in mind (and hopefully relevant to the question): The processes are different. Changing beliefs requires a retraining of the mind, not just the completion of a simple action. Someone could change what they believe--what they know to be true, whether that knowledge is reasoned or not--as easily as a baby or a stroke victim who has to learn or relearn motor control can raise their arm, but not in the way that you or I raise an arm, thoughtlessly.
My first thought was, "no, but one can believe at will as one can stop smoking [or insert other addictive or habitual behavior here] at will." It can be done. It can even be done cold-turkey, but it takes some work, or at least real motivation to keep the will over the habit (whether a physical addiction or a previous belief).
posted by Cricket at 9:44 PM on July 9, 2006
My first thought was, "no, but one can believe at will as one can stop smoking [or insert other addictive or habitual behavior here] at will." It can be done. It can even be done cold-turkey, but it takes some work, or at least real motivation to keep the will over the habit (whether a physical addiction or a previous belief).
posted by Cricket at 9:44 PM on July 9, 2006
Sanity hangs ever by the thinnest of imaginable threads.
My schizophrenic brother can believe in the damnedest things, when his medications are a little off. And he can believe in them even when he simultaneously knows his beliefs are impossible. It causes visible strain in his face when this happens, and that's how I usually know, early on, that something is out of kilter with him. For then, he is hearing a babble of terrible voices, as if he were attending a cocktail party in hell, which he can't leave, where the conversations all around him are of eternal torment and black degrading damnation. I can talk to him sometimes in these states, and he knows me, knows my voice is real, knows that the other terrible babble which I never hear is not, is only in his head, and fight it, and yet, he can not choose not to believe it, too. Until the medications are adjusted, all he can do is fight terrible suggestions from his inner voices, and hoard butter knives, which we find together later, and put back in the dishwasher.
So, Eiwalker, you're asking if we can believe at will, and I'm giving you an anecdote of illness, and your first reaction will be to say, "That's not what I meant. Your brother is sick, and can't help what he believes. I'm talking about truly volitional belief."
Yeah, but here's the thing: My brother goes back and forth, from sick to not sick, in terms of how volitional his belief structure is, thanks to an anti-psychotic medication which acts by a mechanism that is truly understood by no one. When he's well, he can distinguish as well as you or I what someone else believes or knows, and what he himself believes or knows. He doesn't confuse assumption with fact any more than you or I; indeed, being able to choose what to believe is a measure of how well he is, on any given day. Generally, the theory of how his medication works relates to how it mediates serotonin levels in his brain, but that's mostly crude observational data; truth is, nobody knows how the stuff works. It just does, and then, maybe, some years later it doesn't, and maybe, hopefully, something else, by then, will. But it's always a question of 10,000 neurons and a few million serotonin molecules, here and there, whether he can choose, entirely, to believe, or not.
So there's that, in terms of what I can offer from our personal family history of abnormal psychology, and then, there's this.
At the end of his life, my father's lifelong religious faith blossomed into a gentle certainty of salvation and redemption that helped him meet death with equanimity. I think, like many people, my Dad professed his religious faith all his life, hoping, somehow, that doing so would bring him comfort in his darkest times, or at least ritual and the solace of structure and familiarity. But whether he ever actually, deeply, could come to believe in it all, was something we never discussed, and never could, until about 2 weeks before he died, the evening of the day before he was scheduled to go to the hospital for what would become his final admission there.
The details of that evening's conversation are personal, but the gist of it was that he had finally realized he had no reason to be afraid of dying, and that he was "going home," if it turned out, as it did, that treatment could not arrest his advanced lung cancer. For him, it became not a choice or an effort to believe, but finally, as if his professed beliefs gracefully enveloped him, when most he needed them, in a tide of warm acceptance of the inevitable. He didn't give up, didn't quit fighting for his life until his last breath, but he didn't waste a second of that last couple weeks worrying about dying, either.
What I took from that sad time, skeptic and agnostic that I am, was that there is a cost both for belief and for disbelief. When we can, we choose our beliefs and our skeptical areas according to what we can afford for our sanity, in light of our education, culture and experience. But sometimes, beyond all that, belief overcomes us, or "the fit takes us" as the ancients used to say, and we become, inexplicably, more than what we were, for what we believe without proof, than ever we could choose to be.
posted by paulsc at 10:08 PM on July 9, 2006 [1 favorite]
My schizophrenic brother can believe in the damnedest things, when his medications are a little off. And he can believe in them even when he simultaneously knows his beliefs are impossible. It causes visible strain in his face when this happens, and that's how I usually know, early on, that something is out of kilter with him. For then, he is hearing a babble of terrible voices, as if he were attending a cocktail party in hell, which he can't leave, where the conversations all around him are of eternal torment and black degrading damnation. I can talk to him sometimes in these states, and he knows me, knows my voice is real, knows that the other terrible babble which I never hear is not, is only in his head, and fight it, and yet, he can not choose not to believe it, too. Until the medications are adjusted, all he can do is fight terrible suggestions from his inner voices, and hoard butter knives, which we find together later, and put back in the dishwasher.
So, Eiwalker, you're asking if we can believe at will, and I'm giving you an anecdote of illness, and your first reaction will be to say, "That's not what I meant. Your brother is sick, and can't help what he believes. I'm talking about truly volitional belief."
Yeah, but here's the thing: My brother goes back and forth, from sick to not sick, in terms of how volitional his belief structure is, thanks to an anti-psychotic medication which acts by a mechanism that is truly understood by no one. When he's well, he can distinguish as well as you or I what someone else believes or knows, and what he himself believes or knows. He doesn't confuse assumption with fact any more than you or I; indeed, being able to choose what to believe is a measure of how well he is, on any given day. Generally, the theory of how his medication works relates to how it mediates serotonin levels in his brain, but that's mostly crude observational data; truth is, nobody knows how the stuff works. It just does, and then, maybe, some years later it doesn't, and maybe, hopefully, something else, by then, will. But it's always a question of 10,000 neurons and a few million serotonin molecules, here and there, whether he can choose, entirely, to believe, or not.
So there's that, in terms of what I can offer from our personal family history of abnormal psychology, and then, there's this.
At the end of his life, my father's lifelong religious faith blossomed into a gentle certainty of salvation and redemption that helped him meet death with equanimity. I think, like many people, my Dad professed his religious faith all his life, hoping, somehow, that doing so would bring him comfort in his darkest times, or at least ritual and the solace of structure and familiarity. But whether he ever actually, deeply, could come to believe in it all, was something we never discussed, and never could, until about 2 weeks before he died, the evening of the day before he was scheduled to go to the hospital for what would become his final admission there.
The details of that evening's conversation are personal, but the gist of it was that he had finally realized he had no reason to be afraid of dying, and that he was "going home," if it turned out, as it did, that treatment could not arrest his advanced lung cancer. For him, it became not a choice or an effort to believe, but finally, as if his professed beliefs gracefully enveloped him, when most he needed them, in a tide of warm acceptance of the inevitable. He didn't give up, didn't quit fighting for his life until his last breath, but he didn't waste a second of that last couple weeks worrying about dying, either.
What I took from that sad time, skeptic and agnostic that I am, was that there is a cost both for belief and for disbelief. When we can, we choose our beliefs and our skeptical areas according to what we can afford for our sanity, in light of our education, culture and experience. But sometimes, beyond all that, belief overcomes us, or "the fit takes us" as the ancients used to say, and we become, inexplicably, more than what we were, for what we believe without proof, than ever we could choose to be.
posted by paulsc at 10:08 PM on July 9, 2006 [1 favorite]
Bleary you should do that post.
As far as the posters question is concerned, as I see it the answer is yes. Take two situations, in one you see a mosquito on your arm and swat it off, in the second you read about torture in Iraq and your belief that the war was a mistake is strengthened. In terms of will how are these two situations different?
posted by afu at 10:11 PM on July 9, 2006
As far as the posters question is concerned, as I see it the answer is yes. Take two situations, in one you see a mosquito on your arm and swat it off, in the second you read about torture in Iraq and your belief that the war was a mistake is strengthened. In terms of will how are these two situations different?
posted by afu at 10:11 PM on July 9, 2006
Best answer: That's an amazingly good answer, grumblebee. I don't know if it will make Eiwalker too happy. I gather from his reference of that Bennett paper that he's approaching this question from a standpoint in epistemology which largely takes beliefs to be largely determinate (for example, one might say that they're tokened in a langauge of thought or something like that). Suggesting that we should ascribe beliefs based on the way we interact with the world, and that there's plenty of room for indeterminacy, pulls the rug out of a lot of contemporary epistemology (and rightly so).
I want to point out that this question is further clouded by the fact that grumblebee's point about the indeterminacy of belief applies just as much to the indeterminacy of will. This is just as much a question about will as a question about belief. (Bleary's suggestion to read up on Wegner is good; Wegner does excellent work in both areas.) Some responders have mentioned that people go to church in order to find or maintain a belief in god. Is that an example of willing a belief? Maybe not. It might be best thought of as an action that is willed in order to bring about a wanted effect (much like we can't will our hearts to stop beating, but we can will ourselves to cut out our own hearts to stop them from beating). But how about the person who simply thinks about god a lot in order to maintain a belief in god? Is that direct enough to count as a willed belief? Maybe... but trying to draw a definitive line is a mug's game. Just as there is indeterminacy in what counts as a belief, there is indeterminacy in what counts as willed. (So, I think grumblebee's right when he says that he suspects that he's just arguing semantics with about choosing to fall in love with someone.)
Formally define what's meant by 'belief' and 'will', and the answer to this question will fall into your lap. If you insist on using folk terminology (or even philosopher's terminology that leans on folk terms), then there is no determinate answer.
posted by painquale at 1:43 AM on July 10, 2006
I want to point out that this question is further clouded by the fact that grumblebee's point about the indeterminacy of belief applies just as much to the indeterminacy of will. This is just as much a question about will as a question about belief. (Bleary's suggestion to read up on Wegner is good; Wegner does excellent work in both areas.) Some responders have mentioned that people go to church in order to find or maintain a belief in god. Is that an example of willing a belief? Maybe not. It might be best thought of as an action that is willed in order to bring about a wanted effect (much like we can't will our hearts to stop beating, but we can will ourselves to cut out our own hearts to stop them from beating). But how about the person who simply thinks about god a lot in order to maintain a belief in god? Is that direct enough to count as a willed belief? Maybe... but trying to draw a definitive line is a mug's game. Just as there is indeterminacy in what counts as a belief, there is indeterminacy in what counts as willed. (So, I think grumblebee's right when he says that he suspects that he's just arguing semantics with about choosing to fall in love with someone.)
Formally define what's meant by 'belief' and 'will', and the answer to this question will fall into your lap. If you insist on using folk terminology (or even philosopher's terminology that leans on folk terms), then there is no determinate answer.
posted by painquale at 1:43 AM on July 10, 2006
In order to raise your arms above your head, you must first let go of the five ton weight you're holding onto with both hands.
Analogously, in order to embrace a particular belief, you must first let go of any other beliefs which contradict it.
The exact method you use to adopt a given belief will depend sensitively on why you wish to do so. It will also depend on how many related beliefs you already have.
I believe that people suffering from the widespread delusion commonly known as religious faith are more likely than skeptics to believe all kinds of weird crap.
I also believe that skeptics are just as likely as anybody else to suffer from confirmation bias, especially with regard to beliefs about non-skeptics.
Of course, believing anything nonsensical for any length of time is as tiring as spending your whole life with your arms above your head.
posted by flabdablet at 4:48 AM on July 10, 2006
Analogously, in order to embrace a particular belief, you must first let go of any other beliefs which contradict it.
The exact method you use to adopt a given belief will depend sensitively on why you wish to do so. It will also depend on how many related beliefs you already have.
I believe that people suffering from the widespread delusion commonly known as religious faith are more likely than skeptics to believe all kinds of weird crap.
I also believe that skeptics are just as likely as anybody else to suffer from confirmation bias, especially with regard to beliefs about non-skeptics.
Of course, believing anything nonsensical for any length of time is as tiring as spending your whole life with your arms above your head.
posted by flabdablet at 4:48 AM on July 10, 2006
The exact method you use to adopt a given belief will depend sensitively on why you wish to do so.
I may be taking you too literally, but...
It sounds like you're claiming that wishes can cause beliefs. I know that's not your main claim, but it IS implied by the sentence I quoted: belief[s] depend ... on why you wish...
I can't affect my beliefs by wishing. I WISH I believed in God, but I don't. (This is a deeply held, sincere wish. I wish I believed in God and an afterlife and the possibility that could see my loved ones after they die. But strong as this wish is, I don't believe in any of that stuff.) I WISH I believed I was more attractive. I DO believe that 80% of attractiveness is how one feels about oneself, so if I BELIEVED I was more attractive, I would BE more attractive. But I don't believe and wishing doesn't make me believe.
Perhaps, if I rewrote your sentence as follows, it would ring true:
The exact method you use to adopt a given belief will depend sensitively on why you NEED to do so.
Needs can be unconscious drives. Wishes are conscious processes, and (for me at least) conscious processes are bad at manipulating beliefs.
Our language disagrees with me. People are constantly saying things like, "You're a fool to believe in that", as if a one could respond, "Oh, you're right. Okay. I'll stop believing." But I think such conversations are more emotional that rational. They may translate to something like,
"It makes me angry that you believe in that, and I wish you didn't."
"Okay. I'll try not to." (Not that I've ever heard such a response. A more likely one -- and a more accurate one -- would be something like, "Well, I'm sorry, but that's just what I believe!")
posted by grumblebee at 5:19 AM on July 10, 2006
I may be taking you too literally, but...
It sounds like you're claiming that wishes can cause beliefs. I know that's not your main claim, but it IS implied by the sentence I quoted: belief[s] depend ... on why you wish...
I can't affect my beliefs by wishing. I WISH I believed in God, but I don't. (This is a deeply held, sincere wish. I wish I believed in God and an afterlife and the possibility that could see my loved ones after they die. But strong as this wish is, I don't believe in any of that stuff.) I WISH I believed I was more attractive. I DO believe that 80% of attractiveness is how one feels about oneself, so if I BELIEVED I was more attractive, I would BE more attractive. But I don't believe and wishing doesn't make me believe.
Perhaps, if I rewrote your sentence as follows, it would ring true:
The exact method you use to adopt a given belief will depend sensitively on why you NEED to do so.
Needs can be unconscious drives. Wishes are conscious processes, and (for me at least) conscious processes are bad at manipulating beliefs.
Our language disagrees with me. People are constantly saying things like, "You're a fool to believe in that", as if a one could respond, "Oh, you're right. Okay. I'll stop believing." But I think such conversations are more emotional that rational. They may translate to something like,
"It makes me angry that you believe in that, and I wish you didn't."
"Okay. I'll try not to." (Not that I've ever heard such a response. A more likely one -- and a more accurate one -- would be something like, "Well, I'm sorry, but that's just what I believe!")
posted by grumblebee at 5:19 AM on July 10, 2006
I meant "wish to" in the sense of "want to, desire to, need to".
We are two peoples separated by a common language :-)
Examples of what I meant: Fred has a personal habit which I find terribly annoying. I find myself believing that he's doing it specifically to give me the shits. This makes me become angry every time Fred walks into the room.
I would rather not be enraged by the mere sight of Fred; so I suggest to myself that perhaps Fred's habit is just one of those things that Fred is not particularly aware of. Three days and a dollop of confirmation bias later, I now believe that Fred's habit has nothing to do with his relationship with me, and I am no longer enraged by his presence.
I have successfully adoped a belief, basically just by wanting to. I wished to do this because I perceived that the contrary belief was doing nothing for me but giving me grief. I was able to do it because I had very few related beliefs about Fred.
On the other hand: Grumblebee has a strong desire to experience the comforting feelings commonly reported by those afflicted by the whole God/afterlife complex, and therefore wishes to believe in God. But he apparently doesn't value those comforting feelings as highly as he values his sanity; so he will not let go of the five ton weights that keep him firmly anchored in reality, and is therefore unable to lift his arms above his head.
It seems likely to me, given what I know of grumblebee's life experiences from his other writings, that it's probably confirmation bias operating to keep his belief in his own unattractiveness firmly in place. By way of contrast I, having mainly experienced grumblebee via his consistently well-thought-through and insightful postings on Metafilter, find him a very attractive personality; he strikes me as somebody I would very much enjoy sharing a few glasses of decent red with.
Grumblebee, if you're ever in East Gippsland and need a place to crash for a few nights, drop me a line!
posted by flabdablet at 6:21 AM on July 10, 2006
We are two peoples separated by a common language :-)
Examples of what I meant: Fred has a personal habit which I find terribly annoying. I find myself believing that he's doing it specifically to give me the shits. This makes me become angry every time Fred walks into the room.
I would rather not be enraged by the mere sight of Fred; so I suggest to myself that perhaps Fred's habit is just one of those things that Fred is not particularly aware of. Three days and a dollop of confirmation bias later, I now believe that Fred's habit has nothing to do with his relationship with me, and I am no longer enraged by his presence.
I have successfully adoped a belief, basically just by wanting to. I wished to do this because I perceived that the contrary belief was doing nothing for me but giving me grief. I was able to do it because I had very few related beliefs about Fred.
On the other hand: Grumblebee has a strong desire to experience the comforting feelings commonly reported by those afflicted by the whole God/afterlife complex, and therefore wishes to believe in God. But he apparently doesn't value those comforting feelings as highly as he values his sanity; so he will not let go of the five ton weights that keep him firmly anchored in reality, and is therefore unable to lift his arms above his head.
It seems likely to me, given what I know of grumblebee's life experiences from his other writings, that it's probably confirmation bias operating to keep his belief in his own unattractiveness firmly in place. By way of contrast I, having mainly experienced grumblebee via his consistently well-thought-through and insightful postings on Metafilter, find him a very attractive personality; he strikes me as somebody I would very much enjoy sharing a few glasses of decent red with.
Grumblebee, if you're ever in East Gippsland and need a place to crash for a few nights, drop me a line!
posted by flabdablet at 6:21 AM on July 10, 2006
Forgive my ignorance of geography, but where's Gippsland? (And backatcha, flabdablet, if you're ever in NYC.)
Grumblebee has a strong desire to ... to believe in God. But he apparently doesn't value ... those feelings as highly as he values his sanity...
This sounds funny, but I actually value those feelings more than I value my sanity, and I'm not even sure that dichotomy (feelings vs. sanity) makes sense.
Yes (according to my cosmology), if I believed in God, I'd be believing in a lie, but I don't think believing in a lie equals insanity. If I did, I'd have to consider all theists insane (and I don't). I suspect that sometimes it's MORE sane (or more healthy) to believe a lie than the truth.
I know some people will balk at that statement, but why SHOULD it necessarily be healthy to believe the truth? We evolved to survive and pass on our genes. If believing a lie helps us to do that, than believing a lie is the natural and right thing to do. Surely SOMETIMES it's beneficial to believe the truth and SOMETIMES it's beneficial to believe a lie. God, in my view, is a lie that's good for you. (And sometimes bad for you. Nothing is perfect.)
Since I value logic (and since my atheism is partly grounded in logic), there is a reason for me to NOT believe. But I value the comfort of God WAY more than I value logic. And the older I get, the more God seems important and logic seems trivial. I am starting to lose more and more people that I care about. This breaks my heart. I WANT to see them again. I NEED to see them again.
So why DON'T I believe in God?
I don't know. You could say, "Well, clearly it's not important enough to you." I can't prove that it is. All I can say is that it FEELS very important to me. Still, I can't do it. I can't control my beliefs.
As many have suggested here, a belief is a complex structure in the brain -- maybe something like a piece of software. Somehow the "God does not exist" program started running in my brain and it's been running for years. It's hooked into the operating system and can't easily be removed without massive amounts of tinkering. Maybe it can't be removed at all. I'm sure that ALL beliefs aren't so immutable, but many are.
What if someone PROVED to me that God existed? What if someone actually showed me God? I've often thought about this. My guess is that it wouldn't help much. Intellectually, I would believe in God. I would "believe" in the sense that if you asked me, "Does God exist?" I'd say, "Apparently he does, 'cause I saw him yesterday." But I wouldn't FEEL God.
Think of someplace you believe exists but have never been to -- someplace that isn't emotionally compelling to you. Yes, I believe that Quebec exists, but I don't FEEL Quebec. My belief in it is weak, because whether or not it exists doesn't affect me much. If you proved to me that Quebec didn't exist, I'd say, "Okay... whatever."
Which is why I think feelings are SO important. It's doesn't matter much -- outside of academic debates -- whether or not I believe in God (on an intellectual level). I happen not to, but if you can prove Him to me, I'll change my mind. But I won't be fundamentally different until I can FEEL God.
posted by grumblebee at 6:57 AM on July 10, 2006
Grumblebee has a strong desire to ... to believe in God. But he apparently doesn't value ... those feelings as highly as he values his sanity...
This sounds funny, but I actually value those feelings more than I value my sanity, and I'm not even sure that dichotomy (feelings vs. sanity) makes sense.
Yes (according to my cosmology), if I believed in God, I'd be believing in a lie, but I don't think believing in a lie equals insanity. If I did, I'd have to consider all theists insane (and I don't). I suspect that sometimes it's MORE sane (or more healthy) to believe a lie than the truth.
I know some people will balk at that statement, but why SHOULD it necessarily be healthy to believe the truth? We evolved to survive and pass on our genes. If believing a lie helps us to do that, than believing a lie is the natural and right thing to do. Surely SOMETIMES it's beneficial to believe the truth and SOMETIMES it's beneficial to believe a lie. God, in my view, is a lie that's good for you. (And sometimes bad for you. Nothing is perfect.)
Since I value logic (and since my atheism is partly grounded in logic), there is a reason for me to NOT believe. But I value the comfort of God WAY more than I value logic. And the older I get, the more God seems important and logic seems trivial. I am starting to lose more and more people that I care about. This breaks my heart. I WANT to see them again. I NEED to see them again.
So why DON'T I believe in God?
I don't know. You could say, "Well, clearly it's not important enough to you." I can't prove that it is. All I can say is that it FEELS very important to me. Still, I can't do it. I can't control my beliefs.
As many have suggested here, a belief is a complex structure in the brain -- maybe something like a piece of software. Somehow the "God does not exist" program started running in my brain and it's been running for years. It's hooked into the operating system and can't easily be removed without massive amounts of tinkering. Maybe it can't be removed at all. I'm sure that ALL beliefs aren't so immutable, but many are.
What if someone PROVED to me that God existed? What if someone actually showed me God? I've often thought about this. My guess is that it wouldn't help much. Intellectually, I would believe in God. I would "believe" in the sense that if you asked me, "Does God exist?" I'd say, "Apparently he does, 'cause I saw him yesterday." But I wouldn't FEEL God.
Think of someplace you believe exists but have never been to -- someplace that isn't emotionally compelling to you. Yes, I believe that Quebec exists, but I don't FEEL Quebec. My belief in it is weak, because whether or not it exists doesn't affect me much. If you proved to me that Quebec didn't exist, I'd say, "Okay... whatever."
Which is why I think feelings are SO important. It's doesn't matter much -- outside of academic debates -- whether or not I believe in God (on an intellectual level). I happen not to, but if you can prove Him to me, I'll change my mind. But I won't be fundamentally different until I can FEEL God.
posted by grumblebee at 6:57 AM on July 10, 2006
This sounds funny, but I actually value those feelings more than I value my sanity...I don't know if you've actually been mad. If not, I'm filing this under Bold Claims :-)
And the older I get, the more God seems important and logic seems trivial. I am starting to lose more and more people that I care about. This breaks my heart. I WANT to see them again.Fair enough.
I NEED to see them again.There are ways.
Smoke a heap of hash, breathe a heap of nitrous, eat a lot of shrooms, and don't sleep at all for three or four weeks. That should get you to a place where there are only about a dozen distinct people left in your world, some of whom you will have no trouble recognizing as repeat instances of those dearly departed. They won't be, of course, but you won't notice the differences, and you'll be happy to hang out with them.
They, of course, will all rapidly become unhappy to hang out with you; but that won't matter, because there are backup copies everywhere.
Before you consider embarking upon such enterprise, make sure you can in all conscience impose the incredibly heavy load that your newfound insanity will put on your nearest and dearest.
I'm betting that if you give this modest proposal serious consideration, you'll find your need for personal integrity trumps your (undoubtedly genuine, and undoubtedly huge) sense of grief and loss.
Life goes on. There's comfort in that. Personally, I plan to be freeze-dried and reincarnate as apricots :-)
On theism and insanity: a person needs to believe a whole bunch of complete crap to qualify as properly insane, but to my way of thinking the difference between a mad person and a believer in a person-like-in-some-sense God is one of degree, not of kind; believing in arbitrary bullshit is where it all starts.
Lies that are good for us: I can't really see it. Seems to me that given a ridiculously complex and frequently dangerous territory, an inaccurate map pretty much has to be less useful than an accurate one. If the aim is to avoid the kind of surprises that maim and kill, I think the more accurate the map, the better.
Genes and survival: I can't see how we did actually evolve to survive much past the point of passing on our genes. Personally, as a 44-year-old voluntarily-sterilized male part-responsible for the care of somebody else's child, my attitude is: "Fuck you, genes! You're going nowhere, you random bunch of unconscious insensate bastards, but *I* am gonna live as long as I possibly can!"
Also, I don't see much justification for using what apparently has happened (people being put together a certain way as a result of millions of years of evolution) in arguments about what we should do.
FEELing God: I've done enough interior space flights to convince me, beyond reasonable doubt, that the gut-feeling of utter certainty - the emotional rightness thing - is not 100% reliable. I agree with you that it's a really important part of most beliefs, but I would strongly advise anybody who believes any particular thing based solely (or even mainly, given the effectiveness of confirmation bias) on that feeling of certainty to be suspicious of it and do additional real-world checks. Holding emotional rightness to be a fundamentally necessary component of belief is, in my considered opinion, bad mapping practice.
Speaking of mapping: my little patch of East Gippsland is here.
Cheers!
posted by flabdablet at 8:21 AM on July 10, 2006
an inaccurate map pretty much has to be less useful than an accurate one.
For me to get from the sofa to the bathroom, my map needs to include the major obstacles: the chair I need to push aside, the table I need to circumvent, etc. It does NOT need to include the color of the table or the fact that there's a placemat on it. And generally it doesn't include these minor details. My map is expandable and changeable, depending on the needs of circumstance, but it generally oversimplifies. I'd argue that this simplification -- a lie -- is important for survival. One mustn't miss the forest for the trees.
You could argue that this lie is of a different kind than believing in God, but I hope you'll at least agree that our mental map doesn't (and shouldn't) exactly mimic the real world. If we agree about that, we can then explore details. How MUCH should it stray from the real world; WHEN should it stray from the real world?
Should a prisoner in a concentration camp dwell on reality? Should a child who is continually abused dwell on reality? There seem to be some EVOLVED mental processes to block reality in these cases. These blocks cause problems, but they also save lives.
Despite your wishes re: surviving and kids (you and I share pretty much the same wishes in that regard), we are machines "built" to pass on genes. Such machines, if they are well made, shouldn't care about reality. They should just care about passing on genes. If disregarding (or believing lies about) parts of reality help pass on genes, then a good gene-passing machine should do just that.
posted by grumblebee at 10:38 AM on July 10, 2006
For me to get from the sofa to the bathroom, my map needs to include the major obstacles: the chair I need to push aside, the table I need to circumvent, etc. It does NOT need to include the color of the table or the fact that there's a placemat on it. And generally it doesn't include these minor details. My map is expandable and changeable, depending on the needs of circumstance, but it generally oversimplifies. I'd argue that this simplification -- a lie -- is important for survival. One mustn't miss the forest for the trees.
You could argue that this lie is of a different kind than believing in God, but I hope you'll at least agree that our mental map doesn't (and shouldn't) exactly mimic the real world. If we agree about that, we can then explore details. How MUCH should it stray from the real world; WHEN should it stray from the real world?
Should a prisoner in a concentration camp dwell on reality? Should a child who is continually abused dwell on reality? There seem to be some EVOLVED mental processes to block reality in these cases. These blocks cause problems, but they also save lives.
Despite your wishes re: surviving and kids (you and I share pretty much the same wishes in that regard), we are machines "built" to pass on genes. Such machines, if they are well made, shouldn't care about reality. They should just care about passing on genes. If disregarding (or believing lies about) parts of reality help pass on genes, then a good gene-passing machine should do just that.
posted by grumblebee at 10:38 AM on July 10, 2006
Just wanted to add: my position is the substrate for grumblebee's. His "cartoon world" is generated by my "patterns in matter." Together, the two analogies give me a strong sense of the way the brain must work in a bottom up manner.
posted by clord at 4:02 PM on July 10, 2006
posted by clord at 4:02 PM on July 10, 2006
There is a good deal written about this in epistemology and in the literature on Pascal's wager. Why not check it out?
posted by ontic at 4:28 PM on July 10, 2006
posted by ontic at 4:28 PM on July 10, 2006
Response by poster: If you believe it's going to rain, presumably that is based on some evidence or experience; if it's not, then you're just asking if a person can project their fantasies and personal dreams, and ignore reality. Sure: we usually call it mental illness.
I would be delighted if there were EVEN ONE mentally ill person who could do it. I’m not talking about having delusions; I’m talking about being able to acquire particular beliefs immediately upon command, without acquiring new evidence, and without trying to do something else as a means to that end (such as taking or not taking meds).
Even dwelling on the old evidence that one already has is too much: (1) it would function as a means; (2) it would take too long (I’m looking for cases where a person can acquire a belief as quickly as I can raise my arm).
A lot of mentally ill people believe they are God, Jesus or the Devil, for instance, but I’ve not heard of cases where an insane fellow can belief particular things as quickly as I can raise my arm.
The belief doesn’t have to be fantastic, or have anything to do with religion: ANY belief would count. So, the belief doesn’t have to have much or any emotional component.
Nor does the proposition have to be inconsistent with any of one’s other beliefs. (Take the belief that there is a man in China at such and such address with a red hat in his closet.) So, I’m not specifically looking for cases of doublethink or cognitive dissonance.
I used to feel that He existed but think He didn't. In other words, it sure FELT like there was a God, but I knew (or thought I knew) that there wasn't. Was I an atheist or a theist? Did I believe or not? I don't think that's answerable. Not because it's a deep question, but because our language fails us at this point. "Belief" is a fuzzy term. The best we could say is that emotionally I believed and intellectually I didn't.
Good point. There are cases like this in the philosophy literature. For instance, take a case where you are flying in an airplane and feel sure you are going to crash and die, though rationally you judge that you are safer in the airplane than you are driving a couple of blocks to the grocery store and back.
So, then what? Does language really break down, like you say, such that there is really no such thing as belief, or no singular thing, but rather there is a dichotomy of emotional belief and intellectual belief? If so, which kind of belief am I asking about?
I defined belief as involving what the person takes to be the percent likelihood that…, but both the emotional and intellectual kinds of belief seem to involve that. Take the fear in the plane case: emotionally, I feel sure that… intellectually, I feel sure that… Both are near 100%, though they affirm contradictory propositions.
My first inclination is to answer as follows. Take your pick! Can a person acquire an emotional belief at will? What about an intellectual belief? Argue yes or no for one or the other.
grumblebee, you suggest that there might not be a strict dichotomy between emotional beliefs and intellectual beliefs, but rather all beliefs have both emotional and intellectual components. I’d like to say two things about that.
First, if so, then many of our beliefs are self-contradictary. That seems counterintuitive, though maybe it is true.
Second, even if this model is correct, presumably (but maybe I am wrong about this) there is still such a thing as the overall range of likelihood that one ascribes to the proposition.
If the emotional % is 80 and the intellectual % is 20, then the overall % might be the midway point, 50--that’s assuming each kind of belief is weighted equally.
If there is an overall % likelihood that one ascribes to the proposition, then changing that overall % at will would count as believing at will.
we also don't have total control over the non-emotional part -- just a bit more… Bottom-line: beliefs are largely (and usually) immutable in the short-term.
Even a tiny bit counts, even some of the time. Examples?
posted by Eiwalker at 6:01 PM on July 10, 2006
I would be delighted if there were EVEN ONE mentally ill person who could do it. I’m not talking about having delusions; I’m talking about being able to acquire particular beliefs immediately upon command, without acquiring new evidence, and without trying to do something else as a means to that end (such as taking or not taking meds).
Even dwelling on the old evidence that one already has is too much: (1) it would function as a means; (2) it would take too long (I’m looking for cases where a person can acquire a belief as quickly as I can raise my arm).
A lot of mentally ill people believe they are God, Jesus or the Devil, for instance, but I’ve not heard of cases where an insane fellow can belief particular things as quickly as I can raise my arm.
The belief doesn’t have to be fantastic, or have anything to do with religion: ANY belief would count. So, the belief doesn’t have to have much or any emotional component.
Nor does the proposition have to be inconsistent with any of one’s other beliefs. (Take the belief that there is a man in China at such and such address with a red hat in his closet.) So, I’m not specifically looking for cases of doublethink or cognitive dissonance.
I used to feel that He existed but think He didn't. In other words, it sure FELT like there was a God, but I knew (or thought I knew) that there wasn't. Was I an atheist or a theist? Did I believe or not? I don't think that's answerable. Not because it's a deep question, but because our language fails us at this point. "Belief" is a fuzzy term. The best we could say is that emotionally I believed and intellectually I didn't.
Good point. There are cases like this in the philosophy literature. For instance, take a case where you are flying in an airplane and feel sure you are going to crash and die, though rationally you judge that you are safer in the airplane than you are driving a couple of blocks to the grocery store and back.
So, then what? Does language really break down, like you say, such that there is really no such thing as belief, or no singular thing, but rather there is a dichotomy of emotional belief and intellectual belief? If so, which kind of belief am I asking about?
I defined belief as involving what the person takes to be the percent likelihood that…, but both the emotional and intellectual kinds of belief seem to involve that. Take the fear in the plane case: emotionally, I feel sure that… intellectually, I feel sure that… Both are near 100%, though they affirm contradictory propositions.
My first inclination is to answer as follows. Take your pick! Can a person acquire an emotional belief at will? What about an intellectual belief? Argue yes or no for one or the other.
grumblebee, you suggest that there might not be a strict dichotomy between emotional beliefs and intellectual beliefs, but rather all beliefs have both emotional and intellectual components. I’d like to say two things about that.
First, if so, then many of our beliefs are self-contradictary. That seems counterintuitive, though maybe it is true.
Second, even if this model is correct, presumably (but maybe I am wrong about this) there is still such a thing as the overall range of likelihood that one ascribes to the proposition.
If the emotional % is 80 and the intellectual % is 20, then the overall % might be the midway point, 50--that’s assuming each kind of belief is weighted equally.
If there is an overall % likelihood that one ascribes to the proposition, then changing that overall % at will would count as believing at will.
we also don't have total control over the non-emotional part -- just a bit more… Bottom-line: beliefs are largely (and usually) immutable in the short-term.
Even a tiny bit counts, even some of the time. Examples?
posted by Eiwalker at 6:01 PM on July 10, 2006
I'd argue that this simplification -- a lie -- is important for survival. One mustn't miss the forest for the trees.It's that danged common language again - we clearly mean different things by "lie" :-)
I agree that a map which is too complex to use is a useless map; ironically, if we were in fact capable of mapping the entirety of reality with perfect accuracy, there would be no point at all in doing so - we might as well have no map at all, and just refer directly to reality.
In fact there has been some robotics research in this direction - given the extreme paucity of the reality map available to anything running on a present-day embedded microcontroller, simply reacting to perceivable aspects of the world can actually work better than modelling it.
That doesn't mean I'm comfortable with the idea of mapping large slabs of reality with "here be dragons". "Here be unknown territory", sure. "Here be dangers of an indeterminate nature", sure. Dragons: no. It seems to me to be pointless and stupid for a map to say that the High Street runs north-south when it actually runs east-west.
To my way of thinking, the wprd "lie" - in this context - is better reserved for something that actively misleads. I would much rather deal with an incomplete map than an inconsistent or flat-out wrong one.
If disregarding (or believing lies about) parts of reality help pass on genes, then a good gene-passing machine should do just thatIf you mean "should" in the sense of "could reasonably be expected to" then of course I agree.
What I'm questioning, though, is the necessity and/or appropriateness of behaving like an obedient little gene carrier. Seems to me that for better or worse, we do have very powerful and general-purpose inbuilt mappers, and we do have a pretty good handle on where many of the more useful controls are, and we can make choices on how best to use them that are to a large degree independent of the process by which we acquired them. Were this not so, there would be no value at all in e.g. cognitive behavioural therapy, and nobody would have bothered to invent stoicism.
It seems to me that the best mapping policy for people in horrible situations (like prisoner in a concentration camps or continually-abused children) depends on what they're mainly trying to achieve. If the aim is to maximize the chances of actually living through such an ordeal and coming out the other side, they need the best map they can get of those parts of reality they are able to perceive, and especially those parts they are able to control, needs to be as accurate as possible; this should give them the best available chance of anticipating and avoiding the excesses of those holding them prisoner. If, on the other hand, the aim is to minimize intolerable discomfort even at the possible cost of staying alive, disconnecting from reality is a fine strategy.
It's perhaps not terribly useful, though, to consider the example of people in an extremely choice-poor situation when attempting to explore possible consequences of choice and/or will.
Are we off topic yet?
posted by flabdablet at 8:44 PM on July 10, 2006
Eiwalker: now that you've clarified the speed and immediacy that you're after, I think the answer is "No, unless there's something wrong with you".
I have personally experienced a psychotic state where new-belief acquisition was pretty much as easy as you specify. I don't believe that such a state is a sustainable way to live. Not since I got better, at any rate :-)
It seems to me that when our brains are in good working order, our inbuilt world-mapper (belief generator) is fairly well protected against capricious alteration, and for good reason. It generally takes at least a couple days to bed in a new belief.
posted by flabdablet at 8:54 PM on July 10, 2006
I have personally experienced a psychotic state where new-belief acquisition was pretty much as easy as you specify. I don't believe that such a state is a sustainable way to live. Not since I got better, at any rate :-)
It seems to me that when our brains are in good working order, our inbuilt world-mapper (belief generator) is fairly well protected against capricious alteration, and for good reason. It generally takes at least a couple days to bed in a new belief.
posted by flabdablet at 8:54 PM on July 10, 2006
What I'm questioning, though, is the necessity and/or appropriateness of behaving like an obedient little gene carrier.
This is an ethical issue (or, at least, a practical issue). I have no strong opinions about it. It seems perfectly reasonable to me for one to thwart one's evolutionary "programming" if that helps achieve some specific goal.
As-long-as we remember that this programming exists. It's it's had a REALLY long time to run and perfect itself.
If you're a Windows user, you've probably seen that annoying window that pops up after your system has been updated. It asks you if you want to reboot now or later. If you click later, and try to get back to your work, it pops up again -- after about five minutes -- and asks you when you want to reboot. You have to either give into it (reboot) or keep clicking later, later, later...
Fighting one's instincts is like that. It's not wrong (or at least I'm not interested here in whether it's right or wrong). It's just difficult. Which is why we DO need Cognitive Therapy and the like to fight it.
Those of us who think we're fighting it are probably co-opting it more than anything else. Maybe you don't want to have kids, but do you want to fall in love, have sex, eat food...? If so, you're "giving into" or "using" your programming. The goal of the programming is for us to reproduce. you're just using the same program for a different purpose. (On the other hand, a celibate monk or someone who is fasting is genuinely thwarting the program, and both will probably fight an uphill battle. The programming is strong.)
posted by grumblebee at 4:43 AM on July 11, 2006
This is an ethical issue (or, at least, a practical issue). I have no strong opinions about it. It seems perfectly reasonable to me for one to thwart one's evolutionary "programming" if that helps achieve some specific goal.
As-long-as we remember that this programming exists. It's it's had a REALLY long time to run and perfect itself.
If you're a Windows user, you've probably seen that annoying window that pops up after your system has been updated. It asks you if you want to reboot now or later. If you click later, and try to get back to your work, it pops up again -- after about five minutes -- and asks you when you want to reboot. You have to either give into it (reboot) or keep clicking later, later, later...
Fighting one's instincts is like that. It's not wrong (or at least I'm not interested here in whether it's right or wrong). It's just difficult. Which is why we DO need Cognitive Therapy and the like to fight it.
Those of us who think we're fighting it are probably co-opting it more than anything else. Maybe you don't want to have kids, but do you want to fall in love, have sex, eat food...? If so, you're "giving into" or "using" your programming. The goal of the programming is for us to reproduce. you're just using the same program for a different purpose. (On the other hand, a celibate monk or someone who is fasting is genuinely thwarting the program, and both will probably fight an uphill battle. The programming is strong.)
posted by grumblebee at 4:43 AM on July 11, 2006
Response by poster: I have personally experienced a psychotic state where new-belief acquisition was pretty much as easy as you specify.
Pretty much (but not quite)?
It seems to me that when our brains are in good working order, our inbuilt world-mapper (belief generator) is fairly well protected against capricious alteration, and for good reason.
Fairly well (but there are exceptions)?
It generally takes at least a couple days to bed in a new belief.
You mean (1) to work out the implications to make sure it is consistent with everything else, and (2) to get used to using it in sentences?
1. What if it clearly was consistent with everything else?
2. Does a person really have to do a thorough job of synthesizing new information in order to count as believing it? I have old beliefs that I understand better than I did originally: should we really say that I didn't believe them until I was able to synthesize them well? If I come to believe that a man at such and such address in China has a red hat, synthesizing this with my other beliefs seems pretty simple. At least, I shouldn't have to work out implications such as, "If someone asks me whether I believe that there are hats in China, this proposition implies that the answer is yes."
With that said, the more irrelevant the proposition is to the rest of one's beliefs, the easier it should be to synthesize it. And perhaps some cases are so easy that one could synthesize it (well enough to meet the belief prerequisite) immediately as raising one's arm.
posted by Eiwalker at 4:46 AM on July 11, 2006
Pretty much (but not quite)?
It seems to me that when our brains are in good working order, our inbuilt world-mapper (belief generator) is fairly well protected against capricious alteration, and for good reason.
Fairly well (but there are exceptions)?
It generally takes at least a couple days to bed in a new belief.
You mean (1) to work out the implications to make sure it is consistent with everything else, and (2) to get used to using it in sentences?
1. What if it clearly was consistent with everything else?
2. Does a person really have to do a thorough job of synthesizing new information in order to count as believing it? I have old beliefs that I understand better than I did originally: should we really say that I didn't believe them until I was able to synthesize them well? If I come to believe that a man at such and such address in China has a red hat, synthesizing this with my other beliefs seems pretty simple. At least, I shouldn't have to work out implications such as, "If someone asks me whether I believe that there are hats in China, this proposition implies that the answer is yes."
With that said, the more irrelevant the proposition is to the rest of one's beliefs, the easier it should be to synthesize it. And perhaps some cases are so easy that one could synthesize it (well enough to meet the belief prerequisite) immediately as raising one's arm.
posted by Eiwalker at 4:46 AM on July 11, 2006
grumblebee, you suggest that there might not be a strict dichotomy between emotional beliefs and intellectual beliefs, but rather all beliefs have both emotional and intellectual components.
No, I do think there are intellectual-only beliefs, though they're not very interesting. (How can anything be interesting to us if we've no emotional stake in it?) For instance, I believe that if I press the period key on my keyboard, I'll see a period appear on the screen.
But I don't have any feelings connected to this belief. If you try to convince me that I won't see a period, I won't go into denial or defend my stance or anything. At best, I will try it and see.
Boring as such beliefs are, I DO think they give you the best hope of answering your question scientifically. You need to find REALLY non-emotional beliefs. I guess I might have SOME emotion connected with the period key. If it's not making dots, my keyboard's broken (or my OS is), and that sucks. But if you work at it, you can probably come up with something better.
Maybe you could do an experiment in which you get someone to believe something new (not something they're previously invested in) -- something emotionally neutral. Then you can see if they can change this belief.
For instance, you could convince them that you have three pennies in your pocket. I doubt they would care one way or another, but they certainly might BELIEVE that you have three pennies in your pocket. Then you could try to get them to believe that you have two pennies in your pocket.
How? Beats me. In "1984", it takes drugs and torture, and I think you might need similar tactics (which, I realize, is not what you want). I suspect that even these non-emotional beliefs are pretty immutable.
presumably (but maybe I am wrong about this) there is still such a thing as the overall range of likelihood that one ascribes to the proposition.
I think you ARE wrong (though I'm not entirely sure). I don't think we process probability very well -- unless we're consciously trying to do so. (Look up "The Monty Hall Problem" for a window into how bad humans are at thinking about probability.) Our beliefs tend more towards being binary.
For instance, when I say I don't believe in God, I mean something like "I think there's a very low chance that He exists. Maybe .00001%. That's so unlikely that it's not worth thinking about."
That's what I SAY I mean, and that IS my philosophical position. But it's not my day-to-day, working position. It's not what's in my gut. My real position/belief is that God doesn't exist. Again, I if I'm being logical/philosophical, I might explain that "does not exist" means "there chance He does exist is miniscule," but that's a way of explaining my belief AFTER the fact. I don't carry percents around in my brain.
There must be some mental mechanism that converts a large likelihood to a binary TRUTH. This makes sense. It's easier to process TRUE than 90%. When making survival decisions, how would it help you to remember 90% rather than TRUE?
(I also suspect, though I'm less sure, that we can't have a 50% or 40% belief in something. If I say, "there's a 50% chance I'll go to work tomorrow, that just means, "I don't know if I'm going to work tomorrow or not.")
This is even true with something more mundane. When the weather report says 90% chance of rain, my brain quickly decides "it's going to rain today." I can't hold 90% in my brain without constant effort. And sometimes I'm surprised when it DOESN'T rain. Which means I've lost the 10% chance that it won't. My gut never digested 90%. It can't. It's binary by nature.
I think this is even more the case with emotions. I've never been 10% happy, 67% scared or 40% angry. I guess I have been a little angry and a lot angry, but my emotions don't have a fine scale. Basically, I'm angry or I'm not. So with any belief that's emotion-based, things become even MORE binary.
PS. I'm surprised that this thread didn't get deleted, and my guess is that many here have no time for it. But I want to thank you for it. It's one of my favorite threads ever. It helped me clarify several of my ideas.
posted by grumblebee at 5:11 AM on July 11, 2006
No, I do think there are intellectual-only beliefs, though they're not very interesting. (How can anything be interesting to us if we've no emotional stake in it?) For instance, I believe that if I press the period key on my keyboard, I'll see a period appear on the screen.
But I don't have any feelings connected to this belief. If you try to convince me that I won't see a period, I won't go into denial or defend my stance or anything. At best, I will try it and see.
Boring as such beliefs are, I DO think they give you the best hope of answering your question scientifically. You need to find REALLY non-emotional beliefs. I guess I might have SOME emotion connected with the period key. If it's not making dots, my keyboard's broken (or my OS is), and that sucks. But if you work at it, you can probably come up with something better.
Maybe you could do an experiment in which you get someone to believe something new (not something they're previously invested in) -- something emotionally neutral. Then you can see if they can change this belief.
For instance, you could convince them that you have three pennies in your pocket. I doubt they would care one way or another, but they certainly might BELIEVE that you have three pennies in your pocket. Then you could try to get them to believe that you have two pennies in your pocket.
How? Beats me. In "1984", it takes drugs and torture, and I think you might need similar tactics (which, I realize, is not what you want). I suspect that even these non-emotional beliefs are pretty immutable.
presumably (but maybe I am wrong about this) there is still such a thing as the overall range of likelihood that one ascribes to the proposition.
I think you ARE wrong (though I'm not entirely sure). I don't think we process probability very well -- unless we're consciously trying to do so. (Look up "The Monty Hall Problem" for a window into how bad humans are at thinking about probability.) Our beliefs tend more towards being binary.
For instance, when I say I don't believe in God, I mean something like "I think there's a very low chance that He exists. Maybe .00001%. That's so unlikely that it's not worth thinking about."
That's what I SAY I mean, and that IS my philosophical position. But it's not my day-to-day, working position. It's not what's in my gut. My real position/belief is that God doesn't exist. Again, I if I'm being logical/philosophical, I might explain that "does not exist" means "there chance He does exist is miniscule," but that's a way of explaining my belief AFTER the fact. I don't carry percents around in my brain.
There must be some mental mechanism that converts a large likelihood to a binary TRUTH. This makes sense. It's easier to process TRUE than 90%. When making survival decisions, how would it help you to remember 90% rather than TRUE?
(I also suspect, though I'm less sure, that we can't have a 50% or 40% belief in something. If I say, "there's a 50% chance I'll go to work tomorrow, that just means, "I don't know if I'm going to work tomorrow or not.")
This is even true with something more mundane. When the weather report says 90% chance of rain, my brain quickly decides "it's going to rain today." I can't hold 90% in my brain without constant effort. And sometimes I'm surprised when it DOESN'T rain. Which means I've lost the 10% chance that it won't. My gut never digested 90%. It can't. It's binary by nature.
I think this is even more the case with emotions. I've never been 10% happy, 67% scared or 40% angry. I guess I have been a little angry and a lot angry, but my emotions don't have a fine scale. Basically, I'm angry or I'm not. So with any belief that's emotion-based, things become even MORE binary.
PS. I'm surprised that this thread didn't get deleted, and my guess is that many here have no time for it. But I want to thank you for it. It's one of my favorite threads ever. It helped me clarify several of my ideas.
posted by grumblebee at 5:11 AM on July 11, 2006
Here's another complicating factor: STANCES.
Sometimes I take a stance, and that's not exactly the same thing as a belief -- but it's related, and it's easy to confuse the two (and somethimes the two SHOULD be confused, because they are intertwined).
I associate stances with young people. And in fact, I rarely take them nowadays. (I'm 40.) I have a friend who taught at a college for a while. He complained that he couldn't relate to the students, because "people at that age just 'have' opinions. They'll say, 'I HATE Star Wars movies!' You'll ask them why, and they'll look at you defiantly and say, 'I don't know, I just DO!"
In a thread riddled with armchair psychology, I'll throw in a footrest and suggest that "kids" do this as part of the growing up ritual. It's less about what they believe and more about defining their personalities -- defining themselves as independent people with opinions. So it's more important to have ANY opinions that to have specific opinions. It's important to have a stance.
Like I say, I don't do this any more, but I remember what it felt like and how fuzzy the border was between stance and belief.
One example that sticks in my mind is the Loch Ness Monster. As a die-hard skeptic, I don't believe in Nessie, Bigfoot, UFO or psychic powers. But there was a time when my skeptical views weren't as "advanced."
When I was sixteen or so, I remember saying, "I don't believe in Bigfoot or any of that nonsense, but I DO believe in The Loch Ness Monster -- I don't know WHY, but I DO." I think I said that last part (defensively) without any prompting.
I remember that I WANTED there to be a Loch Ness Monster (part of a childhood romance with dinosaurs), which is why I chose it for my stance. And I even remember the feeling of "I don't exactly believe in it but I'm going to SAY I believe in it from now on, DAMMIT!" And after taking that stance, I sort of DID believe it.
Sort of. I never quite forgot that I didn't have any evidence. But my feelings were powerful, so my stance was very belief-like. As an intellectual person, I am able to separate my ideas from my feelings a little more than the average bear. It's possible for a more intuitive person, the stance/belief membrane would be even thinner.
Something I continually wonder about is people who choose their own religion. As someone who's never been religious, I'm baffled by this. I have a friend -- I'm sure we all know people like this (or are people like this) -- who actively searches for religions and sometimes chooses one, follows it for a while, and then moves onto another one.
I don't get what's going on in her head, and she's never been able to explain it to me. Because actually changing one's belief in a whole cosmology is alien to me, it's tempting for me to think that she's just into religion for community and ritual -- so switching religions is like leaving Metafilter and hanging out at Slashdot. But I fear I'm oversimplifying.
Still, how can one (say) believe on Monday that Jesus arose from the dead and then on Tuesday believe that he didn't?
posted by grumblebee at 5:38 AM on July 11, 2006
Sometimes I take a stance, and that's not exactly the same thing as a belief -- but it's related, and it's easy to confuse the two (and somethimes the two SHOULD be confused, because they are intertwined).
I associate stances with young people. And in fact, I rarely take them nowadays. (I'm 40.) I have a friend who taught at a college for a while. He complained that he couldn't relate to the students, because "people at that age just 'have' opinions. They'll say, 'I HATE Star Wars movies!' You'll ask them why, and they'll look at you defiantly and say, 'I don't know, I just DO!"
In a thread riddled with armchair psychology, I'll throw in a footrest and suggest that "kids" do this as part of the growing up ritual. It's less about what they believe and more about defining their personalities -- defining themselves as independent people with opinions. So it's more important to have ANY opinions that to have specific opinions. It's important to have a stance.
Like I say, I don't do this any more, but I remember what it felt like and how fuzzy the border was between stance and belief.
One example that sticks in my mind is the Loch Ness Monster. As a die-hard skeptic, I don't believe in Nessie, Bigfoot, UFO or psychic powers. But there was a time when my skeptical views weren't as "advanced."
When I was sixteen or so, I remember saying, "I don't believe in Bigfoot or any of that nonsense, but I DO believe in The Loch Ness Monster -- I don't know WHY, but I DO." I think I said that last part (defensively) without any prompting.
I remember that I WANTED there to be a Loch Ness Monster (part of a childhood romance with dinosaurs), which is why I chose it for my stance. And I even remember the feeling of "I don't exactly believe in it but I'm going to SAY I believe in it from now on, DAMMIT!" And after taking that stance, I sort of DID believe it.
Sort of. I never quite forgot that I didn't have any evidence. But my feelings were powerful, so my stance was very belief-like. As an intellectual person, I am able to separate my ideas from my feelings a little more than the average bear. It's possible for a more intuitive person, the stance/belief membrane would be even thinner.
Something I continually wonder about is people who choose their own religion. As someone who's never been religious, I'm baffled by this. I have a friend -- I'm sure we all know people like this (or are people like this) -- who actively searches for religions and sometimes chooses one, follows it for a while, and then moves onto another one.
I don't get what's going on in her head, and she's never been able to explain it to me. Because actually changing one's belief in a whole cosmology is alien to me, it's tempting for me to think that she's just into religion for community and ritual -- so switching religions is like leaving Metafilter and hanging out at Slashdot. But I fear I'm oversimplifying.
Still, how can one (say) believe on Monday that Jesus arose from the dead and then on Tuesday believe that he didn't?
posted by grumblebee at 5:38 AM on July 11, 2006
If you click later, and try to get back to your work, it pops up again -- after about five minutes -- and asks you when you want to reboot. You have to either give into it (reboot) or keep clicking later, later, later......or get pissed off enough about it that you go looking for a way to get it under control - does firing up Regedit constitute cognitive behavioural therapy for Windows?
It's a shame our interior states don't come with googlable message texts :-)
Pretty much (but not quite)?Pretty much, as in I found myself fully believing one bizarre thing after another with great rapidity. For example: my grandmother is dying right now, and sending me 95 years' worth of gleeful secret confessions in the clouds; the letters C7 on the glass door of this Changi Airport boarding gate are a heavily compressed coded message from my dead uncle; the Internet has achieved sentience, and its shooting down of TWA Flight 800 was a fit of childish pique; the red blinking light on the end of the travelator is the Internet's way of trying to communicate with me; the security camera on the wall up there is one of the Internet's eyes; it's a good idea to show that eye a piece of improvised nude performance art symbolizing and protesting against humanity's destructive rampage across our planet; this polite Singapore police sergeant pretending to arrest me is an enlightened being who Understands what's Really Going On, and so is that guy in the cleaner's overalls, and so is anybody wearing purple, and so are all women; there's a coded message to be found in every taxi licence plate; eating that cigarette butt somebody has left on the floor of the interrogation room will cure the throat cancer I've just realized I have; wearing these handcuffs is causing complete paralysis of my entire body; and so on and so on and so on.
These were very pleasing beliefs, and they all felt terribly profound and gave me a deep and peaceful sense of understanding exactly where I was and why; but to say that I adopted any of them because I wanted to would perhaps require attributing more integrity to the term "I" than my mental processes at the time could have justified.
Just thinking about the belief-acquisition process some more, it's occurred to me that the process you're assuming (first wanting to believe something, then doing so) doesn't usually happen. Unlike gross motor movements, beliefs normally seem to turn up before the desire to have them does (either because they just occur to us, or because somebody else suggests them) and there is no exercise of choice involved until we're deciding whether or not to keep them. It's a generate-and-test thing rather than a design-and-implement thing.
I guess you could argue that a given proposition must go through a stage of being a mere hypothesis before graduating as a belief. Personally, I don't think it works that way. It seems to me that hypotheses are usually more "suppose" than "perhaps"; they operate just like beliefs for as long as we're prepared to leave them undiscarded.
Fairly well (but there are exceptions)?Yes. It's quite possible to put a healthily functioning brain into a state where it adopts new beliefs almost immediately; that's what hypnosis is all about. I don't know any accomplished self-hypnotists, so I can't tell you how fast it's possible to enter a trance state, auto-suggest a new belief, and resume normal consciousness; but in any case, this would seem to break your requirement about not needing "steps".
On the other hand, perhaps the perceivability of these steps reduces as a practitioner gets better at doing them. You may well be able to raise one arm, but can you raise one eyebrow? What about making little circles with your left middle toe? Speed may well come with practice.
You mean (1) to work out the implications to make sure it is consistent with everything else, and (2) to get used to using it in sentences?No, I mean integrating it into my worldview to the extent that if I find myself in a situation where the belief is applicable, I act on the basis of it without having to become aware of its presence as a new belief. And I'm talking here about beliefs about hard-to-test things like the true motivations of Fred.
And perhaps some cases are so easy that one could synthesize it (well enough to meet the belief prerequisite) immediately as raising one's arm.Seems to me that going from general beliefs to particular ones will often happen at comparable speeds to the arm-raising process. For example, if I already have a general belief that there are frequently traffic cops with speed cameras along my route to work, and I see a car parked by the side of the road ahead, it doesn't take very long at all to believe that "that's probably a camera cop so it's worth a quick glance down at the speedo". But, once again, there doesn't seem to be a pre-belief desire-to-have-a-belief step there.
But enough of my blather. I'm kind of curious about where you're coming from, now.
The way you posed your original question, and the way you've responded to other people's suggestions all the way through this thread, strike me as more than a little Socratic. What's your motivation for raising this issue? Do you have a well-conceived personal position that you're trying to lead this bunch of obstinately non-drinking horses to? If so, what is it? Why do you care what we all think?
Lay your cards down, pardner, and let's see what fell into that sleeve of yours as your right arm went up :-)
posted by flabdablet at 6:49 AM on July 11, 2006
I'm not reading this whole thing, but..
The answer, for some moderate definition of 'believe', is yes.
You are driving down the street and your speedometer reads 60km/h, you don't start thinking about all the issues that might effect the measurement. Are my tires properly inflated, I wonder if the temperature of the gauge is within specifications for optimal performance..
When you are doing design/engineering work this is a real issue you face every day. You have to make instant critical decisions, "this I will believe" or "this I will doubt". Not at all unlike raising your arm, it is instinct developed from extensive training.
posted by Chuckles at 10:02 AM on July 11, 2006
The answer, for some moderate definition of 'believe', is yes.
You are driving down the street and your speedometer reads 60km/h, you don't start thinking about all the issues that might effect the measurement. Are my tires properly inflated, I wonder if the temperature of the gauge is within specifications for optimal performance..
When you are doing design/engineering work this is a real issue you face every day. You have to make instant critical decisions, "this I will believe" or "this I will doubt". Not at all unlike raising your arm, it is instinct developed from extensive training.
posted by Chuckles at 10:02 AM on July 11, 2006
Hmm.. I'm not sure I ever got to the point I intended when raising the issue of design work.. Instead, I will just point out one of my MetaFilter posts, Minds of Our Own.
In particular, cytherea's comment:
In particular, cytherea's comment:
One of the parts that I found most illuminating was when they were discussing the children's understandings of how vision and light worked. Many of the children believed that the light rays emanated from their eyes because they had seen cats' and dogs' shine in the dark. After the science was explained to them, they were asked if they would be able to see an apple in a perfectly dark room, and almost all said that they would, after enough time. But the funny thing was, after they personally performed the experiment, and found that they couldn't see the apple no matter how long they sat in the dark, they still maintained that if they had just waited longer, eventually they'd have been able to see it.And my own, which is very similar:
What impressed me is the way the children would latch on to an idea (we could call it belief, but that has its own baggage) based on the information they were given in class. Even though they could reason fairly well, they would stall when they realized that their reasoning was in conflict with the idea that they had interpolated from the lesson. Many believed that the bulb holder was an integral part of getting the bulb to light up, for example. Once they reached that kind of impasse they didn't have a strong instinct to get down to experimenting to see what would actually work.posted by Chuckles at 10:28 AM on July 11, 2006
Great post, Chuckles.
When you are doing design/engineering work this is a real issue you face every day. You have to make instant critical decisions, "this I will believe" or "this I will doubt".
This is what I call a "stance", which is related to a belief, but it's not quite the same thing. You can say, "this I WILL believe," but that doesn't magically make you believe it.
Perhaps it's more accurate to say, "I'm going to assume that..." Now, if you keep making that assumption for weeks, months and years, it may become a belief.
I wonder how many mathematicians BELIEVE that points, lines and plains exist. I'm sure that most don't believe it on an intellectual level -- they know that these are abstractions. But if they've spent years working with them, they must FEEL very real.
Same with novelists and their characters. Think of someone like Patrick O'Brien, who spent DECADES writing about the same (very nuanced) characters. Did he believe they existed? Again, I'm not suggesting he was insane, but surely his characters FELT very like real people.
I love Chekhov's "Uncle Vanya." I've read it about fifty times, I've seen maybe a hundred stage and film adaptations, I've directed it twice and acted in it once. To me, the characters are real people. I know some of them better than I know acquaintances. Do I BELIEVE they're real? I don't know. Obviously, I know they don't exist in the real world, but they FEEL real.
Also, can I believe that Sonya thinks she's beautiful? I don't think I can. In the story, she thinks she's plain. That's so much a part of her. She doesn't really exist outside of my mind, so I SHOULD be able to manipulate her any way I want. But I can't. And if you think about it, no one exists (for you) outside of your mind. So what's the difference between believing in your best friend and believing in Sonya?
posted by grumblebee at 10:41 AM on July 11, 2006
When you are doing design/engineering work this is a real issue you face every day. You have to make instant critical decisions, "this I will believe" or "this I will doubt".
This is what I call a "stance", which is related to a belief, but it's not quite the same thing. You can say, "this I WILL believe," but that doesn't magically make you believe it.
Perhaps it's more accurate to say, "I'm going to assume that..." Now, if you keep making that assumption for weeks, months and years, it may become a belief.
I wonder how many mathematicians BELIEVE that points, lines and plains exist. I'm sure that most don't believe it on an intellectual level -- they know that these are abstractions. But if they've spent years working with them, they must FEEL very real.
Same with novelists and their characters. Think of someone like Patrick O'Brien, who spent DECADES writing about the same (very nuanced) characters. Did he believe they existed? Again, I'm not suggesting he was insane, but surely his characters FELT very like real people.
I love Chekhov's "Uncle Vanya." I've read it about fifty times, I've seen maybe a hundred stage and film adaptations, I've directed it twice and acted in it once. To me, the characters are real people. I know some of them better than I know acquaintances. Do I BELIEVE they're real? I don't know. Obviously, I know they don't exist in the real world, but they FEEL real.
Also, can I believe that Sonya thinks she's beautiful? I don't think I can. In the story, she thinks she's plain. That's so much a part of her. She doesn't really exist outside of my mind, so I SHOULD be able to manipulate her any way I want. But I can't. And if you think about it, no one exists (for you) outside of your mind. So what's the difference between believing in your best friend and believing in Sonya?
posted by grumblebee at 10:41 AM on July 11, 2006
I believe I should read this thread more thoroughly before going any further, but..
Looking at your response grumblebee, I immediately thought of myth, which doesn't imply falsity, even though it is taken to, most of the time.
In trying to avoid the semantics of belief, lets look at Eiwalker's original analogy.. I can't actually raise my arm at will. I can move it at will, of course, but unless it is lowered it will be raised once, and stay forever. Well, not exactly forever, but I think that just extends the point. And for the record, I take issue with "I do not try to act upon my brain in a certain way, nor do I know which muscles I'd have to flex." This doesn't seem like an accurate description of the processes involved, but that might be too much of a digression.
I guess my point is, there is an underlying mental process there. I'm not sure about the established philosophy of belief, and your point about subjective reality - fictional characters are real, all stories are true, etc. - is well taken. There is certainly room in that direction which my answer doesn't address, but I have no clue how to bring the philosophical question and the mental process together.
Once you have constructed your reality, like accepting the existence of Sonya, I think you are back to the mental processes again. Making her change her personality is like lifting a raised arm further..
posted by Chuckles at 11:44 AM on July 11, 2006
Looking at your response grumblebee, I immediately thought of myth, which doesn't imply falsity, even though it is taken to, most of the time.
In trying to avoid the semantics of belief, lets look at Eiwalker's original analogy.. I can't actually raise my arm at will. I can move it at will, of course, but unless it is lowered it will be raised once, and stay forever. Well, not exactly forever, but I think that just extends the point. And for the record, I take issue with "I do not try to act upon my brain in a certain way, nor do I know which muscles I'd have to flex." This doesn't seem like an accurate description of the processes involved, but that might be too much of a digression.
I guess my point is, there is an underlying mental process there. I'm not sure about the established philosophy of belief, and your point about subjective reality - fictional characters are real, all stories are true, etc. - is well taken. There is certainly room in that direction which my answer doesn't address, but I have no clue how to bring the philosophical question and the mental process together.
Once you have constructed your reality, like accepting the existence of Sonya, I think you are back to the mental processes again. Making her change her personality is like lifting a raised arm further..
posted by Chuckles at 11:44 AM on July 11, 2006
We can come to deeply hold a belief nearly instantly, without much effort, under the right conditions, and it is a common human experience. On the positive side, connected powerfully with emotion, are the experiences of "falling in love at first sight" and "being scared to death" of a person we've barely met. Many people who have fallen in love at first sight talk about it as being one of the strongest physiological experiences of their life, complete with being breathless, having palpable heart arhythmia, becoming flushed and warm, etc. Many recall the experience so vividly decades later, that they re-experience some of the same sensations in recalling the moment. The same kinds of vivid details are reported by people who meet remarkable criminals, and believe instantly, that they are in the presence of tangible evil.
But we can also come to strong beliefs very quickly under conditions of fatigue, boredom, or other stress, and such responses are so predictably effective, they are standard parts of military and sports training. Pushed to exhaustion, the mind can accept an idea unconditionally, which it would never otherwise adopt, and hold on to that idea as a new personal belief long after the training experience is past.
These kinds of instant beliefs seem to be almost "pre-rational" in that the conversion of the idea to a belief happens nearly without cognitive effort, and certainly without the operation of conventional mental processes such as deductive or inductive reasoning. Yet once the idea so adopted as belief is truly seated as such, the new belief is as powerfully resistant to alteration or replacement as any other belief held by the individual.
posted by paulsc at 12:07 PM on July 11, 2006
But we can also come to strong beliefs very quickly under conditions of fatigue, boredom, or other stress, and such responses are so predictably effective, they are standard parts of military and sports training. Pushed to exhaustion, the mind can accept an idea unconditionally, which it would never otherwise adopt, and hold on to that idea as a new personal belief long after the training experience is past.
These kinds of instant beliefs seem to be almost "pre-rational" in that the conversion of the idea to a belief happens nearly without cognitive effort, and certainly without the operation of conventional mental processes such as deductive or inductive reasoning. Yet once the idea so adopted as belief is truly seated as such, the new belief is as powerfully resistant to alteration or replacement as any other belief held by the individual.
posted by paulsc at 12:07 PM on July 11, 2006
Response by poster: PS. I'm surprised that this thread didn't get deleted, and my guess is that many here have no time for it. But I want to thank you for it. It's one of my favorite threads ever. It helped me clarify several of my ideas.
LOL
This comment made my day.
The first sentence kind of took me by surprise: deleted?
1. Why would a thread that is so popular that it got almost sixty comments in two days get deleted, or be deleted already?
More importantly,
2. It's bad enough that the philosophy category is COMBINED with the religion category: there's no reason it should be, other than that B. Dalton bookstore does that; it would be even more egregious if a question that actually arises in academic philosophy would be disallowed.
Maybe stuff like this tends to get deleted, I don't know. I've only been coming to Metafilter for two or three weeks.
I'm glad you clarified several of your ideas! In philosophy, that is a nice kind of success.
Please: nobody should respond to this post.
(I shall get a cup of coffee, and come back and answer a couple of questions: what I believe, or what my motivation is; and give a response to grumblebee on "overall ascriptions of probability.")
posted by Eiwalker at 2:08 PM on July 11, 2006
LOL
This comment made my day.
The first sentence kind of took me by surprise: deleted?
1. Why would a thread that is so popular that it got almost sixty comments in two days get deleted, or be deleted already?
More importantly,
2. It's bad enough that the philosophy category is COMBINED with the religion category: there's no reason it should be, other than that B. Dalton bookstore does that; it would be even more egregious if a question that actually arises in academic philosophy would be disallowed.
Maybe stuff like this tends to get deleted, I don't know. I've only been coming to Metafilter for two or three weeks.
I'm glad you clarified several of your ideas! In philosophy, that is a nice kind of success.
Please: nobody should respond to this post.
(I shall get a cup of coffee, and come back and answer a couple of questions: what I believe, or what my motivation is; and give a response to grumblebee on "overall ascriptions of probability.")
posted by Eiwalker at 2:08 PM on July 11, 2006
Nobody should respond to this post??? Well, I want to address my "deleted" comment and your remark about it. But since you don't want replies, I won't. (I just wanted to explain to you -- since you're a newcomer -- why I'm surprised it wasn't deleted.) Unless you'd really rather not do so, please add an email address to your profile so stuff like this can be discussed outside of MeFi. Thanks.
posted by grumblebee at 3:15 PM on July 11, 2006
posted by grumblebee at 3:15 PM on July 11, 2006
Response by poster: I want to address my "deleted" comment and your remark about it.
I'm sorry, go ahead. I was just worried that someone might get mad and say, "There is a time a place for Metatalk!" (I've seen that in other threads.) I'm actually intensely curious and would love to hear it.
posted by Eiwalker at 3:31 PM on July 11, 2006
I'm sorry, go ahead. I was just worried that someone might get mad and say, "There is a time a place for Metatalk!" (I've seen that in other threads.) I'm actually intensely curious and would love to hear it.
posted by Eiwalker at 3:31 PM on July 11, 2006
Response by poster: Suppose, as grumblebee suggests, SOME beliefs have both emotional and intellectual components. Then, as I said before: presumably (but maybe I am wrong about this) there is still such a thing as the overall range of likelihood that one ascribes to the proposition.
grumblebee says: I think you ARE wrong (though I'm not entirely sure). I don't think we process probability very well -- unless we're consciously trying to do so. (Look up "The Monty Hall Problem" for a window into how bad humans are at thinking about probability.)
I have never meant to suggest that humans tend to be GOOD at figuring probabilities: I merely claim that they DO it.
In what sense or senses do we do it? Emotionally or intellectually, or both at the same time—if both at the same time, maybe the two forces average out to be a person’s overall % ascription.
Notice I say “ascription”: it may be misleading. Recall that painquale pointed out that, given my reference to that Jonathan Bennett paper, I am probably approaching this question from a standpoint in epistemology which largely takes beliefs to be largely determinate (for example, one might say that they're tokened in a langauge of thought or something like that).
I’ve been thinking throughout this discussion that ascriptions of probability are feelings of probability—private, subjective phenomena. Thus, I suppose it’s fair to say that I’ve been thinking that beliefs are tokened in a language of thought.
Yet Bennett takes beliefs—defined as subjective probabilities—to be DISPOSITIONS. These dispositions are functions of inputs/desires/outputs. (Let’s not discuss the fact that he takes desires to also be dispositions: no less, functions of inputs/beliefs/outputs; and there is debate about how vicious this circle is); just note that, this way, beliefs must be able to issue in overt action, at least in principle (counterfacutal scenarios).
E.g., one difference between, say, an 89% belief and a 90% belief, is that a person should tend to be willing to bet slightly more money on the one rather than the other (assuming the person desires to win); or, if a person had to bet on one or the other but not both, would bet on the one slightly more often, other things being equal.
(The precise dispositions, including precise percentages, that a person has could not be scientifically verifiable if not all of the unrealized possibilities are realized; but it would of course be impossible for that to happen. Yet that’s not fatal to the position, since not all truths are scientifically verifiable.)
Now for the climax.
The point of all this is that if beliefs are ascriptions of probability towards particular propositions, and if such ascriptions are dispositions to act in various ways given certain inputs and desires, then there can be a fact of the matter regarding the overall % probability of each belief relative to its corresponding proposition, even if one does not calculate (or claim) percentages at the conscious level.
Thus if a person were to change a belief, she may not even know it!
posted by Eiwalker at 8:49 PM on July 11, 2006
grumblebee says: I think you ARE wrong (though I'm not entirely sure). I don't think we process probability very well -- unless we're consciously trying to do so. (Look up "The Monty Hall Problem" for a window into how bad humans are at thinking about probability.)
I have never meant to suggest that humans tend to be GOOD at figuring probabilities: I merely claim that they DO it.
In what sense or senses do we do it? Emotionally or intellectually, or both at the same time—if both at the same time, maybe the two forces average out to be a person’s overall % ascription.
Notice I say “ascription”: it may be misleading. Recall that painquale pointed out that, given my reference to that Jonathan Bennett paper, I am probably approaching this question from a standpoint in epistemology which largely takes beliefs to be largely determinate (for example, one might say that they're tokened in a langauge of thought or something like that).
I’ve been thinking throughout this discussion that ascriptions of probability are feelings of probability—private, subjective phenomena. Thus, I suppose it’s fair to say that I’ve been thinking that beliefs are tokened in a language of thought.
Yet Bennett takes beliefs—defined as subjective probabilities—to be DISPOSITIONS. These dispositions are functions of inputs/desires/outputs. (Let’s not discuss the fact that he takes desires to also be dispositions: no less, functions of inputs/beliefs/outputs; and there is debate about how vicious this circle is); just note that, this way, beliefs must be able to issue in overt action, at least in principle (counterfacutal scenarios).
E.g., one difference between, say, an 89% belief and a 90% belief, is that a person should tend to be willing to bet slightly more money on the one rather than the other (assuming the person desires to win); or, if a person had to bet on one or the other but not both, would bet on the one slightly more often, other things being equal.
(The precise dispositions, including precise percentages, that a person has could not be scientifically verifiable if not all of the unrealized possibilities are realized; but it would of course be impossible for that to happen. Yet that’s not fatal to the position, since not all truths are scientifically verifiable.)
Now for the climax.
The point of all this is that if beliefs are ascriptions of probability towards particular propositions, and if such ascriptions are dispositions to act in various ways given certain inputs and desires, then there can be a fact of the matter regarding the overall % probability of each belief relative to its corresponding proposition, even if one does not calculate (or claim) percentages at the conscious level.
Thus if a person were to change a belief, she may not even know it!
posted by Eiwalker at 8:49 PM on July 11, 2006
Best answer: It's good that Bennett takes beliefs to be dispositions, but the point about indeterminacy still stands. How are you supposed to know which behavioral dispostions are the ones that compose a belief? Which complex of dispositions is an 89% belief that the roulette ball will fall on red and which is a 90% belief will that it will fall on red? Saying that the person who would bet more often, all other things being equal, is a bit handwavy (ceteris paribus clauses always are).
Say that you encounter a smart, educated guy with a gambling problem. If you talk to him outside of a casino, he'll wholeheartedly agree that the gambler's fallacy is a fallacy. But in front of the roulette table, he can't help but keep riding once he's on a streak. Emotionally, he feels lucky, and he can't help but let that sentiment take over and dictate his behavior -- when you tell him to listen to reason, he pushes you away. Does this man actually believe in hot streaks? There's no strict answer. Now compare him to the guy who's never heard of the gambler's fallacy and plays roulette only very occasionally. He'll say that he thinks people get lucky from time to time, but doesn't bet as much or as often as the guy with the gambling problem, simply because he's not that invested in the game. Does this guy believe in hot streaks less than the other guy? How many percentage points less? Invoking ceteris paribus laws won't help because there are too many conditions that are not equal.
If we stipulate which funtional states are associated with particular beliefs, then we might be able to assign percentages and do other things like that. I haven't read the Bennett paper, but it's common for epistemologists to think that this stipulation has already been made, or that it's trivial, or that we can get around it by calling upon a language of thought hypothesis. But the stipulations are never actually explicitly made (obviously, for the functional states would be inscrutably complex), and there's enough indeterminacy in our use of the term 'belief' to make the issue far, far from trivial. This is a good example of what Wittgenstein would have seen as a puzzle that can be dissolved through an analysis of language. I maintain that under certain common uses of 'belief', we could say that beliefs can be willed, and under other common uses, they can't be.
posted by painquale at 11:38 PM on July 11, 2006
Say that you encounter a smart, educated guy with a gambling problem. If you talk to him outside of a casino, he'll wholeheartedly agree that the gambler's fallacy is a fallacy. But in front of the roulette table, he can't help but keep riding once he's on a streak. Emotionally, he feels lucky, and he can't help but let that sentiment take over and dictate his behavior -- when you tell him to listen to reason, he pushes you away. Does this man actually believe in hot streaks? There's no strict answer. Now compare him to the guy who's never heard of the gambler's fallacy and plays roulette only very occasionally. He'll say that he thinks people get lucky from time to time, but doesn't bet as much or as often as the guy with the gambling problem, simply because he's not that invested in the game. Does this guy believe in hot streaks less than the other guy? How many percentage points less? Invoking ceteris paribus laws won't help because there are too many conditions that are not equal.
If we stipulate which funtional states are associated with particular beliefs, then we might be able to assign percentages and do other things like that. I haven't read the Bennett paper, but it's common for epistemologists to think that this stipulation has already been made, or that it's trivial, or that we can get around it by calling upon a language of thought hypothesis. But the stipulations are never actually explicitly made (obviously, for the functional states would be inscrutably complex), and there's enough indeterminacy in our use of the term 'belief' to make the issue far, far from trivial. This is a good example of what Wittgenstein would have seen as a puzzle that can be dissolved through an analysis of language. I maintain that under certain common uses of 'belief', we could say that beliefs can be willed, and under other common uses, they can't be.
posted by painquale at 11:38 PM on July 11, 2006
Eiwalker emailed me and I explained why I was suprised this thread is still standing: I'm surprised Matt didn't consider it chatfilter. I'm glad he didn't.
posted by grumblebee at 4:49 AM on July 12, 2006
posted by grumblebee at 4:49 AM on July 12, 2006
Best answer: It's good that Bennett takes beliefs to be dispositions, but the point about indeterminacy still stands. How are you supposed to know which behavioral dispostions are the ones that compose a belief?
Never mind that. A bigger problem is that the list would be infinite--hence probabilities would be undefined.
So, if beliefs are dispositions, then it cannot be that one of their necessary conditions is that they ascribe specific probabilities. So, if beliefs are dispositions, then the answer to my question is, No: believing at will in the way we raise our arms at will would be impossible.
Yet if we say that beliefs are not dispositions but rather are private, subjective phenomena, there seems no way for specific probabilities to be established, since, as many have pointed out, we often have beliefs without thinking of specific probabilities. So, beliefs can't be private, subjective phenomena if one of their necessary conditions is that they ascribe specific probabilities. So, if beliefs are private, subjective phenomena, then the answer to my question is, No: believinng at will in the way we raise our arms at will would be impossible.
So, either way, whether beliefs are dispositions or private, subjective phenomena, the answer to my question is, No.
posted by Eiwalker at 6:48 AM on July 12, 2006
Never mind that. A bigger problem is that the list would be infinite--hence probabilities would be undefined.
So, if beliefs are dispositions, then it cannot be that one of their necessary conditions is that they ascribe specific probabilities. So, if beliefs are dispositions, then the answer to my question is, No: believing at will in the way we raise our arms at will would be impossible.
Yet if we say that beliefs are not dispositions but rather are private, subjective phenomena, there seems no way for specific probabilities to be established, since, as many have pointed out, we often have beliefs without thinking of specific probabilities. So, beliefs can't be private, subjective phenomena if one of their necessary conditions is that they ascribe specific probabilities. So, if beliefs are private, subjective phenomena, then the answer to my question is, No: believinng at will in the way we raise our arms at will would be impossible.
So, either way, whether beliefs are dispositions or private, subjective phenomena, the answer to my question is, No.
posted by Eiwalker at 6:48 AM on July 12, 2006
I think [url=http://ask.metafilter.com/mefi/41795#642859]Vionnett[/url] nailed it. It depends on whether you believe in Free Will or Determinism. If you believe in free will at all, then beliefs are a choice. Many are not aware that belief is a choice. This is unfortunate, because awareness affords one the opportunity to be proactive and choose the belief that is most beneficial.
In general I use the scientific method, and faith in objective reality to inform my choices of belief, but when these tools do not result in a clear decision, I choose that which is most useful to me. e.g. "Never attribute to malice, that which can be adequately explained by stupidity".
posted by Manjusri at 5:16 PM on November 1, 2006
In general I use the scientific method, and faith in objective reality to inform my choices of belief, but when these tools do not result in a clear decision, I choose that which is most useful to me. e.g. "Never attribute to malice, that which can be adequately explained by stupidity".
posted by Manjusri at 5:16 PM on November 1, 2006
This thread is closed to new comments.
posted by criticalbill at 3:50 PM on July 9, 2006