Like any self-respecting aspiring rationalist, I try to think about the answers to questions in terms of subjective probability rather than a binary ‘yes/no’. So if you ask me who is going to the next president of the USA, if I’m trying my best to be honest about it I’d say “X with 40% probability, Y with 25% and someone else with 35%” (or something). Does that mean I “believe X is going to be the next president”? Formally I hold with 60% probability that X will not be the next president, but I think X has the best chance of any particular candidate. (There’s a Most Of vs The Most shift to be wary of here.)
I think if you ask most people what “I believe S is true” means, they would say something like “it means I think S is more likely than not”, i.e. “>50% Bayesian probability.”1 But if this were true, then for many (most?) ideas or facts, we would not hold any belief. In the example above, you wouldn’t be able to say “I believe X will be president” because you only give this statement 40% probability. People do, in fact, seem to believe things for which they could not have >50% credence.
So maybe people are confused2, and what they really mean is “S is more likely than any other specific option”, i.e. the most probability rather than most of the probability. In this case you could believe “X will be the next president” even though you only give this option 40% credence. But this doesn’t seem to match the common parlance either. Imagine a race with 100s of racers, each roughly equal in skill (as far as you know), and therefore you would give even odds to all racers (you hold no belief about who will win, if belief means what we have assumed). But you are then told that racer R had an exceptionally good practice yesterday. With no other information, you should update from even odds in favor of R being the winner, and therefore (by the definition above) you now believe that R will win.
But… do you really believe R will win? Yes if you were forced to bet you’d (rationally) bet on R, but does the use of that word feel right to you?
The disconnect here is, I think, that when people use the word “believe” they don’t actually mean anything related to probabilities. What they really mean is:
“when I think about whether S is true, there’s a kind of warm feeling in my heart.”
I maintain that this warm feeling is really what is referred to by the word “belief.” Sometimes the belief (warm feeling) matches to a high subjective probability, and through effort you can increase the correlation, but they are distinct, and much confusion is downstream of a failure to distinguish the two.
This is obvious for the median human. Most people haven’t thought about subjective probability and how to update in light of new information, so how could they think about belief in those terms? But I think this is true more broadly. If you ask a well-calibrated Bayesian, they will still say “I believe S” even before doing any Bayesian calculation; I maintain that they, too, are referring to a warm feeling in their heart. (What, after all, do you think a prior is?)
The difference is that, if you ask the Bayesian for more information, they know how to do the calculation and give you a probability. And once you have the probability in hand, does it make sense anymore to talk about the belief?
Can you derive the probability (or even a likely range) from “I believe S”?
Does “I believe S” add any information at all once you already say “I hold S to be true with 40% probability”?
The answer to all of these questions is “no". The probability is the whole game; the belief adds nothing and, in many cases, is inadvertently directing you to make mistakes.
Beliefs are useless. That warm feeling in your heart is all it means. Stick with probabilities.
\\ Some other thoughts \\
I’m not philosopher, but I did a minor in college and was struck in epistemology classes by how long we studied the idea of ‘knowledge’ but how little we studied ‘belief.’ It was just understood what a belief was, or at least that’s how it seemed to me; belief was an axiom, not an object of analysis.3 It was always assumed that everyone kind of knew what we meant by “Agent X believes proposition P.”
An objection I anticipate: yes, perhaps belief is in essence just a warm feeling in your heart, but that warm feeling is not useless—that is your intuition! Intuition is how your unconscious mind bubbles up truths of which you are not consciously aware! To this I say: maybe, but again I ask you, how confident are you that this warm feeling correlates well with the truth? Have you tested this? I’m open to the idea of intuitions as useful inputs, but I remain skeptical that they have some magic power that can’t be ascertained by Bayesian analysis. (See here for a good summary of what makes more sense to me regarding intuitions.)
Finally, there are other ways people seem to use the word “belief”, including as an identity marker (“my belief in God is a central part of my life” or “believe all women”), or an affirmation (“I believe in you” or “I believe in Obama’s message”). These uses make much more sense once you realize that we are just referring to a warm feeling in your heart. “I believe in you” means “I hold you in high regard” or “I trust you to do your best”, not “there is a high probability that you will succeed in this particular task.”
Most won’t actually say that last part; I’m translating from the common tongue.
Always a safe bet!
I am confident there are philosophy texts on ‘belief’ as an object of analysis; you don’t have to yell at me. I’m just saying that there is a huge discrepancy between effort exerted toward the study of ‘belief’ vs the study of ‘knowledge.’ I would, however, be very interested to be pointed toward such sources discussing ‘belief’!