How beliefs are different from actions?

cartoon boy running away from his shadow on the wall

In our daily lives, it is common for us to think and talk about actions in normative terms. I should do this, I’d better not do that, for different reasons, depending on the situation. Actions are amenable to such regulating, normative attitudes because we can choose how to act, we have control over our actions. In other words, our actions are subject to our exercise of agency. Legal systems, for instance, are built based on the assumption that we can be held responsible (normativity) for our actions because we can determine our actions (agency). This demonstrates our fundamental commitment – ‘ought’ implies ‘can’, normativity cannot apply without agency. 

However, we use normative terms when referring to beliefs as well. Are my beliefs justifiedShould I believe this information or dismiss it as unwarranted? Just think of all the fake news and conspiracy theories. While we regularly evaluate beliefs based on some norms, just as we do with actions, we do not have the same sort of control over our beliefs as we do over our actions. In this context of agency, beliefs are not readily understandable as actions. Yet, we evaluate them nonetheless. So, if we follow our intuition that ‘ought’ implies ‘can’, then the question is this: how is our control over our beliefs different from the control we have over our actions? A much bigger question (something for another article) is, of course, about the sort of control that we have over our beliefs, what is this control like, where is it ‘located’, and how do our normative expectations attach to it? But for now, let’s see how beliefs differ from actions.

Difference in control over beliefs and actions

To warrant normative evaluation of our beliefs, we must have enough control over them. This seems analogous to actions. However, on closer inspection, beliefs differ from actions in a significant way. For example, depending on the incentive, I can will myself to act but I cannot will myself to believe something, regardless of the incentive. Imagine that I offer you 1,000 EUR if you go and switch on the lights. Easy, right? You just get up and do it. Now imagine I offer you 10,000 EUR if you truly come to believe that the Moon is made of cheese. Not so easy anymore. No matter how huge the external incentive is, you will find it close to impossible to will yourself to believe something just as easily as you can will yourself to do something. Moreover, I can force myself to act without having a corresponding belief, which indicates that my beliefs are not equally responsive to the force of will. 

Self-determined action is in itself an exercise of our agency. Belief is not like that. Intuitively, belief seems distinct from the processes resulting in it or influencing it. This raises the question of epistemic agency. Can I choose to believe something? To what extent can I determine my beliefs? 

If we want to maintain our commitment to having enough control over our beliefs (and the related normative evaluation of our beliefs), the action analogy does not help. For example, Ernest Sosa’s performance view builds a case of beliefs as performances that can be normatively evaluated similarly to an archer’s or hunter’s shot, based on how well they achieve their aims where the “aim of belief is said to be truth” (2009, 6). It is not necessarily the case that beliefs aim at truth though (more on this later). However, the analogy with archer’s or hunter’s shooting performance is not properly applicable to beliefs – I cannot decide whether or what to believe in the same way I can decide whether, when and in which direction to shoot. 

Examples from empirical research

One of the most common views about when and how we control our beliefs can be called the process view. In short, the idea here is that we exercise our agency in the process of forming a belief when we engage in deliberating, judging, assessing, etc. No doubt, sometimes we perform all these cognitive activities. But not as often as we perhaps might wish. Conscious, focused reflection is a resource-intense activity. It wouldn’t have been a very adaptive strategy if we always had to engage in such sort of critical thought before making up our minds and deciding how to act. Empirical research suggests that we rely on a whole set of mental shortcuts in our daily lives. Shortcuts save resources and work well enough, most of the time. 

In their 2021 article The science of belief: A progress report, Nicolas Porot and Eric Mandelbaum present a summary of the evidence from the empirical study of beliefs. One of the findings is that our belief-forming processes are susceptible to the effects of cognitive load. For example, recently I was watching a comedy show where two participants tell each other stories from their lives and each has to guess whether the opponent’s story is true or a lie. Then, the speaker reveals the truth. I remember one story that particularly surprised me. It was a very interesting story, well told, with a lot of details, twists, suspense, jokes, and so on. It grabbed my attention and held it throughout. I could retell this story with little effort. However, if you asked me whether it was true or a lie, I couldn’t tell. I can now easily misremember it as being true – after all, I recall it so well and it sounded so convincing. 

However, that is how cognitive load works (along with influences from various cognitive biases). Because so many cognitive resources have been engaged in paying attention to all the details, following all the twists, being surprised at unexpected turns in the plot, staying focused in this cognitively absorbing experience, it was a considerable cognitive load. That means fewer available resources to dedicate to the effortful process of deliberately assessing and rejecting the story as a lie. In the absence of cognitive load, I would have been more likely to remember correctly whether that story was true or not. Therefore, besides being subject to various cognitive biases, our belief-forming processes can be short-circuited by the presence of cognitive load. 

Another interesting empirical finding concerns how we change or update our beliefs. It appears we use different methods for this. For example, while the norm for many of my present beliefs may be to maintain a certain level of coherence in the belief system when new evidence comes up (here, rational updating can be involved), those beliefs that form the core of our identities are subject to different norms dictated by our psychological immune system: “The psychological immune system functions to protect our most core beliefs, the ones that make up our sense of who we are (such as the beliefs that one is a good person, a smart person, and a dependable person). Believing conclusions that challenge one’s core beliefs puts one in a state of psychological distress. The psychological immune system remedies this by post-hoc rationalizing those conclusions away” (2021, 7). 

So, when a belief I self-identify with is challenged by new information that contradicts that belief, I am more likely to treat it as a psychological threat to be avoided (i.e., rationalised away) rather than new evidence that warrants a reassessment of my belief. Therefore, as far as our core beliefs are concerned, maintenance of a state of psychological balance around an individual baseline may be more important than attaining the truth. 

Conclusion

Where does this leave us? For one thing, it is clear that the sort of control we have over our beliefs is not the same as the sort of control we have over our actions. It is far easier to force someone to do something than to believe something. And yet, we treat beliefs as something we can evaluate, something we can judge as being poorly or well justified, reasonable or complete nonsense. All this normativity must attach itself to something. If there is a ‘should’, there needs to be a ‘can’. This is the bigger question about the nature of control we have over our beliefs.

keep exploring! 

Resources used:

Porot, N. and Mandelbaum, E. (2021) ‘The science of belief: A progress report’ WIREs Cognitive Science 12:e1539.

Sosa, E. (2009) ‘Knowing Full Well: The Normativity of Beliefs as Performances’ Philosophical Studies v142(1), pp5–15.

One thought on “How beliefs are different from actions?

  1. Pingback: Brief Reflections on Free Will – humanfactor

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.