Cognitive biases prevent science from working properly.
by Sabine Hossenfelder
by Sabine Hossenfelder
Today
I want to talk about a topic that is much, much more important than anything I
have previously talked about. And that’s how cognitive biases prevent science
from working properly.
Cognitive biases have received some attention in recent years, thanks to books like “Thinking Fast and Slow,” “You Are Not So Smart,” or “Blind Spot.” Unfortunately, this knowledge has not been put into action in scientific research. Scientists do correct for biases in statistical analysis of data and they do correct for biases in their measurement devices, but they still do not correct for biases in the most important apparatus that they use: Their own brain.
Before I tell you what problems this creates, a brief reminder what a cognitive bias is. A cognitive bias is a thinking shortcut which the human brain uses to make faster decisions.
Cognitive biases work much like optical illusions. Take this example of an optical illusion. If your brain works normally, then the square labelled A looks much darker than the square labelled B.
[Example of optical illusion. Image:
Wikipedia]
|
But
if you compare the actual color of the pixels, you see that these squares have
exactly the same color.
[Example of optical illusion. Image:
Wikipedia]
|
The
reason that we intuitively misjudge the color of these squares is that the
image suggests it is really showing a three-dimensional scene where part of the
floor is covered by a shadow. Your brain factors in the shadow and calculates
back to the original color, correctly telling you that the actual color of
square B must have been lighter than that of square A.
So, if someone asked you to judge the color in a natural scene, your answer would be correct. But if your task was to evaluate the color of pixels on the screen, you would give a wrong answer – unless you know of your bias and therefore do not rely on your intuition.
Cognitive biases work the same way and can be prevented the same way: by not relying on intuition. Cognitive biases are corrections that your brain applies to input to make your life easier. We all have them, and in every-day life, they are usually beneficial.
The maybe best-known cognitive bias is attentional bias. It means that the more often you hear about something, the more important you think it is. This normally makes a lot of sense. Say, if many people you meet are talking about the flu, chances are the flu’s making the rounds and you are well-advised to pay attention to what they’re saying and get a flu shot.
But attentional bias can draw your attention to false or irrelevant information, for example if the prevalence of a message is artificially amplified by social media, causing you to misjudge its relevance for your own life. A case where this frequently happens is terrorism. Receives a lot of media coverage, has people hugely worried, but if you look at the numbers for most of us terrorism is very unlikely to directly affect our life.
And this attentional bias also affects scientific judgement. If a research topic receives a lot of media coverage, or scientists hear a lot about it from their colleagues, those researchers who do not correct for attentional bias are likely to overrate the scientific relevance of the topic.
There are many other biases that affect scientific research. Take for example loss aversion. This is more commonly known as “throwing good money after bad”. It means that if we have invested time or money into something, we are reluctant to let go of it and continue to invest in it even if it no longer makes sense, because getting out would mean admitting to ourselves that we made a mistake. Loss aversion is one of the reasons scientists continue to work on research agendas that have long stopped being promising.
But the most problematic cognitive bias in science is social reinforcement, also known as group think. This is what happens in almost closed, likeminded, communities, if you have people reassuring each other that they are doing the right thing. They will develop a common narrative that is overly optimistic about their own research, and they will dismiss opinions from people outside their own community. Group think makes it basically impossible for researchers to identify their own mistakes and therefore stands in the way of the self-correction that is so essential for science.
A bias closely linked to social reinforcement is the shared information bias. This bias has the consequence that we are more likely to pay attention to information that is shared by many people we know, rather than to the information held by only few people. You can see right away how this is problematic for science: That’s because how many people know of a certain fact tells you nothing about whether that fact is correct or not. And whether some information is widely shared should not be a factor for evaluating its correctness.
Now, there are lots of studies showing that we all have these cognitive biases and also that intelligence does not make it less likely to have them. It should be obvious, then, that we organize scientific research so that scientists can avoid or at least alleviate their biases. Unfortunately, the way that research is currently organized has exactly the opposite effect: It makes cognitive biases worse.
For example, it is presently very difficult for a scientist to change their research topic, because getting a research grant requires that you document expertise. Likewise, no one will hire you to work on a topic you do not already have experience with.
Superficially this seems like good strategy to invest money into science, because you reward people for bringing expertise. But if you think about the long-term consequences, it is a bad investment strategy. Because now, not only do researchers face a psychological hurdle to leaving behind a topic they have invested time in, they would also cause themselves financial trouble. As a consequence, researchers are basically forced to continue to claim that their research direction is promising and to continue working on topics that lead nowhere.
Another problem with the current organization of research is that it rewards scientists for exaggerating how exciting their research is and for working on popular topics, which makes social reinforcement worse and adds to the shared information bias.
I know this all sounds very negative, but there is good news too: Once you are aware that these cognitive biases exist and you know the problems that they can cause, it is easy to think of ways to work against them.
For example, researchers should be encouraged to change topics rather than basically being forced to continue what they’re already doing. Also, researchers should always list shortcoming of their research topics, in lectures and papers, so that the shortcomings stay on the collective consciousness. Similarly, conferences should always have speakers from competing programs, and scientists should be encouraged to offer criticism on their community and not be avoided for it. These are all little improvements that every scientist can make individually, and once you start thinking about it, it’s not hard to come up with further ideas.
And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality.
The reason this is so, so important to me, is that science drives innovation and if science does not work properly, progress in our societies will slow down. But cognitive bias in science is a problem we can solve, and that we should solve. Now you know how.
So, if someone asked you to judge the color in a natural scene, your answer would be correct. But if your task was to evaluate the color of pixels on the screen, you would give a wrong answer – unless you know of your bias and therefore do not rely on your intuition.
Cognitive biases work the same way and can be prevented the same way: by not relying on intuition. Cognitive biases are corrections that your brain applies to input to make your life easier. We all have them, and in every-day life, they are usually beneficial.
The maybe best-known cognitive bias is attentional bias. It means that the more often you hear about something, the more important you think it is. This normally makes a lot of sense. Say, if many people you meet are talking about the flu, chances are the flu’s making the rounds and you are well-advised to pay attention to what they’re saying and get a flu shot.
But attentional bias can draw your attention to false or irrelevant information, for example if the prevalence of a message is artificially amplified by social media, causing you to misjudge its relevance for your own life. A case where this frequently happens is terrorism. Receives a lot of media coverage, has people hugely worried, but if you look at the numbers for most of us terrorism is very unlikely to directly affect our life.
And this attentional bias also affects scientific judgement. If a research topic receives a lot of media coverage, or scientists hear a lot about it from their colleagues, those researchers who do not correct for attentional bias are likely to overrate the scientific relevance of the topic.
There are many other biases that affect scientific research. Take for example loss aversion. This is more commonly known as “throwing good money after bad”. It means that if we have invested time or money into something, we are reluctant to let go of it and continue to invest in it even if it no longer makes sense, because getting out would mean admitting to ourselves that we made a mistake. Loss aversion is one of the reasons scientists continue to work on research agendas that have long stopped being promising.
But the most problematic cognitive bias in science is social reinforcement, also known as group think. This is what happens in almost closed, likeminded, communities, if you have people reassuring each other that they are doing the right thing. They will develop a common narrative that is overly optimistic about their own research, and they will dismiss opinions from people outside their own community. Group think makes it basically impossible for researchers to identify their own mistakes and therefore stands in the way of the self-correction that is so essential for science.
A bias closely linked to social reinforcement is the shared information bias. This bias has the consequence that we are more likely to pay attention to information that is shared by many people we know, rather than to the information held by only few people. You can see right away how this is problematic for science: That’s because how many people know of a certain fact tells you nothing about whether that fact is correct or not. And whether some information is widely shared should not be a factor for evaluating its correctness.
Now, there are lots of studies showing that we all have these cognitive biases and also that intelligence does not make it less likely to have them. It should be obvious, then, that we organize scientific research so that scientists can avoid or at least alleviate their biases. Unfortunately, the way that research is currently organized has exactly the opposite effect: It makes cognitive biases worse.
For example, it is presently very difficult for a scientist to change their research topic, because getting a research grant requires that you document expertise. Likewise, no one will hire you to work on a topic you do not already have experience with.
Superficially this seems like good strategy to invest money into science, because you reward people for bringing expertise. But if you think about the long-term consequences, it is a bad investment strategy. Because now, not only do researchers face a psychological hurdle to leaving behind a topic they have invested time in, they would also cause themselves financial trouble. As a consequence, researchers are basically forced to continue to claim that their research direction is promising and to continue working on topics that lead nowhere.
Another problem with the current organization of research is that it rewards scientists for exaggerating how exciting their research is and for working on popular topics, which makes social reinforcement worse and adds to the shared information bias.
I know this all sounds very negative, but there is good news too: Once you are aware that these cognitive biases exist and you know the problems that they can cause, it is easy to think of ways to work against them.
For example, researchers should be encouraged to change topics rather than basically being forced to continue what they’re already doing. Also, researchers should always list shortcoming of their research topics, in lectures and papers, so that the shortcomings stay on the collective consciousness. Similarly, conferences should always have speakers from competing programs, and scientists should be encouraged to offer criticism on their community and not be avoided for it. These are all little improvements that every scientist can make individually, and once you start thinking about it, it’s not hard to come up with further ideas.
And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality.
The reason this is so, so important to me, is that science drives innovation and if science does not work properly, progress in our societies will slow down. But cognitive bias in science is a problem we can solve, and that we should solve. Now you know how.
Posted
by Sabine Hossenfelder at 1:18 PM Labels: Psychology, Sociology of Science, Video