Myside bias: why is it so hard to change other people’s opinions?

Sahar Raheem
7 min readJun 29, 2020

Have you ever wondered why our debates are becoming more and more polarised? Why is it nearly impossible for two people with different opinions to actually end a discussion with agreement?

One of the reasons for this is something called “myside bias”, or confirmation bias. Confirmation bias is the tendency to search for, interpret, favour and recall information that confirms or supports one’s prior personal beliefs or values. Basically, it is an inherent shortcoming in the way our brain processes information that leads us to make biased decisions or adopt biased views and opinions, even when it becomes apparent that there is a strong enough evidence-base against them.

You can imagine how this bias is harming our society. The number of times you have spoken to someone about an issue, pointing them towards all sorts of facts and research results, only to end up driving them further away. It is a concept in psychology that has been researched for years, and it is most discussed in the realm of decision-making science, but here I want to talk about the social and political implications of it.

Why is confirmation bias dangerous?

To understand the effect of this bias we will explore some examples of this bias works

1- Biased search for information: researchers have observed that people tend to search for information that further proves their hypothesis, rather than actually looking at all the evidence out there. An example of this is what is known as positive testing. Researchers have found that when people were asked the question “how happy you are with your social life”, they reported feeling more happiness than when they asked “how unhappy are you with your social life”. In the real world, this may mean that if you are pro-life, for example, you are more likely to search for information that makes you even more pro-life, and confirms your initial hypothesis, rather than actually seek out and critically evaluate the evidence from both sides. What’s interesting is that this process interacts with personality traits. For example, it has been found that people who are more confident readily seek information that contradicts their personal beliefs, while less confident people have less of a tendency to do so.

2- Biased interpretation of information: this can be observed when two individuals have the same information about a topic, but end up with opposing views, signifying that the way in which they interpret that information can be very different due to biased interpretation. A good example of that is an experiment conducted by Stanford university, in which researchers sought out participants who had strong opinions regarding capital punishment, half being for and the other half against it. The participants were then given short descriptions of two studies to read, one on a comparison of US states with and without capital punishment and another which was a comparison of murder rates in a state before and after introduction of the death penalty., They were subsequently given more detailed information on the two studies, and had to evaluate their research methodologies. Both groups, whether with or against, reported shifting their attitudes towards the first study they read, but when they read the detailed accounts, they almost all returned to their original beliefs, regardless of the evidence, and recognised details that supported their beliefs, while disregarding those to the contrary. The study showed that people hold a higher standard for evidence of hypotheses that go against their views. This is also known as disconfirmation bias. Another interesting study, carried out during the 2004 US elections, involved participants who felt strongly about a presidential candidate. These were shown contradictory statements either from Georg Bush, John Kerry or a third neutral candidate, and they were asked to comment on the consistence of each candidate’s statements. The evaluations were very different, with participants more likely to label statements from the candidate they opposed as contradictory. If anything, this shows how this inherent bias in our ability to judge information objectively can skew our perception and interfere in our decision making, even when so much is at stake. In the second example, participants were put in an MRI while they were making their judgments, and it showed that when participants looked at statements from candidates they supported, the emotional part of the brain became active. This did not, however, happen when they were evaluating statements from opposing candidates.

Being smart won’t save you from this bias. In another study, participants took a SAT test to evaluate their intelligence. Then, they were given vehicle safety concerns information, and the origin of the cars was manipulated. The participants — who were American — were asked to evaluate if they would allow German cars with safety concerns onto the American streets, and vice versa for American cars on German streets. The participants believed the dangerous German cars should be banned more quickly than their American counterparts in Germany. There were no differences in responses observed that could be linked to participants’ levels of intelligence.

3- Even if people gather and interpret evidence neutrally, their memory might also be biased, in that they only recall evidence that reinforces their beliefs, AKA selective recall. Psychologists have theorised that information that matches already existing expectations will be more easily stored and recalled than information that does not match

The implications of these kinds of biases are enormous. ‘Myside’ or confirmation bias can make us unable to effectively and logically evaluate the other side of the argument.

Confirmation bias has serious real-life implications. The most obvious of these is how social media algorithms work. These “filter bubbles” (algorithmic editing) only give to individuals information that they are likely to agree with, while excluding contradictory information. Many people have argued that these kinds of algorithms undermine democracy, for example, as they hinder access to diverse information and points of view, and that unless these filter bubbles are removed, voters will be unable to make informed political decisions. Another dangerous outcome is the radicalisation that many social media platforms — including Facebook and YouTube — have been accused of facilitating. This phenomenon has been described by scholar Zeynep Tufekci as follows: “Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons. It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm.” The algorithm uses confirmation bias as a basis for recommending content to individuals that further enforce their initial viewpoints or interests.

Confirmation bias can have detrimental effects on our political and judicial systems as well. Taking juries, for example, who often make judgments based on very complex evidence; it is plausible to assume that these may make biased judgments. They often come to a conclusion early on and, due to confirmation bias, any further evidence presented, whether for or against, may regardless lead to them becoming even more extreme in their opinions — phenomenon known as “attitude polarisation”. Furthermore, confirmation bias can create and aggravate conflicts, especially in emotionally charged debates. Every side of the debate, when presented by evidence, is likely to grow more confident and more radical in their original opinion. Take the abortion debate for example

In a crime investigation, for example, a policeman may identify a suspect early in the investigation, but only seek evidence to confirm his hypothesis and further indict the suspect.

Can we overcome “myside bias”?

We can definitely try!

Intelligence level has not been found to make a difference, but other individual differences might. These include deductive reasoning abilities; and the ability to overcome belief bias. To illustrate, a study was conducted to investigate whether participant’s views of “what makes a good argument?” can be a source of confirmation bias. Participants were randomly assigned to be on either side of an argument and were asked to write an essay defending it. They were then given explicit instructions to make their arguments balanced, including giving proper consideration to both sides of the argument and including pros and cons. The study found that balanced research instructions significantly increased the incidence of opposing information in arguments. In other words, people’s beliefs about what makes good thinking can influence how arguments are generated.

As mentioned before in relation to the results of one of the studies, giving explicit instructions to balance evidence and to consider opposing views will also help people make less biased conclusions. This is like writing the pros and cons of something before making a decision about it. Another study by Vydiswaran et al in 2015 concluded that showing contrasting views of a topic reduced the chances of bias, and also that providing people with evaluations or ratings for those who made these points of views also reduced the bias.

But what’s most important is educating people about the extent to which their abilities to make judgments may be undermined. The awareness of such biases in our thinking in and of itself may help us to actively identify and avoid them. It also begs the necessity of giving a voice and a platform to everyone, even those we disagree with, because we should constantly try to expose ourselves to the other side of the conversation, even If it proves to be difficult or uncomfortable.

--

--

Sahar Raheem

A curious cat and a critical eye. MD, MSc Reproductive and Sexual Health Research.