Bias on my side: what it is and how it distorts our perception of things
Have you ever wondered why debates are getting more and more polarized? Why when two people argue is almost impossible to reach an agreement? How is it possible that, even with strong evidence to the contrary, people defend their opinions so aggressively?
No matter how rational we consider ourselves, it seems that human beings have the natural tendency to seek, interpret, favor and remember information that supports our previous beliefs and values, regardless of whether there are facts that contradict.
This natural tendency has a name: it's the bias on my side. Next we are going to delve into this widespread and, in turn, potentially harmful psychological phenomenon and the investigations that have shed some light on how it occurs.
- Related article: ""Cognitive biases: discovering an interesting psychological effect"
What is bias on my side?
There are many times that, when we are talking to someone about any topic, we explain what we think and what "facts" there are. We explain all the evidence that we have found in all kinds of “reliable” sources. We know that that person has an opinion contrary to ours and we trust that, after giving them these tests, they will change their mind, but that simply does not happen. No, he is not deaf, nor has he ignored us, it has simply happened that since what we have told him contradicts what he thinks, he has belittled our "facts", thinking that we are uninformed.
My side bias is a psychological phenomenon that causes us to have tendency to seek, interpret, favor and remember information that supports or confirms our beliefs and previous values, ignoring or downplaying evidence that contradicts what we believe in. Basically, this bias is an inherent defect in our brain in the way it processes information. information, which leads us to make biased decisions or adopt points of view and opinions wrong.
Despite the fact that all human beings are victims of this bias, this psychological phenomenon is considered potentially dangerous, in the sense that makes us practically blind to any information that, no matter how true it may be, if it is contrary to what we think, we will consider it false or not rigorous. In fact, some theorists on this pattern of thought, such as Keith E. Stanovich consider him essentially responsible for the idea of post-truth: we only see what we want to see.
Implications of this cognitive bias
Throughout the last decades Stanovich along with other cognitive researchers such as Richard F. West, and Maggie E. Toplak have experimentally addressed this bias. One of its main implications is that human beings tend to seek information that gives strength to our opinions, omitting or discarding any data that, no matter how true and demonstrable it may be, we consider less rigorous. People we look for information that supports our hypotheses, instead of looking for all the evidence, both confirming and refuting.
In fact, this is pretty easy to understand by looking at how people behave on just about any subject they want to educate themselves on. For example, if we find a person who is pro-life, that is, she is against abortion, she will be more prone to seek information that proves her right and, what's more, it is even possible that she becomes even more contrary to the abortion. You will rarely seek information that explains why abortion should be a universal right or whether the fetus after a few weeks he does not feel, and if he does, he will read those contents from a very skeptical perspective and superficial.
Curiously, the fact of looking for information that is found on both sides of a debate, that is, looking for data that is favorable and unfavorable to the opinion that one has already made from the beginning, seems to be related to personality traits rather than intelligence. In fact, some research suggests that the most confident people tend to search for data. prove and refute both sides of the debate, while the most insecure seek what gives strength to their beliefs.
Another of the clear implications of this bias is how the same information is interpreted differently based on our basic beliefs. In fact, if two individuals are given exactly the same information on a subject, it is likely that they will end up with totally or completely different points of view. partially opposed, given that even if the message is identical, the interpretation they make of it will not be and their way of seeing it will be biased in a way staff.
- You may be interested in: "Are we rational or emotional beings?"
The death penalty experiment
We have a good example of this in an experiment conducted at Stanford University, in which researchers they looked for participants who already had strongly divided opinions on the same issue: being in favor of or against the death penalty. Each of the participants was given brief descriptions of two studies, one comparing US states. with and without capital punishment and the other in which the murder rate was compared in a state before and after having introduced the death penalty.
Following this description, they were given more detailed information on both studies and asked to rate how reliable they believed the research methods in both investigations were. In both groups, both those who were in favor of the death penalty and those who were against it, reported that their attitudes had changed somewhat at the beginning of the study when they were given the brief description, but when given more details most reverted to their previous beliefs, despite having the evidence that gave solidity to both studies. They were more critical of sources contrary to their opinion.
German cars and American cars
Another study showed that intelligence does not protect us from bias on my end. In this case, the intelligence of the participants was measured before they were given information about a fact in which they had to express their opinion. The fact in question was about some cars that could pose security problems. The participants, all of whom were Americans, were asked if they would allow German cars with safety problems to drive on the streets of the United States. They were also asked the question vice versa: whether they thought American cars with defects should be allowed to drive through Germany.
Participants told about German cars with safety issues said they should be banned in the US. for posing a danger to the road safety of the country. Instead, those who were briefed on their American counterparts said that they should be able to transit in Germany. In other words, they were more critical of the safety of German cars because they are German and drive in their country and more lax with American cars because they are American and drive abroad. Intelligence did not reduce the likelihood of bias occurring on my side.
Memory and bias on my side
Although people try to interpret data in the most neutral way possible, our memory, which will be biased by our own beliefs, will act favoring the memory of what supports our point of view, that is, we have memory selective. Psychologists have theorized that information that fits our existing expectations will be more easily stored and remembered than information that disagrees. That is to say, we memorize and remember better what makes us right and we forget more easily what goes against us.
How does this relate to social media?
Having seen all this, it is possible to understand the seriousness of the implications of the bias on my side when receiving and interpreting any information. This bias makes us incapable of effectively and logically evaluating the arguments and evidence given to us, no matter how solid they may be. We can believe something that is doubtful more strongly for the simple fact that it is on "our side" and be very critics of something that, despite being very well demonstrated, since it is "against us" we do not see it as rigorous and reliable.
But Of all the implications that this entails, we have one that is directly related to social networks.especially its algorithms. These digital resources, through "cookies" and remembering our search history, make us present some resources that are related to something we have already seen before. For example, if we search for images of kittens on Instagram, more photos of these animals will begin to appear in the magnifying glass section.
What implication do these algorithms have with the bias on my side? A lot, since we not only look for images of animals or food on social networks, but also opinions and "facts" that confirm our pre-established opinion. So, if we search for a vegetarianism blog, many other related ones will appear in the search section, both politically neutral and would be vegetarian recipes such as blog posts, images and other resources that talk about animal brutality and criminalize people "carnacas".
Bearing in mind that we are hardly going to seek information contrary to our point of view, It is a matter of time before our opinions become more radical. As the networks show us resources in favor of our point of view, we will progressively go deeper into the subject and, taking the example of vegetarianism, it is even probable that we will end up in vegan sectors, supporters of more intense actions towards the sector meat.
Based on this, and especially applied to political ideologies, many people consider that these algorithms are killing democracy. The reason for this is that, since the algorithm does not present us with all the available points of view about the same topic but rather presents us with what favors our opinion, making us less prone to compare options. Since we are not facing different "truths" and we are trapped in the comfort of our point of view because of social networks, we are really being manipulated.
It is for this reason that, as an attempt to escape from the trap of our own mind and how social networks They help to lock us even more in what we think, it never hurts to look for opinions contrary to those our. Yes, it's true, the bias on my side will make us tend to see them more critically and superficially, but at least the attempt can give us a bit of ideological freedom and opinion. Or at least delete the search history and not give the opportunity to the social network of the moment to trap us in our own beliefs.
Bibliographic references:
- Macpherson, R. & Stanovich, K. (2007). Cognitive ability, thinking dispositions, and instructional set as predictors of critical thinking. Learning and Individual Differences 17 (2007) 115–127.
- Stanovich, K., West, R. (2007). Natural myside bias is independent of cognitive ability. THINKING & REASONING, 2007, 13 (3), 225 – 247
- Stanovich, K., West, R. (2008). On the failure of cognitive ability to predict my side and one side thinking biases. THINKING & REASONING, 14 (2), 129 – 167
- Sternberg, R. J. (2001). Why schools should teach for wisdom: The balance theory of wisdom in educational settings. Educational Psychologist, 36, 227-245.
- Stanovich, K.E.; West, R.F.; Toplak, M.E. (2013), Myside bias, rational thinking, and intelligence, Current Directions in Psychological Science, 22 (4): 259–64, doi: 10.1177/0963721413480174, S2CID 14505370
- Toplak, M. E., & Stanovich, K. AND. (2003). Associations between my side bias on an informal reasoning task and amount of post-secondary education. Applied Cognitive Psychology, 17, 851-860.
- Lord, Charles G.; Ross, Lee; Leper, Mark R. (1979), Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence, Journal of Personality and Social Psychology, 37 (11): 2098–09, CiteSeerX 10.1.1.372.1743, doi: 10.1037/0022-3514.37.11.2098, ISSN 0022-3514