MUSE Magazine

View Original

THE BUILT IN BIAS

With the click of a mouse, it is extremely easy to feel informed as endless amounts of information is literally at our fingertips. The accessibility of information we seek is not up for debate - it is the question of what we are searching and reading about that poses an issue. The way we frame our questions and choose what to read or ignore is based heavily on our biases. If we believe strongly in something, we will be less likely to want to read about how we could be mistaken. I highly doubt many Trump supporters have ever searched “all the things Trump has done wrong” on Google, just as most of his haters have probably never looked up “what has Trump done right?”People like being right and will, therefore, seek media that will make them feel like they are. If a headline coincides with one’s thoughts, they are more likely to select that media to consume, believe, and base further arguing points on -  even if they never read past the title. This confirmation bias is all too real and is shown every time we choose to ignore information because it contradicts our thoughts. What is worse is opting for sources that agree with us, simply because they agree with us. The absence of fact-checking is where the spread of misinformation can begin.   By consuming media selectively, we actively choose to become blind to opposing views. I think we can all agree that this can be detrimental – what is worse is that we are exposed to a built-in version of this confirmation bias every day and may have never realized it.Many of us spend a lot of time on our phones and most of this time on social media. Facebook, Twitter, TikTok, Instagram, and Snapchat keep us entertained and also up to date on what is going on in the world. We use these platforms so often and absentmindedly that we tend to forget they are not just for our benefit but are also businesses that profit from our continued use of them. Algorithms are essentially why people become addicted to their technology, as they are the machinery behind the way content is presented to us. Social media algorithms sort the posts on our feed based on our behaviour, and they give us what we want to see, all the time, to keep us online and connected as much as possible. We know this: it is seen in our perfectly tailored For You Pages on TikTok that have us scrolling for hours, and in the discussion “what side of TikTok are you on?”. We know when a post pops up that we don’t like, a mistake has been made - the algorithm predicted wrong – and we ask, “how did I end up on this side of TikTok?”. The problem is that the algorithms are usually right in deciding what content will make us stay - which is fine for big companies who benefit from people’s online presence as they send out paid advertisements and make a profit. However, there are consequences for audiences who are continuously pushed favoured content. Aside from the screen time increase that our eyes and mental health do not appreciate, we can find ourselves hardly being exposed to opposing views. A Netflix Documentary, The Social Dilemma, explains how the problems associated with being provided continuously personally biased information are tied to the fact that we all are “simply operating on different sets of facts” (The Social Dilemma, 2020). When the same information is presented to people over and over again, it isn’t easy to understand how anyone could believe anything different. However, other people do because they receive their tailored media with biased information meant for them and their beliefs (The Social Dilemma, 2020). We all have personal echo chambers on social media, meaning we are always surrounded by facts we want to see and opinions that reflect and reinforce our own. Here is the built-in confirmation bias that results in less information about any opposing stances or views. When all you see is information reinforcing your beliefs, it seems evident that anything from the outside that tries to contradict you is automatically wrong. The current political polarization in the US and the drastic difference in some people’s responses to COVID-19 are, unfortunately, great examples. We need to be aware of confirmation bias and the problems it causes, and how algorithms come into play. To compete with the algorithms trying to keep our attention, we need to be diverse in our media selections and open to material that may not hold our immediate beliefs. Actively imposing our own meanings through our bias and previous knowledge can make it difficult to take new opinions seriously. Still, it is vital to become exposed to other perspectives to comprehend society fully. One way to expose ourselves to other opinions could be to follow people with different political beliefs on social media. Not only may it open our minds, but it can educate us on other views, as well as make us reflect on our own. Doing this can help us solidify our stances by making us think about why we believe what we do, or it may even sway us to other sides of discussions as we realize we may not be as set in our beliefs as we thought. Either way, it will trick the algorithms and contribute to the avoidance of falling prey to confirmation bias.

References

The Social Dilemma. Dir. Jeff Orlowski. Exposure Labs, 2020. Film 

HEADER IMAGE SOURCE: