Extreme Left Leaning Social Media Algorithms Skewing The Bias

Each social media nowadays has a unique set of learning and technical elements, computer logic, and data analytics that make up the algorithms that help deliver the most relevant content to its users.

These social media platforms spend millions of dollars building and developing, testing, and perfecting these algorithms in the hopes of providing users with the most relevant experience that they can, to ensure that they see only the most engaging and interesting content based on a number of factors. In essence, balancing personal relevance with post timeliness.

These algorithms are managed by engineers, data scientists and analysts, content strategists and more so that you see nothing on accident. Facebook’s algorithm, for instance, puts posts that spark conversation and meaningful interactions between people at the top of the feed. This means that machine learning determines the posts that are most likely to get users to engage, comment, react, and delivers them to users with the highest of priority.

Instagram, on the other hand, used to determine which posts you are likely to care about the most and deliver those posts to you first. But in March of 2018, Instagram went back to a time-based feed like they originally used. This isn’t to say that Instagram isn’t still attempting to deliver you the content that they deem the most relevant to their users, but they are putting more emphasis on the timeliness of the posts to make sure that its users are seeing the posts with the most direct relevance, things happening today. Twitter’s algorithm takes the interaction between users and the accounts they follow and places the most importance on the direct engagement of the users. Essentially, of all the major social networks, the machine learning is dedicated to relevance to its users.

The problem that this presents is that social media platforms are now in control of said “relevance”. Rather than the users being able to decide the content that they want to see, for instance, by choosing the accounts to follow or the friends to add, that the social media software decides what we see. This allows the machine learning to essentially control our reality, or at least our perception of it.

For example, perhaps you are neither here nor there with your opinion on Donald Trump. Maybe you’re too uninformed to feel comfortable having an opinion on the current leader of the free world. These social media algorithms now have the ability to help shape your opinion on him. At first, you may not believe this to be true, but imagine that as an undecided user you are delivered only negative content on the POTUS. It would almost be impossible for you to remain neutral and form your own rational decisions on a man when the content that you are being delivered all have negative spins about him.

In a day and age where the majority of people get their news and knowledge from social media, it’s dangerous to allow these social media algorithms to determine the content that we are putting in front of our eyes and the front of our mind. Not that this is much different from the mainstream and traditional media, but social media were designed to be personalized experiences. These algorithms make these experiences less personal, even though they were designed to do the opposite, and we allow these computer processes that we do not totally understand, determine our experience.