We live in a strange time. Polarisation in societies has been on the rise and it’s getting to a point where it really concerns me.
Technology, the internet, and social media were supposed to do us well and improve our lives. All the innovation and achievements in this space are absolutely impressive and mind-boggling, especially considering that digital computing was not even invented until 1930s.
Having said that, I do think that the two features below significantly contributed to creating tunnel visions and a great divide in the United States which is now spreading across the globe. Social media has:
- Brought people with similar interests and way of thinking closer together;
- Provided users with personalised content for maximum efficiency and best user experiences.
For almost two decades social media and recommender systems have solved quality of life business and social problems. I certainly enjoy the feed YouTube puts together for me based on the content I enjoy. I love the algorithm-curated computer science, electronics, security, hardware hacking, mathematics, true crime, gliding, and Hearthsone videos that are fresh out of the oven and ready to be watched with a single click; Or when eBay shows me all the vintage computers that are being sold near me.
But similar to the early internet protocols like HTTP, when these systems were being designed some key secondary effects were not taken into consideration by the original designers. To be fair, it would’ve been a difficult thing to foresee given the maturity and popularity of the internet back then. In the case of HTTP, the main issue would be security and privacy of its users.
You see, HTTP is vulnerable to many kinds of security attacks. Luckily SSL (and eventually TLS) came around to save the day and now we can use the internet for sensitive activities with ease of mind. That was a natural evolution that took decades to mature.
Recommender systems are going through a similar phase, with the exception that we are at the very beginning of the journey: Acknowledgement!
These algorithms were designed with good intentions in mind, but their secondary effects are proving to be devastating. For example, see YouTube under fire for recommending videos of kids with inappropriate comments.
This is not a YouTube-specific issue, though. Facebook, Twitter, Instagram, TikTok, you name it, they all personalise content for their users using recommender systems. Google also shows results for things people search for that are relevant to what it knows about them, even if they’re not logged in.
So now imagine a perfect world where a left-winger and a right-winger decide to fact check a matter. What kind of results do you think they’re going to get? They will both get confirmation on what Google thinks they like to hear based on their previous activities. And if they don’t fact check for themselves, which is usually the case, well, then the truth is at the mercy of the recommender systems.
Long story short, recommender systems are becoming confirmation bias reinforcer systems. They’ve solved quality of life problems but introduced major social problems and improvements are long overdue.