When we go on social media, we trust the algorithm for our feed to show us things that we will like and it works 90% of the time. However, there are some negative consequences such as perpetuating or even inclining someone towards bad habits. One of my friends was watching videos of people doing really cool and interesting tricks with vape (like the one pictured above). And unsurprisingly, they were met with ads for vapes. They never wanted to try vaping, so they never bought into the ads. Fair enough, however, if someone started off in the same position as my friend, but had less resolve against vaping, those ads could have started their nicotine addiction. These ads seem harmless until someone starts buying vapes, get addicted, continue watching videos and searching up vapes on their own, and now the ads continue showing up — here is our feedback loop.
Drug addiction is not the only place where the targeted ad algorithm plays a role — around election season, everyone is drowning in a sea of information about election candidates and their platforms. This is discussed, in a similar sense, in Jill Lepore’s article, “All the King’s Data” where she discusses the concern of using personal data in political campaigns. At the time of the entire Simulmatics Corp, it was unheard of and deemed as unfair to use in a political campaign to gain a leg up on an opponent. The first time I read the article I thought it was completely fine to use the “voting-behavior machine” to help predict voter preferences because it’s just using all the tools available to you, but relating it to present-day prediction algorithms, made me reconsider. While the voting behavior machine is a much simpler version of the targeted propaganda circulating on social media today, the underlying principle stays the same — use records of people’s preferences to show them more of the things that they will agree with to get them on your side. Kennedy’s use of the machine told him to take a firm stance on civil rights and religious intolerance and that helped him win his election.
Today, on social media, we are interacting with the propaganda that we agree with or are interested in, and we’re scrolling past the ones that we don’t care about. Thus, we’re fed more targeted ads depending on the stuff we interact with: reinforcing existing biases and creating an echo chamber of information, which limits our understanding on the politics that are actually at play. Once the algorithm becomes fixated on our patterns of democratic or republican-leaning views, we are shown policies for people that play to those ideas, and never end up getting the whole picture — our second feedback loop. While it’s understandable to say that social media and targeted ads should not be someone’s source of information, it is naive to think that everyone will bother doing research when there is already a wealth of information on social media.
While the targeted advertising feedback loop is not necessarily affecting people in the way O’Neil states when she says that they often hurt the poor and help the rich, they can still be considered a WMD as it causes harm in diversity of thought, privacy, and the concern of people unknowingly submitting to giving away their personal info. The adverse consequences of these targeted ads, I would argue, are enough to deem these a Weapon of Math Destruction. Whether it be in perpetuating lack of diversity in political ideologies or the more threatening issue of pushing people towards addiction (in either drugs or social media).