UO Student Series: Combatting Facebook's Anti-Democracy Algorithms
We all need to pressure social media platforms to take more responsibility for their role in this crisis and to immediately implement solutions to remedy it.
**This post is from a member of Professor David Frank’s class at the University of Oregon. In the coming weeks, we’ll share several more of these posts from young Oregonians. Go Ducks!**
Tamir Eisenbach-Budner is studying Planning, Public Policy, and Management at the University of Oregon.
Living in two separate worlds
Last year, I was shocked to see the smiling face of someone I consider to be a very thoughtful, kind person featured in a picture of a Proud Boy rally — MAGA hat and all. It turns out he is a devout Christian, pro-life, and a Trump supporter — all things I don’t blame him for hiding at the left-leaning University of Oregon.
Like any good civic participant who’s been raised to appreciate the healing power of dialogue, I started talking to him about politics, partly to understand where he was coming from, but admittedly mostly to see if I could persuade him to adopt some different views on my side of the political spectrum.
What I was surprised to discover, however, is that we rarely disagreed over issues. Instead, we were each concerned with completely different issues that the other had never heard of, or at least given any thought to. For instance, this conversation was the first time I heard about the excellent criminal justice reform bill, the First Step Act, that Trump signed into effect, and it was also the first time he heard about the Trump administration’s heinous child separation policies.
It was like we were coming from two separate worlds. And we were — online worlds.
Currently, 67 percent of American adults get their news on social media. Of the various social media platforms, Facebook is by far the most popular, with 45 percent of Americans using it as a source of news. It has facilitated the rise of the Black Lives matter movement, as well as the Blue Lives Matter movement; cyberbullying, as well as intimate connection; and information, as well as disinformation.
Based on these counterbalancing consequences, it is difficult to judge whether, on the whole, social media, and specifically Facebook, empowers or disempowers our democracy. For most of my life, I’ve believed that it is simply a neutral service that is only tipped towards good or bad depending on the will of the human using it. However, this perspective does not take into account the fact that Facebook’s revenue model is inherently linked with the polarization of its users.
Facebook’s Revenue Model
It's natural to surround ourselves with people whom we feel comfortable with and similar to, which will always lead to at least some polarization. However, it is also “the structure of social media itself [that] may be part of the problem.” According to Investopedia, “Facebook generates substantially all of its revenue from selling advertising to marketers... [and] marketers pay for ads based on the number of impressions delivered or the number of actions, such as clicks, undertaken by users.”
What this means is that the ultimate goal of Facebook is to get its users to spend as much time on the site as possible — clicking, viewing, commenting, liking, sharing, and posting — because they all directly translate into advertising revenue.
Unfortunately, it just so happens that humans are naturally drawn to and reactive to incendiary, divisive, and outrageous posts, which means it is in Facebook’s best interest to direct users towards these kinds of posts in order to generate the most engagement. Adjusting their algorithm to curb the dominance of such polarizing posts, on the other hand, is, in Facebook’s words, “antigrowth.” Thus, Facebook not only allows these weeds to grow in its garden, it intentionally waters them.
How This Drives Polarization
In the aftermath of the Cambridge Analytica scandal that contributed to the election of Donald Trump, Facebook set up several “Integrity Teams” to analyze and improve the company’s ethics.
The “Common Ground” team found that “[Facebook’s] algorithms exploit the human brain’s attraction to divisiveness.” What’s more, the team discovered that, “[i]f left unchecked,” Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.” Similar studies have found that not only are “conservative blogs... more likely to link to other conservative blogs” and “left-leaning news media articles... more likely to link to other left-leaning sites” but that each subsequent click inevitably leads in a more radical direction.
For example, if a new mother joins a Facebook group about making organic baby food, Facebook will likely suggest that she join an anti-vaccination group. And from there, Facebook will likely recommend that she join a QAnon group or Chemtrails group. This is because each subsequent group is the most “engaging” one with correlated members to the one that came before it. An internal study conducted by Facebook in 2016 found that no less than “64% of all extremist group joins are due to [Facebook’s] recommendation tools,” specifically, their “Groups You Should Join” and “Discover” algorithms. This is no surprise, considering the way in which their revenue model encourages and profits off outrage and extremism.
To its credit, Facebook encouraged its “Integrity Teams” to develop ideas to remedy these issues. They suggested building a classification system for hyperpolarized content, suppressing clickbait, reducing the spread of content disproportionately favored by hyperactive users, and penalizing publishers who repeatedly share fake news. However, unsurprisingly, management decided to shift “away from societal good to individual value,” claiming that they “shouldn’t police people’s opinions, stop conflict on the platform, or prevent people from forming communities.” All of the “Integrity Teams’” projects were either rejected or weakened, and with them, our entire democracy.
How can we ever hope to compromise and collaborate with one another when each of us is living in an entirely different world, believing in entirely different events and facts, and completely alienated and dehumanized from the other?
The Consequences
After learning these things, it is no longer possible to consider Facebook a neutral platform that simply reflects the will of its users. It exerts its own will, which drives polarization, in the name of its advertising-based revenue model. Given that one in two Americans get their news from Facebook, not to mention other social media websites, it is no surprise that “Republicans and Democrats are more divided along ideological lines – and partisan antipathy is deeper and more extensive – than at any point in the last two decades.”
We all need to pressure social media platforms to take more responsibility for their role in this crisis and to immediately implement solutions to remedy it. Otherwise, all Americans, like my conservative counterpart and I, will continue to struggle to find common ground.