Facebook’s Algorithmic Priorities: Why Zuckerberg Denied Facebook’s Addictive Nature

By Nathaniel Hylton

“Time well spent.” That’s the phrase Mark Zuckerberg has used to describe Facebook, always focusing on how the platform is “bringing people closer together.” The co-opting of this phrase — first used by Facebook critics — seems carefully designed to hide Facebook’s highly addictive nature. “It’s as if they’re taking behavioral cocaine and just sprinkling it all over your interface and that’s the thing that keeps you like coming back and back,” said Aza Raskin, a leading technology engineer responsible for designing the infinite scroll that has become a prominent feature across social media apps.

Former Facebook employee Sandy Parakilas tried to stop using Facebook after he left the company in 2012: “It literally felt like I was quitting cigarettes.” The Facebook addiction is real.

On March 25th, when House members questioned the chief executives of Twitter, Google, and Facebook, lawmakers specifically highlighted Facebook’s spread of misinformation, cyberbullying, and addictiveness. In response to questions about Facebook’s addictive nature, Zuckerberg stated that “from what I’ve seen so far, it’s inconclusive and most of the research suggests that the vast majority of people do not perceive or experience these services as addictive or have issues.” Yet Sean Parker, Facebook’s first president and an early investor, has publicly admitted the platform was designed to be addictive, telling the question was, ‘How do we consume as much of your time and conscious attention as possible?’” Parker explained that Facebook’s design is based on a social validation feedback loop. To keep users on the platform, Facebook gives “you a little dopamine hit every once in a while because someone liked or commented on a photo or a post.” This exploits a vulnerability in human psychology that makes us yearn for external validation. Parker admitted that he, Zuckerberg, and Kevin Systrom (Instagram’s co-founder) “understood this consciously.”

In an interview about his book Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, Siva Vaidhyanathan, a professor of media studies at the University of Virginia, critiqued Facebook’s design, explaining that Facebook’s objective is to maximize happiness among its users. For Facebook, maximizing happiness is synonymous with high engagement — receiving ever increasing clicks, comments, likes, and shares on a post. “If a post or a person generates a lot of these measurable actions, that post or person will be more visible in others’ News Feeds.” That is where Facebook’s highly addictive nature comes in; users strive to receive as many measurable actions as possible (i.e. likes). Facebook then pushes these “popular” posts to the top of suggested feeds to get more engagement, continuing an endless feedback loop. This system that prioritizes “popular” posts is not only addictive, but also dangerous, as posts that advocate conspiracy theories, advance hatred and bigotry, and spread health misinformation quickly rise to the top of feeds as they acquire likes.

And the reason Facebook wants to make sure users stick around as long as possible? Money. The more users interact with the site, the more information about these users Facebook can suck up and monetize. Targeting users with ads is the center of Facebook’s business model, advertisers spend top dollars making sure ads are directed at those most likely to buy their goods or services. That’s why CfA, through its research initiative the Tech Transparency Project, added its voice to a coalition of more than 40 other organizations demanding an end to these types of targeted ads, with an eye towards removing Facebook’s incentive to make its platform so addictive. Unsurprisingly, Facebook did not respond.

Of course Facebook has the right to monetize its product, but as the dangers of the platform become increasingly apparent, one of the issues lawmakers will need to take on is its addictive nature. To understand how users become addicted, and to limit the negative societal impact, lawmakers need honest answers regarding Facebook’s algorithms and design.

Campaign for Accountability (CfA) uses research, litigation and aggressive communications to expose misconduct & malfeasance in public life.