A former senior executive at Facebook has warned the social media giant is “ripping apart the social fabric of how society works.”
Chamath Palihapitiya was vice president for user growth at Facebook until 2011.
“The short-term, dopamine-driven feedback loops that we have created are destroying how society works,” he said, as reported by The Verge. “No civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.”
Palihapitiya was referring to the fact that humans get a rush from the chemical dopamine when they experience positive social feedback, something social media provides in the form of “likes” and other functions.
Governments have long worried about the impact of technology in driving radicalization. Online communities of Islamist supporters have been able to create echo chambers which allow them to reinforce each others views and incite each other further into radical beliefs. The same goes for supporters of neo-Nazi and white supremacist organizations.
Recently tech companies have been under increased pressure to crackdown on extremists who use their platforms to recruit and communicate. In September, Facebook hired 3,000 moderators to take down hate speech from the site. Yet, it is unclear whether or not this intervention is motivated by humanitarian impulses. Consider the fact that in July, Facebook partnered with the government of Pakistan to remove “blasphemous” content — i.e., content critical of the Muslim prophet Mohammed.
But it is rare for a senior executive responsible in part for building the platform to directly call out the psychological triggers embedded in the platforms themselves that fuel division.
Palihapitiya’s comments echoed those of former Facebook president Sean Parker, who in November accused Facebook of “exploiting a vulnerability in human psychology.”
Those dealing with counter extremism will need to do more to understand how humans interact with social media and what impact these platforms are having on group dynamics. One thing is certain: When Facebook’s own top executives warn that something is wrong, it’s time to investigate further.