View in your browser | Forward to a friend

The Facebook Files: Tech's Big Tobacco Moment


Last week, the Wall Street Journal launched a bombshell investigative journalism series called The Facebook Files. So far there have been 5 articles and 4 podcast episodes on a wide-range of topics from teen mental health, to algorithm-driven outrage, to human-trafficking, to vaccine misinformation. 

What’s unique about the investigation is its evidence: Internal Facebook documents “including research reports, online employee discussions, and drafts of presentations to senior management” prove Facebook knew of alarming, far-reaching harms and failed to take action—often against direct recommendations of its own researchers. 

 

SOME OF THE (MANY) REVELATIONS:

  • 32% of teen girls said that when they felt bad about their bodies, Instagram (which is owned by Facebook) made them feel worse 
  • 13% of British and 6% of American teens who reported suicidal thoughts traced the desire to kill themselves to Instagram 
  • 5.8 million VIPs are protected or excluded from having their policy-violating content removed through a system called "XCheck"
  • Only 13% of moderator time taken to label or remove false or misleading information is spent on content from outside the US, yet 90% of users live outside the US and Canada
  • One political party's team shifted their content from 50% negative to 80% negative because 2018 algorithm changes rewarded outrage


SYSTEMIC HARMS

While the multitude harms evidenced by the reports are shocking, they're also predictable: We're seeing the same effects of extractive capitalism as described in The Social Dilemma. When faced with trade offs, platforms like Facebook, Instagram, YouTube, Google, and TikTok are incentivized to prioritize profits at the expense of user well-being.

What the WSJ reporting illuminates is the length organizations like Facebook will go to protect their businesses — unless outside forces like widespread regulation, public pressure, or investor demands change the equation.

And still, even when platforms do work in the users' best interests we see runaway mechanisms generating problems faster than they can be solved by humans or by artificial intelligence. 
 
 * * * 

A CLARION CALL


This is the Big Tobacco moment for today’s dominant social media platforms. It's clear that platforms like Facebook cannot regulate themselves without devastating global consequences. 

Importantly, while the Facebook Files are limited to Facebook and Instagram, there is ample evidence that similar harms and causal mechanisms are at play in the other major social media companies—it’s just that their internal documents haven’t been reported on publicly. 

What happened to Big Tobacco must happen to these platforms. Externalized harms have to be paid for and future harms need to be mitigated. Tweaks to products won’t solve the problem; we need transformative changes in:

  • How the industry is regulated
  • How capital is distributed
  • How technology is built
Daniel Schmachtenberger, Tristan Harris, and Frank Luntz discuss the importance of the Facebook Files last Friday and the need for sweeping change.
LISTEN NOW
 * * * 

HOW YOU CAN HELP

  1. Get informed. Listen to WSJ's Facebook Files podcast episodes (free) or read the articles series (paywall). More info below.
  2. Dig deeper. Watch or listen to this candid conversation with Tristan Harris, Daniel Schmachtenberger, and Frank Luntz on The Facebook Files, its business model, our regulatory structure, and human nature itself.
  3. Connect with others. Where you can, try to help people see that these harms share a root cause: giant user-generated content systems that profit from monetizing attention, social comparison, peer pressure, tribalism, and disinformation.

The Facebook Files Series (so far)

4 Part Podcast Series (FREE) on Whitelisting, Teen Body Image Impact, Human-Trafficking, and the Outrage Algorithm
Article 2: Instagram and teen mental health
Article 3: Promoting outrage for profit
Article 4: Harms to society's most vulnerable
Article 5: Inability to control anti-vaccine misinformation

Thank you


If you made it this far, thank you for the gift of your attention. And thank you for your support and commitment to the movement for humane technology. 

This is an important opportunity to shift the technology landscape to be more humane, treating attention and intention as sacred, protecting well-being, and building our collective capacity to address humanity’s most urgent challenges.

Let's respond accordingly. 

Best, 
The CHT Team

The Center for Humane Technology is a registered 501c3 nonprofit dedicated to catalyzing a more humane future. We are grateful to our generous lead supporters for making our resources freely available. 

Twitter Twitter
LinkedIn LinkedIn
Facebook Facebook
YouTube YouTube
Copyright © 2021 Center for Humane Technology, All rights reserved.
You are receiving this email because you opted in to stay updated about news, actions and community events related to the Center for Humane Technology.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.