View this email in your browser
Hello friends,
Last week, a group of 300 incredible technologists, press, partners, and other thought leaders joined us as we unveiled a new agenda for the technology industry. You can watch the 44-minute presentation video here.
We aimed to gain a common understanding of the tremendous harms posed by the attention economy and how to create change together. 

Introducing “Human Downgrading”

Our primary goal was to move public discourse in Silicon Valley from a cacophony of disconnected grievances and scandals ("they took our data!") to a meaningful humane agenda of actions that address the vast surface area of problems arising from technology’s race for attention.

In last week’s presentation, we explained how seemingly separate problems – tech addiction, teen depression, shortening attention spans, political polarization, the breakdown of truth, outrage-ification of culture, and the rise of vanity/micro-celebrity culture – are actually not separate issues. They are all symptoms of one underlying problem: the race between tech giants to capture human attention, which becomes a race to overwhelm human weaknesses. Put together, that race creates “human downgrading.”

Giving a name to the connected system of human downgrading is crucial, because without it, solution creators end up working in silos and attempt to solve the problem by playing an infinite “whack-a-mole” game. It’s like working on coral reefs, ocean acidification, or hurricanes before there was a recognition of systemic climate change. Shared language creates the opportunity and leverage to develop systemic solutions and unite the voices of concerned technologists, investors, researchers, media, policymakers, and parents.

Human downgrading is a problem that reduces our capacity to solve all other problems. It suppresses critical thinking and nuance, makes us more lonely, and reduces our capacity to find common ground and shared values. 

The Solution?

While the problem is immense, the good news is that the solution involves one thing: better protecting the vulnerabilities of human nature. Technologists must approach innovation and design with an awareness of protecting of the ways we’re manipulatable as human beings. Instead of more artificial intelligence or more advanced tech, we actually just need more sophistication about what protects and heals human nature and social systems. To that end, we are developing a new model that technologists can use to explore and assess how technology affects us at the individual, relational, and societal levels.

What’s Next?

This is the beginning of a long journey. Instead of a race to the bottom of “how can we most effectively manipulate you,” we must create a new a race to the top to completely reverse human downgrading. To start facilitating this transition, we are announcing four initiatives:

  • Opportunities for key stakeholders to plug into working groups to take action.
  • “Your Undivided Attention” — a new podcast launching in June where Tristan and Aza gather insights about the invisible limits and ergonomics of human nature from a wide range of experts to address human downgrading.
  • Design guides to facilitate assessment across human sensitivities and social spaces to help guide designers in redesigning their products.
  • A Humane Technology conference in the next year to bring together people working on many different aspects of Human Downgrading.

Growing Our Team to Meet the Challenge

We’re hiring to take on the scope and urgency of this work. The best candidates often come through referrals, so please point outstanding candidates to our Jobs page.

Thank you so much for your support! We look forward to sharing more with you soon.


Check out the press to learn more about our new agenda for tech: Fast Company, Reuters, Wall Street Journal, and Wired among others.


Copyright © 2019 Center for Humane Technology, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.