Tristan Harris Testifies to Congress About the Dangers of Persuasive Technology

Tristan Harris Testifies to Congress About the Dangers of Persuasive Technology

Tristan Harris discusses the persuasive power of technology, including how it creates an asymmetry of power that masquerades as a contractual relationship, the race for attention in technology, the mass narcissism and mental health problems associated with likes and followers, and the use of AI to build predictive models of behavior. His comments also highlights concerns around algorithmic extremism, the impact of persuasive technology on society, and the need for regulation of social media platforms. Additionally, he mentions specific examples, such as YouTube’s algorithm favoring extreme content and the removal of protections in campaign ads for the 2020 election.

  • 💡 Persuasion is an invisible asymmetry of power that leads to an increasing asymmetry of power that masquerades itself as an equal or contractual relationship where the responsibility is on us.
  • 💡 The race for attention in technology is the race to the bottom of the brain stem to keep people engaged by being more and more aggressive, using techniques like pull-to-refresh and infinitely scrolling feeds that remove stopping cues.
  • 💡 The introduction of likes and followers has created mass narcissism and cultural problems, particularly with young people.
  • 💡 Mental health problems in young girls have increased by 170% in the last eight years and social media has been identified as a cause.
  • 💡 AI is being used to build predictive models of behavior and to simulate more and more possibilities to maximize watch time, leading to algorithmic extremism and recommendations being driven by machines rather than human choice.
  • 💡 Facebook and Google use abstract metaphors of a “voodoo doll” to represent their predictive models of behavior and are in a race to better predict users’ behavior.
  • 💡 Persuasive technology creates a fiduciary relationship or a duty of care relationship where the same standard applied to doctors, priests, and lawyers should also be applied to technology companies.
  • 💡 The impact of persuasive technology is widespread, affecting people even if they don’t use these products, through things like the spread of misinformation and conspiracy theories.
  • [💻] YouTube’s algorithms tilt the playing field towards extreme content
  • [🗳️] The rise of social media has taken away protections like equal pricing in campaign ads, which causes a concern for the 2020 election.
  • [🔬] YouTube’s algorithm is designed to keep users engaged by recommending content that is likely to keep them on the platform.
  • [🤝] There needs to be a new class of regulation for social media platforms to address their role in spreading extreme content.
Comments are closed.