Redrawing the Legal Landscape for Social Media Companies
Recent jury verdicts in the United States have significant implications for how social media companies like Meta Platforms and YouTube are held accountable for harm caused to users, particularly young ones.
California Verdict: Product Liability Focus
- A jury in California found Meta Platforms and YouTube liable for harm to a young user.
- The case focused not on specific user-generated posts but on the design of the platforms themselves.
- This case deviated from previous cases that were halted by Section 230 of the Communications Decency Act, which protects platforms from liability for user-generated content.
- The legal argument was reframed to question whether harm resulted from third-party content or the companies' design choices.
- This reframing placed the issue within product liability, leading to a landmark decision.
New Mexico Verdict: Consumer Protection Law
- A separate case in New Mexico led to a verdict against Meta Platforms under consumer protection law.
- The case centered on allegations of child sexual exploitation and misleading public safety assurances.
- Internal communications revealed that Meta’s 2019 expansion of end-to-end encryption on Facebook Messenger was warned to reduce the ability to detect child sexual abuse material.
- This was evidenced by a significant reduction in such reports following the changes.
Implications and Policy Response
- The verdicts shift focus from user posts to how platforms are built and governed.
- Internal knowledge within companies about compulsive use and harm to younger users, despite high engagement strategies, is highlighted.
- Structures like infinite scroll and algorithmic amplification are critiqued for fostering compulsive use among younger users.
- Policy responses are increasingly considering restrictions, especially for minors, but blanket restrictions may address symptoms rather than root causes.
- The need for regulation focusing on platform architecture, safer defaults for minors, and transparency in recommendation systems is emphasized.
The overarching conclusion is that effective regulation must address the design and governance of social media platforms, not just user behavior.