the social media problem – 230 bloody hands

So, the recent congressional hearing Big Tech and the Online Child Sexual Exploitation Crisis reminded me of social media’s ongoing saga of good, bad & ugly – both its direct impact and parallel political drama. Perhaps there’s a path forward with collaborative bills, updates to Section 230 of the Communications Decency Act, …

Steven Levy captures the mood: real tragedies, evasive moguls, political grandstanding, allusions to a path forward, distracting subplots, … a tepid mea culpa … a future of fixing the present.

• Wired > email Newsletter > Steven Levy > Plaintext > The Plain View > “After 20 years, legislators are still trying to fix Facebook” (February 2, 2024) – the hearing was less about listening to the executives than flogging them for their sins.

… what better way to celebrate [20 years since Harvard sophomore Mark Zuckerberg released a program called Thefacebook to his college community] than raising your hand in a congressional hearing like a mafia boss or tobacco executive? “You have blood on your hands,” Lindsey Graham, ranking member of the Senate Judiciary Committee told Zuckerberg this week.

4 comments on “the social media problem – 230 bloody hands

  1. Kids online safety

    Here’s how Josh Golin, Executive Director, Fairplay, recapped the social media CEO hearing:

    • Fairplay > Email > “2 minutes of your time to save children’s lives” by Josh Golin, Executive Director, Fairplay (February 8, 2024)

    Last week, I had the privilege of attending the social media CEO hearing in Washington with a dozen members of our new initiative ParentsSOSfamilies who have lost their children to social media harms and advocate for the Kids Online Safety Act [KOSA]. They were at the hearing holding pictures of their kids to remind CEOs, Senators, and the assembled press of the human cost of unregulated social media.

    You probably saw them on the front page of your local newspaper, staring down Mark Zuckerberg as he mumbled his incoherent apology, or in dozens of interviews they gave, including Morning Joe, Jake Tapper, USA Today, and PBS Newshour.

    These parents understand all too well that platforms do everything and anything to maximize engagement from young people. That includes sending our kids down dangerous and deadly rabbit holes of pro-suicide and eating disorder content, enticing them to attempt dangerous challenges, and implementing design features that make children more vulnerable to predation, drug dealers, and cyberbullying.

    These parents support KOSA because they know these platforms will never change their toxic business models and deadly design unless Congress forces them to.

    Will you stand with these courageous parents and call your Senators today to urge them to support the Kids Online Safety Act?

    These parents are turning their pain into action because they refuse to let any other family suffer their devastation. Are you with them? Please call now, forward this email to friends and family, and share it on social media.

    P.S. Check out this amazing ad in today’s Wall Street Journal from 250 grieving families. To continue supporting KOSA advocacy efforts like this, please consider giving to the incredible work of Parents SOS.

  2. Red alert

    Media literacy for tweens and teens: “How can adults protect kids from every worst-case scenario lurking in direct messages and algorithmic feeds?”

    • Washington Post > “How to keep your kids safe online — without taking away their phone” by Heather Kelly (Feb 7, 2024) – From drug dealers in their DMs to posts encouraging disordered eating, social media’s dangers mirror the real world.

    During a nearly four-hour hearing on kids safety online last week, senators sparred with tech CEOs from Meta, TikTok, Snapchat, Discord and X about the harm their apps pose to tweens and teens. There was talk of child sexual abuse material (also known as CSAM), suicide, bullying, drugs, lethal viral trends, extortion, disordered eating and mental health issues — all linked back to the use of social media.

    Key points

    • Government help could be a long way off.
    • Delay the introduction of social media for as long as possible (like to age 14, …).
    • Keep devices out of the bedroom for healthy sleep cycles.
    • There’s an alternative to just cutting off online access.
    • Apps can amplify dangers predating social media services.
    • Talk to kids about the broader issues, not just the pieces that are specific to social media, and avoid reaching for the worst-case scenario.
    • Have conversations, but don’t be weird about it – talk factually about the apps and sites, pointing out that people (in direct messages) and things (ads) aren’t always what they appear to be online.
    • Talk about scam basics and age-specific scams.
    • Talk about how algorithms work.
    • Talk about the risks of seductive silly challenges.
    • Talk about the risks of online mental health advice.
  3. Amplified speech

    Section 230 RIP?


    Perhaps you (as an individual speaker) cannot fool all the people all the time, “but perhaps the internet [via amplification] can.”


    Imagine a post-230 world?

    • “A world of speech in which viral harassment is tamped down but ideas are not” (akin to the fairness doctrine, abolished in 1987).
    • Content moderation wherein companies need only uphold the First Amendment.
    • “Open and honest dialog that fosters … collaboration rather than polarization … rather than a race to the bottom of the brain stem.”


    Delete 26 words from the Communications Decency Act? Are voluntary corporate moves enough?

    • Wired > “The One Internet Hack That Could Save Everything” by Jaron Lanier and Allison Stanger (Feb 13, 2024) – The definition of speech itself has changed.

    US lawmakers on both sides of the aisle are questioning Section 230, the liability shield that enshrined the ad-driven internet. The self-reinforcing ramifications of a mere 26 words—”no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” – has produced a social media ecosystem that is widely held to have had deleterious effects on both democracy and mental health.

    Key points

    • “Section 230 was created during a period when policy was being designed to unleash internet innovation,” with (double-edged) provisions for “Good Samaritan” protection (at scale).
    • “Section 230 forced a certain sort of business plan into prominence, one based … on the paid arbitration of access and influence” [the “advertising” business model].
    • “Section 230 perpetuates an illusion that today’s social media companies are common carriers …”
    • New speech is governed by the allocation of virality – algorithms mediate content and optimize for engagement.
    • Perverse incentives promote cranky [incendiary] speech, which effectively suppresses thoughtful speech” [via stochastic harassment].
    • The difference between opt-in and opt-out produces substantial harms (despite stated corporate terms of service).
    • Generative AI, unlike social media, cannot be a business success if its content is nonsense (e.g., poor-quality data, distorted speech, fake speech).
  4. An experiment on our kids

    Social media at scale, 24/7, what could go wrong, eh?

    This article discusses social media as an insidious tech platform which addicts developing brains to “low quality” experiences, a false sense of belonging. Follow the math (the daily screen time numbers).

    • Psychology Today > “We’re Running a Risky Cognitive Experiment on Our Own Kids” by Kim Samuel [1], reviewed by Monica Vilhauer (February 21, 2024) – Our society needs a fundamental rethink about the role of screen time and social media in our children’s lives.

    Children’s screen time isn’t just a questionable impact of the digital revolution – it’s one of the most consequential long-term issues facing humankind.

    We’re running an impossibly risky, real-time experiment on the mental, emotional, and physical health of humanity’s future.

    If a new drug were having these kinds of impacts on our kids, I have no doubt that millions of people would be out in the streets demanding an outright ban. The fact is that social media is now so ubiquitous and so central to young people’s lives that it’s hard to imagine substantial change.

    Key points

    • Anxiety, depression and loneliness are up in youth.
    • Social media usage is contributing to a decline in trust, less empathy.
    • While parents can cultivate skills before letting their kids “swim” in social media’s “ocean,” the systemic issues require collective action.
    • Big Tech companies may need to fundamentally change their business model.


    [1] Kim Samuel is the founder and chief belonging officer of the Samuel Centre for Social Connectedness and the author of On Belonging: Finding Connection in an Age of Isolation.

Comments are closed.