A bipartisan group of U.S. senators has demanded explanations from Meta Platforms’ CEO Mark Zuckerberg over the company’s teen safety policies, adding to a widening wave of political, legal, and regulatory pressure on social media giants over youth harm, addiction, and content moderation.
In a letter sent this week, lawmakers including Brian Schatz, Katie Britt, Amy Klobuchar, James Lankford, and Christopher Coons called on Zuckerberg to clarify why Meta delayed implementing stronger privacy protections for teenage users despite internal research suggesting serious risks to youth well before 2024. The senators said this delay raises concerns that safety may have been deprioritized in favor of user engagement and growth.
“Following recent unsealed evidence regarding Meta’s online safety practices towards children, we write to urge Meta’s commitment to prioritizing user safety over engagement. To that end, we request additional information about the company’s online safety practices, including expectations for public transparency and clarification of its trust and safety protocols,” the letter stated, pressing for detailed information on how the company designs products for users under 18, reviews reports of child sexual abuse material (CSAM), and handles sex trafficking content.
Meta’s situation is far from isolated. Across the U.S., social media platforms are facing landmark jury trials and massive multidistrict litigation alleging that companies like Meta, TikTok, YouTube, and Snapchat have engineered their products to be addictive to young users and negligent about harmful impacts. In Los Angeles, a major trial is underway where Meta and Google’s YouTube face accusations that their platforms were designed to “hook” children and exacerbate mental health issues, lawsuits sometimes compared to tobacco litigation in scale and societal impact.
In New Mexico, a state trial opened this month accusing Meta of enabling child exploitation through Facebook and Instagram features, claiming the company prioritized engagement features like infinite scroll over robust safety protections.
Unfortunately, many academic studies further reinforce lawmakers’ concerns. Research examining social media algorithms found that minors (assigned age 13 in experiments) encountered harmful or inappropriate videos much faster and more often than older users on platforms like Instagram and YouTube, highlighting algorithmic weaknesses in youth protection.
Meta maintains that it has taken teen safety seriously, pointing to features like default private accounts for teens, family controls, and other protections, while also noting that mental health is influenced by many factors beyond social media use. The company has denied allegations that it suppressed research or prioritized growth over safety.
The senators are urging Zuckerberg to provide more details about how Meta weighs the balance between user engagement and the safety and wellbeing of its younger users in product design. They also want to know more about the trust and safety measures in place that affect users under 18.
Failed to comply, the company might see direct attacks towards its sprawling AI projects and VR experience gadgets, with Meta coming under pressure to ensure more protections in these elements, before they become a problem.


