


Meta Under Fire: Researchers Allege Prioritizing Profit Over Child Safety, Lawmakers Push for Accountability
Former Meta researchers testified, alleging the company prioritized profit over child safety, suppressed risk evidence, and exposed children to explicit VR content, prompting Senate scrutiny and legislative action.
Subscribe to unlock this story
We really don't like cutting you off, but you've reached your monthly limit. At just $5/month, subscriptions are how we keep this project going. Start your free 7-day trial today!
Get StartedHave an account? Sign in
Overview
- Former Meta researchers and employees testified, alleging the company prioritized engagement and profit over child safety, suppressing evidence of risks and user safety research.
- Whistleblowers claim Meta intentionally suppressed vital child safety information and directed sensitive studies under attorney-client privilege, aiming to mitigate potential legal risks.
- Internal research and former employees revealed Meta's virtual reality products exposed children to adult and sexually explicit content, raising significant child safety concerns.
- The Senate Judiciary Committee is scheduled to hold a hearing to investigate these serious allegations against Meta concerning its child safety practices and data handling.
- Senators and lawmakers are advocating for new child safety legislation in response to growing concerns about tech companies' accountability for children's online well-being.
Report issue

Read both sides in 5 minutes each day
Analysis
Center-leaning sources frame this story by emphasizing severe allegations against Meta regarding virtual reality harms and alleged cover-ups. They prioritize whistleblower testimony and senatorial condemnations, detailing specific instances of harm and Meta's purported negligence. Meta's brief denial is included but is significantly outweighed by the extensive focus on the accusations and the push for legislative accountability.
Articles (8)
Center (5)
FAQ
Former Meta researchers alleged that the company prioritized engagement and profit over child safety, suppressed evidence of risks, directed sensitive studies under attorney-client privilege to protect against legal risks, and exposed children to adult and sexually explicit content in its virtual reality products.
Meta reportedly required researchers to delete data showing harm to children occurring on its platforms and suppressed vital child safety information.
The Senate Judiciary Committee scheduled a hearing to investigate the allegations, and senators are advocating for new child safety legislation to hold tech companies accountable for children's online wellbeing.
Meta's spokesperson stated the company approved 180 studies related to VR labs including youth safety since 2022 and described the whistleblower narrative as predetermined and false.
History
- 8d3 articles