Two former Meta safety researchers have told US lawmakers that the company concealed evidence of potential harm to children on its virtual reality (VR) platforms, alleging executives pressured staff to downplay or erase findings that could highlight risks.
Testifying before a Senate committee on Tuesday, Jason Sattizahn and Cayce Savage claimed Meta “ignored the problems they created and buried evidence” of unsafe experiences on its VR products. Both said company lawyers intervened to steer internal research away from issues that could expose the scale of harm.
The allegations, first reported by the Washington Post, include claims that Meta demanded researchers erase data on sexual abuse risks. Savage told senators she warned Meta against hosting Roblox on its VR headsets after uncovering evidence of paedophile rings exploiting the platform. “They set up strip clubs and pay children to strip with Robux,” she said, noting that Roblox remains available on Meta’s app store.
Roblox strongly rejected the claims, calling them “ill-informed and outdated.” The company said safety was its top priority, pointing to 24/7 moderation, swift enforcement against abusers, and cooperation with law enforcement.
Meta denied the whistleblowers’ accounts, dismissing them as based on “selectively leaked” documents and insisting it had approved nearly 180 youth safety studies in recent years. “There were no bans or limits on research,” the company said in a statement.
Sattizahn, who worked at Meta from 2018 to 2024, dismissed that defense as “a lie by avoidance,” accusing the company of “pruning and manipulating” its own research.
The hearing also exposed weaknesses in Meta’s parental controls. Florida’s attorney general Ashley Moody told the committee she struggled to locate the safety settings despite being one of the first to sue Meta over alleged child harms. “Not at all,” the whistleblowers replied when asked if her difficulties surprised them.
The testimony adds to a growing list of ex-employees who say Meta prioritized growth and profits over user safety. In 2021, former product manager Frances Haugen leaked documents showing Instagram’s harmful effects on teenage mental health. At the time, Meta CEO Mark Zuckerberg rejected claims the company put profit before safety, though he later apologized to families who said they had been harmed by its products.
More Stories
Nvidia Boss Predicts UK Will Become an “AI Superpower” as Tech Giants Pledge Billions
Klarna valued at $19bn in Wall Street debut
Ferrari Chair John Elkann Accepts Community Service in Italian Tax Settlement