Internal documents revealed: How Meta ignored children safety warnings

In the past few years, Meta (formerly Facebook) has come under the spotlight due to concerns about the safety of young users on their platforms, particularly Instagram and Facebook.

by Faruk Imamovic
Internal documents revealed: How Meta ignored children safety warnings
© Getty Images/Anna Moneymaker

In the past few years, Meta (formerly Facebook) has come under the spotlight due to concerns about the safety of young users on their platforms, particularly Instagram and Facebook. Recently released internal documents and court proceedings shed new light on the company's approach to these issues.

Missed opportunities to improve security

In April 2019, David Ginsberg, CEO of Meta, proposed to Mark Zuckerberg to investigate and reduce the problematic use of Instagram and Facebook. However, the project did not receive funding due to recruitment regulations. Adam Mosseri, director of Instagram, also refused to finance the project.

This email exchange is just one of the pieces of evidence cited in more than a dozen lawsuits filed by attorneys general in 45 US states and the District of Columbia. The states accuse Meta of unfairly attracting teenagers and children to its platforms while misleading the public about the dangers.

An analysis of court filings shows how Zuckerberg and other Meta leaders repeatedly promoted the safety of the company's platforms, downplaying the risks to young people, even as they rejected pleas from employees to strengthen safeguards.

Growing problems and public concerns

The lawsuits against Meta reflect growing concerns that young people on social media can be exposed to harassment, abuse and compulsive online behavior. dr. Vivek H. Murthy, the US surgeon general, called for warnings to be posted on social media, highlighting the public health risk to young people.

In May, New Mexico arrested three men accused of targeting children for exploitation on Instagram and Facebook. State Attorney Raúl Torrez said Meta's algorithms allowed predators to identify children they otherwise wouldn't have been able to find on their own.

“A lot of these decisions ultimately landed on Mr. Zuckerberg’s desk,” said Raúl Torrez, the attorney general of New Mexico. “He needs to be asked explicitly, and held to account explicitly, for the decisions that he’s made.”

Internal Documents Revealed: How Meta Ignored Children Safety Warnings
Internal documents revealed: How Meta ignored children safety warnings© Getty Images/Leon Neal

Meta's reaction and objections

Meta disputed the states' claims and filed motions to dismiss the lawsuits. Spokeswoman Liza Crenshaw said the company is committed to the well-being of young people and has developed over 50 tools and features for youth safety.

“We want to reassure every parent that we have their interests at heart in the work we’re doing to help provide teens with safe experiences online,” Ms. Crenshaw said, according to NYTIMES. The states’ legal complaints, she added, “mischaracterize our work using selective quotes and cherry-picked documents.”

However, parents whose children have been killed by online dangers dispute Matt's assurances. Mary Rodee, whose 15-year-old son was the victim of blackmail on Facebook, criticized the company for its lack of response to her complaints. “They preach that they have safety protections, but not the right ones,” she said. “It’s pretty unfathomable,” she said.

Pressure to attract teenagers

Meta has long struggled to attract and retain teenagers, who are a key part of the company's growth strategy. Internal documents show that back in 2016, Zuckerberg ordered a focus on increasing the time teenagers spend on the company's platforms.

The “overall company goal is total teen time spent,” wrote one employee, whose name is redacted, in an email to executives in November 2016.

In April 2017, Kevin Systrom, then-CEO of Instagram, asked for more staff to work on mitigating the damage to users. Zuckerberg responded that he would include Instagram in the hiring plan, but stressed that Facebook faces "more extreme problems."

Problems with underage users

In January 2018, Zuckerberg received a report estimating that four million children under the age of 13 were on Instagram, according to a 33-state lawsuit. 

"Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed, and zealously protected from disclosure to the public," cites state attorneys general complaint.

Although the terms of service prohibit users under the age of 13, the registration process allowed children to easily lie about their age.

Beauty filter debates

The internal debate about beauty filters on Instagram has shown the tensions surrounding the mental health of teenagers. After the introduction of filters that mimicked the lines of plastic surgery, some mental health experts warned that such effects could normalize unrealistic beauty standards for young women.

Although some CEOs have proposed a permanent ban on such filters, Zuckerberg has supported keeping them, citing a lack of data on harm.

Explicit content involving children

Last fall, Match Group, owner of dating apps like Tinder, revealed that its ads on Meta's platforms were being shown alongside "highly disturbing" violent content, some involving children. Meta removed some of the flagged posts, but the company's response was inadequate to Match Group.

The lawsuits against Meta and published internal documents reveal a complex picture of how the company balanced between attracting young users and ensuring their safety. While Meta claims it is committed to the welfare of young people, critics, including parents and state prosecutors, challenge the effectiveness of its safeguards.

Facebook Instagram