Dec. 22, 2023 5:30 am ET
Meta Insiders Sound the Alarm About Its Encryption Push
BY JEFF HORWITZ AND KATHERINE BLUNT
When Meta Platforms began rolling out encryption for Facebook direct messages this month, it was advancing a project that members of its safety staff have long warned would end in disaster.
The encryption feature, which Meta has said will also be implemented on its Instagram app, is the culmination of work first announced in 2019 as part of a push to enhance users’ privacy. Even before that, some Meta employees had warned internally that such encryption would limit the ability to detect and report child sexual abuse on Meta’s platforms.
In 2018, David Erb was an engineering director at Facebook running a “community integrity” team focused on detecting harmful user behavior. When the team began studying inappropriate interactions between adults and minors on Face-book, it determined that the most frequent way adults found children to prey upon was Facebook’s “People You May Know” algorithm.
“It was a hundred times worse than any of us expected,” Erb said in a recent interview. “There were millions of pedophiles targeting tens of millions of children.”
Soon after, Erb learned that top executives were discussing encrypting Facebook messages. Worried that would prevent Meta from combating the child-safety problems his group had uncovered, Erb told his manager that he would resign if encryption were implemented on Facebook messages, he says. Within days, he was removed from his role and placed on leave. He ultimately resigned.
Meta has called privacy a fundamental human right, and privacy advocates have long advocated that tech companies embrace such “end-to-end” encryption, which blocks anyone other than senders and receivers from viewing messages. When Chief Executive Mark Zuckerberg announced the encryption plan in March 2019, he said that “the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to
other stays secure.”
He also acknowledged the dangers, saying the move would require working with law enforcement to help prevent the foreseeable abuse. “When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion,” Zuckerberg wrote.
Spokesman Andy Stone said Meta has since worked to address those risks, saying the company has “spent years developing robust safety measures on Facebook and Instagram to prevent and combat abuse or unlawful activity.”
While other companies, including Apple, have implemented such encryption on their messaging platforms, and Meta’s own Whats-App platform is already the world’s largest encrypted messaging app, those services don’t generally connect users with strangers.
Facebook and Instagram both have expansive search and recommendation functions that let users find and message people they don’t know. Adding encryption to such services cloaks communications among strangers—including predators contacting children.
Other social-media networks with these discovery features don’t offer encryption. TikTok fully disabled messaging for users younger than 16 in 2020, and requires older teens to opt in to send and receive messages. It says it doesn’t offer encryption because “we place a premium on ensuring that our younger users have a safe experience.” YouTube eliminated private messaging in 2019, saying it wanted to “focus on improving public conversations.”
Meta said that it has taken measures to make messaging safer for underage users. It restricts users who tell the platform they are older than 19 from initiating conversations with teens who don’t follow them, and works to prevent adults who behave suspiciously from interacting with teenagers’ accounts.
Privacy advocates say encryption is essential in the digital age to protect communications from the prying eyes of governments, hackers and the businesses that own the platforms. Law-enforcement agencies and child-safety advocates, meanwhile, have long worried that encryption will make it harder to combat child predators.
‘Too old for you’
As Meta staffers prepared to implement encryption for Facebook messages, a Chicago-area man named Karl Quilter was using Facebook and two other messaging platforms to solicit sexual photos from young girls in the Philippines, according to court filings. On Face-book, his alias was “Mathew Jones.”
In May 2020, Quilter used Face-book to begin messaging a girl shortly before her 16th birthday. He told her he was aged 55, “too old for you,” but, days later, asked for photos of her breasts and vagina. In exchange, he wrote, he would send money for her to buy medication for herself and food for her family.
The girl uploaded three nude photos, saying: “I did this to help my brothers and sister and to eat more for a day.”
Quilter continued to message her on Facebook, telling her he would visit the Philippines that December, and asking her to promise that she would give him her virginity.
Facebook investigators discovered the messages and turned them over to authorities. The company, until recently, had protocols allowing certain employees to screen messages for potential child sexual abuse and imminent risks to human life. Federal investigators discovered that Quilter used the same Facebook account to solicit sex content from at least eight other girls in the Philippines.
Quilter was arrested in November 2020, a month before his planned trip. He was sentenced this year to 30 years in federal prison after pleading guilty to sexually exploiting children.
“Their trust and safety team’s ability to access messages was instrumental,” said Brian Fitzgerald, head of the Department of Homeland Security Chicago office that investigated Quilter.
‘Recipe for disaster’
Before rolling out encryption on Facebook, Meta considered measures to limit circumstances under which encryption would occur and pondered allowing some employees to collect additional data or access messages under certain circumstances. Vaishnavi J, Meta’s former head of youth policy, said there was internal discussion about exempting child accounts on Face-book and Instagram from encryption.
The measures were rejected, she and other former employees told The Wall Street Journal, both to avoid potential liability for what happened to teenagers on its platforms and because it would require staff to continue to spend time and resources dealing with problematic behavior. J lost her job earlier this year amid layoffs that were part of what Zuckerberg has branded Meta’s “year of efficiency.”
“Meta will consistently say this is an industrywide issue. But no other company is melding recommendations with encrypted mes-saging,” she said. “That’s a uniquely Meta recipe for disaster.”
Meta’s Stone called it “patently absurd” to claim that cost or labor savings factored into the decision to not exempt underage users.
Guy Rosen, now Meta’s chief information security officer, told employees in May 2019 that encrypting messages would require taking a different approach to detecting bad actors with the unencrypted data the company could still access.
“We need to be laser focused on doing everything we can to keep users safe, within the limits of what’s possible in a private, encrypted messaging surface,” he wrote in an internal Facebook post reviewed by the Journal.
Some employees voiced support for the move. Others responded with concern. One wrote that introducing encryption at such a scale would create “unknown unknowns”— a range of potential abuses that the company couldn’t entirely anticipate.
“I personally agree with these concerns and voiced them loudly,” Rosen replied. “The absolute best thing we can do is execute the crap out of this.” Rosen, through Meta, declined to comment.
Meta has built an internal tool to better make use of unencrypted data, in hopes of making up for the lack of access to messages. Macsa, short for “Malicious Child Safety Actor,” flags accounts with suspicious interactions such as adult users repeatedly blocked by young users, or ones with posts removed for containing child exploitation content. After a Journal article in June showed how Instagram’s algorithms were connecting and promoting accounts openly devoted to underage-sex content, Meta took measures to enhance Macsa. It expanded the algorithm to incorporate 60 different behavioral cues. If users trigger too many of them, Meta will restrict their ability to find and contact children, or potentially disable their accounts.
Larry Magid, CEO of Connect-Safely, a nonprofit that promotes online safety, said that while encrypted messaging could complicate investigations, it also could make children safer by shielding their communications from sexual predators or others who could use their information to target them.
“Some minors will use messaging apps to share information, location, medical concerns, mental health issues, sexuality information and other highly personal or confidential information with a trusted person,” said Magid, who is a member of Meta’s safety advisory council and whose group gets financial support from tech companies including Meta.
Meta is the largest source of tips to the National Center for Missing and Exploited Children that can then be passed on to law enforcement. In 2022, Meta re-each ported more than 26 million incidents of child exploitation to the center. More than 20 million relied on the contents of Instagram and Facebook messages detected by the company without user assistance, work that won’t be possible with encrypted messaging.
“Unless the minors are actually reporting it to the platform, there will be no knowledge that it’s happening,” said John Shehan, a senior vice president at the center.
Jennifer Buta woke up one morning last year to find that her 17-year-old son, Jordan DeMay, had taken his life.
The prior evening, alleged scammers in Nigeria using a hacked account pretended to be a girl who flirted with him on Instagram to entice him to send a nude photograph, according to court filings. The alleged scammers then threatened to send it to everyone he knew unless he paid them $1,000. DeMay didn’t have enough money to meet their demands, and said he would kill himself if they went through with the threat.
“Good. Do that fast,” the scammers replied.
DeMay killed himself less than six hours after first contact with the scammers. Before he took his life, he deleted the conversation from his phone. Police recovered the messages from Instagram, along with the alleged scammers’ communications with more than 100 other victims, material that paved the way for the indictment and extradition of two men to the U.S. They have pleaded not guilty and are awaiting trial.
“If police hadn’t been able to get access to those messages, I might still be wondering why Jordan killed himself,” she said. “Encrypting messages on Instagram will create a breeding ground for people who can have access to our children thinking that ‘I’m never going to get caught.’”