Commentary on Political Economy

Wednesday 31 January 2024

 

‘You Have Blood on Your Hands’: Senators Say Tech Platforms Hurt Children

Chief executives from tech companies, including Meta CEO Mark Zuckerberg, faced lawmakers Wednesday, in a hearing highlighting risks that social-media platforms pose to children. Photo: Andrew Caballero-Reynolds/AFP/Getty Images

Mark Zuckerberg, TikTok’s Shou Zi Chew and other tech CEOs faced withering bipartisan criticism on Wednesday from senators who said social-media platforms must bear more legal liability when children are harmed online.

“You have blood on your hands,” Sen. Lindsey Graham (R., S.C.) told the executives during a hearing of the Senate Judiciary Committee, eliciting applause from a packed audience that included many holding pictures of children. 

The presence of grieving families lent the roughly four-hour session an emotional charge, as lawmakers repeated stories of sexual exploitation, suicide and other suffering blamed on social media.

At the same time, it wasn’t clear it would lead to a different result than previous congressional tongue-lashings of the tech industry. Several senators acknowledged the futility of their legislative response to date, despite a bipartisan consensus that the current laws don’t adequately address harms to children on the platforms.

“We have an annual flogging every year,” said Sen. Thom Tillis (R., N.C.). “And what materially has occurred?”

The Wall Street Journal has highlighted persistent dangers to children on social-media platforms, including how Instagram’s algorithms connect a vast network of pedophiles. Several lawmakers cited the Journal’s reporting in their criticism, and they pointed to a wave of lawsuits filed by parents and state attorneys general seeking to hold platforms accountable. Senators noted that many had been dismissed under laws designed to protect online speech.

Senate Judiciary Chairman Dick Durbin (D., Ill.) blasted the executives for a “crisis” in child sexual exploitation online.

Senate Judiciary Committee Chairman Sen. Dick Durbin (D., Ill.), right, listens as ranking member Sen. Lindsey Graham (R., S.C.) speaks during the hearing with the heads of social-media platforms on Wednesday.   Photo: Susan Walsh/Associated Press

“Their design choices, their failures to adequately invest in trust and safety, and their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk,” he said.

Zuckerberg, who got many of the most pointed questions, told lawmakers there are positive aspects of children’s interactions on Meta platforms. He also praised Facebook’s investment in child-safety work, saying the company has gone beyond legal requirements in seeking to remove abusive material.

Newsletter Sign-up

What’s News

Catch up on the headlines, understand the news and make better decisions, free in your inbox daily. Enjoy a free article in every edition.

“I’m proud of the work that our teams do to improve online child safety on our services and across the entire internet,” Zuckerberg said, pointing to technology that detects inappropriate or abusive content and tools that he said help parents get more involved in children’s decisions.

Meta reported about 27.2 million instances of suspected child sexual-abuse material on its main platforms to the National Center for Missing and Exploited Children in 2022, far more than any other company, according to the nonprofit group’s data. The U.S. total for all platforms was about 32 million in 2022.

But Meta has announced plans to encrypt messaging on its platforms, a step that will block the automated detection systems responsible for the majority of its reports.

Zuckerberg under scrutiny

At one point, Sen. Josh Hawley (R., Mo.) asked Zuckerberg to apologize to parents in the audience. The Facebook founder stood, turned, and said: “I’m sorry for everything that you have all gone through. It’s terrible. No one should have to go through the things that your families have.”

“This is why we invest so much and are going to continue doing industry leading efforts…to make sure that no one has to go through the types of things that your families have had to suffer,” he said.

Zuckerberg was asked about internal documents, released Wednesday by two lawmakers, that show top company officials asking him to invest in additional protections for children on their platforms.

Those requests for resources weren’t granted, according to state attorneys general who previously referenced some of the same material.

Meta CEO Mark Zuckerberg at Senate Judiciary Committee hearing on Wednesday. Photo: evelyn hockstein/Reuters

A Meta spokesman said the documents “do not provide the full context of how the company operates or what decisions were made” and noted Zuckerberg’s written testimony for the hearing said the company has spent $5 billion on safety and security in the past year.

Sen. Ted Cruz (R., Texas) cited a Journal report last year that Instagram in many cases has permitted users to search for terms that its algorithms know may be associated with illegal material. In such cases, a pop-up screen warned users the results “may contain images of child sexual abuse” and offered two options for users: “Get resources” and “See results anyway,” the Journal reported. 

“Mr. Zuckerberg, what the hell were you thinking?” Cruz asked. “In what sane universe is there a link to ‘See results anyway?’”

“Well, we might be wrong” about the material being inappropriate, Zuckerberg responded. He also noted that the company reports more suspected child-exploitation material to authorities than any other social-media company.

​​In response to questions from the Journal, Instagram removed the option for users to view search results for terms likely to produce illegal images.

TikTok data

Chew’s prepared remarks touted TikTok’s growing U.S. user base—now at 170 million, up from 150 million in 2023—and its average age of 30. He said the platform takes steps to minimize the exploitation of children, such as prohibitions on direct messaging for users under 16 and on recommending their videos to strangers.

But Chew ran into critical questions from Sen. John Cornyn (R., Texas) over how well TikTok protects its U.S. users’ data from Chinese authorities. TikTok is owned by Beijing-based ByteDance, and U.S. officials worry that it could be forced to provide sensitive U.S. user data to Chinese authorities. 

Cornyn pointed to a Wall Street Journal report this week that TikTok is struggling to wall off U.S. users’ data from ByteDance as it has promised to do. Chew disputed the article’s accuracy but said that “no system that any one of us can build is perfect.”

TikTok CEO Shou Zi Chew ahead of the Senate hearing. Photo: Mark Schiefelbein/Associated Press

Legal liability

Groups representing young victims of online harms including social-media addiction and sexual exploitation held media events this week to help drum up support for lawmakers’ efforts to rein in social-media platforms. 

John DeMay, the father of a Michigan teenager who died by suicide after falling victim to an online extortion scheme, said he hopes the Senate hearing will bring awareness “that social media is not a safe place, especially for children.” He and his family are considering legal options.

But suing the companies for harm to children can be legally difficult.

Currently, the platforms often can avoid liability when someone is harmed as a result of social-media use because of special legal protections that Congress created for the platforms in the 1990s when the internet was in its infancy. Those protections generally immunize the platforms from liability for harm from content generated by other users. 

Snap CEO Evan Spiegel at Wednesday’s hearing where many in the packed audience held pictures of children.  Photo: evelyn hockstein/Reuters

Durbin and other lawmakers have proposed removing those special legal protections in cases where children are sexually exploited.  

Industry representatives say that those bills could harm users’ privacy, mainly by discouraging the platforms’ use of encryption. Tech industry allies argue that could affect a range of groups from LGBTQ youth to people seeking reproductive health services.

SHARE YOUR THOUGHTS

How should tech companies work to mitigate the growing online risks to children? Join the conversation below.

No comments:

Post a Comment