Senate grills Zuckerberg, other tech CEOs, on kids' safety failures
The packed hearing is focusing on a push for kids safety and a package of stalled-out bills.
The Senate started assailing the CEOs of Meta, X, TikTok, Snap and Discord on Wednesday morning, attacking them on their ability to keep kids safe from sexual exploitation online and drug sales on their sites — as well as the mental health impact of their immensely popular platforms.
The chair of the Judiciary Committee hopes to jump-start a new national debate about harms facing children online — fueled by mounting national frustration among advocates, whistleblowers and victims, who were emotionally represented in the room Wednesday.
The committee has advanced multiple bills aiming to hold tech companies liable for hosting child sexual material, but those have stalled in the Senate.
“Their design choices, their failures to adequately invest in trust and safety, and their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk,” Chair Dick Durbin (D-Ill.) said in his opening remarks.
As in previous CEO appearances, the Senate chamber was packed with media, as well as safety advocates and tech industry lobbyists piling in to see the top executives make the case that their platforms are safe.
The audience was full of victims of child exploitation, as well as families and parents whose children died due to bullying and drug sales on the social media platforms — 400 of whom sent a letter to pressure Congress to act urgently to better protect kids on the sites. A number of families held framed photos as well as buttons and pins of their deceased children.
Despite the fresh energy, the hearing also came with a strong whiff of deja vu: with Meta CEO Mark Zuckerberg making his eighth appearance on Capitol Hill, and TikTok CEO Shou Zi Chew his second.
The three CEOs of smaller platforms have not previously testified on the Hill: Linda Yaccarino of X, Evan Spiegel of Snap and Jason Citron of Discord.
Despite years of rhetoric, mounting court cases against tech and public hearings, Congress has still passed no significant recent laws protecting kids online.
Even Durbin acknowledged in his opening remarks Congress’ failure to act. “The tech industry alone is not to blame for the situation we are in,” he said. “Those of us in Congress need to look in the mirror.”
He said it was time for Congress to change Section 230 of the 1996 Communications Decency Act, which lets platforms largely avoid liability for content that users post online, to “finally hold tech companies accountable for child sexual exploitation on their platforms.”
Several bills from his committee, including his STOP CSAM Act, would eliminate that protection to specifically allow victims of child exploitation to sue platforms.
Lawmakers are also expected to score points on their own pet issues, including China hawks like Sen. Josh Hawley (R-Mo.), who’s likely to attack TikTok’s Chew over national security concerns tied to the company's Beijing-based owner ByteDance.
All the platforms will be asked about the growing amount of child sexual material on their platforms. The National Center for Missing and Exploited Children said it received 36.2 million reports of child sexual and exploitative material in 2023 from platforms, a major jump from 21 million reports it received in 2020, according to the nonprofit. They’ll also be asked about the growth of so-called financial “sextortion,” in which a predator uses a fake social media account to trick a minor into sending explicit photos or videos, then threatens to release them unless the victim sends money.
In the hearing, lawmakers will push for action on their legislation. Sen. Richard Blumenthal (D-Conn.) told reporters Tuesday that he plans to ask platforms if they support the Kids Online Safety Act, (KOSA), which he cosponsors with Marsha Blackburn (R-Tenn.). The bill aims to stop platforms from recommending harmful material — like suicide and eating disorder content.
Snap was the first social media platform to support KOSA, as POLITICO first reported last week. But none of the other social media platforms testifying have backed the bill.
Separately, Microsoft President Brad Smith announced Tuesday the large tech company backs KOSA as well. Several of Microsoft’s products, including its Xbox gaming systems (for users 13 and older) and networking platform LinkedIn (for users 16 and older), would both be covered by KOSA.
Blumenthal also plans to interrogate Zuckerberg about newly disclosed emails where the CEO rejected a request in August 2021 from Nick Clegg, currently Meta’s president of global affairs, who asked for increased resources and staffing to address kids’ safety.
In recent years, legislators have been bombarded with evidence that social media is negatively impacting young Americans.
Two whistleblowers, former Meta employees Frances Haugen and Arturo Béjar, have come forward to detail the ways in which the company repeatedly ignored reports that young people on its platform are bullied, harassed, and exposed to content that causes them to compare themselves to others.
And due to inaction from Capitol Hill, state legislators are stepping in. States passed 23 kids’ online safety laws in 2023. And more state legislation is coming this year. Florida’s conservative House passed a bill to ban teens under 16 from “addictive” apps (though it includes carve-outs for messaging apps). Maryland, Minnesota, New Mexico, Vermont and Illinois are all introducing bills on “age appropriate design” modeled after the U.K.’s Children’s Code.
The companies have rolled out a number of voluntary product announcements in recent weeks, which CEOs highlighted in their opening remarks — with Meta being the most aggressive in its pre-hearing safety push.
In his opening statement, Zuckerberg emphasized the company’s plans previewed earlier this month that it would block suicide and eating disorder content for users, mimicking proposals in KOSA.
He also touted thecompany’s own legislative proposal calling for Congress to pass a law to put the onus on app stores, not platforms, to obtain parental consent for kids under 16 to download social media apps.
Meta, TikTok and X have also launched a series of online ads focused on children and teens’ safety in the weeks leading up to the hearing — with Meta and TikTok running print ads in D.C. as well.