SOCIAL MEDIA

Social Media CEOs Face Tough Questions Over Child Protection in US Senate

The CEOs of Snap, Meta, X, and TikTok have all appeared before the U.S. Senate today to discuss their respective efforts to combat child exploitation content in their apps, and answer questions in regards to their ongoing development of new initiatives to better protect young users.

The Senate Judiciary Committee’s “Big Tech and the Online Child Sexual Exploitation Crisis” hearing was originally tabled to be held late last year, but had to be re-scheduled to ensure that all of the CEOs could be in attendance. The hearing is an extension of an earlier session, in which the Senate heard from child safety experts in regards to the harm caused by social media apps.

So today, the company chiefs themselves had the opportunity to present their side of the story, and detail what each is doing to combat CSAM content.

And some Senators did not hold back in their criticism of the platforms.

First off, each of the CEOs shared a pre-prepared statement, which provided an overview of their efforts and plans.

Meta CEO Mark Zuckerberg outlined Meta’s protective systems, which includes 40,000 dedicated staff working on safety and security, while Zuckerberg also said that Meta has invested over $20 billion on this element since 2016.

Though Zuckerberg also pushed back on criticisms made in the previous session in regards to the harms caused by social media apps:

“A recent report from the National Academies of Sciences evaluated results from more than 300 studies and determined that the research “did not support the conclusion that social media causes changes in adolescent mental health at the population level.” It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore, and connect with others.”

Zuckerberg also re-emphasized Meta’s recently outlined proposal that the app stores be held responsible for underage downloads.

“For example, 3 out of 4 parents favor introducing app store age verification, and 4 out of 5 parents want legislation requiring app stores to get parental approval whenever teens download apps.”

So while Zuckerberg is willing to take his share of the heat, he also set the tone early that he does believe there are counterpoints to those that had been proposed by child safety experts.

X CEO Linda Yaccarino emphasized her perspective as a mother herself, and outlined X’s efforts to implement broader protections for young users.

In the last 14 months x has made material changes to protect minors. Our policy is clear – X has zero tolerance towards any material that features or promotes child sexual exploitation.

Yaccarino also explained that in 2023, X suspended more than 12 million accounts for violating its CSE policies, while it also sent 850,000 reports to The National Center for Missing & Exploited Children (NCMEC), via a new automated reporting system designed to streamline the process.

Yaccarino outlined the same in a recent post on X, though the automated reporting element, in particular, could lead to further issues in terms of incorrect reports. But at the same time, it could reduce the labor load on X, and with 80% fewer staff than the previous Twitter team, it needs to utilize automated solutions this where it can.

Yaccarino also noted that X is building a new 100-person moderation team, to be based in Texas, which will be specifically focused on CSAM content.

Snapchat CEO Evan Spiegel, meanwhile, emphasized the platform’s foundational approach to privacy in his statement:

Snapchat is private by default, meaning that people need to opt-in to add friends and choose who can contact them. When we built Snapchat, we chose to have the images and videos sent through our service delete by default. Like prior generations who have enjoyed the privacy afforded by phone calls, which aren’t recorded, our generation has benefitted from the ability to share moments through Snapchat that may not be picture perfect but instead convey emotion without permanence.”

Spiegel also quoted the platform’s NCMEC reporting figures, stating that Snap submitted 690,000 NCMEC reports last year.

TikTok chief Shou Zi Chew, meanwhile, outlined TikTok’s evolving CSAM detection efforts, which will include significant investment in new initiatives.

We currently have more than 40,000 trust and safety professionals working to protect our community, and we expect to invest more than two billion dollars in trust and safety efforts this year alone – with a significant part of that investment in our US operations.

TikTok is arguably in a tougher position, given that many senators are already seeking to ban the app, due to concerns about its connection to the Chinese government. But Chew argued that the platform is leading the way on many CSAM detection elements, and is looking to build on them where it can.  

The session included a range of pointed questions from the Senate floor, including this remark from Senator Lindsey Graham:

Mr. Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people,” 

Zuckerberg was the main focus of much of the angst, which makes sense, given that he’s in charge of the most used social media platforms in the world.

Zuckerberg was also pushed to apologize to families that have been harmed by his company’s apps by Senator Josh Hawley, which, somewhat unexpectedly, Zuckerberg did, turning to the gallery to issue a statement to a group of parents that were in attendance:

“I’m sorry for everything you have all been through. No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry wide efforts to make sure no one has to go through the things your families have had to suffer.”

Yet, at the same time, a new report has indicated that Zuckerberg has previously rejected calls to bulk up Meta’s protective resources in 2021, despite requests from staff.

As reported by The New York Times:

“In 90 pages of internal emails from fall 2021, top officials at Meta, which owns Instagram and Facebook, debated the addition of dozens of engineers and other employees to focus on children’s well-being and safety. One proposal to Mr. Zuckerberg for 45 new staff members was declined.”

Zuckerberg maintained his composure under pressure, but clearly, many concerns remain about Meta’s initiatives on this front.

Several senators also used today’s session to call for changes to the law, in particular Section 230, in order to reduce the protections for social platforms in regards to harmful content. Thus far, repeals of Section 230, which protects social apps from lawsuits for the content that users share, have been rebuffed, and it’ll be interesting to see if this angle moves the discussion forward.

In terms of platform specifics, Yaccarino was questioned about X’s reduced staffing, and how it’s impacted its CSAM detection programs, while Spiegel was pressed on the role that Snap has played in facilitating drug deals, and in particular, fentanyl trading. Both provided sanitized assurances that more was being done to up their efforts.

It was a tense session, with Senators looking to push their case that social platforms need to do more to protect young users. I’m not sure that any of the proposed law changes will hold up as a result of today’s grilling, but it is interesting to note the various elements at play, and how the major platforms are looking to implement solutions to address concerns.




Source link

Related Articles

Back to top button