Facebook co-founder and Meta CEO Mark Zuckerberg said:
“I am sorry for everything you have all gone through, it is terrible. No one should have to go through the things that your families have suffered,”
Meta CEO Mark Zuckerberg apologized to families who say their children have been harmed by social media during a heated hearing in the US Senate.
Mr. Zuckerberg, who runs Instagram and Facebook, told them that “no one should go through” what they went through.
He and the heads of TikTok, Snap, X, and Discord were questioned for nearly four hours by senators from both parties.
Lawmakers wanted to know what they were doing to protect children online.
There is currently a bill being passed in Congress that aims to hold social media companies accountable for the content posted on their platforms.
Wednesday’s hearing was a rare opportunity for U.S. senators to question the heads of technology companies.
Mr. Zuckerberg and TikTok CEO Shou Zhi Chu voluntarily agreed to testify, but executives at Snap refused.
Behind the top five tech companies are families who claim their children have harmed themselves or committed suicide because of their social media content.
They expressed their feelings over and over again, hissing when CEOs entered and applauding when lawmakers asked tough questions.
Although the hearing focused primarily on protecting children from online sexual exploitation, senators took advantage of the presence of five powerful executives under oath to ask questions. The contents were wide-ranging.
Chu, CEO of TikTok, which is owned by Chinese company ByteDance, was asked whether the company had shared U.S. user data with the Chinese government, but he denied this.
US Senator Tom Cotton asked Mr Chew, a Singaporean, whether he had ever belonged to the Chinese Communist Party.
Parents rally for urgent legislation:
After the hearing, parents of the victims held a rally outside, calling on lawmakers to quickly pass the Kids Online Safety Act. Many shared personal stories and emphasized the urgency of legislative action.
Former Meta executive Arturo Bejar criticized the company’s approach, saying Meta must take responsibility for creating a safe environment for young people.
For More Updates Follow: AroundUsInfo.com
During the hearing, the tech giant revealed the number of content moderators it employs on its platforms. Meta and TikTok, which have the largest user bases, each reported 40,000 moderators, while Snap, X, and Discord each disclosed the number of moderators they have. Discord, which previously researched child abuse prevention, said it has “hundreds” of moderators.
Social media industry analyst Matt Navarra commented on the hearing and pointed to a common pattern of political self-promotion. Despite bipartisan agreement on the need for regulation, Navarre expressed skepticism that the hearings would result in significant regulatory changes. He emphasized that there will be no significant regulation in the US social media environment in 2024.