In testimony before a U.S. Senate panel on Tuesday, a former employee of Meta (META.O) claims that the parent company of Facebook and Instagram knew about the harassment and other negative issues that kids were experiencing on its platforms but did nothing about them.
Arturo Bejar, the employee, was a director of engineering for Facebook’s Protect and Care team from 2009 to 2015 and worked on Instagram’s well-being from 2019 to 2021, according to him.
Bejar is giving a testimony about social media’s effect on adolescent mental health before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law.
In prepared statements made available before the hearing, he stated, “It’s time that young users have the tools to report and suppress online abuse, and it’s time that the public and parents understand the true level of harm posed by these ‘products.'”
Bejar’s speech coincides with a bipartisan congressional effort to enact legislation mandating social media companies to give parents online kid safety resources.
Bejar stated during the hearing that his work at Meta aimed to impact Facebook and Instagram’s designs in a way that would encourage users to engage in more positive behaviors and give young people resources to deal with negative experiences.
In a response, Meta reaffirmed its commitment to safeguarding youth online, citing its support of the user surveys that Bejar mentioned in his evidence and the tools it has developed, such as anonymous alerts for potentially harmful information.
“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” read a statement from Meta. “All of this work continues.”
Bejar informed the senators that he frequently met with influential business figures, such as Chief Executive Mark Zuckerberg, and that, at the time, he believed they favored the initiative. He later determined, however, that the executives had repeatedly chosen “time and time again not to tackle this issue,” as he stated in his testimony.
In a 2021 email, Bejar forwarded internal statistics to Zuckerberg and other high-ranking officials, which showed that 24.4% of youngsters between the ages of 13 and 15 had reported receiving unsolicited sexual approaches, and 51% of Instagram users had reported having a harmful or detrimental experience on the network in the previous seven days.
Additionally, he informed them that his 16-year-old daughter had received offensive images and sexist remarks but lacked the means to submit the incidents to the firm. The Wall Street Journal was the first to report on the email’s existence.
During his evidence, Bejar said that during a discussion, Chris Cox, the Chief Product Officer of Meta, was able to recall specific figures about risks to teenagers.
“I found it heartbreaking because it meant that they knew and that they were not acting on it,” Bejar added.