Linda Yaccarino, CEO of X, informed US lawmakers that she is expanding her staffing numbers for safety and confidence. It was not mentioned by her that when Elon Musk bought the site in 2022, he sacked the majority of the content police officers.
Linda Yaccarino, CEO of X, urges for more moderators amidst concerns over safety. Explore the aftermath of Elon Musk’s staffing decisions and the platform’s response.
When Elon Musk took over Twitter, since rebranded as X, his favorite letter of the alphabet, he went on a firing spree. Chief among those ejected were people working on trust and safety, the work of keeping bad content, from hate speech to child exploitation, off the platform.
Presenting before a US Senate committee today, CEO of X Linda Yaccarino seemed to be implying that Musk went too far in dismantling the platform’s limitations, suggesting that the firm was going backwards a little bit. According to her, X has hired 100 additional moderators in Austin who will be focusing on child sexual exploitation, and it has expanded the number of trust and safety professionals by 10% in the last 14 months.
Yaccarino and the CEOs of Meta, TikTok, Snap, and Discord participated at a Senate hearing on the failure of social networks to stop child sexual exploitation. Additionally, she repeatedly stated that “less than 1 percent” of X users were younger than 18. That claim—and her announcement that after 14 months of Musk’s ownership and deep cuts to trust and safety, the company was now hiring new moderators—raised the eyebrows of social platform experts and former Twitter employees.
Theodora Skeadas, a former employee of Twitter’s safety and trust team who was let go by Musk in November 2022, claims that X is still dreadfully understaffed for a large social media company, despite making the hirings Yaccarino boasted about. “Unless their technical systems for flagging and removing content have really improved, 100 is not enough,” says Skeadas. “And that seems unlikely because they’ve fired so many engineers.” X did not immediately respond to a request for comment.
The Mods’ Bonfire
Musk made significant changes to the trust and safety teams when he fired almost half of Twitter’s workforce shortly after the company acquired the social media platform in October 2022. After establishing connections with the site’s safety and trust teams to report objectionable or hateful content, researchers and civil society organizations soon found themselves without a point of contact on the network.
In the lead-up to the 2022 presidential runoffs in Brazil, the Electoral Court of the nation expressed concern that Musk might enable the dissemination of false information linked to the elections, leading to a near-ban of the platform. After Musk became the company’s CEO, a group of academic researchers discovered that hate speech increased. In September of last year, X sacked five of the trust and safety employees who were in charge of battling misinformation. The firings came before of a historic election year.
Prior to Musk taking control, approximately 400 Twitter employees worked on trust and safety, while an additional 5,000 contractors assisted with platform content reviews, according to Skeadas. Over 4,000 contractors as well as the majority of those employees were let go.
Yaccarino claims to have increased trust and safety workers by over 10%, although the platform probably still employs significantly fewer individuals to ensure user safety. According to Skeadas, there is “no way” that the business has more safety personnel and trust than it did before to Musk. “It’s a 10 percent increase if they hired two people out of the remaining 20 employees, but it’s still nothing compared to before,” the woman claims.
Even if they were only focused on child sexual abuse, adding back 100 moderators—as Yaccarino asserted before the Senate today—would not be nearly enough to adequately monitor content relating to underage users and child-exploitation-related content, according to Skeadas.
Matt Motyl, a former Meta employee who is currently a research and policy fellow at the trust and safety-focused think tank Integrity Institute, concurs. He also finds it hard to believe Yaccarino’s assertion that fewer than 1% of X’s customers are under the age of 18, which she used to argue that X was less affected by many of the issues brought up by the Senate committee than were her colleagues CEOs who were testifying.
According to Motyl, “X doesn’t have much by way of age verification,” making it relatively simple for a young user to make up their age in order to access the platform. According to a December 2023 Pew Research study, 20% of American teenagers between the ages of 13 and 17 report that they “never use” Twitter. There may be many teenagers on the site who are only hiding and are not included in Yaccarino’s analytics.
Regardless of the number of teenagers using X in 2024, Motyl and Skeadas assert that neither they nor the other users of the site are as well safeguarded as they ought to be.
Yaccarino, who oversees a platform with between 300 and 500 million users globally, according to Motyl, is more focused on show than content. He asserts, “One hundred moderators is nothing.” “It’s a safety and trust play.”
Conclusion
Linda Yaccarino’s call for more moderators on X underscores ongoing challenges in maintaining safety and trust. Despite efforts to increase staffing, concerns linger over the platform’s ability to safeguard users. The aftermath of Elon Musk’s staffing decisions continues to raise questions about content moderation in social media.
Connect with us for the Latest, Current, and Breaking News news updates and videos from thefoxdaily.com. The most recent news in the United States, around the world , in business, opinion, technology, politics, and sports, follow Thefoxdaily on X, Facebook, and Instagram .