Meta shared the data on violent content removal as part of its quarterly community standards enforcement report
Meta said its algorithms removed over 98% of the violating content without human intervention. Content removal was also up on Instagram, albeit marginally. The platform took down 2.7 million posts in Q1 2022, whereas the figure stood at 2.6 million in Q4 2021. The company shared the data on content removal as part of its quarterly transparency report, also known as the community standards enforcement report (via Engadget). Meta said the sharp increase in the removal of violent content on Facebook came due to an expansion of its “proactive detection technology.” Meta’s transparency report comes days after Facebook was criticized for acting slowly in removing content related to the racist mass shooting at a supermarket in Buffalo, NY. Several copies of the shooting continued to be up on Facebook for a few hours. One post was shared over 46,000 before removal, The Washington Post reports. Social media sites have a massive responsibility to restrict the spread of violent content before it reaches too many people. But recent events indicate that Meta’s platforms have a lot of work to do in this regard.
Meta acknowledged some of the challenges in removing violent content before it’s too late
VP of Integrity at Meta, Guy Rosen, acknowledged some of the company’s limitations in a call with reporters. “One of the challenges we see through events like this is people create new content, new versions, new external links to try to evade our policies [and] evade our enforcement,” Rosen said. “As in any incident, we’re going to continue to learn to refine our processes, refine our systems to ensure that we can detect we can take down violating content more quickly in the future.” Meta’s report also provides info on the posts taken down by accident. The company said it reversed the removal of 756,000 Facebook posts initially marked as violent. Meta further said that it is “working on developing robust measurements around mistakes,” although it didn’t provide any details.