Facebook Inc. removed tens of millions of user posts in the past six months for violating its terms of service regarding issues like child pornography, drug sales and terrorism. Millions more were removed from Instagram.
That’s according to a report released Wednesday by Facebook that details how the social media company enforces its own content policies. The report, which is published every six months and for the first time includes data from Instagram, said that Facebook identifies most of the content it removes automatically using its own software algorithms.
The numbers provide a reminder of the scale at which Facebook operates. Following the report’s release, Chief Executive Officer Mark Zuckerberg said that the company — the world’s biggest social network — gets unfairly criticized for reporting large takedown numbers, but that these accounts actually show Facebook is taking these problems more seriously than competitors are.
Some people look at the amount of content Facebook takes down “and come to the conclusion that because we’re reporting big numbers that that must mean that so much more harmful content is happening on our services than others,” he said. “What it says, if anything, is that we’re working harder to identify this and take action on it and be transparent about that than what any others are.”
Some highlights from the report: