Meta attributed the problem to an error the company says has been addressed.
“We apologize for the mistake.”
Meta made its owncontent-moderation changesmore recently and hasdismantled its fact-checking departmentin favor of community-driven moderation.
Amnesty Internationalwarned earlier this monththat Meta’s changes could raise the risk of fueling violence.
Some content, Meta says, is also filtered for those younger than 18.
They included shootings, beheadings, people being struck by vehicles, and other violent acts.