Facebook announced Wednesday that moderators have used a new technology to remove millions of inappropriate photos of children.
The company's Global Head of Safety Antigone Davis wrote in a blog post that 8.7 million 'sexual' child images have been taken down in the last three months alone.
"One of our most important responsibilities is keeping children safe on Facebook," Davis said. "Today we are sharing some of the work we've been doing over the past year to develop new technology in the fight against child exploitation."
The company says their new machine learning tool can detect child nudity and previously unknown child exploitative content when it's uploaded.
That way, the company can more quickly report violations to the National Center for Missing and Exploited Children (NCMEC).
Before the new software, Facebook relied on users or its adult nudity filters to catch child images.
Now, the new technology goes as far as even finding specific accounts that engage in potentially inappropriate interactions with children.
Facebook's Community Standards takes action against nonsexual content as well, like innocent family-related photos of children in the bathtub.
"We'd rather err on the side of caution with children," Davis told Reuters in an interview.
The company says it makes exceptions for art and history.
In their blog post, Facebook also announced it will join Microsoft and other industry partners next month to "begin building tools for smaller companies to prevent the grooming of children online."
Latest CBN News Stories