Displaying 30+ Stories
CBNNews.com

Facebook Launches New Tech to Fight Child Exploitation, Removes 8.7M 'Sexual' Photos

10-26-2018
Facebook announced Wednesday that moderators have used a new technology to remove millions of inappropriate photos of children.
 
The company's Global Head of Safety Antigone Davis wrote in a blog post that 8.7 million 'sexual' child images have been taken down in the last three months alone.
 
"One of our most important responsibilities is keeping children safe on Facebook," Davis said. "Today we are sharing some of the work we've been doing over the past year to develop new technology in the fight against child exploitation."
 
The company says their new machine learning tool can detect child nudity and previously unknown child exploitative content when it's uploaded.
 
That way, the company can more quickly report violations to the National Center for Missing and Exploited Children (NCMEC).
 
Before the new software, Facebook relied on users or its adult nudity filters to catch child images.
 
Now, the new technology goes as far as even finding specific accounts that engage in potentially inappropriate interactions with children.
 
Facebook's Community Standards takes action against nonsexual content as well, like innocent family-related photos of children in the bathtub.
 
"We'd rather err on the side of caution with children," Davis told Reuters in an interview.
 
The company says it makes exceptions for art and history.
 
In their blog post, Facebook also announced it will join Microsoft and other industry partners next month to "begin building tools for smaller companies to prevent the grooming of children online."

Did you know?

God is everywhere—even in the news. That’s why we view every news story through the lens of faith. We are committed to delivering quality independent Christian journalism you can trust. But it takes a lot of hard work, time, and money to do what we do. Help us continue to be a voice for truth in the media by supporting CBN News for as little as $1.