Machine learning tools help Facebook identify images that contain both children and nudity. Facebook has a ban on photographs depicting minors in a sexual context and also images of naked children in non-sexual contexts, such as "seemingly benign photos of children in the bath."
"We're using artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it's uploaded," the company said in a blog post.
"We use photo-matching technologies, so if someone tries to share an image that's known, we can prevent that image from being shared on our platform," Antigone Davis, Facebook's head of global safety, said in a video statement. "Recently, our engineers have been focused on classifiers to actually prevent unknown images, new images. Using a nudity filter as well as signals to indicate that it's a minor, we're actually able to get in front of those images and to remove them."
In the past three months, Facebook "removed 8.7 million pieces of content on Facebook that violated our child nudity or sexual exploitation of children policies, 99 percent of which was removed before anyone reported it," the company said.
"We also remove accounts that promote this type of content," it added.
The company said that it reports violations of its policy to the National Center for Missing and Exploited Children (NCMEC).
NCMEC Chief Operating Officer Michelle DeLaune said that "when Facebook detects child sexual abuse material, they remove it, and then they provide it to us. Our job is then to make that report available to the appropriate law enforcement agency."