Facebook launches AI to find and remove 'revenge porn'

Facebook is rolling out technology to make it easier to find and remove intimate pictures and videos posted without the subject's consent, often called "revenge porn."

Currently, Facebook users or victims of revenge porn have to report the inappropriate pictures before content moderators will review them. The company has also suggested that users send their own intimate images to Facebook so that the service can identify any unauthorized uploads. Many users, however, balked at the notion of sharing revealing photos or videos with the social-media giant, particularly given its history of privacy failures.

The company's new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review.

Facebook and other social media sites have struggled to monitor and contain the inappropriate posts that users upload, from violent threats to conspiracy theories to inappropriate photos.

Dawn Kamber