Having laid bare over half a billion usernames and passwords through meager funding and witless indifference, Yahoo! is putting its faith in artificial intelligence to protect people from bare skin.
Yahoo! engineers Jay Mahadeokar and Gerry Pesavento in a blog post on Friday said the company has released an open-source model for detecting images deemed "not safe for work" (NSFW).
"To the best of our knowledge, there is no open source model or algorithm for identifying NSFW images," the pair wrote. "In the spirit of collaboration and with the hope of advancing this endeavor, we are releasing our deep learning model that will allow developers to experiment with a classifier for NSFW detection, and provide feedback to us on ways to improve the classifier."
READ MORE: http://www.theregister.co.uk/2016/09/30/yahoo_nsfw_detector/