Microsoft's PhotoDNA project will allow law enforcement and image hosters to identify offending images by matching them to known illegal content
Crafted as a joint effort between Microsoft and NetClean, in conjunction with the National Center for Missing and Exploited Children, PhotoDNA will allow both police and private companies to trawl through millions of images for child pornography by comparing it to existing images. The software is being given free to law enforcement branches and has been licensed to Facebook. Microsoft also turned the service on themselves, and have been using it to trawl through Bing, Hotmail, and Skydrive — finding 2,500 images out of more than two billion scanned.
The real power of this software is that it works even if the image has been edited significantly. Each image is converted to black and white, and then split into a grid, and from each grid square an intensity gradient histogram is generated. The information in each "DNA" isn't enough to reconstruct the image, but remains unchanged enough that images can be spotted even after editing. It's a different approach to the way facial recognition and its ilk identifies people.
As of right now, the tech boasts "a rate of zero false positives," which is astonishing if it's true. It also means that law enforcement can search for the images without needing to expose themselves to the pictures directly, which can be psychologically damaging. While the service can't identify new images, it will catch those that are already in circulation.
[via Ars Technica]