Facebook uses image-matching to remove child porn

Facebook is using image-matching software PhotoDNA to try and detect child pornography online.

The software was developed two years ago by Dartmouth College and Microsoft, and is based on a database of child pornography images from the US Center for Missing and Exploited Children (NCMEC). It automatically compares uploaded images with the database, and flags any that appear to be the same.

“Traditionally if people resized a photo or edited it in any way at all, the digital blueprint or signature of that photo would be different, and it would be difficult or sometimes even even impossible to find the nearly identical images,” says Brad Smith, general counsel and senior vp Microsoft legal and corporate affairs.

But PhotoDNA, say the companies, can detect these images even when they’ve been altered in this way.

Facebook now plans to run the software on every photo uploaded to the site, and will report offending images both to the NCMEC and the police. Dartmouth professor of computer sciences Hany Farid says it can scan an image in just four milliseconds, meaning users won’t notice any delay.

He says that in tests it was able to detect 99.7 percent of matches, and has a false positive rate of between one in two billion and one in ten billion.

“The child pornography problem worldwide – and particularly in the United States – has absolutely exploded with the advent of the internet,” says Ernie Allen, president and CEO of the NCMEC. “Twenty years ago we thought this problem was virtually gone.”

He said that beta tests of the software in New Zealand had already led to arrests. Microsoft has also successfully used the technology to remove 1,000 images from its SkyDrive cloud storage service since February.