How tech companies are catching wrongdoers
Google’s controversial email scanning practices netted a child abuser last month, but the internet giant is not the only technology company proactively combating the sharing of child abuse images.
While it is not known exactly what tipped the Google algorithms off to the existence of three pornographic child exploitation images in John Skillern’s Gmail, the registered sex offender is now facing new charges in the US.
Both Facebook and Twitter use a photo analysis service called PhotoDNA from Microsoft. The software, launched in 2009, is also used on Microsoft’s email service Hotmail (now Outlook), search engine Bing and data storage and sharing service Skydrive.
The image matching technology attaches a unique number (or hashing string) to images identified by peak child abuse prevention groups and recognises matches when uploaded and shared by users, regardless of whether they’ve been modified or resized.
Groups such as the US’s Internet Watch Foundation and the National Center for Missing and Exploited Children have flagged almost 100 million images and videos.
Samantha Doerr, former senior manager with Microsoft's digital crimes unit, child sexual exploitation prevention, previously told Fairfax Media the National Centre picks the "worst of the worst pictures on the internet and shares them with internet service providers so they can identify when they are shared again online".
"Technology has enabled a number of amazing things in this world and for people to build communities and share with each other, but also for people to build these [illegal] communities," she said.
"Children are getting younger and younger, about 10 per cent [of images] now is infants and toddlers because they can't tell anyone what is happening to them. And it's getting more and more violent."
Doerr, now a director of public affairs at Microsoft, said all technology companies reported found images as part of a global effort to end child exploitation.
She said the technology had not resulted in lower numbers of images being shared, but did improve authorities' ability to detect images and produced better reporting.
Robert Gellman, a US-based privacy and information consultant, told website Mashable that Google’s actions had opened a “whole different can of worms ” beyond the usual privacy concerns that batter the tech giant.
"Drawing a line about email scanning is not simple — no one see to object if email is scanned for malware, but once you move beyond that, it's much more difficult," Gellman said.
"No one defends child porn, but the principle that an email provider will read mail looking for criminal activities is problematic. It raises concerns over what standards apply and which crimes are included."