The Justice Department says there's been an "explosion" in the trafficking of child pornography, and it's pledged to step up investigations of people who download the illegal images.
But the government has hardly been neglecting the problem: Since 2000, the number of federal prosecutions has almost tripled. This has caused some people to ask whether more couldn't be done to prevent people from downloading the illegal images to begin with.
Microsoft is voluntarily developing technology that makes it harder to stumble upon child pornography on Internet searches, but officials question whether or not it will actually curb abuse of the children used in the photos.
In September 2004, a family in Northern California became the subject of a child pornography investigation, prompting a visit from the FBI.
"I get a call from my wife, she's crying and upset, and so I went home, and there were several agents going through my house, who stated that there was ... child porn on one of the computers," says Dave, who asked to have his last name withheld to prevent jeopardizing his job.
In this case, the images were on a computer used by his 20-year-old son. His son is now in prison, serving a three-year sentence for one count of possessing child pornography. Dave still can't get over how much damage was done by some computer files.
"You have a young kid that has no criminal history ... is not a pedophile, is not a child molester, and he's going to be labeled as a sex offender and have to register as such every year, on his birthday, for the rest of his life," he says.
Of course, these aren't just any computer files. Jeff Fischbach, a forensic analyst who works with this kind of material, says he's seen what happens when these images are shown to juries.
"You can watch the expressions change. I saw one gentleman in the front row of the jury, I think it was number three, and I saw him lurch forward toward the defendant," Fischbach says.
Defendants almost never risk going to trial; lawyers usually recommend they take a plea bargain. It's Fischbach's job to analyze the hard drives in these cases, and when he does, he follows strict protocols — as if the hard drives contained plutonium.
"It is handled like it's radioactive," Fischbach says.
Filtering The Photos
He keeps the drives in special locked boxes, and never connects them to the Internet or other computers, for fear of cross-contamination. He knows the legal jeopardy that can be caused by even one stray thumbnail image. And yet, he says, the very same files can be found on the open Internet quite easily.
"Right now, anybody is just one search term and a click on Google away from most of the same files that I have seen as part of my work," he says.
Fischbach believes the easy-to-find images are a kind of public hazard.
He worked for one defendant who went to prison because of one night of ill-advised Web surfing. The easy-to-find images are also tempting weapons in messy custody battles and divorces — he's convinced that in some of the cases he's worked on, one spouse has been framed by another. All of this makes Fischbach wonder why more isn't done to block some of the more obvious sources of these "radioactive" files.
"It's the same thing as any other public nuisance. Part of the government's job is not just to go out there and stop people from doing bad things, but to stop good people from having to fall victim to that," he says.
It's probably not constitutional for the government to block offending Web sites outright, but Fischbach says Internet service providers and search engines could volunteer to filter the images that reach their customers, just as e-mail providers filter out known viruses.
He's been suggesting this idea for years, and now somebody is trying it.
John Scarrow is general manager for Safety Services at Microsoft, which is developing technology called PhotoDNA.
In a recent demonstration at the Microsoft campus in Redmond, Wash., Scarrow uploaded a dummy child porn photo to Microsoft's cloud computing service called SkyDrive. The numeric value of that photo — something called a "hash value" — is checked against a master list of known child porn photos maintained by the National Center for Missing and Exploited Children. If the photo matches one of those, the account is disabled and an error message appears.
Microsoft is using a more flexible version of traditional hash values, so it can ID an image even if it's been cropped or resized. This flexibility takes some heavy math — and that means it can't be used to filter the whole Internet because it would take too much computing power.
But Scarrow says filtering everything was never the point.
"Somebody that's smart enough that really wants to traffic in the material will figure out how to circumvent these. But ... the person that is getting a hold of it because it's pretty easy to get a hold of, you know, is just looking around, whatever ... we believe, this will have a pretty substantial impact on them," he says.
The filtering system is already running on the Bing search engine — if someone's search finds a site with known child pornography, those links are suppressed.
'Part Of An Answer'
Microsoft is planning to expand the filter to its other online services, and it's offering to share PhotoDNA with other online portals for free. And the project has the tentative support of the Justice Department.
Drew Oosterbaan, who runs the Criminal Division's Child Exploitation and Obscenity Section, says the filtering isn't the final solution.
"I think it's fair to say that it may be part of an answer," he says.
Oosterbaan says file-blocking could become, in his words, "another front in the battle," but he says there could also be unforeseen consequences. One possibility, he says, is that the filtering could work too well.
If child pornography became too hard to find, he says, it might create demand for new images and cause more children to be abused.
"That's why this is a very, very challenging area of law enforcement, where, in the past 10 years, I've never found any real simple solutions," Oosterbaan says.
For now, filtering images with PhotoDNA is a pilot project — and it's purely voluntary. One thing its developers insist on is that they don't want to see it evolve into a filtering system that's mandated by the government.