Clik here to view.

You don’t need permission to put a website online, and, once it’s there, anyone in the world can see it. Much of what makes the web incredible can also make it a harsh, unkind place: Just as photographers and musicians can find a global audience, child abusers can, too.
Today, Google and Microsoft announced they are taking technological steps to make the Internet less hospitable to child abusers. The news comes after more than 300 people were arrested worldwide last week—and 400 children rescued—in one of the largest-ever crackdowns on child pornographers. It also comes months after U.K. prime minister, David Cameron, called on the search engines to do more to obstruct child pornographers.
Google has long had “zero tolerance” for child pornography. But in practice, Google’s algorithms cannot perfectly detect every piece of pornography; they require making tradeoffs. Its announcements today make it highly likely that, under government pressure, the search giant opted to accept many more “false positives”—non-pornographic content which will be blocked nonetheless—in order to make it harder to find child pornography.
What will be done
In an op-ed in the Daily Mail, Google’s chairman, Eric Schmidt, announced the company’s specific actions. The search engine has moderated search results worldwide to make it even harder to find child pornography. The company has “cleaned up” the results of more than 100,000 searches in 150 languages, he says. (Details about Microsoft’s plans remain less certain.)
Not only has the company cleaned up its search results, Schmidt say it will now display “warnings–from both Google and charities–at the top of our search results for more than 13,000 queries.”
“These alerts make clear that child sexual abuse is illegal and offer advice on where to get help,” he writes. The search results themselves now direct users to news reports about the phenomenon, for instance, and ways to get help, rather than pornographic material itself.
The company also announced a new technology to detect pornographic material in YouTube videos. Though YouTube prohibits any kind of pornography on its site, the new algorithm is said to make finding it easier.
While these measures were advocated by Cameron, Google has long worked to stop the distribution of child pornography on the Internet. In 2006, it joined a financial and technical coalition to fight the material; and in 2008, it began using “hashing” to detect child porn-like material. This June, Google’s chief legal officer, David Drummond, described the technique in the British Telegraph:
Since 2008, we have used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique fingerprint that our computers can recognize without humans having to view them again. Recently, we have started working to incorporate these fingerprints into a cross-industry database.
The risk of measures like the new YouTube algorithm is that the software will file “safe” results as porn. As search engine expert Danny Sullivan wrote back in June:
The difficulty is that this approach will produce false positives. There will absolutely be images that are not child porn that will be blocked, because understanding what images really are is a touch search challenge. It’s even harder when you get into judgment calls of “art” versus “porn.”
Don’t get me wrong. I’m not trying to argue for anything that grants a loophole for actual child porn. I’m just saying that a politician thinking there’s some magic wand that can be waved is a politician doing what politicians do best, making grand statements that aren’t always easily backed up.
A shadow workforce
Indeed, while it can initially detect or identify content via algorithm, only a human being can separate benign family pictures from abusive content. According to some reports, Google employs hundreds of people to do this work, sorting through images all day and separating the lawful from the unlawful.
Those workers do the hardest work in the process—and they’re also the hardest to find.
“They're precluded from speaking to the media, and it is difficult to reach out and find them,” Sarah Roberts, a professor at Western University in Canada, told NPR’s Rebecca Hersher in an excellent report on the workers from yesterday.
“I think there's an aspect of trauma that can often go along with this work and many workers would rather go home and tune out, not talk about it,” Roberts said.
Pressure from the U.K. government
The British prime minister, David Cameron, first pressured the two search giants to suppress “child abuse content” this summer. In that speech, he called for broader censorship of Internet pornography, specifically announcing plans to block pornography by default in U.K. homes. (He later backtracked that specific idea.)
According to Cameron, Google and Microsoft initially resisted these suggested policies. They’ve now, obviously, accepted them.
It’s hard, on the one hand, to see how the public benefits of these specific policies—which make it harder to find images created by child abusers—could be outweighed by any public harm. Indeed, child abuse obviously does not contribute to a democratic conversation, lessen the pain of the world, or give voice to the suffering: It constitutes, in fact, a document of suffering. It is the very sort of thing a government should afflict.
But, as Danny Sullivan showed this summer, googling “child porn” doesn’t exactly turn up child abuse—it turns up news reports and charity reports. Google’s new algorithms seem to make it easier to find commentary around child abuse while making it harder to find the thing itself, but legality—and morality—have no algorithm. Work examining the pain that illuminates child abuse is the very sort of thing a government might promote, yet some of it might now be blocked, algorithmically labeled as pornographic.
I’m not sure anyone goes to Google to find work like that. But Google also operates all sorts of other tools, including academic indices, and code—like law—spins off consequences its authors never intended. Today’s news should be welcomed. But the policies and technologies put in effect today—and those still promoted by Cameron—should be inspected.
Image may be NSFW.Clik here to view.

Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Clik here to view.
Clik here to view.