LOS ANGELES—In an article today on PandoDaily.com, MetaCert founder and CEO Paul Walsh wrote about the extent to which Tumblr has far more porn on more blogs than it realizes (or is willing to admit), that a tiny percentage of that content is tagged properly, and that the extent to which “Tumblr [is] failing to protect brands and minors inside its own site [by] making every inch of its content accessible to every search engine without providing any tagging to make it easy for parental controls to spot the porn.” But that’s just a part of Tumblr’s porn problem.
Walsh, it must be noted, is in the business of “protecting children from adult content.” MetaCert products are designed to filter out any content that is intended to cause sexual arousal, including not only “live filmed action” but also “cartoons, audio descriptions, textual descriptions or other media.” In addition to filters designed for Firefox and Chrome, the company offers an iPad browser and a product called MetaCert DNS, which “is the world’s most sophisticated method of blocking pornography without blocking access to innocent websites," per MetaCert.
Walsh also frequently talks about how much he abhors censorship and yet is against free Wi-Fi in public spaces, which he wrote about on the MetaCert blog today. He thinks porn needs to be hidden away “behind closed doors,” that allowing free Wi-Fi is like building “swings for kids on top of concrete,” and that a “city could be liable should an angry parent decide to sue the city on account of their child being exposed to harmful content," As much as he hates to say it, he also thinks that allowing free Wi-Fi provides “a perfect opportunity for pedophiles to download images and videos of children being abused online, without being detected.”
I’m different. I do not equate free Wi-Fi solely with the accessibility of porn, but rather recognize a vast array of potential benefits beyond that rather narrow slice of life. I also think Walsh is exaggerating the problem and harbors a decidedly censorship-oriented bent, strains of which can also be found in his article on Tumblr, which was also published today.
Despite any reservations I may have about Walsh’s true intentions, however, I am in no position to question his data about porn on Tumblr, of which he writes, “We maintain two data buckets specifically for porn. One contains 7.2 million unique domains and a second holds millions of URLs across Tumblr, Facebook, Twitter and other sites fueled by user-generated content. I believe that this massive data set provides me with unique insights as to how much porn exists on sites like Tumblr.”
He adds, “Less than 2 percent of Tumblr porn blogs that my company classified were self-tagged as NSFW. Yet over 50 percent used the keyword NSFW in their metadata for their blogs to be indexed by Google and other search engines. Meanwhile 100 percent of them had porn-related keywords in either the domain or metadata.”
And concludes, “Either those behind Tumblr porn blogs purposely avoid using this tagging system to avoid being filtered by people who would rather not see their content, or they don’t know it exists. Either way, Tumblr’s self-tagging system doesn’t work.”
More problematically, he says, Tumblr has only two settings for such content, “NSFW” and “Adult,” the latter of which will no longer appear on Tumblr public tag searches, and will only be visible, per The DailyDot, “to your own followers and the followers of people who reblog your content.” Walsh says it doesn’t matter because “every porn blog is indexed by Google and all other search engines anyway. Not only is Tumblr failing to protect brands and minors inside its own site, it’s making every inch of its content accessible to every search engine without providing any tagging to make it easy for parental controls to spot the porn. It also means that brands that end up with ads on porn blogs will also be exposed in search engines.”
Walsh says proof of that can be easily found by Googling “site:tumblr.com porn,” which he says returns “85 million results.” We did the same thing and got 12.6 million results, but who knows, maybe results are different in San Francisco.
Walsh also claims, “There’s some extreme material. On a scale of 1 to 10, where youporn.com is a 5, some of the content on Tumblr would reach 9 and possibly 10. We’ve actually reported some Tumblr domains to The National Center of Missing and Exploited Children.” His company, he adds, “has amassed a list of hundreds of Tumblr domains that are classified as pornography and contain the keyword ‘rape.’”
It all adds up to a big mess, the solution for which, he says, is for Tumblr to “break down ‘NSFW’ into sub-categories so that innocent sites aren’t caught in porn nets,” and to “build dedicated crawlers for Tumblr blogs” that are “intelligent enough to know the difference between a site that talks about adult content and one that actually contains adult content.”
The entire discussion is interesting because it implicates a large amount of adult content, and however Tumblr proceeds, it appears as if the result will have an impact beyond the boundaries of just Tumblr, even just in terms of exterior search engine results.
But Tumblr has another problem that it will have to deal with if an appeal is not made to the current 2257 ruling entered by District Court Judge Michael M. Baylson in Free Speech Coalition v Holder, in which the judge reaffirmed the constitutionality of the federal labeling and record-keeping statutes and, for the most part, regulations.
Specifically, much if not all of the content that Walsh has identified as either NSFW or ADULT probably triggers 2257, which may even encompass much more Tumblr content than that. What the government made abundantly clear in court documents filed during the case is that the 2257 and 2257A statutes apply to “the universe of publicly available sexually explicit images.” Users of Tumblr, which itself no doubt claims a CDA Section 230 exception, are still responsible for complying with the requirements of 2257. They are responsible for that right now, but are unable to do anything about it even if they know about it because Tumblr/Yahoo does not recognize 2257.
If anything, the 2257 trial and ruling highlight the extent to which both the government and this court believe that 2257 is an essential tool in the fight against child pornography. It is a load of bull, to say the least, but no American corporation, and certainly not one the size of Yahoo, is going to take that position in the face of a claim by Judge Baylson that, “The governmental interest informing the [2257 and 2257A] regulatory scheme—combating child pornography—is substantial.”
In other words, if 2257 is not struck down or fixed on appeal, Tumblr will have to force its users to supply 2257 notifications on their blogs if they want to post up 2257 triggering content. It's just that simple.
But here’s the rub. As was also made abundantly clear during trial testimony, when regular people, as opposed to adult industry professionals, are asked to provide personal documentation in order to post up adult content, they mostly balk, especially if the content is of them or someone they know. (I’m not talking about “revenge porn” here, but much of that triggers 2257, too.) What that means is that the moment Tumblr starts making people provide a link to the place where they are keeping their 2257 records, most of those blogs will probably disappear overnight, gone, poof, bye-bye.
For the Paul Walshes of the world that may not be a bad thing (except that less porn means less porn to filter), but for supporters of free sexual speech, it’s going to be a big problem. For Tumblr, which was build on support for unfettered speech, it will also be a problem, unless it decides that sexual speech will simply have to be jettisoned for the greater good. Tumblr and Yahoo could of course continue to ignore 2257 and 2257A, but I believe that will become harder for Yahoo to do, especially if inspections are reinstated and the international debate over what to do about online porn continues unabated.
2257 and 2257A issues are not limited to Tumblr, of course. The regulatory scheme that the government says sweeps in “the universe of publicly available sexually explicit images” is like a vast black hole pulling in everyone who dares to publish sexually explicit content for any level of public viewing, including all social networks and tube sites within reach of federal authorities.
At the end of the day, I don’t know if it’s going to be a problem for Yahoo or Tumblr—I have my doubts that it will be—but I do believe it’s going to be a big problem for thousands of their loyal users, and maybe millions of other people posting up to similar platforms, when the day comes that they are forced to prove that they are not child pornographers.