See No Evil: The Uncertain Future of Net Filtering

People in the adult Internet industry probably never think about content filters unless they have kids, in which case they no doubt have the official "Evil-Sifter" Christian Coalition filtering software ensconced on every computer in the house.
That's because they, of all people, know best about the glut of hardcore sexual material that's not only universally obtainable on the Net, but is also often directly targeted at children.

As such, the loving parent in them is forced to keep close tabs on their kids' surfing habits while the capitalist in them bankrolls their college education with the proceeds from that very same smut.

It's a neat irony, illustrating the fact that, like a gun, a filter is nothing more than a neutral tool.

While hate sites, chat rooms, and adult newsgroups are also widely blocked or filtered, and in some countries outlawed altogether, hardcore sex sites get the most attention in the United States. Presently, the only realistic options people have to keep porn from inundating their families are to either be constantly vigilant, never to access the Internet, or to filter.

If the first two are unfeasible, the last is hardly foolproof. One of the newest trends in aggressive porn site "marketing" involves the grabbing up of mainstream domains that entities have either intentionally or unwittingly let lapse. According to a December 2001 Wired article, "...Local governments, church groups and nonprofit organizations... have recently seen their homepages turned into smut dens." The article lists sites for The National Lutheran Women's Missionary League, The Nebraska Department of Education, The Ohio State Senate, and the Ballet Theatre of Annapolis, MD, among others, as having been affected. It's a trend that presents a two-sided problem for those who don't appreciate graphic sexual content. With up to 1 million registered domains expiring each month, more avenues for delivery will exist. It also presents a problem for filtering programs that block sites based on URL or META tag descriptions, which in these cases would presumably be innocuous.

One of the fundamental complaints about filters is that they block too many "innocent" sites. While accurate, it's a claim that is tangential to any serious discussion of the misuse of filters. As First Amendment attorney Robert Corn-Revere recently told AVN Online, the effectiveness of filtering software is a consumer issue, not a censorship issue. "The questions about filters don't necessarily raise any legal issues," says Corn-Revere, "because they're really questions of consumer choice. However, they do present legal issues when they begin to be used as instruments of government policy, when, for instance, a library says that they're going to choose your family values, not just for your kids but for you as well; then it becomes a matter of law."

According to Corn-Revere, a fundamental if understandable public confusion regarding filters has been generated in the last several years. "It's hard for people to understand the positions on filters because people on both sides of the issues have argued, in some ways, what seems like both sides. Filters first became really significant in people's consciousness during the debate over the CDA (Communications Decency Act), because those who opposed the law argued that you didn't need to have a federal criminal law because there were the less restrictive alternatives of using filters. Those who supported the law scoffed at that, and said, 'Don't be crazy, filters are flawed and won't work.'

"Well, after the Supreme Court struck down the CDA, it seemed overnight as if the positions had reversed and the world had somehow shifted 180 degrees. Those who had supported the CDA said, 'No, you can't force us to use filters, that would be unconstitutional,' and those who had previously opposed filters as being flawed said, 'Well, they're getting much better.' And so, suddenly the debate shifted, and even before COPA (Children's Online Protection Act) there were efforts to impose filters."

The Supreme Court amended the CDA in 1997, striking down central provisions of the act that sought to shield children from "indecent" or "patently offensive" Internet content by making it a crime to transmit such material or fail to effectively block kids' access to it. But the central issue of protecting children from online smut remained, and COPA, sometimes referred to as CDA II, was Congress' solution to the problem, though it too has been challenged. (The high court will be ruling on that case at just about the time this article is printed.) Neither statute directly involved filtering, which lurked in the background as a sort of passive alternative to those more punitive measures.

Congress' latest effort, The Children's Internet Protection Act (CIPA), does deal directly with Internet filtering by requiring schools and libraries to screen out smut by 2003 or forfeit federal funding. The ACLU and the American Library Association filed suit in October challenging its legality, with both cases slated for trial this month in a federal court in Pennsylvania.

According to Corn-Revere, the case has little to do with filters themselves. "Filters are here to stay," he says. "They're bundled in with ISP service, are widely used by institutions, and are even used by some government institutions, primarily schools more than libraries. So filters will be with us, as will rating systems. The question will be how much overlap there will be between public policy and the use of filters. And that's where the litigation over CIPA will have a lot to say."

Even if this law doesn't eventually pass muster, there will inevitably be more to follow. "The way in which Washington tends to approach these issues is that there's always a lot of political traction for doing something," says Corn-Revere. "It's kind of like dealing with The Terminator, or in a kinder, gentler way of viewing it, The Energizer Bunny. They'll try one thing, it'll be flawed, it'll get knocked down, and they'll simply keep coming back until they [draft] rules that do survive."

But if filtering and free speech seem to be at insoluble loggerheads, there are alternative solutions. In Lowden County, Virginia, where a group of patrons in 1998 challenged the constitutionality of permanent filters at the library's computers in federal court and won - a case in which Corn-Revere was lead counsel for the patrons - an interesting compromise was reached. "After we won the case," says Corn-Revere, "they adopted the policy that we had proposed in the first place, which was to have both filtered and unfiltered access, so that people could choose which they wanted to use. For kids, the parents chose. And it's worked that way ever since the lawsuit, and as far as I've been able to determine since then, there's been one complaint in over two years about access to porn."

For Corn-Revere, rapprochement is the key to an issue that exists beyond good and evil. "In the Lowden County case," he says, "I argued that filters were unconstitutional, and was successful in that; but it's not a simple question of whether filters are good or bad, or constitutional or not constitutional. It is entirely a function of how they are used. If they are used as a tool to use improve individuals' choice over what they can get in their households, and to impose their own family values, they are a tool of choice and it's not a constitutional issue." Likewise, he says, technological advances offer no automatic solution, "because no matter how good technology gets, there's no way that filters can, even in the theoretical sense, apply a legal standard. You can't design a filter that's going to determine what's obscene or harmful to minors."

Another solution referred to above is rating systems, or more accurately, self-rating systems. One such is ICRA, the Internet Content Rating System, a global nonprofit operation that provides content ratings free of charge to Webmasters who complete a form at the organization's Website (www.icra.org), entering into a contractual obligation that stipulates that the declarations they make regarding their content are true and correct. Supported by annual dues from heavyweight members such as AOL, Microsoft, Yahoo!, Verizon, and Verisign, ICRA is based in England, with offices in the U.S. and Brussels, and plans to move into the pan-Asian area next year, according to Mary Lou Kenny, ICRA's director of North American operations.

"ICRA is both a labeling and a filtering system," says Kenny, "but it's a filter, not a block. It's a substantive difference. On the labeling side, what's unique about ICRA is that it's the content provider, the person who owns the content, who labels the site. On the filtering side, what makes it different from most of the other ones out there is that it is the consumer who is making choices about what is and is not to come into their system. When we talk about filters versus blocks, we're talking about companies that offer a blocking service. They have a database of a million or 10 million URLs, that they have decided, based on their criteria, are inappropriate for children. In our system, we don't do that. The parent makes the choice about what is and is not appropriate. ICRA does not label, content providers do. ICRA does not filter, parents do. Those are the two major differences between filtering and blocking."

According to Kenny, Capital Hill is enamored of ICRA. "Government entities are very attracted to this," she says, adding that late last year, resolutions were introduced in the House and Senate "saying that self-regulation of content is good, self-labeling is a good idea, and basically recognizing the industry for taking the lead in addressing child online safety, and encourages members of Congress to label their sites." The Technology Office at the Department of Commerce and The National Center for Missing and Exploited Children ICRA-label their sites, she says, as will other government Websites in the future.

ICRA does have its detractors. An October 2001 Wired article by Declan McCullagh, entitled "Filter Plan Leaks Like a Sieve," took ICRA to task for wording their content questionnaires too broadly, thus casting too large a net. "News articles about sexually transmitted diseases or former President Clinton's affair with an intern would be 'sex' rated," it claimed, adding that MSNBC.com, CNN.com, and Time.com do not use the ratings, even though their parent companies endorse it.

"People do say that we ought to have [a rating] for news, to distinguish the news organizations," counters Kenny, "but the question always becomes, who decides what's news, like who decides what's XXX. Our take is, the content provider does, rather than some organization or some governing bodies. But I think that we have a very diverse population around the world and there will always be people who want different solutions to this problem, which is why all of us out there are providing a service to parents. We're all in the same space trying to provide the same public service."

In the end, will filters be mandated, either through government regulation or ubiquitous corporate use? Though any move in that direction would seem wholly unconstitutional, the government certainly still believes that it can and must achieve a modicum of control over offensive Internet content. As far as corporate America goes, it's all about control, and one doesn't have to be an inveterate cynic to believe that the control a filtered Net offers is far preferable to an unfiltered alternative, especially if the filter comes wrapped in the sheep's clothing of choice. That, in fact, is the direction in which we are headed, and one can only hope that consumers will continue to demand full control over filtering software that they program themselves, and, just as important, accountability from the companies that do the blocking for them.