Popular Posts

Friday, November 18, 2005

On a Filtered Internet, Things Are Not As They Seem

Internet filtering is hardly new - indeed, recent volumes of this very publication have chronicled the dramatic rise of filtering over the past half-decade. Using technologies like router-based IP blocking and, more recently, DNS redirection, countries have found they can block the web content they dislike, while still obtaining what they consider the benefits of the Internet. As users gain experience on this increasingly filtered ’net, it’s easy for them to become complacent, thinking they have a sense of how censors operate and of who to blame for limitations on Internet use. But recent events show the situation remains in flux, and confusion may be increasing rather than coming to an end.

Internet filtering has long failed typical notions of regulatory transparency. Go to, say, Thailand and request a banned site about politics, gambling, or pornography. Thanks to blocking technologies like IP filtering, you probably won’t get the web page you asked for. Neither will you get a warning saying "This content is blocked under order of the Information and Communications Technology Ministry." Instead, your web browser is likely to say "host not found" or "connection timeout." These messages mistakenly suggest that the server is broken or the network malfunctioning. But in fact things are just as the censors intended : the site is working fine, but you can’t see it.

Sophisticated users have become accustomed to these sorts of tricks, and have adjusted their expectations accordingly. In a country like Thailand, "host not found" is to be taken with a grain of salt. As a result, filters’ pretextual error messages become somewhat less deceptive over time.

But the past year has brought a rise of new filtering methods that, intentionally or by happenstance, are considerably more confusing. Try using Google in China : Most searches work fine, in a much-appreciated improvement over the week in September 2002 when China blocked Google in its entirety. But run a search on a controversial policy area, and Google will stop working for perhaps half an hour. What to make of these facts ? Some western analysts have wondered whether Google is conspiring with China - after all, such precise and subtle filtering interventions would seek to require Google’s cooperation. But as it turns out, all indications are that Google is innocent ; China has simply implemented a method of filtering more narrowly targeted than any before.

Still more subtle are the "modified mirrors" sometimes used in Uzbekistan. Rather than simply blocking access to sites of political dissenters, Uzbek authorities make copies of the controversial sites - then change the copies to undermine or weaken the unsanctioned positions. The key step : When Uzbek users request the controversial sites, they automatically receive the altered copies in place of the authentic originals. Experts might realize something is wrong, but this tampering is exceptionally difficult for ordinary users to notice or detect.

Clearly not all countries share the notions of free speech and freedom of the press that many of us hold so dear. Some countries explicitly disavow such notions. For example, Saudi Arabia and the United Arab Emirates openly admit to Internet filtering, even appearing proud to filter. But the United Nations’ Universal Declaration of Human Rights makes access to information an undeniable entitlement - something China, Thailand, Uzbekistan, and others seem required to recognize. Indeed, by hiding their current efforts at filtering, these countries implicitly admit that they ought not block their citizens’ access to information. At least the Middle Eastern countries proceed openly and seemingly under claim of legal right - whereas secret filtering gives an implicit admission of impropriety, for if filtering were permitted, there would be no need for secrecy. In the coming years, we ought to look for greater transparency in governments’ Internet interventions. We ought to demand that governments admit what they do, and accept public responsibility for the consequences.

Ben Edelman, researcher studying Internet filtering

Ben is a Ph.D. candidate at the Department of Economics at Harvard University and a student at the Harvard Law School. His research includes empirical analysis of Internet policy and regulation, including domain names, filtering, and spyware.

No comments:

My Google Profile