an Australian Press Council Seminar
5th September 1996
Speech by Danny Yee, Electronic Frontiers Australia
Our feelings about Internet censorship are quite simple. We -- and I speak on behalf of a large part of the on-line community, not just for Electronic Frontiers Australia -- we don't want a bar of it. Not the ignorant, draconian, and purblind criminal sanctions of the United States Communications Decency Act or the Australian Attorneys-General; nor a bureaucratic morass of codes of conduct and complaints tribunals administered by the Australian Broadcasting Authority. We oppose even self-regulation, if that is to mean the enforcement by Internet users and administrators of rules imposed on us by governments. The global society that is the Internet has its own rules and its own methods of enforcing them -- rules and sanctions which have been developed over the decades of the Net's existence and which reflect its technological and social realities in a way alternatives imposed from outside cannot.
Two completely different ideas are often conflated in the term "censorship". The first, which I will call "filtering", enables individuals to protect themselves (and their children) from material they find offensive; the second -- censorship proper -- is the attempt to ban certain kinds of information completely. On the Internet these are separable, and the first is achievable while the second is not. I hope most of you agree with me that this is fortuitous, since filtering is desirable but censorship is not.
In its support of user-enabled blocking [using the PICS scheme], the ABA has gone half-way to recognising that protection of individuals is possible without censorship. What they have not acknowledged is that it is possible without any government intervention at all. There is simply no role for codes of conduct and complaints tribunals when it comes to Internet content: if blocking software fails to perform as advertised, that is a trade practices issue; if a company is employed to protect school networks from unsuitable material and fails, that is a contractual issue. And if you choose not to filter what you see and find something that offends you, that is your problem and you have no grounds for complaint to anyone. The only useful recommendations of the ABA report in this area are the suggestions for user education and community awareness programs.
The key point is that decisions about what an individual wishes to avoid can only be made by them. The idea of an "Australian" ratings service is therefore silly, because variation in "community standards" is greater within countries than between them. Some Australians will prefer ratings services provided by churches; others will prefer to filter on purely intellectual grounds, relying on universities for their ratings. Most will manage without any filtering, simply avoiding things that offend them. No centralised system can provide this sort of flexibility -- the history of the Internet has, time and again, shown that only genuinely distributed systems work with distributed problems.
Protecting people from material they find offensive is part of the broader problem of avoiding unwanted information. Electronic junk mail is an example which illustrates how the Internet community can enforce its own rules without government involvement or legal sanctions. Despite the clear commercial benefit small businesses can obtain from junk email, there is remarkably little of it. Not because there are laws against it, but because it violates one of the uncodified rules of the Internet. There is no central body which enforces this system of rules -- it is maintained by a consensus of Internet users, administrators and organisations, wielding a variety of sanctions.
In short, we feel that the existing Internet works quite acceptably. The question we are asking is: "What exactly are the problems for which government regulation is supposed to be the solution?"
But what of censorship proper, the idea that certain kinds of information are intrinsically evil and should be banned? As the extensive list of publications refused classification by the Office of Film and Literature Classification demonstrates, censorship of this kind is all too common. Fortunately, the advent of electronic communications has pulled the rug from under those who think they have a right to control what others read and see. It is hard enough controlling the movement of books and other print publications in and out of a country: my parents' generation read Lady Chatterly's Lover even though -- or even largely because -- it was banned. With the ease of duplication and transmission of information on the Internet, these kind of bans are now completely futile.
Take, for example, the Rabelais shoplifting article, which is available on several Australian Web sites. If our government were so foolish as to force its removal from those sites, there would be copies across the United States within hours and links to them across the world. Or the book E is for Ecstasy. This is banned here, but it is legal in Europe and freely available on the Web. There is no way for our government to prevent Australian Internet users accessing this sort of material. Even complete disconnection of Australia from the international telephone system might not work.
Another example is provided by the German government's attempt to block access to the Web sites of Holocaust revisionists. This can only be described as a dismal failure -- the material in question is more widely available now than ever and received most unfortunate publicity as a result of the affair. The most effective response to this sort of material is either to ignore it or to refute it. The latter is the approach taken by the Nizkor project, which is devoted to combating Holocaust revisionism on-line.
The reasons why this sort of censorship can't work are both technological and sociological. The technical reasons include the sheer volume of data traffic, the difficulty of analysing end-to-end traffic in the middle of a datagram network, and the existence of freely available military-grade encryption software. The social reasons include the ability of individuals to publish material without going through publishers, the need to protect the privacy of individuals, and the free-speech ethos of the Net.
Another problem is that the Internet transcends the boundaries of nation-states, raising the question of exactly whose censorship laws should apply. The response of governments is, of course, to seek international agreements to control the Internet, but the kind of extraterritoriality required to make this work is inconceivable. Will the Australian government bow to pressure from the Indonesians and ban the Web pages of East Timorese activists? Will Saudi Arabia get to impose its concept of decency on the rest of the world? Will the United States rescind its First Amendment because it allows Australians freedoms our laws would deny us? It seems unlikely.
But if censorship of the Internet can have very limited effectiveness, attempts at its enforcement have potentially appalling consequences for individuals. At the moment the privacy legislation which covers the postal and telephone systems does not extend to computer networks. (Which is one reason the police are not among those clamouring for new laws). Add to this the fact that it can be impossible to know what one is downloading until after the event, and that with electronic mail one has no control over what one receives at all... the possibilities for selective enforcement and abuse of laws against possession or retrieval of information are obvious.
So what is the difference between suppression of junk email and suppression of Holocaust revisionism or information about drugs? The difference is that the first is both an attempt to force information on people and a threat to the smooth operation of the Internet. The millions who wield power on the Internet -- those who control the Web servers, the routers, the newsfeeds, and above all the information -- realise this and act accordingly. While Web sites containing information on drugs, explicit pictures, or neo-Nazi propaganda may offend the sensibilities of some, they do not force themselves on people or hinder the Internet's operation. There is, therefore, nothing remotely like a consensus about preventing them. Without the consensus of Internet users and administrators, external controls are ineffective; with it they are unnecessary.
It is our conviction, then, that there is no place for any form of censorship on the Internet. There is an old saying that the Internet interprets censorship as damage and routes around it. That is to take a technical perspective on the issue. If we take a social perspective, then, since the raison d'etre of the Internet is the sharing of information, censorship is nothing less than a crime -- or perhaps, when carried out by governments, an act of war.
Do we conclude from this that there is no place at all for governments online? Certainly not -- just that the place of politicians and government departments on the Internet is as peers to other individuals and organisations, not as their masters. Governments can, and should, use the Net to provide information; they may also choose to provide ratings servers, caches, and other such services. But they will simply be one group of Internet users among many, with no more privileges and rights on-line than other users -- and with the same duties and responsibilities. Governments didn't build the Internet, they don't own it, and they can't control it; they will have to learn to live with this.
Electronic Rights and Ethics
John Perry Barlow's Declaration of the Independence of Cyberspace
E is for Ecstasy
http://sites.inka.de/sites/bigred/misc/e4x/ (from Germany)
http://www.damicon.fi/drugs/e4x/ (from Finland, linkrot)
The Rabelais article
The OFLC Database
(how do they justify charging for full access to this?)
The Nizkor Project