An International Communication Policy:

The Internet, international regulation & new policy structures


John R. Mathiason and Charles C. Kuhlman

New York University*




The Internet today has reached a level of political importance where some form of governance policy is needed. The problem is to determine which policies to govern which aspects of the Internet.

From one perspective, the Internet does not exist: it is merely a conglomeration of linked individual networks which has no formal corporeal existence. From this perspective, there is no need for any policy save laissez-faire. What may need to be governed are specific pieces, since there is no whole.

From another perspective, the elements of the Internet constitute a conceptual whole, the ultimate commons, where no part can function well without all other parts operating well. From this perspective, some form of overall governance is essential, which without a whole, there will be no parts.

This paper examines the issue of what kind of communications policy should govern the Internet, if any. It looks at the interface between telecommunications technology, content, economics and law that constitutes the policy problematique. It focuses on the policy issue that has placed the problematique in bold relief: the issue of how to assign domain names. It looks at the implications of this for different approaches to governance.

It uses the "R word": regulation of the Internet. It starts from the premise that, because the elements of the Internet are of public interest, there will be regulation. The issue is who will regulate, what, and on which basis..



At the heart of the matter are the technological dimensions, the characteristics of the Internet that define the need and the prospects for governance. In telecommunications terms, these technological dimensions constitute a new paradigm for looking at policy.

A New Network Paradigm

Although telephone switching systems have matured to the point that hundreds of millions of connections a day can be set up and maintained by the network, their essential characteristic remains a continuous full-time pathway between two points at any one time.

Packet switching represents a significantly different communications model. The content, once put into digital form, is divided into small well-defined groups of coded data, each with an identifying numerical address. Anything that can be digitized can be sent as a packet. To the Internet, a packet is a packet is a packet, whether it carries numbers, words, digitized sounds or digitized pictures. It became possible, and the U.S. Defense Department's Arpanet actualized the possibility, to send an unlimited number of packets over the same circuit with different addresses. Routers, rather than switches, became the key to delivering the packets to the intended destination. Controlled by software and microprocessors, the router inspects the address of a packet and sends it on its way on a full-time circuit to another router to an eventual end point. As the Arpanet evolved from an X.25 network into the TCP/IP network that undergirds the Internet, the designers reserved 32 bits for the packet address (to be superseded by 128 bits in Internet Protocol version 6 [Ipv6]) which are represented in decimal notation in a format, where each group of x's can range from 0 to 255.

The innovation of a Domain Name Server (DNS) in 1984, prior to the creation of the Web in 1998-89, provided synonyms for the somewhat inscrutable digit strings of the actual address. The actual addresses of the packets remained the digit strings but they were replaceable by more or less scrutable alphabetic equivalents stored on a DNS server file which permitted the lookup of the alphabetic name from a numerical address and vice versa. Thus was born the web site name, a new entity and a new property right in a new and legally ambiguous sphere.

The telephone system addressing system started in the simplest possible fashion with Sally asking an operator to connect her to Harry. Under the direction of the Bell System-affiliated companies and by agreement with the non-Bell operating companies, the present U.S. ten digit area code-exchange-line number system evolved over decades into a national standard. By contrast, the Internet addressing scheme was designed from its day of creation by engineers and scientists as a logical and comprehensive construct to meet their needs for low cost data communication. The integrity of the scheme was guaranteed by its sponsorship by the US Department of Defense and by later private sector successors.

The alphabetic names associated with numeric addresses were divided into domains, a limited typology of alpha addresses that enabled the routers to do their lookups efficiently. To find the numeric twin of the Internet address of "UN.ORG," for instance, the router need not search through every entry in its address table, just the addresses ending in ".ORG." These suffixes as the first level of searching and selecting are known as top level domains: the current set includes .COM, .MIL, .ORG and .NET, corresponding to net addresses for entities in commercial, military, non-profit and network administration endeavors.

Internet addresses are conceptually very different from telephone numbers. In the U.S., Canada and the Caribbean, most area codes (technically NPAs--National Planning Areas") denote a geographic place with boundaries identifiable with governmental jurisdictions: nations, states, cities. The exchange part of the phone number is traditionally associated with a specific place with a street address, the central office, from which the wires emerge to connect the telephone user over the "last mile" to the network. The place-centered nature of the phone system numbering plan is beginning to break down with the rise of wireless cellular systems, the widespread use of ghostly 800 and 888 numbers which may be answered here one minute and there the next. Nonetheless, jurisdiction can be established in all cases. Internationally, the country code, city code numbering system links phone number to place to jurisdiction.

Internet communications

Internet addresses have no fixed location. They are purely conceptual. There is no central office. The routers which direct packets to the packet address at rates between 100,000 and 500,000 a second can know only the next logical point in a routing table and which outbound circuit is available to carry the packet. Packets are free to traverse the globe on countless circuits to geographically indeterminate end points. The technology provides assurance that the packets are reassembled in the right order and are very likely not corrupted by data errors.

A further distingnet addresses is that neither the sender nor the receiver of a packet is a paying customer for the packet. Telephony requires two paying customers to complete a call, each of whom is a paying for the privilege and each of whom has at a minimum a billing address and usually a street address in a city, a state/province and country. The Internet senders and receivers are inherently tied neither by the billing process nor the technology to place.

We have identified the technical underpinnings of novel realities which have led to major policy debates which are far from resolved. From inside the Internet, names for addresses are structured but purely arbitrary, the technology is indifferent to content, and the sender/receiver dyad is unlocatable in a conventional spatial sense.

Technology and Property: Intellectual Property: Names, Sounds, Pictures, Words, and Ideas

Names have value; legal ownership and the right of exclusivity for patents, trademarks, service marks and copyrights for sound/picture/literary content are well-established in Western law. The treaties underlying the World Intellectual Property Organization (WIPO) have extended the principles, if not the practices, to a wide spectrum of countries. A substantial body of commercial law and practice guarantees that recourse is available to a party which believes its property rights have been infringed upon. A variety of adjudicatory mechanisms are available to resolve disputes and provide redress at a national level.

The Internet poses a challenge due to its indeterminate ubiquity. Infringement becomes possible from any corner of the globe, that is, from any address on the Internet. If content can be digitized it can be not only pirated, but it can also be disseminated globally with no impediment. No court, mediation board or arbitrator can be presumed to exist with authoritative jurisdiction even if the infringer can be definitively identified. The efforts of the United States to halt the active commerce in pirated compact disks and software in China are illustrative of the difficulty of maintaining ownership of intellectual content; absent any overarching jurisdiction, diplomatic pressure had to be brought to bear in a bilateral context of numerous other foreign policy issues. The Internet raises the very real specter (for the owner of content) of massive evaporation of assets.

The Internet is not an inherently broadcast medium although it shares the ability of radio and television to simultaneously reach many people through Web sites either sought deliberately per occasion by the receiver (a "surfer") or "pushed" by a Web application such as Pointcast. With the partial exceptions of short-wave radio and the direct broadcast satellites, the content of radio and television (including cable TV) broadcasting since Marconi has been firmly under the thumb of governmental authority under the rationales of orderly spectrum allocation (U.S.), revenue generation twinned with cultural uplift (U.K.) or outright thought control.

The potentially universal accessibility of content via the Web upsets the traditional regulatory model. In its blithe way, the Web does not care what it carries: hate, love, pornography, fraud, lies, truth, scholarship, charlatanry are all the same in the stream of bits and all equally accessible. The efforts of the German government to shut down Nazi-leaning, anti-Semitic Web sites by penalizing the Internet service provider was a kind of desperate grasping for any available handle since the real purveyors were to difficult to reach. The ease of establishing a Web site provides assurance that the extinction of one offender will not prevent recrudescence. On the horizon are technical developments which will make the Internet much more like a broadcast medium and even more subversive of government control: vastly increased circuit capacity through a technology known as "wave division multiplexing" and a redesign of the underlying Internet protocols to permit simultaneous transmission of the large amounts of data required for images (one such effort is known as MBONE).

Problems resulting from the technological characteristics of the Internet


1.The creation and destruction of property rights.


The ownership of domain names is just one example of a new form of an old, satisfactorily, if not perfectly settled issue. A broader question is how to retain ownership of digital content once it has been digitized and made available on the Internet. The widespread availability of inexpensive copying machines a decade ago created a culture of book and article copying to the great distress of conventional publishers. Law suits and clarifications of the copyright law resulted in a "fair use" doctrine which limits copying to personal use. "Fair use" on the Internet is much more difficult, if not impossible, to define. For one class of content creators, the issue is irrelevant: those for whom dissemination is more important than immediate financial gain. Scientists, scholars in the humanities and creative writers are increasingly bypassing paper-based media and are skipping directly to Internet publication. The effect on the lucrative business of scholarly publishing will be devastating as publishers watch their feedstock evaporate.

2. Trade in services is an increasingly large portion of world trade.


The Internet promises to expand this invisible trade exponentially and uncontrollably from the standpoint of sovereign authorities. Insurance is not the only thing that can be moved beyond existing regulatory measures. Software itself as well as music and visual content are becoming major elements in world trade, all moved silently through the Internet. Relying upon their semi-sovereign status as well as the global nature of the Internet, the Coeur d'Alene tribe of native Americans in Idaho have established an international lottery on the Internet doubly beyond the control of the U.S. and Idaho State governments (NYT, 8 March 98, p. 24).

The conventional telecommunications services which have constituted a growing share of world trade in invisibles have been a well-regulated, though recently contentious, affair both in governmental and commercial terms. Traffic is measured in minutes by the sending and recipient countries and companies and compensation is paid under international agreements much as banks settle funds flows. Internet telephony, on the other hand, upsets these arrangements by bypassing the telephone companies call accounting systems almost entirely through the use of dedicated full-time circuits. Once the quality of service deficiencies are addressed and the last mile problem is solved, a very large piece of international trade will disappear from the scope.

Finally, some of the trade is in goods and services that are usually regulated in national commerce but where the nature of the Internet permits easy avoidance. Pornography and gambling are two examples where the fluidity of the Internet makes national control extremely difficult. The revenues generated in these two trades have not been estimated, but are clearly large.


3. The authenticity of communications becomes suspect on the Internet.


Considerable effort is being devoted to security and encryption due to the anonymity of the Internet and the indeterminacy of the packet technology. Law enforcement and national security interests have butted up against the anxiety engendered by the lack of clear knowledge of where the Internet address is located and who is there. Authenticity anxiety is most powerfully felt when money is exchanged, but is also present in other types of communication as mundane as scholarly texts; is the paper on the Web really authored by Mathiason or Kuhlman?


4. Privacy of communication is a reasonable assumption in circuit switched networks.


Indeed, the almost total digitization of the telephone network and its reliance on extremely high speed multiplexing techniques drove the U.S. government to successfully press for the adoption of the Communications Assistance to Law Enforcement Act (CALEA) which obliges telephone companies and their suppliers to modify their equipment to support digital wiretaps in ways which have yet to be resolved. Privacy is not a reasonable assumption on the Internet.


5. The problem of preserving national, regional and local culture is exacerbated by the Internet.


Before the Internet achieved its current prominence, the dominance of U.S. mass media products in the world market was a major bargaining issue during the Uruguay round of trade negotiations with France and Canada holding out for significant restrictions. Substantively unrelated agricultural negotiations led to compromises which apparently satisfied the negotiators (Gauntt, 1997). The Internet players are not so easily identified and mollified (or restricted) as the Disney Corporation and Rupert Murdoch. The relatively low cost of entry and exit means that backyard moguls can become significant originators of cultural content available everywhere. A clash with local mores is inevitable in open societies and even more so in closed ones. The dominance of the English language on the Web is a thorn to cultural preservationists outside the English-speaking countries (and even inside, vide Quebec). Efforts are underway to make the Web multilingual but the most common ways of representing text digitally were designed for the Roman alphabet, especially as used in English.


6. Crime and terrorism take on new guises with the availability of inexpensive and widely available instantaneous global communication.


As evidenced by the mostly benign (so far) attacks by self-styled "hackers," the network and the computers connected to it themselves are potential targets for criminal enterprises that normally fall under the rubrics of theft, larceny and property destruction. Internet aside, as the normal business of daily life such as power distribution, financial systems, air traffic control become pervasively intertwined with automated systems, the potential for criminal and terrorist attack becomes acute.


The historical working commercial and international law has not been brought to bear on these issues or is inherently incapable of being applied.



Had the Internet remained a vehicle for communication among highly specialized publics, such as the scientists and engineers who first used it, there would probably be no need to talk about governance. The "club-like" approach to maintaining system coherence and allowing for technological change worked well when there were only a limited number of actors and very few evident economic or social issues connected to the technologies. In fact, the technical management of the Internet could still be accomplished by the consensual approach were it not for the non-technical issues.

The telephone industry is a case in point. In the United States, non-governmental structures such as the American National Standards Institute (ANSI), the Institute of Electrical and Electronic Engineers (IEEE), the National Exchange Carriers Association (NECA) as well as legacies of the monopoly era such as Bellcore have effectively set and maintained interoperability, accounting and settlement standards with little authoritative government input. Standards-setting in these bodies is bottom up: companies with a compelling commercial need convene working groups to develop compromise standards which will ensure interoperability in the interest of establishing a wider market.

Some commercial enterprises are able to set standards without the assistance of either voluntary groups or governmental authorities. Market dominance is the standard-setting mechanism. We do not need to cite the all-too-obvious examples.

ISDN is an interesting case of an elegant, logically coherent and comprehensive construct built from first principles, top down. Although it is very important for the carriers, it is less so for consumers in the US; data networking largely ignores ISDN. Likewise, the OSI data communications model, while serving as a point of reference, has been bypassed by the semi-conformant TCP/IP model.

Growth causes problems because it increases the number of players. Many of these do not share the values of the technicians and academics who set up and were the first users of the Internet. The perspective of short-term profit is different, and often contradictory to, the perspective of long-term free and unregulated exchange of information.

Moreover, as the Net becomes more lucrative, there is an urge on the part of governments to tax it as a means of raising public revenue. Here the decentralized and borderless nature of the Internet makes it difficult to collect taxes fairly since it is easy to avoid them. While some of the major governments in the world talk of policies to ensure that the Internet is a tax-free zone, it is questionable how long that commitment will last.




Ultimately there are three approaches to regulation of the Internet, a self-regulating market, national regulation or an international regime. Each has its own distinct features, and limitations.


1. Self-regulating market.


One version of this approach is Peter Huber's Law and Disorder in Cyberspace (NY: Oxford, 1997) where he argues that the advance of technology is so rapid that no regulatory regime has had and can have any effect which is not detrimental. The maintenance of order in the "telecosm" (Huber's neologism) should be maintained by private actors and private litigants, common law courts and the market. Against Huber, Stewart Baker has argued that judges are clumsy and retrograde makers of social policy (WSJ, 3 Nov 97). More to the point, in the international arena, the Anglo-Saxon common law tradition is non-existent and "the market" is a synonym for the global behemoths of Canada, the United States, the European Union and Japan.


2. A market guided by national authorities.


Just as the Federal Communications Commission has regulated the cable TV, telephone and broadcast industries in the US for almost 100 years, so also could it and its foreign counterparts regulate the Internet. Two obstacles are likely to obviate this option. First, the technology is mutating so quickly that even the FCC cannot keep up with the demands for new regulations in its traditional sphere. Second, the Internet lacks the choke points of easily identifiable parties to regulate. A comparison of the efficacy of customs collection from large ocean vessels with that from Caribbean drug runners in small fast boats which can land anywhere points up the difficulty of dealing with widely dispersed mobile targets.


3. An international regime.


The basis for an international umbrella to set policy and resolve disputes has been laid with the World Trade Organization which is responsible for administering trade agreements such as the General Agreement on Trade and Tariffs (GATT), the General Agreement on Trade in Services (GATS) and the Agreement on Trade-Related Intellectual Property Rights (TRIPS). The emerging services of the Internet and the dilemmas they pose have not yet been addressed by the WTO, but the model has been established. We argue that an international convention with representation of an unprecedented kind can set the framework for international regulation of the Internet. The precedent has been established with both the Rio de Janeiro conference on the environment and the Beijing conference on women; both conferences drew their representation and their strength from a broad range of interests that went well beyond nation-states.


The three contending approaches are not mutually exclusive: they could co-exist by assigning one approach to one aspect of Internet policy and others to other aspects. For this to work, however, there would need to be an overarching understanding of which approach would be applied to which aspect and some means to ensure that the result of the approach in one aspect did not have a negative impact on another.



One way to look at communications policy is to break down communications itself into its component parts. In an analytical sense, communication includes (a) a sender, (b) a message, (c) a channel or medium (d) a receiver and (e) feedback. Each component has its own regulatory dimension and its own constituency. The model illustrates the complexity of the international regulatory problematique by suggesting the many agencies involved.


Table 1. Public policy issues in communications, national and international

 Communication aspect
 National Issue
 International Efforts
 Sender  Universal service




Tax policy

 Developing country access (UNDP/World Bank/ITU)

Non-discrimination (UNHCHR)

Competition in services (WTO)

Authentication (UNCITRAL)

Domain name registration (ITU/WIPO)

Trademarks (WIPO)

 E-commerce companies

Software producers

Electronic media

Universities and libraries


 Message  Content regulation


 Norms for content regulation (UNHCHR)

Copyright (WIPO)

 Government censors

Press, film, music industry;

Civil libertarians

 Medium Telecom regulation

Encryption policy


Public investment

Transmission standards and protocols (ITU)

Satellite orbit slots (ITU)

Competition WTO)

Payments policy (ITU)

Infrastructure investment (World Bank)

Telecommunications companies;

Banks and financial service companies;

Internet service providers

 Receiver  Universal access  Universal access User groups;

Students and researchers

 Feedback Access to governance



Access to governance


 Non-governmental organizations

Industry groups

Developing country governments

It is no coincidence that the first two universal international organizations were concerned with communications: the Universal Postal Union and the, then, International Telegraph Union. Without some regulatory mechanism, communication flows over national borders could not be assured. Both institutions have endured (even though the ITU became the International Telecommunications Union) as what are now termed technical agencies of the United Nations system. The ITU has, through the domain-names controversy, become involved in Internet regulation, together with WIPO.

The question is, however, will the Internet change the parameters of international regulation, can the existing agencies perform the task and is there now a need for a conscious international communications policy?


The sender

The key issue for senders is to ensure access. In order to be a sender, one needs a method of identification. The need for order in identification is reflected in the discussions about how best to register domain names. In that sense, it is probably not surprising that the first large controversy in international regulation is in that area. Here the market mechanism has largely failed. The controversy also lays out in stark relief that there are many more significant parties to the question than just governments: individual and corporate senders, non-governmental organizations, channel providers and receivers as well.

A second issue for senders is the freedom to send messages. This is partly a matter of content (as will be seen below) but is also a matter of having a network on which any sender can expect to reach receivers. Here the question turns on whether reasonable access can be obtained to Internet service providers, with minimal regulation and reasonable cost, and whether receivers can be expected to be able to get the messages. This latter, again, is a matter of provision of the appropriate technology at a reasonable cost. The case in which Internet intermediaries in Austria were obliged to disconnect Internet sites in Iraq is an example of this issue.

Cost is related to the degree of competition among senders, and here the work of the World Trade Organization in terms of increasing trade in services is relevant.

Organizations of senders are beginning to develop. These can be nationally-based, as many are, but they are increasingly transnational. This in itself is a new phenomenon. It creates the prospect of international non-governmental organizations of senders which need not be comprised, as are traditional NGOs, of federations of national affiliates.


The message

Issues of message relate to regulation of content. What messages should be allowed, who will determine whether they will be allowed and who will regulate this. The Internet developed as an almost free market for ideas, but many individuals and a few national authorities have found messages objectionable. The efforts at censorship of adult material by the German government through prosecution of CompuServe managers for alleged pornographic content. A similar issue can be said to have been raised by efforts to regulate the use of encryption technology.

Communications policy includes regulatory policy and there are already efforts to develop national regulatory norms that would be applicable to the Internet, most recently the German Information and Communication Services Bill. These are highly controversial, since restriction in one kind of content may set precedents for wider regulation. They also pose national constitutional questions, as was found in the case of the invalidation of the Communications Decency Law in the United States. They are also likely to be ineffective, since the borderless nature of the Internet makes applicability of national regulations highly problematic.


The medium

The medium or channel over which the message is sent has historically been the focus of communications regulation. Partly this was because the channel, which was a physical entity, could be regulated. Partly it was because communications channels, like frequencies or lines that passed over public lands, were inherently public goods. At the international level this aspect of regulation has been reflected in international standards for bandwidths, frequency allocations and exchange protocols. It has also been reflected in agreements on the allocation of geostationary orbit slots.

Ensuring competition among providers of media has been a concern. Just as many governments are privatizing their national telecommunications systems, mergers among main Internet pipe providers has raised the specter of monopoly and has produced some efforts at regulation, as is the case of European Commission scrutiny of the proposed MCI/WorldCom merger.

A new effort at regulation through the medium is found in an increasing effort to regulate content by regulating channel providers. These include the cases of the government of Austria shutting down an Internet service provider by confiscating its physical servers, or the libel action against America On-line, as well as the previously mentioned German case. The futility of this type of regulation has also been demonstrated.

Perhaps more importantly, telecommunications technology is increasingly intersecting with the recognized global commons, as satellite-based transmission technologies designed to increase bandwidth and ensure coverage begin to compete for scarce orbital slots. The fact that two private corporations, Teldesic and a consortium of Motorola and others, intend to place large number of communications satellites in orbit will inevitably require some effort at regulation. Similarly, the increase of wireless transmission is already leading to regulatory efforts on a national basis and can be expected increasingly to enter international "space".


The receiver

As in the case of senders, the key issue for receivers of messages is access. One should be able to receive what the messages that one wishes and the Internet, with its packet delivery technology, facilitates this. Access, however, can be controlled by cost, by regulation or by technology (e.g. v-chips, surf watchers). The largest obstacle, however, is access to service providers, particularly in the developing countries. Partly this is a matter of technology transfer, partly of cost. However, as events such as the World Bank/UNDP/Canada Global Knowledge '97 Conference showed, many of the technical solutions are available, if they can be disseminated.

The real question is cost, including those of national telecommunications. Here it is partly a matter of local assignment of costs, which is related to telecommunications monopolies and the issue of competition, partly it is a matter of how international prices are calculated. In the matter of postal rates, which may have an analogy with Internet rates, mechanisms have been put in place to adjust rates to both national and international reality in the public interest.



The final aspect of communications is feedback. Here the interactive nature of Internet makes it perhaps the most complete communications system yet devised. How and whether feedback can be regulated is as yet an open question, although the well-publicized libel case against Matt Droan and America On-Line is beginning to indicate that there is an issue here. Similarly, the role of gatekeepers, including non-governmental organizations, in channeling feedback has yet to be explored.

Each of the elements also describes a constituency or constituencies (used in preference to the new jargon term "stakeholders") whose interests may conflict.



Internet governance might have remained a dormant issue were it not for the domain name dispute. This issue stands at the interface between the technological issues of Internet management and the economic and social issues that have emerged. It affects most of the aspects of communications. Its resolution has placed competing models in stark relief. It has illuminated the problems of national versus international regulation and it has mobilized the distinctly different constituencies within the Internet.

Origins of the problem

The assignment of names to domains in preference to numbers was a convenience for the early users of the Internet. The responsibility for assigning numbers was delegated by the United States government to an independent entity called the International Assigned Numbers Authority, headed by Prof. Jon Postel of the University of Southern California. The IANA was primarily concerned with ensuring that duplicate numbers were not assigned and that assigned numbers were entered into a central directory (the root directory). IANA assigned the country codes for domain names, including that for the United States (.us).

The attachment of names to the numbers which were not country-specific, the generic top-level domain names (gTLD's) for .com, .org and .edu was the responsibility of another entity, which was working under a subcontract from the National Science Foundation, as a result of competitive bidding. The entity was a private company, Network Solutions, Incorporated (NSI), functioning as InterNic. It is a commentary on the size of the Internet when the contract was issued that NSI began to register domain names that the NSF provided a subvention to NSI to cover the costs of what was in effect a free service.

As the Internet increased in size, the number of registrations increased dramatically. NSI made a decision to charge for site registration, a decision that was extremely unpopular among those persons who believed that the Internet should be, in effect, a free good. The cost factor, plus some management problems in NSI caused by the sudden increase in volume, caused many Netizens to see NSI as a potential monopoly. NSI's policy of registering any name that was not duplicative also produced problems when trademarked names were appropriated by other persons. A number of entrepreneurs, sensing the potential growth of the Internet, registered blocks of names and, in some cases, resold them to the trademark holders in what was perceived as a form of extortion. Court cases based on trademark infringement began to emerge. As the number of domains registered under .com increased into the millions, a shortage of good names was perceived.

Among major constituency groups like the Internet Society, some of the major telecommunications companies and a growing group of Internet service providers (ISP), the problems with domain registration affected both the order and the procedures of the Internet. The NSF subcontract with Network Solutions was due to expire in 1998 and they decided to develop an alternative. In the tradition of informal governance, they formed what was called the International Ad-Hoc Committee (IAHC) to develop an alternative method of domain registry.

The resulting proposal, evolved under the "rough consensus" model that had traditionally governed Internet standards, included the creation of seven new top-level domains, the establishment of a large number of registrars, the creation of a central registry of names and numbers, the creation of a dispute resolution machinery and the establishment of an Internet policy institution. Significantly, it included an involvement of two international organizations, the ITU to register the registrars and the WIPO to manage the dispute resolution mechanism.

The proposal was embodied in a Memorandum of Understanding (MoU) that was signed by a large number of parties at ITU headquarters in May 1997. Memoranda of Understanding were a common method within the ITU for establishing standards without a formal intergovernmental agreement, although their legitimacy was enforced by their status within the international telecommunications regime. The MoU created a mechanism that included a Policy Advisory Board, made up of representatives of the various Internet constituencies, a Policy Oversight Committee (POC) made up of elected representatives of constituency groups and a Council of Registrars (CORE), composed of those entities who were selected to register domain names, to oversee the domain name registration process.

The net effect of the MoU would be to internationalize Internet governance, at least in terms of one of its central functions.

While for the "Internet establishment", the MoU solved what was becoming a major problem, for others, including NSI, it constituted both a threat and an affront. A number felt that it gave too much power to international organizations. Others felt that it by-passed national regulatory mechanisms, especially in the United States. Still others felt than any regulation whatsoever threatened the open character of the Internet. Pressure was put on governments not to accept the MoU.

The United States government was put under particular pressure. As the government which had funded much, but not all, of the development of the Internet and which, through IANA and the NSI contract, maintained the root directory, it felt a particular responsibility. At the same time, the United States Federal Government was in a stage of trying to de-regulate industries. Cross-pressured, the United States delegation to the ITU did not sign the MoU and sent conflicting signals about its position.

Clearly unsure of its position, the United States government reverted to procedures that were used nationally when regulations were contemplated, a period of public comment. The national entity concerned with telecommunications regulation, the National Telecommunications and Information Administration (NTIA) issued a request for comments on June 16, 1997 based on a series of specific questions about Internet governance in general and about detailed aspects of the domain name registration question.

Some 282 distinct comments were received over the comment period in July and August 1997. Based on these, an advisor to the United States president began to prepare a proposal for management of the Internet.

The proposal was finally issued in what was called a "green paper", a draft policy statement. The Green Paper sought to resolve the dilemma by creating a new structure for domain name assignment through the devolution of the IANA function to a non-profit public corporation located in the United States, creating five new generic top-level domain names, undertaking a study of Internet governance and, as a transitional matter, extending the NSI registration contract until the details of the new system were worked out.

The Green Paper was also submitted for public comment on February 20, 1998. By the end of March 1998, over 500 distinct comments had been received. They ranged from short comments favoring one or another model through detailed, well-elaborated comments on specific issues raised in the Green Paper as well as on the Green Paper itself. They included comments from individuals and from the European Union.

An empirical analysis

The two sets of public comments on US government policy proposals for the Internet constitute a unique picture of the various issues in Internet governance and the major actors involved. The changes in perspective reflected in the six months between the first and second comment periods are also instructive.

In order to observe these changes, we coded all of the distinct responses in terms of their perspective on the three contending approaches to regulation: acceptance of an international role, acceptance of the concept of self-regulation and attitude towards national regulation. Our analysis of the first comments has already been reported in detail.1/ _ We included only one comment by each individual2/ and, in the case of a number of identical communications sent by different persons, included the comment only once. As a result, the number of comments in our database is smaller than that reported by the NTIA.

Commenters and constituencies

The pattern of commenters showed some change over the two periods, as can be seen from Table 2. The number of individuals commenting in their own right increased, while the comments from companies involved in web management decreased. The individuals were different, however. While in the 1997 comments, a large proportion made non-substantial comments, or vented frustration about Network Solutions, in the 1998 comments, the focus was on the content of the Green Paper.


Table 2. Types of commenters on NTIA proposals

 Type of respondent
 July-August 1997
 February-March 1998















Web management









*Not coded separately in 1997, included under individuals.


The most significant difference, however, was the proportion of comments coming from outside the United States. Only seven percent of the commenters in the July-August request for proposals were from outside the United States, in the March 1998 round, they constituted at least 20 percent.3/ Perhaps more importantly, they were more heavily represented among the commenters from institutions, a category which includes a diverse group of entities ranging from industry associations through non-governmental organizations and user groups like the Internet Society. This included some of the quasi-official bodies like the Policy Advisory Board, the Policy Oversight Council and the Council of Registrars, all of which are mechanisms set up under the MoU.


Table 3. Type of commenters by origin, March 1998





















 Web management









A major criticism made of the Green Paper, in many of the comments, was its "US-centric" orientation. Many commenters noted that the Paper largely ignored the work of the CORE in Geneva, proposed a United States location for the central registry corporation and had little role for international organizations. One of the dimensions that we coded in the responses, was whether they saw an international regulatory dimension for the Internet.

As can be seen from Table 4, most of those who mentioned the international dimension were favorable to it, but commenters from outside the United States were much more likely to both mention and favor an international dimension. This of course includes a large number of comments which explicitly favored the CORE model. In terms of types of commenters, individuals (who were overwhelmingly from the United States) were less likely (44 percent) to mention the international dimension, while institutions (78 percent) and web managers (65 percent) were much more likely to do so. This was a much larger proportion for all categories than was found in the July-August 1997 comments, which focused on a more US-centric type of questions and which we coded differently. In that round only about a quarter of the commenters mentioned the international dimension, although a group which were identified as more actively involved4/ had similar proportions as institutions in the later round (see Table 5) .


Table 4. Orientation to international governance of the Internet, March 1998 comments





 No mention


















Table 5. "Eminent" Commenters in July-August 1997 who mentioned international nature of the Internet, by type

 Type of Respondent



 Business association









 International organization






 Large corporation






 Small business



 Web management



 Grand Total




While a lower proportion of commenters mentioned the concept of Internet self-governance as a model, almost all who did so favored this approach. This was particularly true of commenters from outside the United States (41 percent) in contrast to those from inside the United States (32 percent). Institutions were particularly likely to mention this and favor it (55 percent), as had also been the case in the July-August 1997 comments. The proportion mentioning self government in February-March 1998 (34 percent) was slightly higher than that in July-August 1997 (27 percent).

There was little support for a model based on national regulation of the Internet. Only about half of the comments mentioned national aspects of regulation at all, but of those who did so, opposition was clear, as can be seen from Table 6. In fact, there was little difference in this between commenters from outside and inside the United States.


Table 6. Mentions of national regulation issues, March 1998





 No mention
















Only the few comments from government sources mentioned government regulation favorably.

The comments elicited a number of different positions. Many commenters clearly favored the CORE proposals. They included some of the CORE principals, registrars in waiting and persons wanting to register domains using the new generic top- level domain names, as well as persons looking for the combination of international self-government represented by CORE. Included were comments provoked by one of the CORE registrars.

A few comments favored the status quo of a U.S. government involvement and ancillary institutions.

A large number of comments, particularly from the United States, accepted the Green Paper as a reasonable compromise and made comments on its structural aspects. Some of the commenters from outside the United States referred to mechanisms to ensure international representation on the board of the new company to replace IANA. Many of the U.S. commenters were concerned that their particular constituency should be represented.

To draw a picture of the comments, we sought to code the main policy focus of the comment. Some were merely statements for or against the main contending institutions, Network Solutions and CORE, others referred to structural aspects of the Green Paper proposals, while still others focused on the specific issue of trademarks and dispute resolution about them which may felt were not well treated in the proposals. Table 7 shows this.


Table 7. Main policy focus of comment, March 1998

 Policy Focus


 Opposed to or critical of Network Solutions


 Supportive of Network Solutions


 Opposed to or critical of CORE


Supportive of CORE


Like status quo, no change is desirable


 Trademark issues


Structural issues relating to Green Paper


Business or commerce issues


Technical issues in domain names



513 cases


The analysis of comments shows a wide diversity of views. Of importance, however, is the increased international interest in Internet governance, all the more remarkable because comments on proposed rules in the United States usually elicit little attention. Clear also is the hostility felt outside the United States to any governance model that is US-centric.

If there had been comments on issues other than domain names, a similar diversity of views could probably be found.

It is evident that while there is the beginning of a rough consensus on some general principles, there is no consensus on details. There is a general understanding that governance should be international and self-regulating, but no agreement on the mechanisms. There is a great deal of diversity of views on even the facts. Some of the defenders of the status quo referred to the fact that the Internet had been created as a result of US government efforts, some critics of the Green Paper pointed out that many aspects of the Internet, including some of the early basic research, had been done outside the United States.

There were differences of views on whether a single registry (or directory) was needed, whether registrars should run registries, whether any of these would enhance or restrict competition.

It should be noted that the matter of domain names has only been widely discussed in the context of the NTIA request for comments. The CORE group undertook a wide-ranging and thorough consultation of those who they considered to be the constituencies of the Internet but it is obvious that they missed some key groups. The issue has not been discussed in any detail at the intergovernmental level and, apart from the links between domain names and intellectual property, little attention has been given to the other communications aspects of the Internet.



While there is a general consensus favoring self-regulation with very limited public intervention, there is no consensus yet on most of the issues of how to bring this about. There has also been little examination of the larger picture, beyond domain names. The percolating issues of Internet commerce, content regulation and extension of the Internet to widening publics, especially in the developing countries, are moving an increasing number of governments to consider the possibility of defining more clearly a comprehensive Internet regime.

In the parlance of international relations theory, a regime is "a set of principles explicit or implicit norms, rules and decision-making procedures around which expectations of actors (States) converge in order to coordinate actors' behavior with respect to a concern to them all."5/. Development of a regime usually follows a sequence of agreements, either tacit or formal. The first stage is to agree on what are termed principles, which are statements of fact. In terms of the Internet, they would include statements about the nature of the technology involved, about the essential mechanisms for making the Net work. The second stage has to do with norms, statements about rectitude. In terms of the Internet they might include such norms as free and open access, avoidance of content control, responsibilities of senders, media and receivers and issues of payment and taxation. A third stage is to convert these norms and principles into operating rules and to set up whatever machinery might be necessary to enforce the rules and modify them as may be required.

Regimes take time to create. It can be argued that much has already been done to set the bases for an Internet regime and that, in some respects, its technological operation suggests that in that sense a regime is already present. But in terms of the non-technological issues of Internet governance, there is little that can be said to have been agreed.

There are some who would argue that a regime agreed by governments is not even necessary. Some argue that the Internet is essentially a private entity, not of the public interest, and therefore not needing government agreement. This is not a widely held view, however, and most would agree that the Internet is a public good or a public trust. The fact that individual governments are already trying, without much success, to regulate elements of the Internet suggests that any agreements to be legitimate will have to be made by governments.

The issue, then, is what type of agreements, reached where.


The Internet Charter Idea

The idea that there should be an international, intergovernmental agreement on the basis for Internet governance has already been presented by the European Commission. As a press release of February 4, 1998 stated:

Following a proposal of its members in charge of industrial affairs and telecommunication and external relations and trade policy, Martin Bangemann and Sir Leon Brittan, the European Commission today recommended in a communication to launch an international debate regarding global communications policy, to set a framework for international policy cooperation and to start a process which could lead to the adoption of an International Communications Charter.

After a review of various policy issues relating to economic and social dimensions of the Internet, the communication concluded:

Although good progress has been achieved, the understandings and agreements arrived at within these fora consist either of principles, which are not necessarily compatible, or do not cover all elements of a comprehensive framework. Also, the process will now need to continue with as wide a participation of the international community as possible, including the developing countries. As chapter 2 shows, there are a growing number of urgent issues awaiting solutions.

It went on to state that

An International Charter would:

An International Charter would not therefore define the key issues to be solved as such, but contain an understanding on how a process of strengthened international coordination should be organized, with as wide as possible a participation of the international community. The Charter could be agreed by or in the course of 1999.

The difficulty with the European Commission proposal is that, by being non-binding, it may not be able to address the basic issues of governance, when these require formal decisions of government and, while serving as a reference document, it may lack the possibility of its own modification in the light of technological, economic and social changes relating to the Internet.

The EU proposal suggests either using an existing ministerial meeting or convening a special ministerial meeting to deal with the issue. The question here would be which ministers. As has been noted, the governance issues of the Internet cut across many of the traditional sectoral lines. It could be trade ministers, for example, or those in charge of telecommunications. It could involve law enforcement ministries. Choosing one ministry over another would imply a decision as to which sector is most important in Internet governance. Such a choice would also affect which of the organizations of the United Nations system would be primarily involved. There is clearly no consensus about which this might be at either the national or the international level.


The concept of a framework convention

What the European Union is proposing is the beginnings of a framework convention. Most international regimes are embodied in legally-binding multi-lateral treaties. However, these are increasingly sequential in nature, with treaties expanding and ramifying in an orderly way in response to further negotiation and new developments. It might be worthwhile to consider extending the EU proposal towards a framework convention rather than a non-binding document.


The example of the UNFCC

The current best example of a framework convention is the United Nations Framework Convention on Climate Change. This was a response to the dilemma of how to address the problems of global climate change, about which there was no initial consensus. As described by Paterson 6/, the process of agreeing on the climate change convention involved first reaching an agreement on the nature of the problem. This largely took place in technical forums, especially of the World Meteorological Organization. Then, when there was a consensus about the nature of the problem, it was possible to determine norms (what should be done) and determine procedures that could be followed subsequently to deal with the issue. The climate change convention was agreed relatively quickly and signed at the United Nations Conference on Environment and Development.

Subsequently, a series of protocols have elaborated details on how to address the identified issues, most recently at Kyoto in December 1997, and a process is in motion, following agreed and legally-binding procedures, to tie up other loose ends.

The climate change convention is a model of how to establish a regime in an environment where little is agreed beyond the existence of a problem, but which provides an element of order without which subsequent agreements might be difficult.


What would an Internet Framework Convention do?

An international framework convention on the Internet would have to do three things: it would have to define the nature of the problems through describing the technological elements and their linked economic and social issues; it would have to articulate basic norms about how the Internet is to be governed and it would have to establish a machinery for both monitoring compliance with the norms and determining future changes.

The task of sorting out the roles and responsibilities of the numerous national, international and private entities cannot be left to Darwinian processes at this point in the development of a global enabling technology. A framework convention is needed to allocate to existing authorities regulatory rights and responsibilities and to establish dispute resolution mechanisms at the international level.


Who should be involved?

The Internet is a unique public space. It involves governments but goes beyond them. It involves many different publics but also goes beyond any one of them. It cannot be controlled by coercion and therefore its governance implies legitimacy. Legitimacy is only achieved when all of the concerned parties accept a decision or a set of rules. The rough consensus approach that governed the Internet in its early days is an example.

For this reason, any framework convention needs to be based on agreements by governments but must go far beyond them to bring on board the many constituencies that currently exist or can be forecast. Negotiation of a framework convention would require the kind of government/civil society/international organization participation that any negotiation about the global commons needs, but in this case the participation of the different parties would need to be far more explicit than in previous negotiations.

*John R. Mathiason is Adjunct Professor of Public Administration at the Robert F. Wagner Graduate School of Public Service, New York University and Managing Director of Associates for International Management Services. Charles C. Kuhlman is Director of Telecommunications at New York University.

1/ Who Will Give You Your Domain Name

2/ A number of persons sent a series of different comments. Only their initial comment was counted, since subsequent comments often were responses to comments by other persons.

3/ The coding was done according to address or other information in the content of the comment. Some of the respondents with a .com address that were coded as being from the United States could well be from other countries, since the generic domain name does not locate the sender in geographical terms.

4/ In the earlier analysis, a group of commenters identified by the World Internet Alliance was taken to represent the Internet establishment and were termed "emminents".

5/ Stephen D. Krasner, "Structural causes and regime consequences: regimes as intervening variables" in Stephen D. Krasner (ed), International Regimes, Ithaca: Cornell University Press, 1983, p. 2.

6/ Matthew Paterson, Global warming and global politics, London: Routledge, 1996