Skip navigation links Sitemap | About us | FAQs

UN Programme on Disability   Working for full participation and equality

Accessibility

International Information Structures and Technologies; the social perspective

Compendium of resources from a seminar
sponsored by the United Nations
Division for Social Policy and Development,
Department of Economic and Social Affairs

Associates for International Management Services
Mount Tremper New York, USA
http://www.intlmgt.com
March 1998

Contents of the seminar

Preface

Over the span of five days in late 1997 and early 1998, Associates for International Management Services (AIMS) conducted an introductory overview seminar on information technology and policy for United Nations staff, principally from the Division for Social Policy and Development, Department of Economic and Social Affairs. The goals of the seminar were:

The presenters were Dr. John Mathiason, Charles Kuhlman, Timothy O'Connor, and Daniel O'Sullivan. The seminar sessions addressed the current state of information technology, developments in video technology, advanced World Wide Web techniques, privacy and encryption, and major global policy issues.

The Information and Telecommunications Explosion

The new information and communications technologies challenge international organizations - as potential tools for more effective management and outreach, - as a new means for economic, political and social development, and - as a tangled cluster of global policy issues.

The digital revolution

The starting point for understanding these challenges is to have an appreciation for the underlying technologies. The early history of mechanical computation is entwined with the demands of war. The calculation of artillery trajectories required massive amounts of computation which was completed first by hand, by analog computers, by mechanical calculators, by electrical relays, by vacuum tubes and finally by transistor-based computers.

Once computation grew past the analog stage of development, the realization grew that there was something quite different about the underlying methodology. The decimal numbers used for ballistic computation were transformed to their binary equivalents to take advantage of the two-state, on/off, one/zero capabilities of relays and transistors. If numbers could be manipulated by computers, anything which could be reduced to numbers could also be manipulated. Furthermore, with a little bit of cleverness binary numbers could be copied endlessly without error. This then is the essence of the digital revolution: the reduction of aspects of reality to binary form and the possibility of perfect reproducibility.

The widespread practical application of these insights awaited the invention of the transistor in 1947 and the incorporation of the transistor into the microprocessor in the early 1970's. The miniaturization of the microprocessor continues at a rapid pace; the number of transistors that can placed on a 2 cm by 2 cm piece of silicon chips has been doubling every 18 months for a decade. Relatively inexpensive chips now contain millions of transistors.

Computers have gone through several generations and will doubtless transmute still more in the next decade as the process of miniaturization proceeds unabated. The first generation of computing machines, the mainframes, were large, hot, expensive and costly to operate. Consequently, they were employed as highly centralized organizational tools. The second generation of computers, the minicomputers, being less expensive, led to decentralized, departmental level computing. Off to the side in the evolution of computers were the supercomputers which were capable of performing parallel operations on very large sets of large numbers in support of program in weapon design and climate analysis. The now ubiquitous microcomputers began to rival the capabilities of minicomputers in the 1980s and the mainframes in the 1990s.

Computers and communication

In an organizational context the desktop machines created a problem: How can a semblance of organizational coherence be maintained unless the PCs are somehow coordinated?

There are two answers. First, the machines were made to talk to each other through standardized communications protocols; they were networked. Second, in a partial reversion to the mainframe model, the PCs were linked to a centralized machine called a server that held common data upon which many depended.

Circuit switching and packet switching

In parallel with the development of ever smaller computers, the technology of communications developed rapidly, abetted again by digitization and miniaturization. The invention of packet switching initiated a grand paradigm shift in the technology of information. Previous forms of electrical communication were based on a circuit connection: one sender, one message, one link, one receiver. Two tin cans connected with a taut string are an apt (and still workable) analogue. Information can be conveyed over a circuit link by varying the electrical characteristics of the line such as the voltage or the frequency. As early as 1870, the utility of a single circuit was expanded through multiplexing, the simultaneous transmission of several messages over the same pair of wires . Some of Thomas Edison's earliest and most profitable patents were devoted to multiplexed telegraphy. Still, the messages were delivered from one point to just one other point.

Switching provided the answer to the problem of binary connectivity by connecting a pair of wires from point A to point B or C or D.... As the demand for connections grew, manual switching by banks of operators was superseded by complex mechanical switches beginning in the late 1890s and finally with computerized switches in the 1950s. Although switching systems matured to the point that hundreds of millions of connections a day could be set up and maintained by the telephone network, their essential characteristic remained a continuous full-time pathway between two points at any one time.

Packet switching changed the model. The communication, once put into digital form, was divided into small well-defined groups of coded data, each with an identifying numerical address. Anything that can be digitized can be sent as a packet. To the Internet, a packet is a packet is a packet, whether it carries numbers, words, digitized sounds or digitized pictures. Now it became possible, and the U.S. Defense Department's Arpanet actualized the possibility, to send an unlimited number of packets over the same circuit with different addresses. Routers, rather than switches, became the key to delivering the packets to the intended destination. Controlled by software and microprocessors, the router inspects the address of a packet and sends it on its way on a full-time circuit to another router to an eventual end point. As the Arpanet evolved from an X-25 network into the TCP/IP network that undergirds the Internet, the designers reserved 32 bits for the packet address (to be superseded by 128 bits in Internet Protocol version 6 [Ipv6]) which are represented in decimal notation in a format xxx.xxx.xxx.xxx, where each group of x's can range from 0 to 255. The innovation of a Domain Name Server (DNS) in 1984, prior to the creation of the Web in 1998-89, provided synonyms for the somewhat inscrutable digit strings of the actual address. The actual addresses of the packets remained the digit strings but they were replaceable by more or less scrutable alphabetic equivalents stored on a DNS server file which permitted the look-up of the alphabetic name from a numerical address and vice versa. Thus was born the web site name, a new entity and a new property right in a new and legally ambiguous sphere.

The telephone system addressing system started in the simplest possible fashion with "Sally" asking an operator to connect her to "Harry". Under the direction of the Bell System affiliated companies and by agreement with the non-Bell operating companies, the present United States ten digit area code-exchange-line number system evolved over decades into a national standard. By contrast, the Internet addressing scheme was designed from its day of creation by engineers and scientists as a logical and comprehensive construct to meet their needs for low cost data communication. The integrity of the scheme was guaranteed by its sponsorship by the US Department of Defense and by later private sector successors.

The alphabetic names associated with numeric addresses were divided into domains, a limited typology of alpha addresses that enabled the routers to do their lookups efficiently. To find the numeric twin of the Internet address of "UN.ORG," for instance, the router need not search through every entry in its address table, just the addresses ending in ".ORG. " These suffixes as the first level of searching and selecting are known as top level domains: the current set includes .COM, ,MIL, .ORG and .NET, corresponding to net addresses for entities in commercial, military, non-profit and network administration endeavors.

Internet addresses are conceptually very different from telephone numbers. In the U.S., Canada and the Caribbean, most area codes (technically NPAs "National Planning Areas") denote a graphic place with boundaries identifiable with governmental jurisdictions: nations, states, cities. The exchange part of the phone number is traditionally associated with a specific place with a street address, the central office, from which the wires emerge to connect the telephone user over the "last mile" to the network. The place-centered nature of the phone system numbering plan is beginning to break down with the rise of wireless cellular systems, the widespread use of ghostly 800 and 888 numbers which may be answered here one minute and there the next. Nonetheless, jurisdiction can be established in all cases. Internationally, the country code, city code numbering system links phone number to place to jurisdiction.

Internet communications

Internet addresses have no fixed location. They are purely conceptual. There is no central office. The routers which direct packets to the packet address at rates between 100,000 and 500,000 a second can know only the next logical point in a routing table and which outbound circuit is available to carry the packet. Packets are free to traverse the globe on countless circuits to geographically indeterminate end points. The technology provides assurance that the packets are reassembled in the right order and are very likely not corrupted by data errors.

A further distinguishing characteristic of Internet addresses is that neither the sender nor the receiver of a packet is a paying customer for the packet. Telephone requires two paying customers to complete a call, each of whom is paying for the privilege and each of whom has at a minimum a billing address and usually a street address in a city, a state/province and country. The Internet senders and receivers are inherently tied neither by the billing process nor the technology to place.

We have identified the technical underpinnings of novel realities which have led to major policy debates which are far from resolved. From inside the Internet, names for addresses are structured but purely arbitrary, the technology is indifferent to content, and the sender/receiver dyad is unlocatable in the conventional sense.

International Policy Challenges: Intellectual Property; Names, Sounds, Pictures, Words, and Ideas

Names have value; legal ownership and the right of exclusivity for patents, trademarks, service marks and copyrights for sound/picture/literary content are well-established in Western law. The treaties underlying the World Intellectual Property Organization (WIPO) have extended the principles, if not the practices, to a wide spectrum of countries. A substantial body of commercial law and practice guarantees that recourse is available to a party which believes its property rights have been infringed upon. A variety of adjudicators mechanisms are available to resolve disputes and provide redress at a national level.

The Internet poses a challenge due to its indeterminate ubiquity. Infringement becomes possible from any comer of the globe, that is, from any address on the Internet. If content can be digitized in can be not only pirated but it can also be disseminated globally with no impediment. No court, mediation board or arbitrator can be presumed to exist with authoritative jurisdiction even if the infringer can be definitively identified. The efforts of the United States to halt the active commerce in pirated compact disks and software in China are illustrative of the difficulty of maintaining ownership of intellectual content. Absent any overarching jurisdiction, diplomatic pressure had to be brought to bear in a bilateral context of numerous other foreign policy issues. The Internet raises the very real specter (for the owner of content) of massive evaporation of assets.

The Internet is not an inherently broadcast medium although it shares the ability of radio and television to simultaneously reach many people through Web sites either sought deliberately per occasion by the receiver (a "surfer") or "pushed" by a Web application such as PointCast. With the partial exceptions of shortwave radio and the direct broadcast satellites, the content of radio and television (including cable TV) broadcasting since Marconi has been firmly under the thumb of governmental authority under the rationales of orderly spectrum allocation (U.S.), revenue generation twinned with cultural uplift (U.K.) or outright thought control. The potentially universal accessibility of content via the Web upsets the traditional regulatory model. In its blithe way, the Web does not care what it carries: hate, love, pornography, fraud, lies, truth, scholarship, charlantry are all the same in the stream of bits and all equally accessible. The efforts of the German government to shut down Nazi-leaning, anti-Semitic Web sites by penalizing the Internet service provider was a kind of desperate grasping for any available handle since the real purveyors were to difficult to reach. The ease of establishing a Web site provides assurance that the extinction of one offender will not prevent recrudescence. On the horizon are technical developments which will make the Internet much more like a broadcast medium and even more subversive of government control: vastly increased circuit capacity through a technology known as "wave division multiplexing" and a redesign of the underlying Internet protocols to permit simultaneous transmission of the large amounts of data required for images (one such effort is known as MBONE).

Problems resulting from the nature of the Internet

1. The creation and destruction of property rights. The ownership of domain names is just one example of a new form of an old, satisfactorily, if not perfectly settled issue. A broader question is how to retain ownership of digital content once it has been digitized and made available on the Internet. The widespread availability of inexpensive copying machines a decade ago created a culture of book and article copying to the great distress of conventional publishers. Law suits and clarifications of the copyright law resulted in a "fair use" doctrine which limits copying to personal use. "Fair use" on the Internet is much more difficult, if not impossible, to define. For one class of content creators, the issue is irrelevant: those for whom dissemination is more important than immediate financial gain. Scientists, scholars in the humanities and creative writers are increasingly bypassing paper-based media and are skipping directly to Internet publication. The effect on the lucrative business of scholarly publishing will be devastating as they watch their feedstock evaporate.

2. Trade in services is an increasingly large portion of world trade. The Internet promises to expand this invisible trade exponentially and uncontrollably from the standpoint of sovereign authorities. Insurance is not the only thing that can be moved beyond existing regulatory measures. Relying upon their semi-sovereign status as well as the global nature of the Internet, the Coeur d'Alene tribe of native Americans in Idaho have established an international lottery on the Internet doubly beyond the control of the U.S. and Idaho State governments (New York Times, 8 March 98, p. 24). The telecommunications services which have constituted a growing share of world trade in invisibles have been well-regulated affair both in governmental and commercial terms. Traffic is measured in minutes by the sending and recipient countries and companies and compensation is paid under international agreements much as banks settle funds flows. Internet telephony upsets these arrangements by bypassing the telephone companies almost entirely. Once the quality of service deficiencies are addressed and the last mile problem is solved, a very large piece of international trade will disappear from the scope.

3. The authenticity of communications becomes suspect on the Intenet. Considerable effort is being devoted to security and encryption due to the anonymity of the Internet and the indeterminancy of the packet technology. Law enforcement and national security interests have butted up against the anxiety engendered by the lack of clear knowledge of where the Internet address is located and who is there. Authenticity anxiety is most powerfully felt when money is exchanged, but is also present in other types of communication as mundane as scholarly texts.

4. Privacy of communication is a reasonable assumption in circuit switched networks. Indeed, the almost total digitization of the telephone network and its reliance on extremely high speed multiplexing techniques drove the U.S. Government to successfully press for the adoption of the Communications Assistance to Law Enforcement Act (CALEA) which obliges telephone companies and their suppliers to modify their equipment to support digital wiretaps. Privacy is not a reasonable assumption on the Internet.

5. The problem of preserving national, regional and local culture is exacerbated by the Internet. Before the Internet achieved its current prominence, the dominance of U.S. mass media products in the world market was a major bargaining issue during the Uruguay round of trade negotiations with France and Canada holding out for significant restrictions. Substantively unrelated agricultural negotiations led to compromises which apparently satisfied the negotiators (Gauntt, 1997). The Internet players are not so easily identified and mollified (or restricted) as the Disney Corporation and Rupert Murdoch. The relatively low cost of entry and exit means that backyard moguls can become significant originators of cultural content available everywhere. A clash with local mores is inevitable in open societies and even more so in closed ones. The dominance of the English language on the Web is a thorn to cultural preservationists outside the English-speaking countries. Efforts are underway to make the Web multilingual but the most common ways of representing text digitally were designed for the Roman alphabet, especially as used in English.

6. Crime and terrorism take on new guises with the availability of inexpensive and widely available instantaneous global communication. As evidenced by the mostly benign (so far) attacks by self-styled "hackers," the network and the computers connected to it themselves are potential targets for criminal enterprises that normally fall under the rubrics of theft, larceny and property destruction. Internet aside, as the normal business of daily life such as power distribution, financial systems, air traffic control become pervasively intertwined with automated systems, the potential for criminal and terroristic attack becomes acute. The historical working commercial and international law has not been brought to bear on these issues or is inherently incapable of dealing with the new issues brought about by the Internet. Contending approaches to regulating the Internet.

Contending approaches to regulating the Internet

1. Self-regulating market. One version of this approach is Peter Huber's Law and Disorder in Cyberspace (New York, Oxford University Press, 1997) where he argues that the advance of technology is so rapid that no regulatory regime has had and can have any effect which is not detrimental. The maintenance of order in the "telecosm" (Huber's neologism) should be maintained by "private actors and private litigants, common law courts and the market" (Huber, 1997). Against Huber, Stewart Baker has argued that judges are clumsy and retrograde makers of social policy (Wall Street Journal, 3 November 1997). More to the point, in the international arena, the Anglo-Saxon common law tradition is non-existent and "the market" is a synonym for the global behemoths of Canada, the United States, the European Union and Japan.

2. A market guided by national authorities. Just as the Federal Communications Commission [of the United States of America] has regulated the cable TV, telephone and broadcast industries for almost 100 years, so also could it and its foreign counterparts regulate the Internet. Two obstacles are likely to obviate this option. First, the technology is mutating so quickly that even the FCC cannot keep up with the demands for new regulations in its traditional sphere. Second, the Internet lacks the choke points of easily identifiable parties to regulate. A comparison of the efficacy of customs collection from large ocean vessels with that from Caribbean drug runners in small fast boats which can land anywhere points up the difficulty of dealing with widely dispersed mobile targets.

3. An international regime. The basis for an international umbrella to set policy and resolve disputes has been laid with the World Trade Organization which is responsible for administering trade agreements such as the General Agreement on Trade and Tariffs (GATT), the General Agreement on Trade in Services (GATS) and the Agreement on Trade-Related Intellectual Property Rights (TRIPS). The emerging services of the Internet and the dilemmas they pose have not yet been addressed by the WTO, but the model has been established. We argue that an international convention with representation of an unprecedented kind can set the framework for international regulation of the Internet. The precedent has been established with both the Rio de Janeiro Conference on the Environment and the Beijing Fourth World Conference on Women. Both conferences drew their representation and their strength from a broad range of interests that went well beyond nation-states.

Note: The views expressed are those of the seminar presenters and do not necessarily reflect the views of the United Nations Secretariat. The seminar sessions were organised and conducted by following set of AIMS Associates: Mr. Charles Kuhlman - seminar director and presenter on information structures and technologies (Director, Department of Telecommunications, New York University); Dr. John R. Mathiason - presenter on international information policy issues and trends (Adjunct Professor, Robert F. Wagner School of Public Service, New York University); Mr. Timmothy O'Connor - presenter on Internet security (Security Manager, New York University); and Mr. Daniel O'Sullivan - presenter on new dimensions in multimedia and the Internet(Tisch School of the Arts, New York University).


Home | Sitemap | About us | FAQs | Contact us

© United Nations, 2006
Department of Economic and Social Affairs
Division for Social Policy and Development