The Centripetal City: Telecommunications, the Internet, and the Shaping of the Modern Urban Environment



On July 19, 2001, a train shipping hydrochloric acid, computer paper, wood-pulp bales and other items from North Carolina to New Jersey derails in a tunnel under downtown Baltimore. Later estimated to have reached 1,500 degrees, the ensuing fire is hot enough to make the boxcars glow. A toxic cloud forces the evacuation of several city blocks. By its second day, the blaze melts a pipe containing fiber-optic lines laid along the railroad right-of-way, disrupting telecommunications traffic on a critical New York-Miami axis. Cell phones in suburban Maryland fail. The New York”“based Hearst Corporation loses its email and the ability to update its web pages. Worldcom, PSINet, and Abovenet report problems. Slowdowns are seen as far away as Atlanta, Seattle, and Los Angeles, and the American embassy in Lusaka, Zambia loses all contact with Washington.

The explosive growth and diversification of telecommunications in the last three decades have transformed how we exchange information. With old divisions undone, email, telephone, video, sound, and computer data are reduced to their constituent bits and flow over the same networks. Both anarchistic hackers and new economy boosters proclaim the Internet to be a new kind of space, an electronic parallel universe removed from the physical world. It is tempting, when our telecommunication systems function, to get caught up in the rhetoric of libertarians like George Gilder and Alvin Toffler, who praise cyberspace as a leveler of hierarchies and a natural poison to bureaucracies, or to listen to post-Communist radicals as they declare social, digital, and economic frameworks obsolete, and profess their faith in Deleuzean “rhizomatic” networks””?multidirectional, highly interconnected meshworks like those created by the roots of plants. It is easy, on a normal day, to believe that the Net exists only as an ether, devoid of corporeal substance. But this vision is at odds with the reality of July 19, 2001. When the physical world intrudes, we confront the fact that modern telecommunications systems are far from rhizomatic, and act instead as centralized products of a long historical evolution. The utopian vision of a network without hierarchies is an illusion””?an attractive theory that has never been implemented except as ideology.

If telecommunications disperses individuals, it concentrates structures, reinforcing the fundamental simultaneity of centrifugal forces that drive capital and the modern city. This is nothing new: Downtown has always been dependent on both suburbs and rural territories. The remarkable density of the nineteenth-century urban center could only develop when homes and factories were removed from the city core via the spatially dispersive technologies of the commuter railroad and the telephone, while industrial, urban capital required the railroad and steamship to facilitate exploitation of the American continent and connection to global markets. As the Baltimore train wreck demonstrates, this history of structural concentration in the telecommunications industry means that new infrastructures do not so much supercede old ones as ride on top of them, forming physical and organizational palimpsests””?telephone lines follow railway lines, and over time these pathways have not been diffused, but rather etched more deeply into the urban landscape.

Modern telecommunications emerged in the mid-nineteenth century. The optical telegraph, invented in the 1790s and based on semaphore signaling systems, had by the 1830s formed a network across Europe, allowing messages to be transmitted from Paris to Amsterdam and from Brest to Venice. In 1850, at its peak, the French system alone included at least 534 stations, and covered some 3,000 miles. But optical telegraphs were hampered by bad weather and the expense of manning the closely-spaced stations largely limited them to serving as early warning systems for military invasion. It took American artist Samuel F. B. Morse’s invention of the electric telegraph in 1837 to make possible an economical system of telecommunications. Initially Morse’s simple system of dashes, dots, and silences was received with skepticism. Only with the opening of a line between New York and Philadelphia in 1846 did the telegraph take off. By 1850, the United States was home to 12,000 miles of telegraph operated by twenty different companies. By 1861, a transcontinental line was established, anticipating the first transcontinental railroad by eight years and shuttering the nineteen-month-old Pony Express with its ten-day coast-to-coast relay system. In 1866, a transatlantic cable was completed, and Europe’s optical telegraph declined swiftly, leaving only a scattering of “telegraph hills” as traces on the landscape.

 The electric telegraph’s heyday, however, was also short; invention of the telephone by Alexander Graham Bell in 1876 gave individuals access to a network previously limited to telegraph operators in their offices. By 1880, 30,000 phones were connected nationwide, and by the end of the century there were some 2 million phones worldwide, with one in every ten American homes. Still, the telegraph did not merely fade away, retaining its popularity among many businessmen who preferred its written record and continuing to dominate intercity traffic for decades to come.

At the turn of the century, telephone and telegraph teamed with the railroad to simultaneously densify and disperse the American urban landscape. Making possible requests for the rail delivery of goods over large distances, telecommunications stimulated a burst of sales and productivity nationwide. This produced a corollary growth in paperwork, which, in turn demanded new infrastructures, both architectural and human. Vertical files and vertical office buildings proliferated as sites of storage and production and a new, scientifically oriented manager class emerged. In 1860, the US census listed some 750,000 people in various professional service positions; by 1890, the number ballooned to 2.1 million, and by 1910 it had doubled again, to 4.4 million. In 1919, Upton Sinclair dubbed these people “white collar” workers. Often trained as civil and mechanical engineers, they tracked the burgeoning commerce through numerical information. New machines aided them: the mid-1870s saw the development of the typewriter and, soon after, carbon paper; the cash register, invented in 1882 to prevent theft, could collect sales data by 1884. Modern adding machines and calculators emerged in the later half of the decade and, in the 1890s, the mimeograph made possible the production of copies by the hundred.

The result transformed the city. Commuter rail allowed white collar workers to live outside the downtown business districts and, as industry came to rely more and more on rail for shipment, production left the increasingly congested core for the periphery, where it was based in buildings that, for fireproofing purposes, were physically separated from each other. The telephone tied building to building, and linked the rapidly spreading city to its hub. Understanding that the phone was reshaping the city, phone companies and municipalities worked closely together, the former relying for their network expansion on zoning plans legislated by the latter.

Eventually, Bell’s company came to dominate the telephone system telephone while Western Union controlled the telegraph. Initially, however, this relative equilibrium in the industry was far from certain. Between 1877 and 1879, Western Union had begun to diversify from telegraph services by producing telephones based on alternative designs by Thomas Edison and Elisha Gray. Bell filed a lawsuit claiming patent infringement and an out-of-court settlement left him in possession of a national monopoly. Opportunities for competition arose again when Bell’s patents expired in 1893 and 1894, and thousands of independent phone companies arose, serving rural hinterlands where Bell did not want to go. But, by refusing to connect his lines to these independents, Bell insured that long distance service””?a luxury feature””?was available to his subscribers only.

This desire to eliminate competitors was not universally appreciated in the Progressive era. In 1910, Bell’s company, which had taken the name Atlantic Telephone and Telegraph in 1900, purchased the larger and better-known Western Union. This move stimulated anti-corporate sentiment, and the threat of governmental antitrust action loomed””?a far from idle threat given the 1911 breakup of the Standard Oil Company. In late 1913, AT&T took pre-emptive measures, in the form of a document called the Kingsbury Commitment. The giant agreed to sell off Western Union, and to permit the independents access to its lines. Over the next decade, a partnership evolved between AT&T and the government, with an understanding that in exchange for near-monopoly status, the company would deliver universal access to the public by building a network in outlying areas. AT&T thus avoided antitrust legislation to emerge in total control not only of the long-distance lines but through its twenty-two regional Bell operating companies, of virtually every significant urban area in the country. AT&T owned everything from the interstate infrastructure to the wiring and equipment in subscriber’s homes.

This early period established a topology of communications that existed until the Bell System’s breakup in 1984. Individual phones were connected to exchanges at the Company Office (to this day, one’s distance from the Company Office determines the maximum speed of one’s DSL connection) and these exchanges connected to a central switching station inevitably located in the city core, where the greatest density of telephones would be found. In 1911, the same year that Bell bought Western Union, General George Owen Squier””?then head of Army’s Signal Corps, and the future founder of Muzak Corporation””?developed a technology called multiplexy. By modulating the frequency of the signals so that they would not interfere with each other. multiplexy permitted transmission of multiple, simultaneous messages over one cable, Multiplexed connections were used on long distance lines beginning in 1918. After World War II, however, the high cost of copper wire comprising the multiplexed network””?coupled with rising demand for bandwidth and growing fear that nuclear war would wreak havoc on continuous wire connections””?led engineers to develop microwave transmission for long distances. In the 1950s and 1960s, adopting the motto “Communications is the foundation of democracy,” AT&T touted its microwave “Long Lines” network as a crucial defense in the Cold War. Then, in 1962, AT&T launched Telstar, the world’s first commercial communications satellite, which they hoped would allow them to provide 99.9% connection between any two points on the earth at any time, while further increasing communications survivability after atomic war. Ironically, Telstar operated only six months instead of a planned two years, succumbing to radiation from Starfish I, a high-altitude nuclear test conducted by the United States Army the day before Telstar’s launch.

Even with the hardening of key buildings against atomic attack and the development of the microwave transmission system, the vulnerability of satellites to enemy destruction remained an open question. American computer scientist Paul A. Baran, a researcher at the Cold-War think tank the RAND corporation, felt that continued use of the centralized model of communications left the country vulnerable to extreme disruption during a nuclear first strike. With the loss of the city center and the destruction of the central switching station, Baran realized, all intercity communications would be destroyed.

 The popular idea of the Internet as a centerless, distributed system stems from Baran’s eleven-volume proposal for a military network that could survive a nuclear first strike and maintain the centralized, top-down chain of command. Such a system was essential, Baran felt, so that the other alternative””?giving individual field commanders authority over nuclear weapons””?would not be necessary.

Baran proposed a new military network for telephone and data communications to be located entirely outside of strategic targets such as city cores. He identified three forms of networks: centralized, decentralized, and distributed. In the centralized network, with the loss of the center, all communications cease. Decentralized networks, with many nodes, are slightly better, but are still vulnerable to MIRV (Multiple Independently-targeted Reentry Vehicle) warheads. Baran’s network would be distributed and hard to kill: Each point would function as a node and central functions would be dispersed equally.

Designed not for present efficiency but for future survivability even after heavy damage during nuclear war, Baran’s system broke messages down into discrete “packets” and routed them on redundant paths to their destinations. Errors were not avoided but rather expected. This system had the advantage of allowing individual sections of messages to be rerouted or even retransmitted when necessary and, as computers tend to communicate to each other in short bursts, would also take advantage of slowdowns and gaps in communication to optimize load on the lines. Baran’s model, however, was never realized. Baran’s proposal itself fell victim to a military bureaucracy unable to see its virtues.

Instead, the Internet as we know it is the outgrowth of ARPANET, another military project that produced the first successful intercity data network. Established in 1958 to ensure U. S. scientific superiority after the launch of Sputnik, the Department of Defense’s Advanced Research Projects Agency was implanted in universities throughout the country. ARPANET was designed to build community and overcome isolation between these geographically separated offices, without undoing the wider range of possibilities created by diversity in location. Initially, the focus was on data-sharing and load-sharing. (The latter was facilitated by the range of continental time zones: as one technician slept, a colleague in another time zone would take advantage of otherwise idle equipment.) Few experts thought that communication could become a significant use of the data network, and when email was introduced, in 1972, it was only as a means of coordinating seemingly more important tasks. ARPANET’s internal structure was a hybrid between distributed and decentralized, but as it leased telephone lines from AT&T, its real, physical structure could not overcome the dominance of metropolitan centralization.

With computer networks like ARPANET proliferating, researchers developed an internetworking system to pass information back and forth between them. First tested in 1977 and dubbed the Internet, this single system is the foundation for the global telecommunicational system we know today. During the 1980s, the Internet opened to non-military sites through the National Science Foundation’s NSFNet, a nationwide network that connected supercomputing sites at major universities through a high-capacity national “backbone.” Through the backbone, university-driven computing super-centers, such as Ithaca, New York, and Urbana-Champaign, Illinois, were as wired as any big city. To counter the dominance of these elite computing schools, the NSFNet made these universities, and eventually other non-profit institutions as well, act as centers of regional networks. Again, Baran’s distributed network was rejected, replaced by a decentralized model in which, even if there is no national center for the Internet, local topologies are centralized in main command-and-control hubs.

The NSFNet grew in leaps and bounds while the ARPAnet swiftly became obsolete, its old dedicated lines running at 56-kilobits-per-second””?as fast as today’s modems. In 1988 and 1989, ARPAnet would transfer entirely to the NSFNet, ending the days of military control over the Internet. As a government-run entity, the backbone was still restricted from carrying commercial traffic. In 1991, however, new service providers teamed up to form a Commercial Internet Exchange for carrying traffic over privately owned long-haul networks. With network traffic and technology continuing to grow””?the NSFNet backbone in 1991 was a T1 line, which at 1.5mbps is the basic connection used for small businesses today and about as fast as a DSL modem or cable broadband connection””? in 1995 the government ended the operation of the NSFNet backbone, and signed the operation of the Internet over to the private realm.

With the exponential growth of the Internet following privatization, the tendency of the Internet to be strongly centralized on the local or regional level continues. Driven by profit, the commercial Internet followed the money. Internet corporations began turning to metropolitan areas, thereby reinforcing the existing system of networking. The Internet and telephone system are inextricably tied together today: not only are analog modems and DSL connections run over telephone lines, faster T1 and T3 lines are simply dedicated phone lines as well. To understand the way today’s Internet is built then, we need to turn back to the telephone system.

AT&T’s breakup into the Baby Bells in 1984, together with subsequent legislation further deregulating the industry, triggered competition at every level but did not fundamentally change the centralization of telephone service within cities. As before, the key for long distance carriers was the interface with the local system at the central office. But the central office was now controlled by whichever Baby Bell provided regional service. By 1990, fiber optics had surpassed satellite technology as a means of intercontinental communication, and had even begun to challenge the dominance of microwave towers, forcing AT&T’s vast Cold War project into obsolescence and leading the company to auction off the “Long Lines” system to cell phone carriers seeking sites for towers. Fiber, however, is expensive. and again the paradigm of physical centralization comes to the fore. Cost-effectiveness dictates that it is more difficult to create new pathways than it is to follow existing infrastructural routes and right-of-way easements: Hence the fiber optic cable in the train tunnel in Baltimore.

 Within cities, lines concentrate in carrier “hotels,” otherwise known as telco or telecom hotels. The history of the carrier hotel at the One Wilshire tower in Los Angeles is an example of the current system. In Los Angeles, the central switching station, now owned by SBC, is at 400 S. Grand, downtown. Although competing carriers are, by law, allowed access to the lines at the central switching station, SBC does not have to provide them with space for their equipment. Over a decade ago, in order to house their competing long-distance lines in close proximity to the 400 S. Grand station, MCI””?which had its own nationwide microwave network””?mounted a rooftop microwave station on One Wilshire, which is only three thousand feet from the central switching station and was at the time one of the tallest buildings downtown. With One Wilshire providing a competitor-friendly environment, long-distance carriers, ISPs, and other networking companies began to lay fiber to the structure. While the microwave towers on top have dwindled in importance””?they are now used by Verizon for connection to its cell-phone network””?the vast amount of underground fiber running out of One Wilshire allows companies many possibilities to interconnect. These attractions allow One Wilshire’s management to charge the highest per-square-foot rents on the North American continent.

Such centralization defies predictions that the Internet and new technologies will undo cities or initiate a new era of dispersion. The historical role that telecommunications has played in shaping the American city demonstrates that, although new technologies have made possible the increasing sprawl of the city since the late nineteenth century, they have also concentrated urban density. Today, low- and medium-bandwidth connections allow employees to live and work far from their offices and for offices to disperse into cheaper land on the periphery. At the same time, however, telecommunications technology and strategic resources continue to concentrate in urban cores that increasingly take the form of megacities, which act as command points in the world economy. In these sites, uneven development will be the rule, as the invisible city below determines construction above. In telecom terms, a fiber-bereft desert can easily lie just a mile from One Wilshire.

Moreover, the Internet’s failure to adopt Paul Baran’s model of the truly dispersed system means that it continues to remain vulnerable to events like the Baltimore tunnel fire. If Al Qaeda had targeted telecom hubs in New York at 60 Hudson Street or at the AT&T Long Lines Building at 33 Thomas Street, or had taken down One Wilshire, the toll in life would have been far smaller. Carrier hotels have few occupants. But the lasting economic effect, both locally and globally, might have been worse. Losing a major carrier hotel or a central switching station could result in the loss of all copper-wire and most cellular telephone service in a city, as well as the loss of 911 emergency services, Internet access, and most corporate networks. Given that many carrier hotels on the coasts are also key nodes in intercontinental telephone and data traffic, losing these structures could disrupt communications that we depend on worldwide.

The Net may appear to live up to Arthur C. Clarke’s idea of a technology so advanced that it is indistinguishable from magic. But whenever we see magic we should be on guard, for there is always a precarious reality undergirding the illusion.

http://varnelis.net/articles/centripetal_city

                           

 

                                       

 

                                      INTERNET

 Invention is rarely the isolated product of a lone scientist or engineer. Instead, every significant technology in the modern world is the product of a long history of numerous people and events. One of our most modern inventions, the Internet, is itself the result of decades of work and innovation by thousands of people who may have never dreamed of the possibility or potential of a global network.

One interesting and influential ancestor in the history of the Internet is radar. Hundreds of the best scientists and engineers in Britain and the U.S. worked during World War Two to develop radar systems to help them to defeat the Axis powers. Electronics technology was pushed to new heights to make the signals stronger, and early computing machines were developed to process the complex radar messages.

In the 50's and 60's, the Cold War spurred further research in radar and computers . The U.S. government feared that the Russians were pushing ahead of the west in science and technology, and the launch of the Russian Sputnik symbolized a new era and a new frontier. A call to arms swelled university campuses with budding scientists and engineers ready to burn their slide rules and retake the lead in technology from the Communists.

The U.S. government poured hundreds of millions of dollars into research and it was a golden age for R&D around the country. New federal agencies, such as NASA and ARPA, were created to distribute the research money around . MIT especially benefited from the Cold War, building new labs and hiring the best nerds in the country to work in them. High-tech companies sprang up around MIT, staffed with MIT graduates. One of these companies, Bolt, Barenek and Newman (BBN), would take the lead in developing the ARPAnet, the forerunner to the Internet.

The ARPAnet began as a government program thought up in the halls of the Pentagon. BBN was paid to build the connecting hardware and software, and several universities funded by ARPA were chosen to test the network. In 1969, only four computers were connected to the ARPAnet, but it grew and advances in computer technology made it faster and easier to use. Better networking protocols and applications were developed, especially email, and more people were convinced that it was going to be a success.

At the beginning of 1989 over 80,000 host computers were connected to what was now called the Internet. That same year, after some solemn thought, the aging ARPAnet was turned off signaling a transfer of the Internet from the hands of the Nerds to the Suits.

 

                        1. On the Radar Scope

One of the ironies of history is that war often lifts innovation to a higher level, and many beneficial inventions have roots in warfare. During World War Two, scientists and engineers on both sides of the battle lines advanced technology at a tremendous rate. In particular, the development of radar during and after the war was a catalyst for some of the technologies later incorporated into the Internet. One of the pioneers in radar, Vennevar Bush, was also the originator of an idea that would later evolve into the World Wide Web.

When the U.S. officially entered the war, Bush served as the top advisor to President Roosevelt on matters of technology in the war. He managed all the government's scientists, including the Manhattan Project. Even with this great responsibility, he also found time to keep up on his own research, including a machine intended to change the way people store and retrieve books, records, and notes. He called this machine a memex, because its purpose was to augment human memory.

Bush theorized that people didn't think well in the linear structures of alphabetic or numeric indexes, but instead in associative connections. Therefore, the memex would index everything with associative links and pieces of information that could be retrieved through paths of logical connections. He described the memex as a desk and camera that could record anything a user wrote and then link it to other pieces of information indexed in its storage space.

No memex machine was ever built, but Bush described his idea in an article for Atlantic Monthly titled "As We Think", published near the end of the war in 1945. It was obviously a revolutionary idea, but few people could grasp its potential impact. In the Philippines, a young Navy Radar Technician named Doug Engelbart picked up a copy of Atlantic Monthly at the Red Cross and became an early advocate for Bush's idea.

                        2. The Cold War Heats Up

In 1957, the Russians launched the first artificial satellite, Sputnik. The United States was near hysterics thinking of that little, metal ball orbiting the globe overhead. The U.S. didn't want the Russians to own outer space without a fight, so the old soldier, President Eisenhower, called up some new troops, the nation's scientists and engineers, to battle in the Cold War.

Above losing a space-race, the biggest fear for the world in the 1950's was the threat of a nuclear war. Dropping atomic bombs on Japan demonstrated the incredibly destructive power of nuclear weapons, and both sides had the bomb now. While some research centers worked on making weapons even more destructive, other researchers studied how to survive an atomic war. Protecting the nation's modes of communication was considered one of the most critical priorities.

Scientists at the RAND Corporation, a think-tank devoted to national defense, studied several possibilities. One of their scientists, Paul Baran, theorized that a decentralized network with several possible routes between any two points could keep the channels open for communication. If a few of the routes in the network were destroyed in a nuclear attack, messages would be rerouted automatically. In order to do this, though, he realized that the messages would need to be split up and sent as separate blocks. If the message was cut up into blocks, each could travel along any route that connected the source to the destination and at least part of the message would make it through.

In 1965, Baran found funding from the Air Force, but the project was plagued with bureaucratic problems. Baran was afraid that the project was doomed to fail because of the people put in charge of it, so he withdrew his request because he feared a failure would ruin any future prospects. He gave up on the idea, but he didn't know that other engineers were already working on the same idea.

Earlier in 1961, Leonard Kleinrock wrote his Ph.D. thesis at MIT on a similar block switching idea. Also, across the Atlantic in Britain, Donald Watts Davies was working on a block-switching scheme for the British National Physical Laboratory (NPL). However, Davies had a different name for it; he called the blocks "packets."

                                3. Acceleration

Eisenhower allotted over a billion dollars for U.S. research and development centers, including the Advanced Research Projects Agency (ARPA) located in the new Pentagon building. The success of the Manhattan Project also attracted money toward research in particle physics, including projects like the new Stanford Linear Accelerator.

Accelerators were big machines a country could brag about, and the Cold War foes liked to show them off. Not wanting to fall behind the super-powers, several European countries collaborated and built the biggest of them all at the new Counseil European pour la Recherche Nucleaire, or CERN. The same research center where the World Wide Web would be invented forty years later.

All this building was wonderful for the economies devastated from the war, but the scars of international conflict were still fresh in their minds. The United Nations was created as a grand experiment in preventing another global war. New York was picked to host the United Nations, and construction began on a UN Building for the hundreds of UN ambassadors and their staff.

Three acoustic engineers, Bolt, Beranek, and Newman, formed a partnership to work as consultants on the new UN Building. Their new company, BBN, was headquartered in Cambridge, Massachusetts near the engineers' alma mater, MIT. From there they recruited students from MIT and Harvard. BBN was soon known as the "third university in Cambridge."

                                   4.Whirlwind

The Cold War was very profitable for private and university research centers. In particular, MIT was a hotbed for research funded by the military. In 1951, they opened Lincoln Lab, devoted to developing technology for air defense, especially radar systems.

One of the biggest radar projects at MIT was Whirlwind, an early application of computers to coordinate and monitor a collection of radars watching for Russian bombers flying over the north pole. Several staff and graduate students (including Frank Heart) worked on a computer system that would alert a central monitoring station when something showed up on the radar. It was one of the first uses of computer networking and it pushed the technology to new levels.

                                   5.Lick

In the 1950s, the new field of computer science was introduced on university campuses around the country. MIT was one of the early leaders in the field, and researcher at MIT's Lincoln Lab had access to several mainframe computers. Lincoln Lab also built their own computers including the TX-0 (the first transistorized computer, thus the "T") and TX-2. Both of the TX computers were built by two MIT technicians, Ken Olsen (the founder of Digital Systems) and Wesley Clark.

An MIT psychoacoustrician named J.C.R. Licklider took and immediate and intense interest in computer after Clark demonstrated the TX-0 to him. Licklider, called "Lick" by his friends and fellow researchers, applied his background in psychology to research how people interacted with computers, and he became known as an expert in human-computer interaction. People at ARPA took notice and offered Licklider the job of director for their new Information Processing Techniques Office (IPTO). He accepted the position as the founding director and continued his research in human-computer interaction.

In 1960, Licklider wrote a landmark article titled "Man-Computer Symbiosis". In his article Licklider looked beyond the conventional idea that computers were mere calculators. He saw a relationship with computers where people "will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluation." The computer would take care of the tedious work and allow us to do the important stuff.

                        6.Intergalactic Network

In 1962, Jack Ruina, the director of ARPA, offered Licklider the chance to start ARPA's new behavioral science division, and because Licklider was interested in computer science, the new Information Processing Techniques Office as well. It was an opportunity for Licklider to blend his background in psychology and his interest in computers together.

At that time, using one of the world's few computers meant either submitting a "batch job" and waiting up to a day for the results, or logging into a terminal connected to a time-sharing computer. People on time-sharing terminals used the same computer, but it was fast enough to make it appear like they had the computer's complete attention.

Licklider observed how the researchers and students at the Lincoln Lab communicated with each other on time-sharing computers. He theorized that computers augmented human thinking by increasing their ability to communicate. If the whole world, he proposed, could connect through a "intergalactic network" they could share ideas and collaborate in an integrated unit. However, he had no idea how to create this global network.

                                   7.The First Link

As director of IPTO, Licklider could fund others to find out how to create an Intergalactic Network. One project he funded in 1963 was the "Augmentation Research Center," headed by Doug Engelbart out at Stanford.

After World War Two, Engelbart had returned to school and completed his Bachelor's Degree in Electrical Engineering at Oregon State University. He worked for NACA (later renamed NASA) at the Ames Laboratory for three years, but his fascination for computers grew and he went back to graduate school to study the new field of computer science.

After earning a Ph.D., he went to work at the Stanford Research Institute designing and building computer components. Bush's article still influenced Engelbart's work, and Englebart had some ideas on how to build an equivalent machine using computers. After a few years at SRI, he had enough experience and reputation to attract funding for his own laboratory. Besides Licklider, Engelbart also found some funding from of a young project engineer at NASA, Robert Taylor.

At the Augmentation Research Center, Engelbart was developing a new way of computing and he had to invent new tools to make it work. Many of his inventions were very innovative, including the mouse pointing device and windows on a computer display. He also created an early hypertext system called NLS, but it wasn't widely used. The world wasn't ready for hypertext.

                                           8.Xanadu

In 1965, the Association of Computing Machinery (ACM) hosted its 20th annual conference. One of the speakers at the event was 28 year old Theodore Nelson giving a presentation titled "A File Structure for the Complex, the Changing, and the Indeterminate." This was the first time he described his interconnected "docuverse" to the scientific community, and his audience were some of the first to hear the word "hypertext."

While a Master's student studying sociology at Harvard, Nelson took a computer science course and discovered an exciting new world. He imagined innovative applications for the computer, including word processors and an interconnected, nonsequential, dynamic collection of documents and multimedia. Nelson's "docuverse" was similar to the future World Wide Web, but it was on a grander scale. Hyper-links pulled portions of documents and multimedia components across the network, and copyrights were managed to protect the intellectual property of contributors.

It was a revolutionary idea, and it was given the fantastic name of "Xanadu". However, it was never realized. Although several believers poured millions of dollars into the project, including John Walker (the founder of AutoDesk), Ted Nelson never produced a complete working model of Xanadu.

In the later 1960s, Nelson continued to work on his ideas and collaborated with Andries van Dam at Brown University to design and build a hypertext editing system they descriptively named Hypertext Editing System or HES. IBM paid for the project, and the system was programmed on an IBM mainframe and graphic display. When it was finished, IBM sold the system to NASA to produce documentation at the Manned Spacecraft Center in Houston.

                              9.Twenty Minute Pitch

Early in the space-race, NASA was paying for a lot of research and it employed thousands at its growing research centers. Robert Taylor, a young scientist who studied psychoacoustics and mathematics at the University of Texas, worked for NASA as a research administrator in the early 1960's. After a few years at NASA, he was hired by Ivan Sutherland, the second director of IPTO, in 1965 to work at ARPA. Only one year later, Taylor succeeded Sutherland as director and managed all the computer projects funded by ARPA.

From the terminal room next to his Pentagon office, Bob Taylor had a direct connection to several of the ARPA-funded computers around the country. Each terminal was connected to a single computer and Taylor needed to use a different login sequence and different commands on each mainframe. In 1966, it was the leading edge of computer networking, but Taylor was tired of changing seats and instructions every time he needed to communicate with another computer.

He composed an idea and walked to his boss's office, Charles Herzfeld, and gave him the pitch. Taylor explained the problem and described a vague solution about networking different computers together. Herzfeld liked the idea and said Taylor had one million dollars to make the idea work. When Taylor looked at his watch he noted that it only took twenty minutes to get the project funded.

One of the sayings at ARPA was "why don't we rely on the computer industry to do that?" instead of the government. So, Bob Taylor started writing a Request for Proposals titled "Cooperative Network of Time-Sharing Computers." He described the general idea, but he needed some help figuring out what they were asking contractors to do, exactly. The best person he knew who could help him was Larry Roberts, who was working at MIT's Lincoln Lab networking computers like the TX-2. Roberts had just built and tested the first transcontinental network between two computers, so he had as much experience as anyone in long-distance networks.

At first, Roberts had no interest in leaving MIT, but Taylor wouldn't take no for an answer. Since he was in charge of funding over half of the research at Lincoln Lab, he had some clout there. After over a year of asking, Charles Herzfeld called Roberts' boss and strongly suggested he help Roberts decide to take the job. The director of Lincoln Lab called Roberts into his office and made the suggestion that the position at ARPA might be a good career choice at that time. Roberts moved to ARPA in 1966 and began drafting the Request for Proposals that ARPA would send out to potential contractors.

At the next annual conference of ARPA-funded university projects, Roberts organized a meeting to talk about the project. Two important parts of the network were decided: that the network traffic between computers would be broken up into blocks (a packet-switched network), and that a separate computer would act as a gateway to the network for each node. This computer, named an Interface Message Processor (IMP), would be connected to the network and to a mainframe at the site. All the nodes would have nearly identical IMPs, creating a standard interface for the network between nodes.

                             10.Request for Proposals

Larry Roberts finished writing the Request for Proposals and sent it to 140 potential contractors in the summer of 1968. After a few months, about twelve came back to ARPA, including BBN's proposal. Two of the largest computer companies, IBM and Control Data Corporation (CDC), declined to make a bid, confident that packet-switching was an unworthy endeavor.

Roberts cut the number he considered appropriate down to four, including BBN and Raytheon. Raytheon looked like the early leader in the competition for the contract. They had more resources than BBN and they claimed they could make the network faster than the proposal required. However, BBN made the same claim and they supplied the details on how they were going to do it. BBN had spent several months and over $100,000 writing the proposal and Roberts felt BBN's proposal was a better plan.

BBN received word that they won the contract, and they were congratulated via telegram by Massachusetts' Senator Edward Kennedy for the winning the contract to build an "Interfaith Message Processor."

On the first day of 1969, Frank Heart collected his team together and started working on designing and programming the IMP. They chose the Honeywell DDP-516 for the computer they would modify into the IMP. The DDP-516 was one of the most powerful minicomputers on the market, and Heart liked it because it was built to military specifications - reinforced body and i-bolts on top.

The team's primary members included programmers Will Crowther and Dave Walden, BBN's star debugger Bernie Cosell, and Severo Ornstein, the geologist turned computer hardware specialist. Bob Kahn volunteered to write a specification to send to the participating centers detailing how to connect their computers to the IMP. They had about eight months to deliver the first IMP to UCLA on Labor Day.

Crowther and Walden spent several months writing code to send and receive packets over the network. They had to write the code in assembly language (what a computer can understand) and everything had to fit into the Honeywell's 12k of memory. To make the process more efficient, they wrote an assembler on BBN's PDP computer and transfered the compiled application via paper tape to a prototype Honeywell for testing each version.

When the first modified IMP (IMP-0) was delivered to BBN, Ben Barker (the young engineer assigned to testing it) discovered that the modifications were all wrong. In the 1960's changing the computer's configuration meant unwrapping and wrapping hundreds of tight bundles of wires. Barker spent a few months of long days reconfiguring the Honeywell.

IMP-1 was delivered to BBN just two weeks before Labor Day. They had sent instructions on the changes Barker made to the first IMP to Honeywell and they expected everything to work this time. However, when Barker powered the Honeywell up, nothing worked. He opened it up and discovered the same configuration the first IMP started with. However, this time he had detailed instructions on what to do, and he started working right away. He finished in just a few days, but he found a new problem.

When they tested the IMP it would consistently work for a while and then crash for no apparent reason. Most of the time it would go a day or two between crashes, but the IMP was supposed to work all the time or the ARPAnet wouldn't be practical. After a few days, Barker was convinced it was a synchronizer problem, an occasional mistiming in the CPU. It was one of the worst problems for a computer, and one of the hardest to fix. Heart had already arranged to have it shipped, so Barker and Ornstein raced against the clock to fix the problem.

                             11.Did You Get the "L"?

When planning began for the IMPs, four university research centers were chosen for the initial test sites. The decision on which university received an IMP was based on the specialties of each research center. Len Kleinrock, at UCLA, was one of the leading experts on packet-switching networks, so he would receive the first IMP and test the network as it was built and used. The second IMP would go to Stanford, where Doug Engelbart would manage the Network Information Center (NIC) providing a network home for ARPAnet documentation. Sutherland (the second director of IPTO), was researching computer graphics at the University of Utah, so the third IMP would go there. The fourth IMP would go to the University of California at Santa Barbara where research was conducted on interactive computer graphics.

Len Kleinrock's graduate students had found out about the problems BBN was having with the IMP, so they guessed that BBN would need to set the date back and give them more time to finish programming the software interface. However, on August 29th, the day before Labor Day, the IMP was delivered to the Stanford shipping dock as planned. Steve Crocker, the graduate student responsible for the host-to-IMP software, heard the news two days earlier and was a little surprised. He spent all night finishing the interface for the Sigma 7 mainframe.

On Labor Day, the IMP was carted up to Kleinrock's lab and connected to a power source by a BBN engineer. When it was powered up, it started working where it had left off back in Cambridge. Unlike the temporary memory used in today's computers, the IMP used core memory that didn't forget anything when it was powered off. When they connected the Sigma 7 to the IMP, the mainframe and the IMP communicated with each other just as planned.

                                        12.Logging In

A month later, the second IMP was installed at the Stanford Research Institute (SRI). They had a SDS-940 mainframe computer connected to the IMP, so a different interface was written by the graduate students at Stanford. When it was working they were ready to test the first connection in the ARPANET, so they got on the phone with UCLA and coordinated the login.

"Did you get the L?" Charlie Klein, an undergraduate at UCLA, asked. "Yes," came the answer from Stanford. "Did you get the O?" asked UCLA. "Yes," answered Stanford. When Klein typed 'G' another first occurred - the network crashed.

                                        13.@ Work

After the first IMPs were installed, including one at BBN, everyone was experimenting with doing things on the ARPAnet. Kleinrock and his students kept measuring its usage and occasionally pushed it to the breaking point to check for weaknesses. Researchers and students (including Bob Metcalfe) found several creative uses for the network, but Kleinrock's measurements indicated that the traffic didn't come close to filling the network's capacity. Something considered a frill would change that.

Although initially looked upon as something improper for the ARPAnet, electronic mail (email) was the first big hit on the network. People could send messages to other people on a time-sharing computer before, but now the researchers wanted to send messages between computers on the ARPAnet. Ray Tomlinson, an engineer at BBN, wrote the first email reader and writer for the ARPANET. He kept his program a secret and just sent a message to himself to test it. However, his secret soon got out and email was quickly passing across the ARPAnet. Kleinrock reported that email took up more of the network's capacity than any other application.

When writing his email applications, one of the design problems Tomlinson had to tackle was how to address users on different computers. On a single computer, people could just send a message to a particular user name. But on the ARPAnet, there had to be a way to signify which computer computer hosted the account. Tomlinson decided to combine the account name with the computer name into a single address.

He looked at his keyboard for a single character to delimit the two names. One particular character made a lot of sense to him, because it wasn't used in proper names and the symbol meant "at". They character was the "@" symbol. Unfortunately, his choice for the most important character in Internet email addresses netted him zero profit.

 

                                            

                             14.The Big Demo

After several more universities linked their computers to the ARPAnet, Larry Roberts, now director of IPTO, wanted to demonstrate the abilities of the network to other research centers funded by ARPA and other people in the young field of computer networking. He assigned Bob Kahn at BBN to organize the event and arrange for demonstrations showing off the potential of the network. Kahn spent the year getting people and equipment ready for the event to be held at the Hilton in Washington, D.C..

Bob Metcalfe had written a short guide to all the exhibits, but everything was so new that many people couldn't understand or believe some of the demonstrations. They watched demonstrators logging into computers across the country, running a program there, and then sending the output back to the Hilton. Almost everyone was very impressed.

AT&T, one of the early and strongest skeptics of packet-switching, attended assuming they would witness the failure of a pipe dream. Metcalfe, still a graduate student, volunteered to show them around and demonstrate a working ARPAnet. However, the demonstrator's nightmare occurred - the network crashed. It was the only time during the conference that it failed, but the AT&T businessmen took it as evidence that packet-switching was doomed.

                                      15.Surfing the Net

Norm Abramson, a professor of engineering at Stanford, had a personal interest in Hawaii. It was surfing. He was an avid surfer and after a visit to Hawaii in 1970, he inquired at the University of Hawaii if they were interested in hiring a professor of engineering. Within a year he was working at the University and surfing on the beaches in Hawaii.

Abramson immediately started working on a radio-based data communications system to connect the Hawaiian islands together, and he got Larry Roberts to fund the project. Abramson's team of engineers and graduate students eventually built the first wireless packet-switched network, and in true Hawaiian style, they named it ALOHAnet.

Abramson then managed to get a Terminal IMP from Larry Roberts in early 1971 and connected the ALOHAnet to the ARPAnet on the mainland. It was the first time another big network was connected to the ARPAnet.

                                         16.TCP/IP

At about the same time, the weaknesses of the Network Control Protocol (NCP) were becoming more apparent. In 1970, Bob Kahn and Vint Cerf ran an experiment from UCLA to see if they could overload the network under high but acceptable network traffic. Kahn had warned the engineers at BBN when they were building the IMPs, but Frank Heart thought the NCP was more than adequate for the amount of traffic they expected. However, the ARPAnet was getting more traffic than predicted.

While waiting for a meeting one day, Vint Cerf started daydreaming about a new protocol that would improve the efficiency of the network and allow different networks to connect together into one big network - an Internet. He jotted down some notes and then met with Bob Kahn to work out the details. Together they came up with a protocol that would include error detection, packaging, and routing. They called it Transmission Control Protocol, but later split off part of it and called that part Internet Protocol. Together the acronym became TCP/IP.

TCP/IP was a necessary step in the evolution of the Internet. The earlier protocol, NCP, couldn't handle the tremendous traffic of a global network, and TCP/IP was a common protocol different networks could use to talk to each other. However, it took several years before TCP/IP became the default protocol for the Internet. The International Standards Organization (headquarted in Europe) proposed a competing protocol named OSI (Open Systems Interconnection). It was a more abstract protocol created by some of the world's best computer scientists. However, TCP/IP was a protocol that was already proven and it was gaining momentum.

The United State's government proclaimed OSI was the protocol the Internet was going to use, but it never happened. Too many networks were already using TCP/IP and it was too much trouble to switch. Europe mandated using OSI, but the universities were switching over to TCP/IP anyway because they didn't want to cut themselves off from the giant Internet in the United States.

With TCP/IP, the "global network" was becoming a reality. Universities and government offices were using the network for communicating with colleagues and exchanging data. However, sometimes a less professional application snuck into the Internet. Personal email became more common, and some hackers wrote network games and other recreations. The intent of the Internet was strictly for official business, and it was even a law -- but people started rethinking the purpose of the Internet.

 

                               17.Personal Computers

In the 1980's, personal computers became a common fixture in homes and offices. Supplying business with computers and software grew into one of the biggest industries in less than a decade. Soon, networking became a profitable business for engineers previously restricted to networking mainframes.

 

Some of the engineers trained on the ARPAnet went out on their own to found some of the fastest growing high-tech companies in history. Bob Metcalfe, one of the pioneers of ARPAnet, developed a better way of networking personal computers together and founded 3Com.

Four 27 year-olds from Stanford and Berkeley formed a company named Sun and built networkable workstations that could crunch numbers faster than many mainframes. Taking advantage of Metcalfe's invention, four programmers in Utah wrote a network operating system (Netware) and resurrected Novell Systems into a multi-billion dollar software company. A married couple working at Stanford developed an improved way to connect different networks together and operated a multi-million dollar company, named Cisco, from their house until venture capitalists took over and propelled it to a multi-billion dollar business.

The Internet opened a gold rush in the 1980's that built huge fortunes and toppled old empires. Passionate engineers and savvy venture capitalists built a new economy that would lay the tracks for the Information Super Highway.

                                       18.Xerox PARC

At the turn of the decade in 1970, big business was taking another look at the impact of computers. During the sixties, computers were mostly restricted to the air-conditioned back rooms of major research centers, but smaller and more affordable computers were starting to show up in offices. Xerox was afraid that their products, mostly copiers and typewriters, would disappear from a paper-less office. Xerox realized that they couldn't be surprised by the office of the future if they were the ones to build it.

Bob Taylor was hired to help build a new research center for Xerox, where the best minds would do nothing but forge the future of computers and technology. Taylor wanted the new center near a major university where new ideas were already being created, so he broke ground near Stanford University in Palo Alto for Xerox's Palo Alto Research Center (PARC). He recruited some of the researchers he funded before, including Jerry Elkind and Severo Ornstein from BBN.

The atmosphere at PARC was electric, where some of the best technologists and scientists worked on their own dreams of the future. Their research was funded from the growing profits of Xerox, which was quickly approaching the size of IBM, therefore project budgets were nearly unlimited.

Two of the researchers at PARC, Butler Lampson and Chuck Thacker, worked on a project to put a computer on every person's desk. The computer, called an Alto, was really the first small personal computer that could be used in a business environment. They didn't really know how useful Alto would be, but it was their job to discover the future. Several of the Altos were built and given to PARC's staff to see what they would do with the computers.

                  19.Computer Communication Compatibility

At about the same time the Alto was appearing on office desks at PARC, Bob Metcalfe finished his Ph.D. thesis at Harvard and was hired by Bob Taylor in 1973. Metcalfe's thesis work was an efficiency study of the ARPAnet and the ALOHAnet, but he had never seen the ALOHAnet work firsthand. So, one of the stipulations of his accepting the job was a three-month visit to Hawaii before he reported to work. In Hawaii, he studied the ALOHAnet in more detail and came back to the mainland with a notebook full of data and ideas for improving the network.

Norm Abramson, the builder of ALOHAnet, created a protocol looser than that used on ARPAnet, something similar to a party line on a telephone. Because there were no isolated network lines, everyone had to share the same pathway. Sometimes two or more network packets would be sent at the same time and none of them would arrive at their target locations uncorrupted. To correct for this problem, the receiving computer would check for errors in each packet and would only send back an acknowledgment if it received a good packet. If the sending computer didn't get an acknowledgment, it would wait a random amount of time and then send it again.

Bob Metcalfe had some ideas for a better protocol. Instead of just sending out a packet, the transmitter would first listen for any traffic and only send it out if no other computers were sending packets. It's similar to the way polite people talk in a group. Instead of interrupting other people, they wait until there's an pause in the conversation. The protocol was called CSMA/CD, which stands for Carrier Sense, Multiple Access with Collision Detection. That just means multiple computers could connect to a single cable and would try to have only polite conversations.

At PARC, Metcalfe worked with Thacker, Lampson (the two inventors of the Alto), and David Boggs on building a computer card for the Altos. Their two test machines were named Michelson and Morley, after the two early 20th century physicists who disproved the popular belief that an invisible "ether" filled the universe.

After several months of building and testing, Metcalfe's team successfully networked the two Altos together. They needed a name for the network protocol, so they continued the ether joke and named it "ethernet." Soon more Altos were connected to the ethernet and then different computers were added, creating a local internet inside PARC.

Although Metcalfe had a useful product, at that time PARC didn't have an easy mechanism to move it to production and to market. He decided that he would produce and market ethernet himself. He was not a businessman, but he was a researcher, so he decided to research the process of building a successful business and then do it. Using the Directory of Western Venture Capitalists he scheduled breakfast, lunch, and dinner meetings with anyone willing to talk to him. To start, he wasn't looking for venture capital, but advice and information.

After nearly a year, while perfecting his product, he found backers and opened his new company. For a name he combined three words that described his business: Computer Communication Compatibility, or 3Com for short.

                                        20.Sun Rises

While growing up in India, Vinod Khosla, dreamed of starting his own high-tech company and becoming rich like the founders of Hewlett-Packard and Intel. When he came to this country to study business at Stanford Business School, he got a chance to help draft a business plan for a technology company named Daisy Systems. The business succeeded, and Khosla cashed-in and walked away rich at the age of 27.

He still wanted to start his own company, and in 1982 he found a product he knew he could sell, a workstation. A graduate student named Andy Bechtolsheim studying computer science at Stanford had already built a name for himself on campus because of the workstation he designed. A workstation is a computer more powerful than a personal computer but small enough to sit on a desktop. It had built-in networking, because Bechtolsheim knew researchers needed it, and it used the Unix operating system, a nonproprietary operating system developed at Bell Laboratories.

Bechtolsheim designed the workstation to fill a void in the computer science department. He was frustrated with the aging time-sharing system used at Stanford, and he thought everyone could get a lot more work done if they had immediate access to a computer and still be able to exchange data. His idea was to license the new technology to companies that could build it and then get the computers from them. He had already licensed the workstation idea to over seven companies before Kolshod approached him, so he thought Kolshod just wanted another license.

However, Kolshod told him "I don't want to do that. I want the goose that lays the golden egg. I don't want the golden egg." He wanted Bechtolsheim to join a partnership with him to build the workstations for sale. It was difficult to convince Bechtolsheim because he wanted to stay at Stanford and complete his Ph.D. work, but he eventually agreed.

To help with the business side, Kolshod recruited a fellow Stanford Business School graduate, Scott McNealy. McNealy had no experience starting a company, but he was excited about the prospect. Now they had two business people and a hardware expert. All they needed was a software expert to cover all facets of the product. The choice was easy, Bill Joy.

Bill Joy was a young computer science professor across the bay at Berkeley. Joy had quickly gained the reputation as one of the best computer scientists in the country at the age of 27, the same age as the other three. Having worked on implementing parts of the ARPAnet, Joy had plenty of networking experience. He had also rewritten the Unix operating system to incorporate TCP/IP and released it as Berkeley Unix (BSD).

Khosla found more venture capital for his new company (initially called "VLSI Systems") from John Doerr of Kleiner, Perkins, Caufield and Byers (silicon valley venture capitalists). Doerr had known the initial trio from Stanford, and he understood how big a Unix workstation would be in the research and business market.

With a few million dollars in the bank, a prototype motherboard, and a networking operating system, the company began building their workstations under the new name of Sun (in homage to the Stanford University Network). By 1988, they passed $1 billion in sales and were the second fastest growing computer company in history. It was a success story rarely matched since.

                             21.Success from Failure

For every successful high-tech company there are hundreds of failures. Novell is a story of both failure and success. In 1982, Novell Data Systems was a small computer company located in Orem, Utah near the Wasatch Mountains. It had found some success building computer systems for local businesses, but the good times quickly took a downturn. They were soon finding it hard to make payroll, and they had to auction off office furniture to pay their employees. Some of the company's investors called in Ray Noorda to see if he could turn the company around.

The same year as their demise, four contractors were hired to write software to network CP/M computers to a common "disk-server." The disk-server split up it's hard-drive into virtual drives, one for each of the computers networked to it. Drew Major and the other programmers started working on a new idea for a "file-server." Instead of a private section of the disk, each networked computer would have access to all the files on the server. Then people could share data across the network. There were some problems to work out, like two people simultaneously accessing a file, but Major and the others recognized a big seller.

While they were working at Novell, they saw a demo of the first IBM PCs available in Utah and bought one right away. They saw great potential in the IBM computers, and decided to try out their new ideas on it. They were contract workers at Novell, so they weren't too worried about the eminent layoffs - they could just find other contract work while they worked on their file-server.

When Noorda toured the business to see if he could save it, he visited with Major and the other contractors and liked what he found . He immediately saw the potential of the file-server. Noorda decided there was something to save, but the business would have to change and drop everything except for Major's file-server.

During the next year, the four contractors (still not full-time employees) worked on their idea and eventually built a network operating system. They called it Novell Netware and shipped it as soon as it was ready. There were lots of bugs in the code, but it was on the market first.

Netware quickly became the dominant networking software for PCs, competing against a weaker operating system packaged with 3Com's ethernet cards. Noorda came up with a novel marketing plan -- essentially flipping 3Com's strategy. Novel began selling ethernet cards at near cost as long as the customer bought Netware to run on their networks. 3Com thought they were being ripped off, but Novel became their biggest customer for ethernet cards. It's a strange business.

 

                                     22.Riding the Bear

Microsoft probably benefited the most from the growing IBM personal computer market. Because IBM didn't have the skill-set to write an operating system for their desktop computers, they chose to hire Microsoft, still a small Seattle company, to do the work for them. After the first MS DOS was released, IBM decided to keep the business relationship and let Microsoft maintain the operating system. Microsoft was pleased to serve, because they knew IBM ruled over the computer business and they would go wherever IBM went. Microsoft called it "riding the bear."

In the mid 1980s, IBM was working on a new personal computer, and they wanted a revamped and more powerful operating system to match the capabilities of the new computer. They contracted Microsoft to work with them to design and write the second generation operating system, OS/2. This new operating system added several new features, such as a graphical interface with windows and a mouse. However, they didn't want to add any networking for OS/2. IBM was still in the mainframe and minicomputer business, and some of the management didn't want to the personal computer division competing with the rest of IBM.

Bill Gates and the rest of Microsoft didn't see it that way, but no matter how much they argued and pleaded IBM would not add networking to OS/2. Microsoft just watched as Novell took over the networking market:

"Around '83 and '84 and certainly ... by '85 Netware was reaching critical mass. And so Microsoft felt really like there was a huge missed opportunity. In fact, I remember some memos Bill wrote, in sort of '84, '85, '86, where he said, you know one of the biggest disasters for the company is that ... is that we have no assets in networking or very weak assets in networking."

                                                                       Rob Glaser, Microsoft

In late 1989, Bill Gates made his first attempt at buying the competition, Novell. He called David Bradford of Novell and offered to band together and compete against IBM, but Bradford refused the offer. So, Microsoft went to 3Com instead and offered to band with them and compete against Novell. Metcalfe was already frustrated with Novell, so he agreed with the arrangement. He believed that with Microsoft's relationship with IBM, they had an opening to the largest market, IBM PCs.

"What Microsoft failed to tell us was that their relationship with IBM was falling apart at that moment. Which came as a big surprise about three days after we signed the deal."

                                                      Bob Metcalfe, founder of 3Com

Later in 1989, Microsoft announced OS/2 LAN Manager and the headlines proclaimed that it would control most of the networking market by 1991. But that forecast fell short and Novell still controlled the majority of the market in 1991.

At 3Com, because their half of the partnership was falling short of their goals, the board of directors decided a change of management was in order. Metcalfe suggested that he should be CEO in 1990, but the board chose Eric Benhamou instead. Metcalfe quickly resigned from 3Com. He's now a content gentleman farmer in Maine and writes a weekly column in Info World.

After inventing Ethernet and founding 3Com he does have some advice to prospective entrepreneurs:

"It helps to have good parents, and then it helps to work really hard for a long period of tie and go to school forever, and then it works to drop quite by accident into the middle of Silicon Valley, where you're swept up into an inexorable process of entrepreneurship and wealth generation, and you pop out the other side with a farm in Maine. I hate to oversimplify."

                                                     Bob Metcalfe, founder of 3Com

 

                                    23.Adult Supervision

Only one year after Sun was formed, another company destined to make a big splash in the networking world was also based on work done at Stanford. Cisco, founded by husband and wife Len Bosack and Sandy Lerner, started experimenting with connecting their two detached networks located in two different buildings on campus. With the help of two other Stanford staff members, Bosack and Lerner ran network cables between the buildings and connected them first with bridges and then routers.

Bosack's router was really an updated IMP, which would transmit only the traffic that was meant to get out and accept only the traffic that was meant to get in. Bosack and Lerner designed and built routers in their house and experimented using Stanford's network. When word got out about their routers, other universities and research centers asked to buy them. Bosack and Lerner went to Stanford with a proposition to start building and selling the routers, but the school refused. Bosack and Lerner knew they had something worthwhile, so they founded their own company and named it "Cisco," taken from the name of the city to the north.

Their house became company headquarters and every room was used for building, testing, manufacturing, or shipping. They had no capital, so they charged all the startup costs on their credit cards. However, even without a real sales staff, they started to make a profit very quickly. When they were pulling in hundreds of thousands of dollars every month, they decided it was time to act like a real business, and they needed help recreating their company.

They went to several venture capitalists and made their pitch, but all refused until the 77th:

"At that point I think we were -- Cisco was doing, I think, a quarter million, maybe 350,000 a month without a professional sales staff and without an official conventionally recognized marketing campaign. So it wasn't a bad business just right then. And so I think just for the novelty of it, the folks at Sequoia listened to us."

                                                     Len Bosack, cofounder of Cisco

Don Valentine was an experienced venture capitalist with such companies as Apple and 3Com in his portfolio. He saw great potential in Cisco and he was impressed with Bosack's and Lerner's enthusiasm. However, he wanted to hire some experienced management to run the business - something venture capitalists call "adult supervision."

"The commitment we jointly made to each other is that we at Sequoia would do a number of things. We would provide the financing, we would find and recruit management, and we would help create a management process. None of which existed in the company when we arrived."

                                                      Don Valentine, Sequoia Capital

Valentine recruited John P. Morgridge as president and CEO to oversee the company and be Bosack's and Lerner's boss. There was tension between the three from the start, and Lerner claims that the first words from Morgridge's mouth when they met were "I hear that you're everything that's wrong with Cisco." He denies he ever said that, but Lerner never got along with Morgridge after that.

As Cisco grew to a billion dollar business, more management was added and Bosack and Lerner were feeling pushed out. Lerner was always concerned about customer relations, but she didn't get along with the upper management. In 1990, seven of the vice presidents came as a group to Valentine (with Morgridge's knowledge) and made an ultimatum, either Lerner left or they would leave. On August 28th Sandy was asked to leave the company, and Bosack left after he heard. They both immediately sold their two-thirds stake in Cisco, taking about 170 million dollars.

Sandy Lerner now owns a cosmetics line and funds several charities. To get away from California, she bought Ayershire Farms, the 40 room house in Virginia where she raises draft horses. She is still bitter about her experiences at Cisco. Although she had separated from her husband Len, they are still close friends and manage a trust that funds several charities. After leaving Cisco, Bosack started his own company, XKL,in Redmond, Washington and also funds charities including SETI (the Search for Extra-Terrestrial Intelligence).

Cisco has grown even more since 1990, now valued at over 6 billion dollars and controls over three quarters of the router business. Morgridge is still president and CEO, and has directed Cisco's growth through aggressive marketing and acquisitions including new ventures into ATM and gigabit Ethernet.

                                       

 

                               24.Wiring the World

The rise of the personal computer by Apple and IBM introduced the rest of the world to computing. At first, computers were the tools of technically inclined nerds, but new applications drew other people to the keyboard. With an affordable modem, people could connect with other computer enthusiasts and commercial online services. People were using the computer as Bush and Licklider had prophesized, as a medium to interact with other people.

 

A venerable institution of international collaboration was the setting for the major development in the history of the Internet. It began when Tim Berners-Lee, a computer programmer at CERN in Switzerland, got to play on a new NeXT workstation. The object-oriented operating system was an inspiration for a problem he was working on - how to distribute information across a diverse network of different computers and operating systems. He started working on a protocol very similar the "docuverse" described by Ted Nelson, but reduced it down to a minimal, working model.

Berners-Lee eventually named his project the World Wide Web, because he visualized it as a web of interconnected documents that would stretch across the Internet and the world. It sounded grandiose, but his predictions were later proven too low.

In 1992, Marc Andreessen, an undergraduate computer science major at the University of Illinois, was working at the Super Computer Center when he discovered Berners-Lee's World Wide Web. There were a few sites scattered around the world, including the first U.S. site at the Stanford Linear Accelerator, but it was hard to use. Marc and some of the other programmers knew they were looking at a great idea with a bad presentation. They wanted to put a more "human face" on the Web, so they wrote the first graphical browser - later to become Netscape.

Microsoft saw no profit in the Internet, because there were specific laws against any profits on the Net. However, when Rep. Frederick Boucher from Virginia drafted a bill in the U.S. Congress that would change all that. Soon, the Internet was open for business.

Microsoft finally jumped onto the Internet in 1995, offering a browser to compete with Netscape - a browser they called Internet Explorer. It looked like the giant from Redmond, Washington would take over the Internet just as they took over the OS market - but the competition called foul.

After 1995, every business was at least thinking about getting a Web site. Online services exploded when they offered connections to the Internet and the World Wide Web. Traffic on the Internet was increasing at an exponential rate, and the old network backbones were showing signs of collapse. Some of the ARPAnet pioneers were predicting a collapse if something wasn't done soon.

 

                              25.Electronic Meeting Places

As the early computer visionaries (such as Licklider and Nelson) realized in the 1960s, people weren't satisfied with just interacting with their computers. They wanted to use their computers to interact with other people. Computer hobbyists soon came up with their own method of connecting computers over the telephone lines. Prices for modems made it possible for almost everyone to buy one and call other computers.

However, when the other computer answered there really wasn't much to do. Computer hackers wrote programs to answer the phone and interact with the caller, all done automatically without human supervision. Features like message boards, online games, and file exchange were added and a new business was born. The average computer enthusiast could buy (or download) the complete software needed to operate a bulletin board system and set it up themselves.

Although there were bulletin boards earlier, bulletin boards reached their height of popularity soon after 1980. Bulletin board systems were found in every town, and some were advertised in the local newspapers. For many it was a labor of love, and if they charged any money it was to cover the costs of operating the system. However, some of them saw a business opportunity and added value to their systems to attract paying customers.

                                  26.Online Services

Founded several years before 1980, the Compuserve online service really took off in the early 1980s. They didn't supply access to the Internet yet, but they did give subscribers the opportunity to send email and exchange files with other people across the country and eventually around the world. Compuserve operated a computer center in Columbus, Ohio, but they set up local modem pools in large population centers making it easy for subscribers to dial-in from anywhere. Subscribers could also use access numbers provided by TELENET or TYMNET if Compuserve didnt provide a local phone number for their town.

When people connected to Compuserve in the early 1980s they didn't have the graphic interface taken for granted on the Web today. Instead, subscribers needed to use commands and key sequences to perform the simplest functions. Even with its difficult interface, people joined by the tens of thousands. Other entrepreneurs took notice and decided to join the business, including Delphi, Genie (from GTE), BIX (from BYTE magazine), and later America Online.

Prodigy the first large commercial service to add a graphic interface to the bulletin board system. Using a Macintosh or Windows computer, subscribers could click on icons instead of typing archaic commands.

                                    27.The WELL

Some of the community bulletin board systems lived on and prospered. One of the most famous is the WELL, created by Stewart Brand in 1985 using a personal computer on his houseboat in Sasalito, California. He claims that he created the bulletin board system as a virtual commune so he could experience the lifestyle without having to move into a real commune.

On the WELL, members could create and host their own topical discussion boards, and the most popular one was devoted to the Grateful Dead, the Deadhead conference. One of the Sysops for the WELL was John Perry Barlow, who was also a lyricists for the Dead and a friend with band member Bob Weir since they met at a boarding school in Colorado.

People from all across the country called in to the bulletin board to join the online community. It was a successful experiment and was copied by other bulletin board systems. Using the WELL and the Internet, Brand and Barlow are now cyberspace activitists and are founding members of the Electronic Frontier Foundation.

                                   28.Spinning the Web

In 1990, Tim Berners-Lee, a computer programmer at CERN in Switzerland, began working on a way to provide access to research materials to everyone over the network. CERN had an international mix of researchers and a diverse collection of computers and operating systems. Reformating documents for each computer platform every time their content changed would require too much time and money.

While Berners-Lee was grappling with this problem, Mike Sendall, a fellow computer programmer at CERN, purchased a new NeXT workstation for evaluation. When he decided not to use it, he offered it to Berners-Lee. Berners-Lee was impressed with the NeXT cube's object-oriented operating system and it gave him an idea for a solution to the problem of distribution.

During the next year Berners-Lee worked on a system including a server to store documents and a client to request documents from the server. He finished the first working "browser" and server in 1991, but it was very primitive and displayed only text. At the time, all Berners-Lee wanted was a way for researchers to access text-based documents - nothing more. Others wanted more.

                                     

                               29.A Human Face

The University of Illinois at Urbana-Champain had always been on the forefront of computer science. It was one of the first dozen nodes on the ARPAnet, and it was chosen to manage the National Center for Supercomputer Applications (NCSA). Much of the world's high-power computing happens at UIUC, but not necessarily on campus.

One of the strategies of the NCSA was to provide access to the supercomputers for the country's and world's researchers. Around the clock, scientists would submit jobs to the Cray supercomputers over the NSFnet (the successor to the ARPAnet) and download the output. An army of staff and students maintained the campus computers to keep the connection stable and open.

In 1992, a few of the students led by Marc Andreessen came across the World Wide Web protocol released from CERN. They thought it was a great idea, but it was clumsy for most people with a minimum of skill with the computer. They decided it would be a fun a potentially worthwhile project to write a friendlier, graphical interface on the browser. When they had one finished, people asked them to write one for their PC or the Macintosh. Eventually they released it on the Internet, and the downloads increased steadily. In 1993, Andreessen's Mosaic browser was used by over one million people around the world. He had a hit, but he didn't realize it, and it was the school's property.

After graduation, Andreessen found a job in Silicon Valley and moved away from Illinois. High-tech firms were on shaky ground as the computer business was uncertain about their future. Jim Clark, the founder of SGI, decided to retire and look for something new to work on. Clark found out the Andreessen was working nearby, so he arranged a meeting to talk about Mosaic.

Clark was impressed with Andreessen and his enthusiasm for the browser. He decided to invest in a new software company, but he wanted Andreessen to recruit everyone that was involved in writing the Mosaic browser at the University of Illinois. Both of them flew back to Illinois and offered six of the original seven a job in the new company - the seventh, Chris Wilson, had already been hired at Microsoft.

Jim Clark, who had started SGI directly from Stanford, assumed that the University of Illinois would act the same way as Stanford and be pleased to permit a commercial spin-off with a student-created product. Afterall, grateful graduates were generous donors. However, UIUC didn't see it that way and refused to relinquish the Mosaic name.

Andreessen and the Gang of Six eventually rewrote the browser and changed its name to Netscape Navigator (actually, they have always called it Mozilla). Clark used a unique marketing plan, one born for the Internet. Anyone could download it, but if they used it for their own business they had to pay for a license.

"In about a year and a half's time, we had 65 million users - the most rapidly assimilated product in history. No one had ever achieve an installed base of 65 million anything, except perhaps Microsoft."

                                                                           Jim Clark, Netscape

                               30.Open for Business

During the first twenty years of operation, the Internet was a restricted club of scientists, engineers, and administrators. Official policy for the Internet forbid anyone from using the network for personal gain or anything that didn't have a job-related function. That didn't stop the real hackers, though. Online recreations were a common menace for the network, filling up capacity with starship combat and menacing wizards. Traffic for the Web, gopherspace, WAIS, and hytelnet was taking the majority of the Internet's backbone, and little of it was essential to government research.

Commerce was still taboo for everybody on the Internet. Hackers and bureaucrats were in complete agreement that any commercialization of the Internet would only lead to its demise. However, the power of the Net was no longer a secret, and the business world was beginning to wonder if there was a profit in the Internet.

However, U.S. law was standing in their way. Like the sooners crossing the borders of the Oklahoma territory, some business was sneaking across the network, but it was still illegal. U.S. Rep. Frederick Boucher, from the 9th district of Virginia, proposed to drop the restrictions and give the Internet over to the citizens. In 1992, Boucher proposed an amendment to the National Science Foundation Act of 1950 that "authorizes NSF to support the development and use of computer networks which may carry a substantial volume of traffic that does not conform to the current acceptable use policy."

Nobody had a real plan for making money on the Internet, but business dunked a toe to test the digital water and dove in. Soon everyone either had a Web site or was thinking of getting one. Internet Service Providers (ISPs) and Online Services (like Compuserve and AOL) were outgrowing their capacity with new members. Providing a connection to the Internet was a very profitable business, but online commerce wasn't as profitable as the predicted.

It took a few years before business on the Web began to show a profit. Online companies like Amazon.com and Excite.com were created just for the Internet, based on revolutionary business models. Entreprenuers had to look at business in a new way, and sometimes only the younger minds could grasp the new concepts. The Internet has forced us to rethink more than how we conduct business.

 

                             31.The Future of the Internet

The networked world imagined by Bush, Licklider, Nelson, and others is finally becoming a reality after three decades and countless hours of late-night hacking and field testing. Many of the predicted benefits of the "Intergalactic Network" are being realized, but new paradigms are constantly created and either thrive or disappear. The "fast as light" pace of the Internet can kill or establish an idea quicker than a marketing department can come up with an ad campaign. One of the products that has thrived is the cross-platform language called Java.

James Gosling, a senior programmer at Sun Microsystems, was working on the forefront of new ideas. He had already established himself as one of the world's best programmers, and his job at Sun was to push the limits of computers. However, in 1991 he felt like he was in a rut. Scott McNealy sensed something wrong and asked Gosling if there was a problem. Gosling told him the current operating systems were too restrictive and he wanted to create his own. McNealy told him to do it - no matter what the cost or amount of time.

After three years of hard work by Gosling and a handpicked team of programmers and hardware specialists, the result was Java. Its original intent was to embed a common operating system in household and office appliances, and network them together. A revolutionary idea, but the $20,000 price-tag for a "super" remote made it impractical. However, the cross-platform Java language was quickly accepted around the world for its other properties: cross-platform, object-oriented, network-secure, and easy to program.

                                    32.A Bigger Pipe

Traffic on the Internet today includes Java applets, streaming video and audio, subscription channels, as well as HTML and email. The government has handed over several sections of the Internet to private companies, and the capacity of the Internet's backbone has been increased to keep up with the exponential growth in traffic. Over the last few years new technologies have widened bandwidth to handle the increased traffic, but engineers don't know how long they'll be able to keep ahead of demands on the network.

Yogi Bera once said that "nobody goes to that restaurant anymore because it's too crowded." Many experts in computer networks, such as Vint Cerf, are predicting an equivalent problem for the Internet in the near future. They warn that too much traffic will shut everything down. To prevent the problem, we'll need faster networks and more efficient protocols.

In answer, there are two competing LAN technologies promising a ten-fold increase in network speeds - ATM (Asynchronus Transfer Mode) and gigabit Ethernet. To handle the faster speeds, a new Internet Protocol has been proposed, IPng (IP new generation, or IPv6). It's designed to handle the growing size of the Internet and faster network speeds. Just as the ARPAnet was based on open standards, all three of these technologies are nonproprietery. And just as the ARPAnet spawned a new industry, new companies are popping up to market products - and some of the players are very familiar.

Larry Roberts is currently President and CEO of Packetcom, a company that designs switches for ATM. One of the leaders in gigabit Ethernet technologies, Granite Systems, was founded by Andy Bechtolsheim (from Sun), but he eventually sold that company to Cisco Systems. Cisco is laying bets on all possible outcomes with products for gigabit Ethernet, ATM, and IPng. To help administrators manage their routers, Cisco recently licensed Novell Directory Services technology (NDS) from Novell. NDS is written in Java. On the browser front, AOL has offered to buy Netscape for over $4 billion. Meanwhile, Microsoft still fights an anti-trust lawsuit with witnesses from Sun, Novell, Netscape, AOL, and others.

The Internet has a rich history with colorful characters and comlex plots. This Web site has only presented a small part of its history, and many more stories remain mostly untold. As for the future of the Internet, most people admit it is uncertain, but everyone must agree that it will certainly be interesting and remain an important part of our future.

 

                                       

 

                                          

 

 

                                              Terms

ALOHAnet - Norm Abramson wanted to surf - so he moved to Hawaii in 1969. Abramson wanted to network with the other islands - so he built the ALOHAnet in 1970. From the University of Hawaii, Abramson connected computers over a network of radio transmitters using a protocol telling the computers how to share the airwaves. more of the story...

ARPA - Advanced Research Projects Agency, founded in 1957 in response to the Russian scientists beating our scientists in putting a satellite into orbit. more of the story...

ARPAnet - Advanced Research Projects Agency Network. Bob Taylor came up with the idea of networking all the ARPA-funded computers together so he wouldn't have to change seats. more of the story...

Bandwidth - how much stuff you can cram onto the network. A wider bandwidth means more information in a shorter amount of time.

BBN - Bolt, Beranek and Newman, in Cambridge, MA - founded by three partners in the 1950s as a consulting business in acoustic engineering. BBN shifted its business to computers as they became more important. In 1969, BBN was awarded the contract to build the first IMPs. more of the story...

Browser - software for navigating the Web, retrieving documents and other files, and displaying them on the user's screen. Two of the most popular browsers are Netscape Navigator and Microsoft Internet Explorer.

Bulletin Board System (BBS) - the cyberspace equivalent to the office bulletin board, a BBS is software that allows users to post and read messages left by other users. Bulletin Board Systems were very popular in the 1980's when computer enthusiasts set up their own systems on personal computers. more of the story...

Domain Name - When the keepers of the Internet realized that the number of computers on the network was getting too much to handle with simple computer names, they came up with a new addressing system. They added the school, organization, or company name and a domain identifier to tell if it was commercial (com), educational (edu), or something else (org, etc.). The domain for the PBS Web server is "pbs.org" and the full address "www.pbs.org" is the domain name. Other countries have an additional identifier to tell which country it comes from - for example, ".uk" means it's located in the United Kingdom.

Ethernet - a networking technology to connect computers over a local area network invented by Bob Metcalfe and David Boggs at Xerox PARC. Named after the invisible, massless substance that 19th century scientists believed filled the universe. more of the story...

FTP - File Transfer Protocol. One of the first applications developed for the ARPAnet, it's still used to send and retrieve files across the Internet.

Graphical User Interface (GUI) - a visual, icon-driven interface for an operating system or other application. A nice little acronym pronounced "gooey."

Host - Just like a party's host is responsible for all the guests, a computer host takes care of any other computers visiting over a network. In the early days of networking, any computer was a potential host, so now any computer connected to a network is called a host.

HTML - HyperText Markup Language. Publishers have always needed to write down instructions to the printer telling them how they wanted the document to look. Eventually, the printing business developed a standard set of shorthand "markup" instructions or "tags". On the Web, publishers use a Hypertext Markup Language to instruct Web browsers how the document should look. Berners-Lee came up with the first set of HTML tags using a tag style defined by the OSI for their Standard Generalized Markup Language (SGML). The HTML standard is currently defined and controlled by the World Wide Web Consortium.

HTTP - Hypertext Transfer Protocol. This is a set of instructions on how Web browsers and servers talk to each other.

Hypertext - a document formatting that allows documents to be linked by making certain words or phrases "clickable." When the link is followed, the information on the second document is related to the word in the first document. Hypertext is the formatting used on the World Wide Web.

IMP - Interface Messaging Processor - these were the minicomputers that connected each node on the ARPAnet to the network. Built by BBN, each was a refrigerator-sized Honeywell DDP-516 computer with a whopping 12k of memory. more of the story...

Internet - An internet is a group of networks connected together. The Internet (note the capital "I") refers to the global connection of networks around the world.

InterNIC - a collaborative project by Network Solutions, Inc., and AT&T (supported by the NSF) which provides four services to the Internet community. A "white pages" directory of domain names, IP addresses, and publicly accessible databases, domain name and IP address registration, support services for the Internet community, and an online publication summarizing information of interest to the online community.

IP - Internet Protocol, a protocol telling how packets on an internet are addressed and routed. The second part of TCP/IP.

Java - a high-level, object oriented programming language developed by Sun Microsystems that runs on most operating platforms. One of the original purposes of the language was to create a common language for all the "smart" appliances in the house. The ultimate in cross-platform, Java was going to let your TV and toaster speak the same language. Its new mission is to provide a language that programmers can use to write applications anyone can use on any computer. more of the story...

Javascript - A scripting language developed by Netscape Communications to add interactivity to Web pages. It really has little to do with Java, but Javascript is supposed to work across platforms and browsers.

Killer Application - Every step in the development of computers had a special application that made that step work and succeed - a killer app. For the personal computer it was the spread-sheet, and for the Internet it was email.

Local Area Network (LAN) - a group of computers, usually all in the same room or building, connected for the purpose of sharing files, exchanging email, and collaboration.

Mainframe - a large, multi-user computer. Before personal computers were available, businesses and universities purchased large and expensive mainframes and housed them away in large, air-conditioned rooms.

Metcalfe's Law - Metcalfe believes that a network's worth is directly related to the number of people on the networking. In the language of math, his law says "where N is the number of nodes, the power of a network is N squared."

Modem - modulator/demodulator - a device that converts digital (binary) signals from a computer into analog signals suitable for transmission over a phone line. On the other end, another modem it receives analog signals from a phone line and translates the analog signal back into digital bits.

MOSAIC - Soon after Marc Andreessen saw what the new World Wide Web could do in 1992, he thought a graphical interface for the browser would let everyone use the Web. He and seven other student programmers at the University of Illinois wrote the world's frist graphics Web browser, Mosaic, in 1992. more of the story...

NSFnet - A wide-area network developed by the National Science Foundation (NSF) in 1985. NSFnet replaced ARPAnet as the main government network linking universities and research facilities in 1990.

Node- a processing location on a network.

Packet - to send a message over a packet-switched network, the whole message it first cut up into smaller "packets" and each is numbered and labeled with an address saying where it came from and another saying where its going.

Packet switching - the technology that made large-scale computer networking possible. Instead of a dedicated connection between two computers, messages are divided up into packets and transmitted over a decentralized network. Once all the packets arrive at the destination, they are recompiled into the original message. more of the story...

Protocol - format or set of rules for communication, either over a network or between applications.

Router - a descendent of the IMP, a router directs packets between separate local area networks. To make the connection more efficient, a router reads each packet's header and directs it in the fastest direction. more of the story...

Search Engine - a program accessible on the Web which has a catalog of scanned Web sites in a large database. The user enters a list of keyword or search parameters, and the search engine creates a list of matches for the user to choose from.

TCP/IP - Transmission Control Protocol/Internet Protocol, first defined by Vint Cerf and Bob Kahn in 1973, the protocol made the Internet possible and has become the default network protocol around the world. more of the story...

TELNET - Terminal Emulation. Telnet allows a user at a remote computer to log on to another computer over a network and enter commands at a prompt as if they were directly connected to the remote computer.

Unix - an operating system developed by Kerrigan and Richie at AT&T Bell Labs in the late 1960's. It was written entirely in the C programming language, which made it easier to port to other platforms. It is still the primary operating system for the biggest servers on the Internet.

URL - Uniform Resource Locator, the address of a document or other resource reachable on the Internet. A URL has three components, specifying the protocol, server domain name, and the file location. For example, "http://www.pbs.org/nerds201/index.html" specifies using the HTTP protocol (others include ftp or gopher), on the www.opb.org server, and the file "/nerds201/index.html."

Usenet - A worldwide bulletin board system that can be accessed through the Internet or through many online services. The USENET contains more than 14,000 forums, called newsgroups, which cover almost every imaginable interest group. Created years before the Web, It is still used daily by millions of people around the world.

World Wide Web (WWW) - The protocol devised and implemented by Tim Berners-Lee in 1990 to help researchers at CERN share information across a diverse computer network. more of the story...

Xanadu - a networked, non-sequential, hyperlinked system of documents and multimedia objects first proposed by Ted Nelson in 1965. Nelson's system was similar to the World Wide Web, but included the ability to compose documents from sections scattered around the network and a method of making micro-payments to copyright holders. more of the story...

Xerox PARC - The Palo Alto Research Center was built by Xerox in the early 1970s to keep them ahead of the other office equipment companies in developing the office of the future. It is the location of many of the innovations that have changed the computer and communications. more of the story...

 

                             TIMELINE

1945 - Vennevar Bush publishes paper on memex machine.

1957 - U.S.S.R. launches Sputnik, first artificial earth satellite.

1958 - In response, U.S. forms the Advanced Research Projects Agency (ARPA) within the Department of Defense (DoD) to establish US lead in science and technology applicable to the military

1960 - J.C.R. Licklider publishes his landmark paper, "Man-Computer Symbiosis"

1961 - Leonard Kleinrock, MIT: "Information Flow in Large Communication Nets". First paper on packet-switching theory 

1962 - J.C.R. Licklider & W. Clark, MIT: "On-Line Man Computer Communication".Galactic Network concept encompassing distributed social interactions

Licklider becomes the founding directory for ARPA's Information Processing Techniques Office and the behavioral science division.

Paul Baran, RAND: "On Distributed Communications Networks" Packet-switching networks; no single outage point

1963 - Licklider funds Engelbarts new "Augmentation Research Center" at Stanford.

President Kennedy is assassinated in Dallas.

1965 - Paul Baran gets funding from U.S. Air Force to experiment with a block switching network to protect communications during an nuclear war. However, he withdrew his proposal when the project was shifted to military managers.

ARPA sponsors study on "cooperative network of time-sharing computers" TX-2 at MIT Lincoln Lab and Q-32 at System Development Corporation (Santa Monica, CA) are directly linked (without packet switches) (more of the story...) 

1966 - Larry Roberts, MIT: "Towards a Cooperative Network of Time-Shared Computers" First ARPANET plan.

(more of the story...)

1967 - ACM Symposium on Operating Principles Plan presented for a packet-switching network First design paper on ARPANET published by Lawrence G. Roberts

National Physical Laboratory (NPL) in Middlesex, England develops NPL Data Network under D. W. Davies

1968 - ARPA mails out 140 Requests for Proposals to prospective contractors to build the first four IMPs.

1969 - ARPAnet commissioned by DoD for research into networking. First nodes were UCLA, Stanford Research Institute, UCSB, and University of Utah. Use of Interface Message Processors (IMP) [Honeywell 516 mini computer with 12K of memory] developed by Bolt Beranek and Newman, Inc. (BBN)

First node-to-node message sent between UCLA and SRI - which was also the first ARPAnet crash

First Request for Comments (RFC): "Host Software" by Steve Crocker, written overnight in a bathroom so he wouldn't wake-up anyone.

1970 - ALOHAnet developed by Norm Abramson.

ARPANET hosts start using Network Control Protocol (NCP).

1971 - 15 nodes (23 hosts): UCLA, SRI, UCSB, Univ of Utah, BBN, MIT, RAND, SDC, Harvard, Lincoln Lab, Stanford, UIU(C), CWRU, CMU, NASA/Ames.

1972 - International Conference on Computer Communications with demonstration of ARPANET between 40 machines and the Terminal Interface Processor (TIP) organized by Bob Kahn. (October) more of the story...

InterNetworking Working Group (INWG) created to address need for establishing agreed upon protocols. Chairman: Vinton Cerf.

Telnet specification. 

1973 - First international connections to the ARPANET: University College of London (England) and Royal Radar Establishment (Norway)

Bob Metcalfe's Harvard PhD Thesis outlines idea for Ethernet.

Bob Kahn poses Internet problem, starts internetting research program at ARPA. Vinton Cerf sketches gateway architecture in March on back of envelope in hotel lobby in San Francisco.

Cerf and Kahn present basic Internet ideas at INWG in September at Univ of Sussex, Brighton, UK.

File Transfer Protocol specification (RFC 454)

Network Voice Protocol (NVP) specification (RFC 741) and implementation enabling conference calls over ARPAnet.

1974 - Vint Cerf and Bob Kahn publish "A Protocol for Packet Network Intercommunication" which specified in detail the design of a Transmission Control Program (TCP). [IEEE Trans Comm]

Larry Roberts founds Telenet, the first commercial packet-switched data service

1975 - Operational management of Internet transferred to DCA (now DISA)

1976 - Elizabeth II, Queen of the United Kingdom sends out an e-mail

UUCP (Unix-to-Unix CoPy) developed at AT&T Bell Labs and distributed with UNIX one year later.

1977 - THEORYNET created by Larry Landweber at Univ of Wisconsin providing electronic mail to over 100 researchers in computer science (using a locally developed email system and TELENET for access to server).

Mail specification (RFC 733)

Tymshare launches Tymnet, competition for Telenet.

First demonstration of ARPANET/Packet Radio Net/SATNET operation of Internet protocols with BBN-supplied gateways in July

1979 - Meeting between Univ of Wisconsin, DARPA, NSF, and computer scientists from many universities to establish a Computer Science Department research computer network (organized by Larry Landweber).

USENET established using UUCP between Duke and UNC by Tom Truscott, Jim Ellis, and Steve Bellovin. All original groups were under net.* hierarchy.

ARPA establishes the Internet Configuration Control Board (ICCB)

Packet Radio Network (PRNET) experiment starts with DARPA funding. Most communications take place between mobile vans. ARPANET connection via SRI.

1981 - BITNET, the "Because It's Time NETwork" started as a cooperative network at the City University of New York, with the first connection to Yale.

Provides electronic mail and listserv servers to distribute information, as well as file transfers.

CSNET (Computer Science NETwork) built by a collaboration of computer scientists and Univ of Delaware, Purdue Univ, Univ of Wisconsin, RAND Corporation and BBN through seed money granted by NSF to provide networking services (especially email) to university scientists with no access to ARPANET. CSNET later becomes known as the Computer and Science Network.

1982 - DCA and ARPA establish the Transmission Control Protocol (TCP) and Internet Protocol (IP), as the protocol suite, commonly known as TCP/IP, for ARPANET. (:vgc:) This leads to one of the first definitions of an "internet" as a connected set of networks, specifically those using TCP/IP, and "Internet" as connected TCP/IP internets. DoD declares TCP/IP suite to be standard for DoD (:vgc:)

1983 - Name server developed at Univ of Wisconsin, no longer requiring users to know the exact path to other systems.

Cutover from NCP to TCP/IP (1 January)

CSNET / ARPANET gateway put in place

ARPANET split into ARPANET and MILNET; the latter became integrated with the Defense Data Network created the previous year.

Desktop workstations come into being, many with Berkeley UNIX which includes IP networking software.

Berkeley releases 4.2BSD incorporating TCP/IP, with much of the programming done by Bill Joy

1984 - Domain Name System (DNS) introduced.

Number of hosts breaks 1,000

Moderated newsgroups introduced on USENET (mod.*)

George Orwell's prophesy of the universal loss of individual rights doesn't come true.

1985 - Whole Earth 'Lectronic Link (WELL), operated by Stewart Brand on his houseboat, is open for calls.

On March 15th, Symbolics.com is assigned the first registered domain. Other firsts: cmu.edu, purdue.edu, rice.edu, ucla.edu (April); css.gov (June); mitre.org, .uk (July)

100 years to the day of the last spike being driven on the cross-Canada railroad, the last Canadian university is connected to NetNorth in a one year effort to have coast-to-coast connectivity.

 1986 - NSFNET created (backbone speed of 56Kbps)

NSF establishes 5 super-computing centers to provide high-computing power for all (JVNC@Princeton, PSC@Pittsburgh, SDSC@UCSD, NCSA@UIUC, Theory Center@Cornell). This allows an explosion of connections, especially from universities.

NSF-funded SDSCNET, JVNCNET, SURANET, and NYSERNET operational

Internet Engineering Task Force (IETF) and Internet Research Task Force (IRTF) comes into existence under the IAB. First IETF meeting held in January at Linkabit in San Diego

The first Freenet (Cleveland) comes on-line 16 July under the auspices of the Society for Public Access Computing (SoPAC). Later Freenet program management assumed by the National Public Telecomputing Network (NPTN) in 1989

Network News Transfer Protocol (NNTP) designed to enhance Usenet news performance over TCP/IP.

1987 - Number of hosts breaks 10,000

NSF signs a cooperative agreement to manage the NSFNET backbone with Merit Network, Inc. (IBM and MCI) involvement was through an agreement with Merit). Merit, IBM, and MCI later founded ANS.

UUNET is founded with Usenix funds to provide commercial UUCP and Usenet access. Originally an experiment by Rick Adams and Mike O'Dell

 1988 - 2 November - Internet worm burrows through the Net, affecting ~6,000 of the 60,000 hosts on the Internet

CERT (Computer Emergency Response Team) formed by DARPA in response to the needs exhibited during the Morris worm incident. The worm is the only advisory issued this year.

DoD chooses to adopt OSI and sees use of TCP/IP as an interim. US Government OSI Profile (GOSIP) defines the set of protocols to be supported by Government purchased products

NSFNET backbone upgraded to T1 (1.544Mbps)

CERFnet (California Education and Research Federation network) founded by Susan Estrada, named after Vint Cerf

Internet Relay Chat (IRC) developed by Jarkko Oikarinen

FidoNet gets connected to the Net, enabling the exchange of e-mail and news

1989 - Number of hosts breaks 100,000

First relays between a commercial electronic mail carrier and the Internet: MCI Mail through the Corporation for the National Research Initiative (CNRI), and Compuserve through Ohio State Univ.

First Interop conference in San Jose, CA, created to promote the use of TCP/IP packet-switched networking

Countries connecting to NSFNET: Australia (AU), Germany (DE), Israel (IL), Italy (IT), Japan (JP), Mexico (MX),Netherlands (NL), New Zealand (NZ), Puerto Rico (PR), United Kingdom (UK) 

1990 - ARPANET ceases to exist

Electronic Frontier Foundation (EFF) is founded by Mitch Kapor and Stewart Brand

Archie released by Peter Deutsch, Alan Emtage, and Bill Heelan at McGill

Hytelnet released by Peter Scott (Univ of Saskatchewan)

The World comes on-line (world.std.com), becoming the first commercial provider of Internet dial-up access

ISO Development Environment (ISODE) developed to provide an approach for OSI migration for the DoD. ISODE software allows OSI application to operate over TCP/IP

Countries connecting to NSFNET: Argentina (AR), Austria (AT), Belgium (BE), Brazil (BR), Chile (CL), Greece (GR), India (IN), Ireland (IE), Korea (KR), Spain (ES), Switzerland (CH)

1991 - Gopher released by Paul Lindner and Mark P. McCahill from the Univ of Minnessota

World-Wide Web (WWW) released by CERN; Tim Berners-Lee developer

PGP (Pretty Good Privacy) released by Philip Zimmerman

NSFNET backbone upgraded to T3 (44.736Mbps)

NSFNET traffic passes 1 trillion bytes/month and 10 billion packets/month

Defense Data Network NIC contract awarded by DISA to Government Systems Inc. who takes over from SRI in May

 1992 - Internet Society (ISOC) is chartered

Number of hosts breaks 1,000,000

Veronica, a gopherspace search tool, is released by Univ of Nevada

The term "Surfing the Internet" is coined by Jean Armour Polly

1993 - InterNIC created by NSF to provide specific Internet services:

• directory and database services (AT&T)

• registration services (Network Solutions Inc.)

• information services (General Atomics/CERFnet)

US National Information Infrastructure Act

Mosaic takes the Internet by storm; WWW proliferates at a 341,634% annual growth rate of service traffic.

Gopher's growth is 997%.

1994 - ARPANET/Internet celebrates 25th anniversary

NSFNET traffic passes 10 trillion bytes/month

WWW edges out telnet to become 2nd most popular service on the Net (behind ftp-data) based on % of packets and bytes traffic distribution on NSFNET

1995 - NSFNET reverts back to a research network. Main US backbone traffic now routed through interconnected network providers

The new NSFNET is born as NSF establishes the very high speed Backbone Network Service (vBNS) linking super-computing centers: NCAR, NCSA, SDSC, CTC, PSC

RealAudio, an audio streaming technology, lets the Net hear in near real-time

WWW surpasses ftp-data in March as the service with greatest traffic on NSFNet based on packet count, and in April based on byte count

Traditional online dial-up systems (Compuserve, America Online, Prodigy) begin to provide Internet access

A number of Net related companies go public, with Netscape leading the pack with the 3rd largest ever NASDAQ IPO share value (9 August)

Registration of domain names is no longer free. Beginning 14 September, a $50 annual fee has been imposed, which up until now was subsidized by NSF. NSF continues to pay for .edu registration, and on an interim basis for .gov

1996 - Internet phones catch the attention of US telecommunication companies who ask the US Congress to ban the technology (which has been around for years)

MCI upgrades Internet backbone adding ~13,000 ports, bringing the effective speed from 155Mbps to 622Mbps.

The Internet Ad Hoc Committee announces plans to add 7 new generic Top Level Domains (gTLD): .firm, .store, .web, .arts, .rec, .info, .nom. The IAHC plan also calls for a competing group of domain registrars worldwide.

The WWW browser war, fought primarily between Netscape and Microsoft, has rushed in a new age in software development, whereby new releases are made quarterly with the help of Internet users eager to test upcoming (beta) versions.

1997 - 2000th RFC: "Internet Official Protocol Standards"

71,618 mailing lists registered at Liszt, a mailing list directory

The American Registry for Internet Numbers (ARIN) is established to handle administration and registration of IP numbers to the geographical areas currently handled by Network Solutions (InterNIC), starting March 1998.

101,803 Name Servers in whois database

1998 - Netscape releases the source code for its Netscape Navigator browswer to the public domain.

Microsoft releases Windows 98. Months later the government orders Microsoft to change its Java virtual machine to pass Sun's Java compatibility test.

Microsoft is taken to court for allegations of anti-trust violations.

httr://www.pbs.org/opb/nerds2.0.1/timeline

                         

 

                      Test your knowledge of Internet stuff!

1. How many colors are in a "Web Safe" palette?

2. Name the first graphical web browser.

3. Which came first, ".org" or ".com"?     

4. The first mouse was made out of _______.   

5. In what year did the Queen of England send her first email?     

6. What does "hex" stand for?        

7. Which of the following is not a protocol?     

8. What organization assigns domain names?   

9. What year did Tim Berners-Lee develop the World Wide Web?         

10. What is the previous name for Java?

                       Choose the correct answers: 

  1. 256;200;216

         2. Visu text;Netview;Mosaic

  3. .org; .com

  4. metal;plastic;wood

  5.1993;1976;1984

  6. hexadom;nothing;hexadecimal

  7. TCP/IP;elternet;WWW;ISP

  8. Internic;W3C;WWWBoard

  9. 1989;1991;1987

10. Oak;Latte;Power+

                

 

 

               Test your knowledge of Internet stuff!

 

1. How many colors are in a "Web Safe" palette?

2. Name the first graphical web browser.    

3. Which came first, ".org" or ".com"?        

4. The first mouse was made out of _______.      

5. In what year did the Queen of England send her first email?        

6. What does "hex" stand for?

7. Which of the following is not a protocol?        

8. What organization assigns domain names?      

9. What year did Tim Berners-Lee develop the World Wide Web?  

10. What is the previous name for Java?    

 

Correct answers:        

1. 256;200;216

2. Visu text;Netview;Mosaic

3. .org; .com

4. metal;plastic;wood

5.1993;1976;1984

6. hexadom;nothing;hexadecimal

7. TCP/IP;elternet;WWW;ISP

8. Internic;W3C;WWWBoard

9. 1989;1991;1987

10. Oak;Latte;Power+

httr://www.pbs.org/opb/nerds2.0.1/timeline

 


Дата добавления: 2018-05-13; просмотров: 270; Мы поможем в написании вашей работы!

Поделиться с друзьями:






Мы поможем в написании ваших работ!