CONTENDING “INFORMATION SUPERHIGHWAYS”
Posted on | July 16, 2016 | No Comments
During the 1980s, before the reality of the Internet, a new communications infrastructure was initiated based on digital technologies. Propelled largely by growing demand for new microprocessor-based business services and fuelled by the availability of low-grade “junk bonds,” companies like MCI, Tele-Communications Inc. (TCI), Turner Broadcasting, and McCaw Cellular raised over $20 billion dollars to lay a fiber optic networks and implement new digital services such as videotext and interactive television.
Soon several modes of telecommunications were competing for the title of “information superhighway,” a popular metaphor for the changes happening to data communications and the potential for expanded telecommunications services such as interactive television. Generally attributed to then Senator Al Gore, the term was co-opted by the Bell companies who finally saw new opportunities coming with the digital revolution. For instance, Bell Atlantic and TCI attempted to form a merger that would offer interactive information services and video-on-demand over both cable and telephone lines.
Although the Internet was already twenty years old, it still had not achieved the type of technical robustness needed to capture popular and commercial attention. Wireless communication was growing, but still primarily used by an elite business class due to the lack of dedicated spectrum and wide-scale infrastructure problems. Cable TV was also a contender, with an extensive subscriber base and having built up its coaxial and fiber infrastructure during the late 1980s. Satellite was also in the running, with dramatically increased power capabilities resulting from the continuing development of solar power. The ability to efficiently transform sunlight into signal radiating power allowed smaller and smaller “earth station” antennas to pick up broadcast and narrowcast signals. A longshot were the power utilities that had developed technology to transmit data along its electrical lines. They lacked installed capacity but had good maintenance teams, billing systems, and ready access to homes and other buildings.
Which technology was going to rise to this status? Despite two decades of existence, the Internet was relatively archaic with no World Wide Web and few high-speed backbone networks. Wireless systems lacked the spectrum or infrastructure for broadband transmission over significant geographic domains. Interactive television was becoming a possibility as the FCC rolled back restrictions on the common carriers providing content, but despite ADSL over copper and fiber-to-the-home, software and content factors proved major limitations.
Interactive consumer services got their start with videotext offerings, but the terminals were large and awkward, and it only displayed textual information. Telephone companies soon began testing other electronic services. For instance, Bell Atlantic and TCI attempted to form a merger that would offer interactive information services and video on demand.
“Telcos” created a new technology for enhancing telephone lines called ADSL (Asynchronous Digital Subscriber Line) that could provide video over existing copper lines to the home. They were also intensively lobbying Washington DC to create a favorable regulatory environment for their new endeavors, specifically trying to derail the 1984 Cable TV Act that excluded them from offering television services.
Divested from ATT in the early 1980s and deprived of the lucrative long-distance services, the Regional Bell Operating Companies (RBOCs) such as Ameritech, Bell Atlantic, BellSouth, and others sought to take advantage of their monopolies over local telecommunications by providing such services as ISDN and interactive television.
By the early 1990s, the Baby Bells were conducting tests using ADSL (Asynchronous Digital Subscriber Line) to provide video over existing copper lines to the home. Disillusioned by the costs of providing fiber to the home, telcos looked to leverage their existing plant. ADSL could send compressed video over the established telephone lines. It was suitable to this task because it could send data downstream to the subscriber faster (256Kbps-9Mbps) than upstream (64Kbps-1.54Kbps) to the provider.[1]
The Bell telcos were also intensively lobbying Washington DC to create a favorable regulatory environment, specifically trying to derail the Cable Communications Policy Act of 1984 that excluded them from offering television services. Bell Atlantic even attempted to merge with cable TV giant TCI in anticipation of their control of the new information highways.
The Cable Communications Policy Act of 1984 triggered strong concerns that the cable TV industry was becoming too strong in relation to other parts of communications/media industry. Growing horizontal and vertical integration as well as a subscriber rate encompassing over 60% of American homes threatened the telecommunications companies, which began to press their own claim to the household imagination.[2]
The information superhighway, as envisioned by the Bell Atlantic-TCI merger, ran into a roadblock when Congress overrode President Bush’s veto of the 1992 Cable Act. The new rules allowed Clinton’s FCC administration to lower cable rates. TCI’s stock dropped and the deal fell through. The FCC had implemented new econometric models that allowed them to reduce cable TV rates in select markets around the country without reducing cable companies revenue. When the FCC announced its new rulings in February 1994, both companies announced that the new regulations had killed their deal.[3]
Well, we know how this story turns out. With its TCP/IP software, the Internet became the world’s “information superhighway.” Its ability to connect differing computers and operating systems had given it unprecedented connectivity in the computer world, and over the course of the 1990s, it became the preferred conduit for communications and netcentric commerce.
Notes
[1] Speed rates on ADSL are listed as their more current rates as provided by Heidi V. Anderson, “Jump-Start Your Internet Connection with DSL,” in How the Internet Works, Part I. SMART COMPUTING REFERENCE SERIES. p. 105
[2] Logue, T. (1990) “Who’s Holding the Phone? Policy Issues in an Age of Transnational Communications Enterprises,” PTC ’90 Pacific Telecommunications Council Annual Conference Proceedings, Honolulu, Hawaii. p. 96.
[3] Hundt, R.E (2000) You Say You Want a Revolution: A Story of Information Age Politics. Yale University Press. pp. 30-34.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: ADSL > Asynchronous Digital Subscriber Line > Bell Atlantic-TCI merger > Cable Communications Policy Act of 1984 > information superhighway > TCP/IP > Tele-Communications Inc. (TCI)
Digital Games and Meaningful Play
Posted on | June 30, 2016 | No Comments
What is a game? What makes it fun? How can you design a game to provide a meaningful and rewarding experience? Rules of Play: Game Design Fundamentals by Katie Salen Tekinba and Eric Zimmerman is a great blend of theory and practical application and helps us understand the importance of “gameplay” – the emotional relationship between player actions and game outcomes. The book helps explain what makes games, from baseball to virtual reality games, effective and meaningful. In this post, I look at some of the key ideas involved in understanding games, using baseball as a primary example.
A game is a structured form of playing and involves choices of action. It invokes an organized way of making choices, taking action, and experiencing some kind of feedback. In other words, game players take some visible action and the game responds with information that provides feedback to the player and subsequently changes the status of the game. Below is a picture of my daughter playing a virtual reality game of baseball.
The actions and subsequent outcomes need to be discernible – understandable and visible. And they need to be integrated into the game. In baseball, for example, a batter makes a decision to swing at a ball thrown by the pitcher. Several things can happen based on the trajectory of the pitch and the way the batter swings. She can swing and miss, or hit the ball for one of several results: foul ball, base hit, home run, pop out, etc.
The result of the action needs to be evident and contribute to the game. The foul ball is registered as a strike; a base hit can move runners or at least get the batter on base. A home run is an ultimate action in baseball as it adds immediately to the final score of the game.
The many recognizable actions of a game is one of the reasons “datatainment” is a prominent part of baseball. Hits, home runs, ERA, strikeouts are all significant acts that can be distinguished and statistically registered on a baseball scoreboard or the season “stats” of a player. Not only are players evaluated based on these measures, fans of professional sports often take a keen interest in these numbers as part of an identification process with players. Sports teams look to deepen fan engagement by going beyond box scores to digitally-enabled fantasy sports and other forms of social involvement and entertainment.
Choices and actions change the game and create new meanings. They move the game forward. Strikes end batters; outs end innings. As the game moves forward, new meanings are created. Heroes emerge, a team pulls ahead, a team comes from behind. A good game drives emotional and psychological interest, either through a tribal allegiance to a team, an interest in a player, or a recognition of the stakes of a game, as in a championship such as the World Series. But in every case, the game must have discernible actions that have a meaningful impact on the progress and result of the game.
Citation APA (7th Edition)
Pennings, A.J. (2016, Jun 30). Digital Games and Meaningful Play. apennings.com https://apennings.com/meaningful_play/games-and-meaningful-play/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: datatainment > games > meaningful play > Rules of Play > Zimmerman
Xanadu to World Wide Web
Posted on | June 11, 2016 | No Comments
Tim Berners-Lee, a British citizen and a software consultant at CERN, or Centre European pour la Recherche Nucleaire developed what came to be known as the World Wide Web (WWW). Located in Switzerland, CERN was Europe’s largest nuclear research institute, although the name was changed to European Laboratory for Particle Physics to avoid the stigma attached to nuclear research.
In March of 1989, Berners-Lee proposed a project to create a system for sharing information among CERN’s dispersed High Energy Physics research participants. This information management system would form the basis of the global Internet, especially after 1994, when he founded the World Wide Web Consortium (W3C), a standards organization that began to guide the Internet’s interoperable technologies with specifications, guidelines, software, and tools for web addresses (URLs), Hypertext Transfer Protocol (HTTP) and Hypertext Markup Language (HTML). These technologies allowed web browsers like Mosaic, Netscape, Internet Explorer, and later Firefox and Chrome, to access data and display web pages.
Berners-Lee wanted to create a system where information from various sources could be linked and accessed, creating a “pool of human knowledge.” Using a NeXT computer built by Steve Job’s post-Apple company, he wrote the prototype for the World Wide Web and a basic text-based browser called Nexus. The Next computer had a UNIX operating system, built-in Ethernet and a version of the Xerox PARC graphical user interface that Jobs had transformed into the Apple Mac. Berners-Lee credited the NeXT computer with having the functionality to speed up the process, saving him perhaps a year in the coding process.
Dissatisfied with the limitations of the Internet, Berners-Lee developed this new software around the concept of “hypertext,” which had originated in Ted Nelson’s Computer Lib, a 1974 manifesto about the possibilities of computers. Nelson warned against leaving the future of computing to a priesthood of computer center guardians that served the dictates of the mainframe computer.
Ted Nelson’s Xanadu project allowed him to coin the terms “hypertext” and “hypermedia” as early as 1965. Xanadu is the original name for Kublai Khan’s mythical summer palace, described by the enigmatic Marco Polo. “There is at this place a very fine marble palace, the rooms of which are all gold and painted with figures of men and beasts and birds, and with a variety of trees and flowers, all executed with such exquisite art that you regard them with delight and astonishment.” Nelson strove to transform the computer experience with software and display technology that would make reading and writing an equally rich “Xanadu” experience.
An important transition technology was HyperCard, a computer application that allowed the user to create stacks of connected cards that could be displayed as visual pages on an Apple Macintosh screen. Using a scripting language called HyperTalk, each card could show text, tables, and even images. “Buttons” could be installed on each card that linked it to other cards within the stack with a characteristic “boing” sound clip. Later, images could be turned into buttons. Hypercard missed out on historical significance because of Apple’s “box-centric culture,” according to HyperCard inventor Bill Atkinson. He later lamented, “If I’d grown up in a network-centric culture, like Sun, HyperCard might have been the first Web browser.” [1]
Berners-Lee accessed the first web page, on the CERN web server on Christmas Day, 1990. He spent the next year adding content and flying around the world to convince others to use the software. Concerned that a commercial company would copy the software and create a private network, he convinced CERN to release the source code under a general license so that it could be used by developers freely. One example was a group of students at the University of Illinois at Urbana-Champaign’s National Center for Supercomputing Applications that was part of the NSFNET. Marc Andreessen and other students created the Mosaic browser that they distributed for free using the Internet’s FTP (File Transport Protocol). They soon left for Silicon Valley where they got venture capital to create Netscape, a company designed around their Web browser called Netscape Navigator.[2]
Berners-Lee designed the WWW with several features that made it extremely effective.
First, it was based on open systems that allowed it to run on many computing platforms. It was not designed for a specific proprietary technology but rather would allow Apples, PCs, Sun Workstations, etc. to connect and exchange information. Berners-Lee compared it to a market economy where anyone can trade with anyone on a voluntary basis.
Second, it actualized the dream of hypertext, the linking of a “multiverse” of documents on the WWW. While Ted Nelson would criticize its reliance on documents, files, and traditional directories, the “web” would grow rapidly.[3]
Third, it used a hypertext transfer protocol (HTTP) to create a direct connection from the client to the server. With this protocol and taking advantage of packet-switching data communications, the request for a specific document is sent to the server and either the requested document is sent or the client is notified that the document does not exist. The power of this system meant that the connection was closed quickly after the transaction, saving bandwidth and freeing the network for other connections.
Fourth, it also worked with the existing Internet infrastructure and integrated many of its basic protocols including FTP, Telenet, Gopher, e-mail, and News. FTP was particularly important for the distribution of software, including browsers. Newsgroups informed people all around the NSFNET that the technology and associated browsers were available.
Another crucial feature was that content could be created using a relatively easy to use interpretation language called Hypertext Markup Language (HTML). HTML was a simplified version of another markup language used by large corporations called Standard Generalized Markup Language (SGML). HTML was more geared towards page layout and format, while SGML was better for document description. Generalized markup describes the document to whatever system it works within. HTML and SGML would form a symbiotic relationship and eventually lead to new powerful languages for e-commerce and other net-centric uses like XML (eXensible Markup Language) and HTML 5.
Finally, Berners-Lee developed the uniform resource locator (URL) as a way of addressing information. The URL gave every file on the WWW, whether it was a text file, an image file, or a multimedia file, a specific address that could be used to request and download it.[4]
Together, these features defined a simple transaction that was the basis of the World Wide Web. In summary, the user or “client” establishes a connection to the server over the packet-switched data network of the Internet. Using the address, or URL, the client issues a request to the server specifying the precise web document to be retrieved. Next, the server responds with a status code and, if available, the content of the information requested. Finally, either the client or the server disconnects the link.
The beauty of the system was that its drain on the Internet was limited. Rather than tying up a whole telecommunications line as a telephone call would do, the HTTP allowed for the information to be downloaded (or not) and then the connection would be terminated. The World Wide Web began to allow unprecedented amounts of data to flow through the Internet, changing the world’s economy and it’s communicative tissue.
In another post I discuss how important hypertext clicks are to the advertising industry through the tracking of various metrics.
Notes
[1] Kahney, L. (2002, August 14). HyperCard: What Could Have Been. Retrieved June 10, 2016, from http://www.wired.com/2002/08/hypercard-what-could-have-been/
[2] Greenemeier, L. (2009, March 12). Remembering the Day the World Wide Web Was Born. Retrieved June 11, 2016, from http://www.scientificamerican.com/article/day-the-web-was-born/
[3] Banks, L. (2011, April 15). Hypertext Creator Says Structure of World Wide Web ‘Completely Wrong’ Retrieved June 11, 2016. Also Greenemeier, Larry. “Remembering the Day the World Wide Web Was Born.” Scientific American. N.p., 11 Mar. 2009. Web. 13 May 2017.
[4] Richard, E. (1995) “Anatomy of the World Wide Web,” INTERNET WORLD, April. pp. 28-20.
Citation APA (7th Edition)
Pennings, A.J. (2016, Jun 11). From Xanadu to World Wide Web. apennings.com https://apennings.com/how-it-came-to-rule-the-world/xanadu-to-world-wide-web/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: HTML > HTML5 > HTTP > HyperCard > NeXt Computer > Ted Nelson > Tim Berners-Lee > URL > World Wide Web > Xanadu Project
Cisco Systems: From Campus to the World’s Most Valuable Company, Part One: Stanford University
Posted on | May 24, 2016 | No Comments
Cisco Systems emerged from employees and students at Stanford University in the early 1980s to become the major supplier of the Internet’s enigmatic plumbing. In the process, it’s stock value increased dramatically and it became the largest company in the world by market capitalization. Cisco originally produced homemade multi-platform routers to connect campus computers through an Ethernet LAN and throughout the 1980s, they built the networking technology for the National Science Foundation’s NSFNET. As the World Wide Web took off during the 1990s, they helped countries around the world transit their telecommunications systems to Internet protocols. Cisco went public on February 4, 1990, with a valuation of $288 million. By 2002, Cisco Systems was calculated to be the world’s most valuable company, worth $579.1 billion to second place Microsoft’s $578.2 billion. Microsoft had replaced General Electric’s No. 1 ranking in 1998.
This post will present the early years of Cisco Systems development and the creation of networking technology on the Stanford University campus. The next post will discuss the commercialization and success of Cisco Systems as it helped to create the global Internet by first commercializing multi-protocol routers for local area networks.
Leonard Bosack and Sandra K. Lerner formed Cisco Systems in the early 1980s and were the driving forces of the young company. In the beginning, they each were the heads of different computer centers on campus and incidentally, (or perhaps consequently) dating. The couple met on the Stanford campus (Bosack earned a master’s in computer science in 1981 and Lerner received a master’s in statistics in 1981) while managing the computer facilities of different departments on the Stanford campus. The two faculties were located at different corners of the campus and the couple began to work together to link them, and to other organizations scattered around the campus. Drawing on work being conducted at Stanford and Silicon Valley, they developed a multi-protocol router to connect the departments. Bosack and Lerner left Stanford University in December 1984 to launch Cisco Systems.
Robert X. Cringely, author of Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date interviewed both founders for his PBS video series, Nerds 2.0.1
Bosack and Lerner happened on their university positions during a very critical time in the development of computer networks. The Stanford Research Institute (SRI) was one of the four original ARPANET nodes and the campus later received technology from Xerox PARC, particularly the Alto computers and the Aloha Network technology, now known as Ethernet.[1] This technology, originally developed at the University of Hawaii to connect different islands, was improved by Robert Metcalfe and other Xerox PARC researchers and granted to the Stanford University in late 1979.[2] Ethernet technologies needed router technology to network effectively and interconnect different computers and Ethernet segments.
A DARPA-funded effort during the early 1970s at Stanford had involved research to design a new set of computer communication protocols that would allow several different packet networks to be interconnected. In June of 1973, Vinton G. Cerf started work on a novel network protocol with funding from the new IPTO director, Robert Kahn. DARPA was originally interested in supporting command-and-control applications and in creating a flexible network that was robust and could adjust to the changing situations to which military officers are accustomed. In July 1977, initial success led to a sustained effort to develop the Internet protocols known as TCP/IP (Transmission Control Protocol and Internet Protocol). DARPA and the Defense Communications Agency, which had taken over the operational management of the ARPANET, supplied sustained funding the project.[3]
The rapidly growing “Internet” was implementing the new DARPA-mandated TCP/IP protocols. Routers were needed to “route” packets of data to their intended destinations. Every packet of information has an address that helps it find its way through the physical infrastructure of the Internet. Stanford had been one of the original nodes on ARPANET, the first packet-switching network. In late 1980, Bill Yeager was assigned to work on a router as part of the SUMEX (Stanford University Medical Experimental) initiative at Stanford University to build a network router. Using a PDP-11, he first developed a router and TIP (Terminal Interface Processor). Two years later he developed a Motorola 68000-based router and TIP using experimental circuit boards that would later become the basis for the workstations sold by SUN Microsystems.[4]
Bosack and Lerner had operational rather than research or teaching jobs. Len Bosack was the Director of Computer Facilities for Stanford’s Department of Computer Science, while Sandy Lerner was Director of Computer Facilities for Stanford’s Graduate School of Business. Despite their fancy titles, they had to run wires, install the protocols, and get the computers to work. They were in charge of getting the computers and the networks to work and make them usable for the university. Bosack had worked for DEC, helping to design the memory management architecture for the PDP-10. At Stanford, many different types of computers: mainframes, minis, and the microcomputers were all in demand and used by faculty, staff, and students.
Some 5000 computers were scattered around the campus. The Alto Computer, in particular, was proliferating on campus. Thanks to Ethernet, computers were often connected locally, within a building or a cluster of buildings, but no overall network existed throughout the campus. Bosack, Lerner, and other colleagues such as Ralph Gorin and Kirk Lougheed worked on “hacking together” some of these disparate computers into the multi-million dollar broadband network being built on campus. But it was running into difficulties. They needed to develop “bridges” between local area networks and then crude routers to move packets. At the time, routers were not offered commercially. Eventually, their “guerilla network” became the de facto Stanford University Network (SUNet).
Notes
[1] Stanford networking experiments included those in the AI Lab, at SUMEX, and the Institute for Mathematical Studies in the Social Sciences (IMSSS).
[2] Ethernet Patent Number: 04063220 , Metcalfe, et al.
[3] Vinton G. Cerf was involved at Stanford University in developing TCP/IP and later became Senior Vice President, Data Services Division at MCI Telecommunications Corp. His article “Computer Networking: Global Infrastructure for the 21st Century” was published on the www and accessed in June 2003.
[4] Circuit boards for the 6800 TIP developed by Andy Bechtolsheim in the Computer Science Department at Stanford University.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Cisco Systems > Leonard Bosack > Sandra K. Lerner > Silicon Valley > SUNet
The NSFNET is the Internet
Posted on | May 20, 2016 | No Comments
An important intermediary in the transition of the military’s ARPANET into the commercial Internet was the National Science Foundation’s NSFNET. The NSFNET adopted TCP/IP and required all connecting nodes to use them as well compliant network technology, mainly built by a small California startup company called Cisco. With government funding for advanced scientific and military research, the network expanded rapidly to form the initial Internet. Without the NSFNET, the Internet would have grown differently, probably using the X.25 protocols developed by the phone companies. Without specifying the use of TCP/IP protocols the Internet would have emerged with significantly less interoperability and diversity of services.
The NSFNET has its origins at the University of Maryland during the 1982-83 school-year. The university was looking to connect its campus computers as well as network with other colleges. It applied to the National Science Foundation (NSF) for funding but found it was organizationally challenged for such a request. In response, the NSF set up the Division of Networking and Computing Research Infrastructure to help allocate resources for such projects. The Southeastern Universities Research Association Network or SURANET adopted the newly sanctioned TCP/IP protocols, connecting the University of Maryland to other universities. It set a precedent and nearly two years into the project, SURANET connected to IBM at Raleigh-Durham, North Carolina.
The National Science Foundation (NSF) was formed during the 1950s before computer science emerged as a specific discipline. It first established areas of research in biology, chemistry, mathematics, and physics before it became a significant supporter of computing activities. Finally, in 1962, it set up its first computing science program within its Mathematical Sciences Division. At first it encouraged the use of computers in each of these fields and later towards providing a general computing infrastructure, including setting up university computer centers in the mid-1950s that would be available to all researchers. In 1968, an Office of Computing Activities began subsidizing computer networking. They funded some 30 regional centers to help universities make more efficient use of scarce computer resources and timesharing capabilities. The NSF worked to “make sure that elite schools would not be the only ones to benefit from computers.”[1]
During the early 1980s, the NSF started to plan its own national network. In 1984, a year after TCP/IP was institutionalized by the military, the NSF created the Office of Advanced Scientific Computing, whose mandate was to create several supercomputing centers around the US.[2] Over the next year, five centers were funded by the NSF.
- General Atomics — San Diego Supercomputer Center, SDSC
- University of Illinois at Urbana-Champaign — National Center for Supercomputing Applications, NCSA
- Carnegie Mellon University — Pittsburgh Supercomputer Center, PSC
- Cornell University — Cornell Theory Center, CTC
- Princeton University — John von Neumann National Supercomputer Center, JvNC
However, it soon became apparent that they would not adequately serve the scientific community.
In 1986, Al Gore sponsored the Supercomputer Network Study Act to explore the possibilities of high-speed fiber optics linking the nation’s supercomputers. They were much needed for President Reagan’s “Star Wars” Strategic Defense Initiative (SDI) and as well as competing against the Japanese electronics juggernaut and its “Fifth Generation” artificial intelligence project.
Although the Supercomputer Network Study Act of 1986 never passed, it stimulated interest interest in the area and as a result the NSF formulated a strategy to assume leadership responsibilities for the network systems that ARPA had previously championed. It took two steps to make networking more accessible. First, it convinced DARPA to expand its packet-switched network to the new centers. Second, it funded universities that had interests in connecting with the supercomputing facilities. In this, it also mandated specific communications protocols and specialized routing equipment configurations. It was this move that standardized the specific set of data communication protocols that caused the rapid spread of the Internet as universities around the country and then around the world. Just as the military had ordered the implementation of Vint Cerf’s TCP/IP protocols in 1982, the NSF directives standardized networking in the participating universities. All who wanted to connect to the NSF network had to buy routers (mainly built by Cisco) and other TCP/IP compliant networking equipment.
The NSF funded a long haul backbone network called NSFNET in 1986 with a data speed of 56Kbps (upgraded to a T1 or 1.5 Mbps the following year) to connect the high-computing power for all its nodes. It also offered to allow other interested universities to connect to it as well. The network became very popular but not because of its supercomputing connectivity but rather because of its electronic mail, file transfer protocols, and its newsgroups. It was the technological simplicity of TCP/IP that made it sprout exponentially over the next few years.[3]
Unable to manage the technological demands of its growth, the NSF signed a cooperative agreement in November 1987 with IBM, MCI, and Merit Network, Inc. to manage the NSFNET backbone. By June of the next year, they expanded the backbone network to 13 cities and developed a modern control center in Ann Arbor Michigan. Soon it grew to over 170 nodes, and traffic was expanding at a rate of 15% a month. In response to this demand, the NSF exercised a clause in their five-year agreement to implement a newer state-of-the-art network with faster speeds and more capacity. The three companies formed Advanced Network & Services Inc. (ANS), a non-profit organization to provide a national high-speed network.
Additional funding by the High Performance Computing Act of 1991 helped expand the NSFNET into the Internet. By the end of 1991, ANS had created a new links operating at T-3 speeds. T-3 traffic moves at speeds up to 45mbps and over the next year ANS replaced the entire T-1 NSFNET with new linkages capable of transferring an equivalent of 1,400 pages of single-spaced typed text a second. The funding allowed the University of Illinois at Urbana Champaign’s NCSA (the National Center for Supercomputing Applications) to support graduate students for $6.85 an hour. A group including Marc Andresson developed a software application called Mosaic for displaying words and images. Mosaic was the prototype for popular web browsers such as Chrome and Internet Explorer.
The NSFNET soon connected over 700 colleges and universities as well as nearly all federally funded research. High schools, libraries, community colleges and other educational institutions were also joining up. By 1993, it also had links to over 75 other countries.[4]
Pressures had been building to allow commercial activities on the Internet, but the NSF had strict regulations against for-profit activities on its network facilities. During the 1980s, the network was subject to the NSF acceptable use policy, including restricting commercial uses of the outcomes of NSF research. Congressman Rick Boucher (D-Virginia) introduced legislation on June 9th, 1992 that allowed commercial activities on the NSFNet.[5] In one of his last executive acts, President Bush finally allowed business to be conducted over its networks and those that were being built around it. Several months into its newly liberalized status, the NSFNET transitioned to an upgraded T3 (45Mbs) backbone status – much, much faster than its original 45Kbs speed.
The legacy of the NSFNET is that it ensured the proliferation of the TCP/IP technologies, protocols and associated hardware. These systems were designed as an open architecture that accepts any computer and connects to any network. It was also miminalist, requiring little from the computer and neutral to applications (e-mail, browsers, FTP) and content running on the network.
Although these are idealistic principles and not always followed in practice, they were largely responsible for the unlocking the tremendous economic growth of the Internet age. For example, Marc Andresson and some of his colleagues soon left the University of Illinois at Urbana and formed Netscape to market their “browser.” Their IPO in 1994 helped spark the massive investments into the Internet that characterized the 1990s and the rise of the “dot.coms.”
Notes
[1] Janet Abbate’s (2000) Inventing the Internet by MIT Press is a classic exploration of the history of the World Wide Web, p. 192.
[2] Abbate, p. 191.
[3] Kahin, B. (ed.) (1992) Building Information Infrastructure: Issues in the Development of a National Research and Education Network. McGraw-Hill Primis, Inc. This book contains a series of papers and appendixes giving an excellent overview of the discussion and legislation leading to the NREN.
[4] Information obtained from Merit, December 1992
[5] Segeller, S. (1998) Nerds 2.0.1 pp. 297-306
Share
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Advanced Network & Services Inc. > Al Gore > ANS > Congressman Rick Boucher > High Performance Computing Act of 1991 > Marc Andresson > Netscape > NSFNET > Supercomputer Network Study Act of 1986 > TCP/IP > The National Science Foundation > University of Illinois at Urbana Champaign’s NCSA (the National Center for Supercomputing Applications) > Vint Cerf’s TCP/IP protocols
We shape our buildings and afterwards our buildings shape us
Posted on | May 9, 2016 | No Comments
One of Marshall McLuhan’s most celebrated intellectual “probes” was a paraphrase of Winston Churchill’s infamous “We shape our buildings, and afterwards our buildings shape us.” Churchill was addressing Parliament some two years after a devastating air raid by the Nazis destroyed the House of Commons and was arguing for its restoration, despite the major challenges of the war.
Churchill’s famous line was paraphrased in the 1960s with a more topical, “We shape our tools and thereafter our tools shape us.” and was included in McLuhan’s classic (1964) recording The Medium is the Massage. With the diffusion of the television and the transistor radio, it was a time when the electronic media was exploding in the American consciousness. McLuhan and others were committed to understanding the role of technology, particularly electronic media in modern society.
The revised quote is often attributed to McLuhan, but it was actually reworded by his colleague, John M. Culkin. Culkin was responsible for inviting McLuhan to Fordham University for a year and subsequently greatly increasing his popularity in the US. Culkin later formed the Center for Understanding Media at Antioch College and started a master’s program to study media. Named after McLuhan’s famous book Understanding Media, the center later moved to the New School for Social Research in New York City after Culkin joined their faculty.
The probe/quote serves in my work to help analyze information technologies (IT), including communications and media technologies (ICT). It provides frames for inquiring into the forces that shaped ICT, while simultaneously examining how these technologies have shaped economic, social and political events and change. IT or ICT means many things but is meant here to traverse the historical chasm between technologies that run organizations and processes and those that educate, entertain and mobilize. This combination is crucial for developing a rich analysis of how information and communications technologies have become powerful forces in their own right.
My concern has to do with technology and its transformative relationship with society and institutions. In particular, the reciprocal effects between technology and power. Majid Tehranian’s “technostructuralist” perspective was instructive because it examined information machines in the context of the structures of power that conceptualize, design, fund, and utilize these technologies. In Technologies of Power (1990) he compared this stance to a somewhat opposite one, a “techno-neutralist” position – the position that technologies are essentially neutral and their consequences are only a result of human agency.
In my series How IT Came to Rule the World, I examine the historical forces that shaped and continue to shape information and related technologies.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Digital Spreadsheets – Part 5 – Ease and Epistemology
Posted on | May 6, 2016 | No Comments
To pick up the story, I started this analysis of the spreadsheet, looking at the emergence of Lotus 1-2-3 within the context of the 1980s. This examination included the importance of the personal computer (PC) and the Reagan Revolution, which was characterized by the commercialization of Cold War technologies, the globalization of capital, and the increasing financialization of individual and organizational life. Part 2 on “spreadsheet capitalism” showed how spreadsheets came to be used by a new breed of analysts for leveraged buyouts (LBOs) and other financial manipulations. The PC-enabled spreadsheet soon became a fixture of modern organizational practice and management because of its ability to quickly store moderate amounts of data, make formulaic calculations and associated projections, and present this newly manufactured information in intelligible visual formats.
Part 3 of the story introduced the formal analysis of spreadsheets that examined how this new technology incorporates other media starting with zero and the Indo-Arabic numerals. The base-10 decimal system, within a positional place-value notation, gave the zero-based numerical system a previously unprecedented calculative ability. Other representational systems in Part 4 include alphabetical writing and list technology. Below, I introduce the importance of the table as an epistemological technology that is empowered by the computerized spreadsheet and is part of a system of knowledge production that has transformed modern forms of bureaucratic and corporate organization.
Bonnie A. Nardi and James R. Miller of Hewlett-Packard Laboratories noted that the effectiveness of a spreadsheet program derives from two major properties: its ability to provide low-level calculation and computational capabilities to lightly trained users and its “table-oriented interface” that displays and organizes information in a perceptual geometric space that shows categories and relationships.[1]
The columnar spreadsheet had an important, if non-distinguished history as the bedrock of the accounting and bookkeeping professions. It became universally more relevant when it was programmed into the first electronic spreadsheet for the Apple II. VisiCalc was created when its co-inventor, Dan Bricklin, needed to make tedious calculations for an MBA course he was taking.
As its name suggests, VisiCalc combines the two most important components of the electronic spreadsheet – visual display and calculative ability. Tables are both communicative and analytical technologies that provide familiar ways to convey relatively simple or complex information as well as structure data so that new dimensional relationships are produced. These became particularly useful for accounting and sales, as well as finance.[1]
The interactivity of the personal computer keyboard and screen was a prominent advantage in the growing popularity of the electronic spreadsheet. It provided immediate feedback and reduced the learning curve of spreadsheet applications like VisiCalc and later Lotus 1-2-3 for the IBM PC and its clones. Getting immediate feedback rather than waiting hours or days to pick up printouts from the computer center made the PC spreadsheets much more valuable than the mainframe (or mini) produced results. PC users could quickly isolate individual cells and input information while applying formulas ranging from simple mathematical calculations to more complex statistical analyses.
Later, the graphical user interface and the mouse made the Apple Mac and Windows-based PCs even more interactive and user-friendly. Initially designed for the Apple Mac, Microsoft Excel emerged as the dominant spreadsheet program. Drop-down menus that could be accessed by the user with a mouse-controlled cursor provided instructions and options to help use the spreadsheet. Users could quickly learn how to complete various “high-level, task-specific functions” such as adding the values of a range of cells or finding the statistical median in a selected range within a column. More about this in the next post on formulas.
Nardi and Miller also pointed to the visual aspects of the spreadsheet. The table combines an easily viewable display format in a simple geometric space. The tabular grid of the computer spreadsheet interface provides a powerful visual format for entering, viewing, and structuring data in meaningful ways. Users could view large quantities of data on a single screen without scrolling. Relationships between variables become discernible due to the table display. This grid facilitates pattern recognition and the ability to isolate specific items. Organized around the intersection of rows and columns at a point called the “cell,” spreadsheets provide relatively simple but critical computational and display solutions for information work.
Tables draw deeply on the meaning-creating capabilities of the list, an integral component of the spreadsheet. The list was one of the first uses of writing and an ancient technology of organization and management for the palace and military. It is a visual technology that produces a verbless and decontextualized organization of concepts or things that creates boundaries between categories and thus organizes knowledge in new ways. In the spreadsheet, lists form the rows and columns of the table. They can isolate individual items or connect them visually to other items, creating new categories and taxonomies.[2] The synergy of this new formulation in the spreadsheet, a conjunction of vertical lists or “columns” with horizontal lists called “rows” creates a powerful new intellectual dynamism for tracking resources and assigning associated values.
Lists and tables are “remediated” in the spreadsheet.[3] In the tradition of Marshall McLuhan, new media can be seen as incorporating and recombining previous media. In other words, the list takes on new significance in the spreadsheet as it is re-purposed horizontally as well as vertically to create the tabular format. The term remediate comes from the Italian remediari and means “to heal.” The list is “healed” as it becomes part of the table because it attempts to provide a more authentic access to reality. In the table, additional dimensions to a list are created, offering new lists and displaying the possibilities of new relationships. The history of media, including spreadsheets, is one of attempting to develop more advanced systems of representation through the process of remediation that strive to provide new and better ways to view and understand the world.
The table acts as a problem-solving medium, a cognitive tool that classifies and structures data and offers ways to analyze it. The rows and columns provide the parameters of the problem, and individual cells invite input and interpretation. The table format also suggests connections, dependencies, and relations between different sets of data. The table makes it easy to discern categories represented in the vertical and horizontal dimensions, scan for individual data values, and get a sense of the range of values and other patterns, such as a rough average.
To recap, the formal analysis of the spreadsheet focuses on the various components of the spreadsheet and how they combine to give it its extraordinary capabilities. The spreadsheet has emerged as both a personal productivity tool and a vehicle for group collaboration. Spreadsheets produce representational frameworks for garnering and organizing labor and material resources. Numerical innovation stemming from the adoption of zero and other systems of mathematical representation has made the computerized spreadsheet a central component of modern life. It has also helped establish the centrality of financial knowledge as the working philosophy of contemporary
In this post, I discuss how spreadsheets, in conjunction with other forms of financial and managerial representation, not only produce the ongoing flows of knowledge that run the modern economy but have created a technologically enhanced epistemology for garnering resources, registering value, and allocating wealth.[4]
Citation APA (7th Edition)
Pennings, A.J. (2016, May 6) Digital Spreadsheets – Part 5 – Ease and Epistemology. apennings.com https://apennings.com/meaning-makers/digital-spreadsheets-power-over-people-and-resources/
Notes
[1] Bonnie A. Nardi and James R. Miller (In D. Diaper et al (Eds.), “The Spreadsheet Interface: A Basis for End-user Programming,” Human-Computer Interaction: INTERACT ’90. Amsterdam: North-Holland, 1990. Spring, 1990.
[2] Jack Goody’s (1984) Writing and the Organization of Society is informative on the use of the list as a historical management tool.
[3] Bolter, J. and Grusin R. (2000) Remediation: Understanding New Media. MIT Press.
[4] Hood, John. “Capitalism and the Zero,” The Freeman: Foundation for Economic Education. N.p., 01 Dec. 2000. Web. 10 Jan. 2015.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor at the Department of Technology and Society at the State University of New York (SUNY) in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.
var _gaq = _gaq || [];
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: #spreadsheetsaga > digital spreadsheet > financial epistemologies > Lotus 1-2-3 > Microsoft Excel > spreadsheet capitalism > tables > VisiCalc
Black Friday and Thomas Edison’s Stock Ticker
Posted on | April 22, 2016 | No Comments
This post continues the story about gold and the greenback dollar and how their trading dynamics led to the invention of early electro-mechanical technology to indicate financial prices. During the height of speculative activity in 1869, a young telegrapher, Thomas Edison, came to Wall Street and made a fortune. He didn’t make it on Wall Street, but rather he made it by helping to make “Wall Street,” the global financial phenomenon.
The early life of Thomas Alva Edison provides a useful index of the times. It can be argued that Edison would probably not have risen to prominence without the financial turmoil of the late 1860s. Edison happened to be in New York during the famous gold speculation of 1869 that resulted in the “Black Friday” crash. The crisis was precipitated by the actions of President Grant, who wanted to squash the attempts by two major speculators to corner the market for gold. Edison’s technical expertise as a skilled telegrapher and amateur electrical engineer proved timely during the financial crash, and he was soon given a job that would alter the course of his life.[1]
Edison was born on February 11, 1847, in Milan, Ohio, one of the busiest ports in the world due to the wheat trade. This commerce also made it an important railroad center, and Edison soon obtained a job hawking candies and other foods on the Grand Trunk Railroad. When he, by chance, saved a young boy from a near-certain train death, the boy’s father taught him to be a telegraph operator. At the age of 15, he got his first telegraph job, primarily decoding the news. He even printed and sold his own newspaper for a while.
Edison traveled to various cities working as a telegraph operator, making his way eventually to Boston, considered the hub of cultural and technological interchange at the time. Although working for Western Union, Edison “moonlighted” extensively and even attended lectures at Boston Tech, now known as MIT. He invented and patented a “vote-counter” for the Massachusetts legislature, but it interfered with the negotiating tactics of the minority and was never purchased. In 1869, after 17 months in Boston, Edison quit his job at Western Union, borrowed money from a friend, and took a steamship to New York City.[2]
At the time, New York City was buzzing with financial speculation, as the gold-greenback relationship was quite volatile. The Funding Act of 1866 had required pulling the paper money out of the US economy, but a recession in 1867 had dampened congressional enthusiasm for the move. The North still had a huge debt and wanted to pay off its war bonds in greenbacks that were not specified in gold. The former Union general and new president, Ulysses S. Grant, made his first act as national executive the promise to pay off U.S. bonds in “gold or its equivalent” and later redeem the greenbacks in gold. The precious metal’s value soon dropped to $130 an ounce, a low not seen since the beginning of the war.
Jay Gould, and later James Fisk, perhaps representing the worst of the “Robber Baron” era, were heavily involved in gold speculation leading up to the events of the infamous 1869 “Black Friday.” Jay Gould was a country-store clerk who took his savings of $5,000 and became a financial speculator, eventually defeating Cornelius Vanderbilt and taking control of the Erie Railroad, New York City’s railroads, and for a time, the telegraph giant Western Union. James Fisk worked for a circus in his youth, but made a fortune selling cotton from the South to Northern firms. He also sold Confederate war bonds to the British during the Civil War. He teamed up with Gould and Daniel Drew to take control of the Erie Railroad and then afterward manipulated the stock to the downfall of the gigantic railroad (and Cornelius Vanderbilt), but enriching his own bank account by millions. During the summer of 1869, Gould and Fisk set up their scam to corner the gold market.[3]
Edison was in quite a desperate situation when he searched for work in New York’s financial district. He applied almost immediately at the Western Union office but was forced to wait several days for a reply. While literally begging for food and sleeping in basements, he was also carefully surveying the new uses of the telegraph on Wall Street. It was a time of much speculation as the word was spreading that Jay Gould and James Fisk were attempting to rig the gold market. They were buying up gold and thereby reducing the supply available. Edison had acquaintances at the Gold and Stock Telegraph Company who let him sleep on a cot in the basement and watch the increasing financial trading and expanding bubble.
He was hanging out at the office when panic struck due to equipment failure. Hundreds of “pad shovers” converged on the office, complaining that their broker office’s gold ticker had stopped working. Within a few minutes, over three hundred boys crowded the office. The man in charge panicked and lost his concentration, especially after Dr. S.S. Laws appeared. Laws was President of the Gold Exchange and had also invented the device that displayed the price of gold. Edison made his way to the situation and announced that he could solve the problem, having studied the machine over the previous few days. Edison later claimed that Laws shouted, “Fix it, Fix it, be quick!” [4]
Edison soon discovered that one of the contact springs had fallen and lodged itself within two gears. He removed the spring, “set the contact wheels to zero,” and sent out the company’s repairmen to reset each subscriber’s ticker. Within a few hours, the system was running normally again. As a reward, Dr. Laws offered him a job managing the entire plant for $300 a month, about twice the salary of a good electrical mechanic at the time. Edison took the job and continued to tinker with several technologies, particularly the stock ticker and a “quadruplex transmitter” for telegraphy that could send two messages each way simultaneously.
While Edison was in New York, gold speculation increased. A crucial detail was whether the government was going to release its gold holdings and drive the price down. The Federal government was a significant holder of gold relative to the total market. It was crucial to the speculating duo’s plan that the government refrain from selling off a substantial amount of their reserves. General U.S. Grant, the North’s Civil War hero, had run for the Republican nomination and the Presidency on a hard-money stance. After his election in March 1869, he continued to indicate he was not likely to sell off the nation’s gold. President Grant’s brother-in-law, Abel Rathbone Corbin, arranged a meeting for Gould and Fisk with the President during one of his visits to New York. (Corbin may have been involved earlier in the scheme. Grant expert Joe Rose believes he may even have approached Gould very early) They tried to convince Grant that higher gold prices would benefit the country. Although Grant refused to indicate his plans, the scheming duo told the press that the President was not planning to sell gold. In early September, they began to increase their sales substantially.
The infamous Black Friday came on September 24, 1869. Edison was operating a Laws Gold Indicator in a booth overlooking the Gold Exchange floor as prices fluctuated and the volume of trades grew heavy. The price of gold hovered between US$160 and $162 during the early hours of the day. Fisk was working the floor, claiming to reach $200 by the day’s end. At noon, Grant allowed his Secretary of the Treasury, George Sewell Boutwell, to sell $4 million in federal gold reserves to undermine the scheme of Gould and Fisk. When the news hit the Gold Room, the price fell within minutes to $133. They desperately tried to keep the indicator’s wheels moving, as it acted much like a modern automobile’s odometer. They even added a weight to it, but the technology could not keep up as panic hit the streets, and many people were wiped out financially.
Edison benefitted from the whole affair. Besides his $ 300-a-month job, he was encouraged and supported in improving the stock ticker and related telegraph transmission equipment. As a result, he rose from near starvation to being able to send money home to his parents and begin to build his invention empire. Edison would often recount the time as the most euphoric in his life because he “had been suddenly delivered from abject poverty to prosperity.”
Citation APA (7th Edition)
Pennings, A.J. (2016, Apr 22). Black Friday and Thomas Edison’s Stock Ticker. apennings.com https://apennings.com/telegraphic-political-economy/black-friday-and-thomas-edisons-stock-ticker/
Notes
[1] Edison’s timely circumstances on Wall Street from Edison: His Life and Inventions by Frank Lewis Dyer and Thomas Commerford Martin.
[2] According to an article published in 1901 in the Electrical World and Engineer, by Edward A. Calahan, Edison’s cotton ticker was only a partial success. It was often replaced by one invented by G.M. Phelps with superior workmanship, speed, and accuracy. It’s sales suffered from it being more expensive and delicate, which may account for its limited use. The article was written by the original inventor of the stock ticker and may not have been unbiased.
[3] A good account of Black Friday events appears on the New York Times website section “On This Day”. See http://www.nytimes.com/learning/general/onthisday/harp/1016.htm. Accessed on 2/27/15.
[4] For some background and an overview of related technology, see Introduction to Financial Technology by Roy S. Freedman
[5] The stocktickercompany website is a useful source for the history of the stock indicators and ticker-tape machines.
[6] Information on Gould, Eckert and Edison from Conot’s Thomas A. Edison: A Streak of Luck. pp. 65-69.
[7] It has been difficult to trace the exact timing of Edison’s activities at the time. Ultimately, I decided to follow the patents.
[8] http://heartbeatofwallstreet.com/PDF-Documentation/TheIllustratedHistoryofStockTickers.pdf
© ALL RIGHTS RESERVED

Tags: Black Friday > cotton instrument > screw-thread unison device > Thomas Edison > Universal Stock Printer > Universal Stock Ticker > Western Union