Anthony J. Pennings, PhD


ARPA and the Formation of the Modern Computer Industry, Part 2: Memex, Personal Computing, and the NSF

Posted on | September 26, 2021 | No Comments

With World War II winding down, President Roosevelt asked Vannevar Bush, his “czar” of all federally funded scientific research, for a set of recommendations on the application of the lessons learned during the war. The President was particularly interested in how the scientific and technological advances achieved in the war effort could improve issues like national employment, the creation of new industries, and the health of the nation’s population. This post looks at Bush’s contribution to the formation of the National Science Foundation and computing developments that lead to interactivity and networking.

Bush managed the Office of Scientific Research and Development under Roosevelt and later became President Truman’s science advisor. While not actually stationed in New Mexico, Bush had the overall supervisory responsibility for building the first atomic bomb and was uniquely positioned to understand the new technologies and scientific advances coming out of the war. As a result, Bush took up the President’s challenge and wrote two articles that would have far-reaching consequences.

Bush’s articles provided the rationale for funding a wide range of technological and scientific activities and inspired a new generation of researchers. His first written response, “Science, the Endless Frontier,” led to the formation of the National Science Foundation (NSF). The NSF provided widespread funding for scientific and technological research throughout the country’s universities and research institutes.[1] In the mid-1980s, it would take control of the non-military part of the ARPANET and link up supercomputing centers in response to the Japanese economic-technological threat. The NSFNET, as it was called, would also standardize the TCP/IP protocols and lead to the modern Internet.

Bush’s second response, “As We May Think,” was published in the Atlantic Monthly in July 1945, just a few months after Hitler committed suicide and a month before Bush’s atomic bombs were dropped on Japan. The article received lukewarm attention at first, but it persisted and inspired many people, including J.C.R. Licklider, who pursued a vision of computing interactivity and communications based on Bush’s ideas.

In a war that increasingly turned to science and technology to provide the ultimate advantage, Bush’s responsibilities were crucial to the outcome of World War II. These burdens also placed him in a troublesome position of needing to read, digest and organize an enormous amount of new scientific information. This responsibility led him to develop and forward the idea of an information technology device known as the “memex,” something he had been working on in the late 1930s while he was a professor and the Vice-President of MIT.[2]

The memex is arguably the model for the personal computer and was a distinct vision of man-machine interactivity that motivated Licklider’s interest in time-sharing technologies and networking. Bush’s conception of a new device aimed at organizing and storing information at a personal level led to a trajectory of government-sponsored research projects that aimed to realize his vision. In 1960, Licklider, a lecturer at MIT, published “Man-Computer Symbiosis,” a theoretical article on real-time interactive computing. [3] Licklider, a psychologist by training, later moved to Department of Defense’s Advanced Research Projects Agency (ARPA) in 1962 to become the first director of its Information Processing Techniques Office (IPTO).

It was the intersection of his vision with the momentum of the Cold War that led to the fruition of Bush’s ideas, largely through the work of Licklider. The first timesharing systems were constructed at MIT with funding from ARPA, as well as the Office of Naval Research. Constructed over the years 1959 to 1962, these efforts led to a working model called Compatible Time-Sharing System (CTSS). Using the new IBM 7090 and 7094 computers, CTSS proved that the time-sharing concept could work, even though it only linked three computers.

The military later supplied MIT with a $3 million grant to develop man-machine interfaces. By 1963 Project MAC, as it was called, connected some 160 typewriter consoles throughout the campus and in some faculty homes with up to 30 users active at any one time. It allowed for simple calculations, programming and eventually what became known as word processing. In 1963 the project was refunded and expanded into a larger system called MULTICS (Multiplexed Information and Computing Service) with Bell Labs also collaborating in the research. MULTICS demonstrated the capacity to handle 40-50 users and use cathode ray tube (CRT) graphic devices, and accommodate users that were not professional programmers.[4]

As the cases of computing and timesharing show, the military-industrial tie drove early computing developments and created the foundation for the Internet to emerge. Funding for a permanent war economy introduced the information-processing regime to the modern world. In conjunction with research institutes like MIT, MITRE, and RAND, and corporations such IBM, GE, as well as the Bell System, IT got its start.

Licklider’s notion of an “Inter-Galactic Computer Network” began to circulate as a vague idea through a like-minded group of computer scientists who were beginning to see the potential of connected computers. The IPTO was beginning to seed the literal invention of computer science as a discipline and its establishment in universities around the country. In Licklider’s memo of April 25, 1963, he addressed the “members and affiliates” of the network that had coalesced around his vision, and the money of ARPA. His concern was that computers should be able to communicate with each other easily and provide information on demand. The project was posed in terms of cross-cultural communications. The concept helped ARPA change its focus from what went on inside the computer to what went on between computers.

The technology was not quite there yet but the expertise was coming together that would change computing and data communications forever. Using military money Licklider began the support of actual projects to create computer technologies that expanded Bush’s vision. A little-known corporation called Bolt, Beranek, and Newman (BBN) was one of the most significant to come out of a new complex of agencies and companies working on computing projects. The bond between this small corporation, MIT, and ARPA produced a packet-switched network that became the precursor to today’s modern Internet.

In conjunction with the National Science Foundation, ARPA pursued human-computer interactivity and subsidized the creation of computer science departments throughout the country. It funded time-sharing projects and funded the first packet-switching technology and would be the foundational technology of the Internet.


[1] Bush stayed with the government throughout the 1940s directing science funding and then becoming the first head of the National Science Foundation after it was established in 1950.
[2] Information on Bush’s early conception of the memex from M. Mitchell Waldrop’s (2001) The Dream Machine: J.C.R. Licklider and the Revolution that Made Computing Personal. New York: The Penguin Group. Particularly useful is the second chapter that focuses on Bush.
[3] Licklider, J.C.R (1960) “Man-Machine Symbiosis,” IRE TRANSACTIONS ON HUMAN FACTORS IN ELECTRONICS. March.
[4] Denicoff, M. (1979) “Sophisticated Software: The Road to Science and Utopia,” in Dertouzos, M.L. and Moses, J.(1979) The Computer Age: A Twenty Year View. Cambridge, Massachusetts: The MIT Press. p. 370-74.



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at St. Edwards University in Austin, Texas. His first academic job was at Victoria University in Wellington, New Zealand. Most of his career was at New York University. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii at also supported his Ph.D.

Engineering TCP/IP Politics and the Enabling Framework of the Internet

Posted on | September 22, 2021 | No Comments

The Internet was designed with a particular architecture – an open system that would accept any computer and connect it to any other computer. A set of data networking protocols allowed any application on any device to communicate through the network to another application on another device. Email, web files, text messages, and data from sensors could be sent quickly without using significant power or other resources from the device. This post explores the political engineering of protocols and subsequent policy framework for a national and global data communications network that empowered users and created an open environment for competition and innovation.

The Power of Protocols

What gives power to the Internet’s architecture are are the protocols that shapes the flows of data. With acronyms like TCP, IMAP, SMTP, HTTP, FTP, as well as UDP, BGP, and IP, these protocols formed the new networks that would slowly become the dominant venue for social participation, e-commerce, and entertainment. These protocols were based on a certain philosophy – that computer hosts talked to computer hosts, that networks were unreliable and prone to failure, and that hosts should confirm with other hosts that the information was passed successfully. The “TCP/IP suite” of protocols emerged to enact this philosophy and propel the development of the Internet.[1]

TCP/IP protocols allow packets of data to move from application to application or from web “clients” to “servers” and back again. They gather data such as keystrokes from an application and package them for transport through the network. Computer devices turn information into packets of data – 1s and 0s – that are sent independently through the Internet. Each packet of data has the address of the destination, including a header and the “payload” or content. The nodes in the network “route” the packets to the computer where they are headed. This involves some “handshaking” and acknowledging the connections and packets received between what we have been calling alternatively the edges/devices/hosts/applications/processes.

More specifically, a “process” on an application on one device talks to a “process” on an application on another device. So, for example, a text application like Line, WhatsApp, or WeChat communicates to the same application on another device. Working with the device’s operating system, TCP takes data from the application and sends it into the Internet. There it gets directed through network routers to its final destination. The data is checked on the other side, and if mistakes are found, it requests the data be sent again. IMAP and SMTP move email messages through the Internet, and most people will recognize HTTP (Hypertext Transfer Protocol) from accessing web pages. The protocol requests a file from a distant server, sets up a connection, and then terminates that connection when the files download successfully.

HTTP is at the center of what has been called the World Wide Web (WWW). Mostly called the ‘web” these days, it combined with the “browser” to provide a powerful new utility – the website. Hypertext Markup Language (HTML) enabled the browser to present text and images on a 2-D color screen. The WWW empowered the “” era and allowed many people to develop computer skills to produce websites. Every organization had to have an online “presence” to remain viable, and new organizations were started to take advantage of the fantastic reach of the web. Soon, server-side software empowered a myriad of new possibilities on the net, including browser-based email, e-commerce, search, and social media.

Devices access an Internet Service Provider (ISP), either from a home, school, or Wi-Fi connections at a café or public network in a train or park. Mobile subscriptions allow access to a wireless cell tower with a device antenna and SIM card. Satellite service is becoming more available, primarily through HughesNet, ViaSat, and increasingly SpaceX’s Starlink as more low-orbit satellites are launched. Physical media make a difference in good Internet access by providing the material access to the ISP. Various wires and fiber optic cables or combinations provide the critical “last mile” connection from the premise or enterprise. Ethernet connections or wireless routers connect to a modem and router from your cable company or telco start and end communication with the edge devices.

Conceptually, the Internet has been divided into layers, sometimes referred to as the protocol stack. These are Application, Transport, Network, Link, and Physical layers. The Internet layers survived the Open Systems Interconnection (OSI) model with a more efficient representation that simplified the process of developing applications. Layers help conceptualize the Internet’s architecture for instruction, service and innovation. They visualize the services that one layer provides to another using the protocols. They provide discrete modules that are distinct from the other levels and serve as a guideline for application development and network design and maintenance.

The Internet’s protocol stack makes creating new applications easier because the software that needs to be written only for the applications at the endpoints (client and server) and not for the network core infrastructure. Developers use Application Programming Interface (APIs) to connect to sockets, a doorway from the Application layer to the next layer of the Internet. Developers have some control of the socket interface software with buffers and variables but do not have to code for the network routers.

The Network layer is where the Internet Protocol (IP) does its work. At this layer, the packets are repackaged or “encapsulated” into larger packets called datagrams. These also have an address on them that might look like The computers and networks only use numerical names, so they may need to use a Domain Name Service (DNS) if the address is an alphabetical name like Large networks have many possible paths, and the router’s algorithms pick the best routes for the data to move them along to the receiving host. Cisco Systems became the dominant supplier of network routers during the 1990s.

Although the central principle of the Internet is the primacy of the end-to-end connection and verification – hosts talk to hosts and verify the successful movement of data, the movement of the data through the network is also critical. The network layer in the TCP/IP model transparently routes packets from a source device to a destination device. The job of the ISPs are to take the data encapsulated at transport and network and transport it – sometimes over long distances via microwave towers, fiber optic cables, or satellites. The term “net neutrality” has emerged to protect the end-to-end principle and restrict ISPs from interfering with the packets at the network layer. If ISPs are allowed to examine data from the Application layer, they could alter speed, pricing, or even content based on different protocols.

The diffusion of the TCP/IP protocol was not inevitable. Computer companies like IBM, Honeywell, and DEC developed their own proprietary data communications systems. Telecommunications companies had already established X.25 protocols for packet-switched data communications with X.75 gateway protocols used by international banks and other major companies. TCP looked like a long shot, but the military’s subsequent decisions in 1982 to mandate it and National Science Foundation’s NSFNET support secured momentum for TCP/IP. Then, in 1986, the Internet Advisory Board (IAB) began to promote TCP/IP standards with publications and vendor conferences about its features and advantages. By the time the NSFNET was decommissioned in 1995, the protocols were well established.

The Philosophy of TCP

The military began to conceptualize the decentralized network as part of its defense against nuclear attack in the early 1960s. Conceived primarily by Paul Baran at RAND, packet-switching was developed as way of moving communications around nodes in the network that were destroyed or rendered inoperable by attack. Packets could be routed around any part of the network that was congested or disabled. If packets going from San Francisco in California to New York City could not get through a node in Chicago, they could be routed around the Windy City through nodes in other cities. As networks were being considered for command and control operations they had to consider that eventually computers would not only be in fixed installations but in airplanes, mobile vehicles, and ships at sea. The Defense Advanced Research Projects Agency (DARPA) funded Vint Cerf and others to create what became the TCP and IP protocols to connect them.

The Internet was also informed by a “hacker ethic” that emerged at MIT in the late 1950s and early 1960s as computers moved away from punch-cards and began to “time-share” their resources. Early hacking stressed openness, decentralization, and sharing information. In addition, hackers championed merit, digital aesthetics, and the possibilities of computers in society. Ted Nelson’s Computer Lib/Dream Machines (1974) was influential as the computer world moved to California’s Silicon Valley.

The counter-culture movement, inspired by opposition to the Vietnam War was also important. Apple founders Steve Jobs and Wozniak were sympathetic to the movement, and their first invention was a “Bluebox” device to hack the telephone system. Shortly after, the Apple founders merged hacktivism with the entrepreneurial spirit as they emphasized personal empowerment through technology in developing the Apple II and Macintosh.

The term hackers has fallen out of favor because computers are so pervasive and people don’t like to be “hacked” and their private data stolen or vandalized. But the hacker movement that started with noble intentions and continues to be part of the web culture. [2]

Developing an Enabling Policy Framework

Although the Internet was birthed in the military and nurtured as an academic and research network, it was later commercialized with an intention to provide an enabling framework for economic growth, education, and new sources of news and social participation. The Clinton-Gore administration was looking for a strategy to revitalize the struggling economy. “It’s the Economy, Stupid” was their mantra in the 1992 campaign that defeated President George H. Bush and they needed to make good on the promise. Their early conceptualization as information highways framed them as infrastructure and earned the information and telecommunications sectors both government and private investment.

Initially, Vice-President Gore made the case for “information highways” as part of the National Information Infrastructure (NII) plan and encouraged government support to link up schools and universities around the US. He had been supporting similar projects as one of the “Atari Democrats” since the early 1980s, including the development of the NSFNET and the supercomputers it connected.

As part of the National Information Infrastructure (NII) plan, the US government handed over interconnection to four Network Access Points (NAPs) in different parts of the country. They contracted with big telecommunications companies to provide the backbone connections. These allowed ISPs to connect users to a national infrastructure and provide new e-business services, link classrooms, and electronic public squares for democratic debate.

The US took an aggressive stance in both controlling the development of the Internet and pressing that agenda around the world. After the election, Gore pushed the idea of the Global Information Infrastructure (GII) worldwide that was designed to encourage competition in both the US and globally. This offensive resulted in a significant decision by the World Trade Organization (WTO) that reduced tariffs on IT and network equipment. Later the WTO encouraged the breakup of national post and telegraph agencies (PTTs) that dominated national telecommunications systems. The Telecommunications Act of 1996 and the administration’s Framework for Global E-Commerce were additional key policy positions on Internet policy. The result of this process was essentially the global Internet structure that gives us relatively free international data, phone, and video service.


As Lotus and Electronic Freedom Frontier founder Mitch Kapor once said: “Architecture is politics”. He added, “The structure of a network itself, more than the regulations which govern its use, significantly determines what people can and cannot do.” The technical “architecture” of the Internet was primarily designed to empower the network’s edges – the users and their hardware. Its power has been borne out as those edges are no longer large mainframes and supercomputers but laptops, smartphones, and the tiniest of sensors in the emerging Internet of Things (IoT). Many of these devices have as much or more processing power than the computers the Internet was invented and developed on. The design of the Internet turned out to be a unique project in political engineering.


[1] Larsen, Rebekah (2012) “The Political Nature of TCP/IP,” Momentum: Vol. 1 : Iss. 1 , Article 20.
Available at:

[2] Levy also described more specific hacker ethics and beliefs in chapter 2, Hackers: Heroes of the Computer Revolution. These include openness, decentralization, free access to computers, and world improvement and upholding democracy.


AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and spending most of his career at New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in the Republic of Korea, he lives in Austin, Texas.

ARPA and the Formation of the Modern Computer Industry, Part I: Transforming SAGE

Posted on | September 12, 2021 | No Comments

In response to the Russian Sputnik satellites launched in late 1957, US President Dwight D. Eisenhower formed the Advanced Research Projects Agency (ARPA) within the Department of Defense (DoD). As the former leader of the Allied forces during D-Day and the invasion of the European theater, he was all-to-aware of the problems facing the military in a technology-intensive era. ARPA was created, in part, to research and develop high technology for the military and bridge the divide between the Air Force, Army, Marines, and Navy.

Under pressure because of the USSR’s continuous rocket launches, the Republican President set up ARPA despite considerable Congressional and military dissent. Although it scaled back some of its original goals, ARPA went on to subsidize the creation of computer science departments throughout the country, funded the Internet, and consistently supported projects that enhanced human/computer interactivity.

Forming ARPA

Headquartered in the Pentagon, ARPA was established to develop the US lead in science and technology innovations applicable to the military and help it respond quickly to any new challenges. Eisenhower had multiple suspicions about the military and its industrial connections. However, he did believe in basic research and appointed a man with similar notions, Neil McElroy, the head of Proctor & Gamble, as his Secretary of Defense. McElroy pushed his vision of a “single manager” for all military-related research through Congress. Despite objections by the heads of the various armed forces, Eisenhower sent a request to Congress on January 7, 1958, for startup funds to create ARPA and appointed its director, a vice-president from General Electric. Shortly after, Congress appropriated funds for ARPA as a line item in an Air Force appropriations bill.[1]

Roy Johnson came to head ARPA from GE, dreaming of human-crewed space stations, military moon bases, orbital weapons systems, global surveillance satellites, and geostationary communications satellites. But by the end of ARPA’s first year, Eisenhower had established NASA, dashing Johnson’s space fantasies. Space projects moved to the new civilian agency or back to the individual military services, including the covert ones like those of the CIA’s spy planes and satellites. ARPA desperately searched for a new mission and argued effectively for going into “basic research” areas that were considered too “far out” for the other services and agencies.

With the Kennedy Administration taking office and its appeal for the nation’s “best and brightest” to enter government service, ARPA found its prospects improving. It looked aggressively for talent to develop the best new technologies. Behavioral research, command and control, missile defense, and nuclear test detection were some of the newest projects taken on by ARPA, although not necessarily “basic” research. The new agency also got increasingly involved with computers, especially after Joseph Carl Robnett “JCR” Licklider joined the staff in October 1962.[2]

ARPA’s Information Processing Techniques Office (IPTO)

The IPTO emerged in the early 1960s with the charge of supporting the nation’s advanced computing and networking projects. Initially called the Office of Command and Control Research, its mandate was to extend the knowledge gained by researching and developing the multi-billion dollar SAGE (Semi-Automatic Ground Environment) project and extend it to other command and control systems for the military.

SAGE was a joint project by MIT and IBM with the military to computerize and network the nation’s air defense system. It linked a wide array of radar and other sensing equipment throughout Canada and the US to what was to become the Colorado-based NORAD headquarters. SAGE was meant to detect aircraft (bombers and later ICBMs) coming over the Artic to drop nuclear bombs on Canada and the US. The “semi-automatic” in SAGE meant that humans would be a crucial component of the air defense system, and that provided an opening for Licklider’s ideas.

SAGE consisted of some 50 computer systems located throughout North America. Although each was a 250-ton monster, SAGE computers had many innovations that further sparked the dream of man-machine interactivity. These included data communications over telephone lines, cathode ray terminals to display incoming data, and light pens to pinpoint potential hostile aircraft on the screen. ARPA’s IPTO helped transform SAGE innovations into the modern IT environment.

From Batch to Timesharing

Throughout the 1960s, three directors at IPTO poured millions of dollars into projects that created the field of computer science and got computers “talking” to people and to each other. Licklider had the Office of Command and Control Research changed to Information Processing Techniques Office (IPTO) when he moved from BBN to ARPA to become its first director. Licklider was also from MIT, but what made him unusual was that he was a psychologist amongst a majority of engineers. He got his Ph.D. from the University of Rochester in 1942 and lectured at Harvard University before working with the Air Force. Foremost on his agenda was to encourage the transition from “batch processing” to a new system called “timesharing” to promote a more real-time experience with computers, or at least a delay measured in seconds rather than hours or days.

These new developments meant the opportunity for new directions, and Licklider would provide the guidance and government’s cash. During the mid-1950s, Licklider worked on the SAGE project focusing mainly on the “human-factors design of radar console displays.”[3] From 1959 to 1962, he was a Vice-President for BBN, overseeing engineering, information systems, and psycho-acoustics projects. He was also involved in one of the first time-sharing cases at BBN with a DEC PDP-1 before taking a leave of absence to join ARPA for a year.[4]

Licklider swiftly moved IPTO’s agenda towards increasing the interactivity of computers by stressing Vannevar Bush’s ideas and the notion of a more personal and interactive computing experience. An influential military project at MIT was the TX-2, one of the first computers to be built with transistors and a predecessor to the PDP line of computers. It also had a graphics display, unlike most computers that used punch cards or a teletypewriter. The TX-2 was located at MIT’s Lincoln Laboratories and had a major influence on Licklider. The brilliant psychologist would ride the waves of Cold War grant monies and champion research and development for man-machine interactivity, including a radical new computer-communications technology called timesharing.

Early computer users submitted their requests and punch cards to a receptionist at a computer center. Then a team of computer operators would run several (or a “batch”) of these programs at a time. The results were usually picked up a day or two after submitting the requests. After Bell Labs developed transistor technology, individual transistors were wired into circuit boards, creating the “second generation” computer series. This new technology allowed vacuum tubes to be replaced by a smaller, cheaper, and more reliable technology and produced an exciting increase in processing speeds. Faster technology eventually led to machines that could handle several different computing jobs at one time – timesharing.

Time-sharing would allow several users to use a computer by taking advantage of the increasing processing speeds. It also used enhanced computer communications by allowing users to connect via teletype and later cathode-ray terminals. Rather than punching out programs on stacks of paper cards and submitting them for eventual processing, time-sharing made computing a more personal experience by making it more immediately interactive. Users could interact with a large mainframe computer via teletypewriters used originally for telex communications and the cathode-ray terminals used for televisions.

Timesharing emerged from the MIT environment and its support by the US government. Sets of procedures used for timesharing originated at MIT after receiving an IBM 704 in 1957, a version of the AN/FSQ-7 developed for SAGE. John McCarthy, a Sloan Fellow from Dartmouth, recognized some possibilities of sharing the computer’s capabilities among several users. As the keyboard replaced punch cards and magnetic-tape-to-magnetic-tape communication as the primary source of data entry, it became easier for the new computers to switch their attention to various users.[5]

As its human users paused to think or look up new information, the computer could handle the requests of other users. Licklider pressed the notion of timesharing to increase the machine’s interactivity with humans, but the rather grandiose vision would not be immediately accepted throughout the military-related sphere of ARPA. It was still in a relatively primitive state of computing in the early 1960s, but ARPA would soon be won over.

First on Licklider’s list was Systems Development Corporation (SDC), a RAND spin-off that had done most of the programming for the SAGE project. ARPA had inherited SDC, and a major part of the IPTO budget was set to help them transition from the SAGE air defense project to command and control computing. SDC had been given one of SAGE’s ANSFQ-32 mainframes, but to Licklider’s chagrin, they used it for batch processing. Licklider thought it ridiculous to use it in this manner, where responses often took hours or even days to help a commander react to battle situations.[6] Licklider immediately went to work to persuade SDC to switch from batch processing to time-sharing, including bringing in his allied colleagues such as Marvin Minsky for seminars to cajole SDC.

Soon they were convinced, and Licklider moved on to other time-sharing projects, pouring ARPA money into like-minded projects at MIT and Carnegie Mellon. Luckily, he had joined ARPA the same month as the Cuban Missile Crisis. The event raised concerns about the ability of the President and others high on the chain of command to get effective information. In fact, Kennedy had been pushing for better command and control support in the budget, reflecting his concerns about being the Commander-in-Chief of a major nuclear power.

In the next part I will examine timesharing and the first attempts to commercialize it as a utility.


[1] Background on ARPA from Hafner, K. and Lyon, M. (1998) Where Wizards Stay Up Late. New York: Touchstone. pp. 20-27.
[2] A much more detailed version of these events can be found in a chapter called “The Fastest Million Dollars,” in Hafner, K. and Lyon, M. (1998) Where Wizards Stay Up Late. New York: Touchstone. pp. 11-42.
[3] Information on Licklider’s involvement with SAGE from Campbell-Kelly, M. and Aspray, W. (1996) Computer: A History of the Information Machine. Basic Books, pp. 212-213.
[4] Information on JCR Licklider’s background at BBN from the (2002) Computing Encyclopedia Volume 5: People. Smart Computing Reference Series.
[5] Evans, B.O. “Computers and Communications” in Dertouzos, M.L. and Moses, J.(1979) The Computer Age: A Twenty Year View. Cambridge, Massachusetts: The MIT Press. p. 344.
[6] A good investigative job on Licklider and SDC was done by Waldrop, M. Mitchell (2001) The Dream Machine: J.C.R. Licklider and the Revolution that Made Computing Personal. New York: The Penguin Group.


AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

ICT and Sustainable Development: Some Origins

Posted on | August 17, 2021 | No Comments

I teach a course called ICT for Sustainable Development (ICT4SD) every year. It refers to information and communications technologies (ICT) enlisted in the service of cities, communities, and countries to help them be economically and environmentally healthy. An important consideration for sustainability is that they don’t impose on conditions or compromise resources that will be needed for future generations. Sustainable Development (SD) is an offshoot of traditional “development” that dealt primarily with national economies that were organizing to “take off” into a westernized, pro-growth, industrial scenarios, and had some consideration of the colonial vestiges they needed to take into account.

While development was also cognizant of the need to support agriculture, education, governance, and health activities, SD put a major focus on related environmental issues and social justice. (See Heeks) SD has been embraced by the United Nations (UN) that came out with seventeen Sustainable Development Goals (SDGs) that were adopted by all UN organizations in 2015.


In this post, I briefly introduce ICT4D and it’s connection to SD. Also, how it emerged and why it is beneficial. Of particular importance are the economic benefits of ICT and recognizing them in the renewable energies so crucial to sustainable development.

ICT was not well understood by development economists and largely ignored by funding agencies, except for telephone infrastructure. Literacy and education were early concerns. Book production, radio, and then television sets were monitored as crucial indicators of development progress. Telephones and telegraphs helped transact business over longer distances but were installed and managed by government agencies called Post, Telephone, and Telegraph (PTTs) entities. PTTs found funding difficult and were challenging to manage, given their technical complexity and enormous geographical scope. Satellites were used in some countries like India and Indonesia and facilitated better mass communications as well as distance education and disaster management.

Most of the economic focus in “developing countries” was on the extraction and growing of various commodities, utilizing low-cost labor for manufacturing, or adding to the production processes of global supply chains. It was only when television and films became important domestic industries that “information products” were recognized economically in the development process.

New dynamics to development and economic processes were introduced with computerization and ICTs. I began my career as an Intern on a National Computerization Policy program at the East-West Center in Honolulu, Hawaii. Computerization and Development in SE AsiaInspired by the Nora-Minc Report in France, it was part of the overall emphasis on development at their Communications Institute. I had an office next to Wilbur Schramm, who was one of the most influential development pioneers with his Mass Media and National Development: The Role of Information in the Developing Countries (1964).[1]

With my mentor, Syed Rahim, I co-authored Computerization and Development in Southeast Asia (1987) that serves as a benchmark studies in understanding the role of ICT in development. One objective of the book was to study the mainframe computers that were implemented, starting in the mid-1960s, for development activities. These “large” computers some of them with RAM of merely 14K, were implemented in many government agencies dealing with development activities: agriculture, education, health, and some statistical organizations. We also looked at what narratives were being created to talk about computerization at that time. For example, the term “Information Society” was becoming popular. Also, with the rise of the “microcomputer” or personal computer (PC), the idea of computer technology empowering individuals was diffusing through advertisements and other media.

Information economics opened up some interesting avenues for ICT4D and sustainable development. Initially, it was concerned with measuring different industrial sectors and how many people were employed in each area, such as agriculture, manufacturing, information, and services. Fritz Machlup, wrote the The Production and Distribution of Knowledge in the United States in 1973 that showed that the information goods and services accounted for nearly 30 percent of the U.S. gross national product. A major contributor to information economics, he concluded the “knowledge industry” employed 43 percent of the civilian labor force.

Machlup was also a student of Ludwig von Mises, known today as the founder of the so-called “Austrian School of Economics.” But he was soon overshadowed by fellow “members” Friedrich von Hayek and Milton Friedman, and the resurgence of Von Mises himself. While this debate was primarily against mainstream Keynesian economics, it was also significant for development studies as these economists saw government activities as running counter to the dynamics of the market. The main nemesis of the Austrian school was socialism and government planning activities. While most developing countries were not communist countries, the Cold War was a significant issue that was playing out in countries worldwide.

The Austrian movement had a significant impact in the 1970s and 1980s. Transactions in the economy were seen as knowledge-producing activities and they focused on the use of prices as communication or signaling devices in the economy. It led to a new emphasis on markets and Hayek and Friedman both received Nobel Prizes for their work.

For context, President Nixon had taken the US off the gold standard in August 1971 and the value of the US dollar dropped sharply. But currency markets were free to operate on market principles. It was also a time when the microprocessor was invented and computers were becoming more prominent. In 1973, Reuters set up its Money Monitor Rates, the first virtual market for foreign exchange transactions. They used computer terminals to display news and currency prices and charged banks to both subscribe to the prices and to post them. With the help of the Group of 5 nations, it brought order to international financial markets, especially after the Arab-Israel War broke out in late 1973. The volatility of the war ensured the economic success of the Reuters technology and currency markets have been digitally linked ever since.

Many development theorists by that time were becoming frustrated by the slow progress of capitalism in the “Third World.” Although the Middle East war was short, it resulted in increasing prices for oil around the world. This was a major strain on developing countries that had bought into mechanized development and the “Green Revolution” of the 1960s that emphasized petroleum-based fertilizers and pesticides. The Arab-dominated Organization of Petroleum Exporting Countries (OPEC) began an embargo of western countries for their support of Israel that refused to withdraw from the occupied territories. Prices of oil increased by 70 percent and the US suffered additional setbacks as they ended the war in Vietnam and inflation raged.

A split occurred between traditional development studies and market fundamentalists. British Prime Minister Margaret Thatcher and US President Ronald Reagan were strong advocates of the Austrian School. Both had been taken by Hayek’s Road to Serfdom (1949) and stressed a pro-market approach to development economics. The IMF was mobilized to pressure countries to undergo “structural adjustment” towards more market-oriented approaches to economic development. The PTTs were a primary target and investment strategies were utilized to turn them into state-owned enterprises (SEOs) and parts sold off to domestic and international investors.

Researchers began to focus on the characteristics or “nature” of information. As the economies became more dependent on information, more scholarship was conducted. It became understood that information was not diminished by use or by sharing. Certainly the value of information varied, often by time. The ability to easily share information by email and FTP created interest in network effects and the viral diffusion of information.

These characteristics became particular important after the development of the Internet that quickly globalized. Vice-President Gore’s Global Information Infrastructure (GII) became the foundation for the World Trade Organization’s Information Technology Agreements (ITA) and the privatization of telecommunications services. Tariffs on information and communications technologies decreased significantly. Countries that had gotten into debt in the 1970s were pressured into selling off their telecommunications infrastructure to private interests and they quickly adopted TCP and Internet Protocols (IP).

Other studies focused on efficiencies of production brought on by science and technology, specifically reducing the marginal costs of producing additional units of a product. Marginal costs have been a major issue in media economics because electronic and then digital technologies have allowed the increasing efficiency of producing these types of products. Media products have historically had high production costs, but decreasing marginal costs on the “manufacture” or reproduction of each additional unit of that product.

If we start with books for example, we know it is time-consuming to write a book and the first physical copies of the book are likely to be expensive, especially if only a small number of them are actually printed. But as traditional economies of scale are applied, the cost of each additional book becomes cheaper. Electronic copies of books in particular have become very cheap to produce, and even distribute through the Internet. Although that hasn’t necessarily resulted in major price decreases.

Digital outputs are generally unique economic products. They have unusual characteristics that make it difficult to exclude people from using them, and they are also not used up in consumption. Microsoft faced this problem in the early days of the microcomputer when it was getting started. It criticized computer hobbyists for sharing cassette tapes of their computer programs. Later, their investment in the MS-DOS operating system and subsequently Windows paid off handsomely when they were able to sell it with enormous margins for IBM PCs and then “IBM Compatibles” such as Acer, Compaq, and Dell. That is how Bill Gates became the richest man in the world (or one of them).

The issue of marginal costs have resonated with me for a long time, due to my work on media economics and what economists call “public goods.” In some of my previous posts, I addressed the taxonomy of goods based on key economic characteristics. Public goods and such as digital and media products are misbehaving economic goods in that they are not used up in consumption and are difficult to exclude from use. These writings examined what kind of products are conducive to reduced marginal costs and what social systems are conducive to managing these different types of goods. Originally, the focus was more on media products like film, radio and television, but then digital products like games and operating systems. Will these efficiencies apply to sustainable development?

Can the economics of media products apply to other products. More recently sustainable technologies like solar and wind are being examined for their near-zero marginal costs. A major voice on this topic is Jeremy Rifkin, who is most noted for his book The Third Industrial Revolution (2011) that refers to the importance of concurrent communications, energy, and transportation transitions. We have moved from an integrated political economy based on telephone/telex communications, and carbon combustion energy and transportation to a digital, clean energy. Two books by Jeremy Rifkin, The Near Zero Marginal Cost Society and The Green New Deal are significant points of departure for sustainable development.

Sustainable development initiatives by definition look to economize and reduce costs for the future. It is important to analyze the characteristics of economic goods and their social implications. This level of understanding is important to understand the market structure and types of regulation.

ICT4D has struggled to claim a strong narrative and research stake in the trajectory of development. The Earth Institute’s ICTs for SDGs: Final Report: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals (2015) and the World Bank’s (2016) World Development Report were significant boosts for ICT4D, especially for economic development, and the move towards sustainable development.


[1] Mass Media and National Development: The Role of Information in the Developing Countries. Stanford University Press. 1964.

[2] Sachs J et al (2016) ICT & SDGs: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals. The Earth Institute: Columbia University. Accessed at 15 Jan 2019


AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and spending most of his career at New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in the Republic of Korea, he lives in Austin, Texas.

Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains

Posted on | August 12, 2021 | No Comments

Are we ready for the age of diamonds? Instead of mining gold or even Bitcoin, why not use diamonds from carbon recaptured from the atmosphere? Instead of “conflict diamonds,” why not have “climate diamonds?”

Can we move to a diamond standard? Can diamonds replace gold and back other currencies? Can this be done in an economical and sustainable process? This post examines the processes of biological carbon sequestration and the manufacture of diamonds that can be used in a fintech environment.

The process of creating “industrial diamonds” is well established and has produced impressive results. Diamonds can be “grown” using small ones as seeds and adding to the crystalline structure in two ways. One is using superheated “greenhouses” with pressurized methane and hydrogen. Another uses CVD (Carbon Vapor Deposition), a low pressure, vacuum-based system, that uses heat or microwaves to bond carbon-rich gasses to the diamond seeds.

I wrote my Ph.D. dissertation about money and standards, so I’ve been thinking about this topic a lot. In Symbolic Economies and the Politics of Global Cyberspaces (1993), I examined the social forces that drive us to the use of general equivalents like money and the forces that establish monetary standardization in a digital environment. So, I’m not entirely convinced of my argument so far, but I want to consider available systems of regenerative agriculture and manufacturing that can be mobilized for making climate diamonds and tie them into newer generation cryptocurrency developments.

I was intrigued by a video from “Have a Think.” It presents an intriguing industrial scenario for growing and using hemp to sequester CO2 from the air and use it to produce several very valuable byproducts, one of which can be be transformed into sparking diamonds. Hemp is controversial due to its connection with THC, a psychoactive substance, but that is only present in certain strains. Hemp has a rich history and was particularly important for ropes needed by the sea-faring ships that relied entirely on wind.

Hemp is a dynamic plant that grows quickly. In the process, it can produce several types of industrial products, including lubricants for cars and wind turbines as well as ingredients for cosmetics, soaps, and printer ink. In addition, they have fiber substances that can make products like cloth, paper, and rope. The seeds have positive nutritional benefits and may be a replacement for soy proteins. It can also be processed to produce biochar, a type of charcoal used for fertilizers, graphene products, and the manufactured diamonds mentioned extensively in this post.

Agriculture is going through a technological transformation, with increased use of big data, hydroponics, and robotics. Hydroponics are a way of growing plants on a water-based medium in a protective environment. Hemp farms can remove significant amounts of carbon dioxide (C02) from the air and produce the oils and fibers mentioned above in a clean and economical way. Biological carbon sequestering is probably better then geologic sequestration that injects carbon into underground porous rock formations. Both may be necessary for reducing climate threats, but it is better done by plants that can produce many valuable byproducts.

Can we create a new monetary standard based on climate diamonds? Is it feasible? Is that something we want to do? Globally, we have been on an information standard since the 1970s, anchored by the strength of the US dollar and hedged by multiple financial instruments worldwide. The new diamond market will likely grow within the cryptocurrency environment.

Much of this depends on the future of cryptocurrency platforms and the digital ledger systems that are emerging in new generations. Cardona’s blockchain platform, for example, is evolving to create a peer-to-peer transactional system to trade many types of value like labor and portions of investment vehicles like houses, labor, art, etc. Imagine, for example, going to Home Depot and buying your gardening supplies with informational “deeds” to fractions of your car, your house, your future work, or your diamonds. Diamonds are likely to be another “value” in a chain of intersecting commodities classified on blockchain with dynamic pricing and smart contracts.

Diamonds have utility based on their beauty but also their durability and strength. Most notable is their sparkling effervescence that makes jewelry treasured symbols of relationship and commitment. Their crystalline structure is used in high-tech products like audio speakers as they can vibrate rapidly without affecting or deforming sound quality. Their high heat and voltage tolerance make diamond-based microprocessors an increasingly viable component of digital technologies.

Their hardness also makes them extremely valuable for drill bits. They have practical uses in delicate drilling for art, dentistry, and manufacturing. With a melting point of around 3550 degrees Celsius, they have the durability to drill industrial metals, geothermal wells, and underground tunnels.

Diamonds can also be money. They are portable, durable, measurable, and difficult to counterfeit. For diamonds, size matters, although color and clarity matter as well. Bigger is better and they pack a lot more value per size than gold. Gold has higher storage costs that quickly eat up any profit gains from their appreciated prices. Granted, neither diamonds or gold are generally used as currency, primarily because they lack acceptability by merchants and interoperability between financial systems. Try cashing in your diamonds at Home Depot.

That is why the cryptocurrency environment is the most likely solution to that problem, barring an economic collapse. While the dollar is going digital, officially known as CDBC (Central Bank Digital Currency), it will not be replaced by Bitcoin or other “cryptocurrencies” like Ethereum and Litecoin. Instead, other “values” will line up in relation to the dollar. The Fed will regulate but protect crypto-currencies because it knows that the financial system wants to trade anything, anywhere, anytime. So cryptocurrencies will survive, and diamonds will find their place within their ethereal blockchained cyberspace. In the future, who knows?

This is an exploratory essay stimulated by the “Have a Think” video but also shaped by my interest in fintech, monetary policy, and cryptocurrencies. The reintroduction of hemp in the modern economy and its potential for absorbing carbon dioxide can be a powerful addition to the global economy. It’s not entirely about taking the CO2 out of the air and bonding the carbon to diamonds. Rather, I am hopeful that the green manufacturing of diamonds will help incentivize and stimulate an industrial process with multiple benefits for the economy and the environment.


AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Modern Monetary “Practice” and the Fear of Inflation in a Low-Supply Economy

Posted on | August 1, 2021 | No Comments

One of America’s most potent political myths is that you, as a citizen, pay for government spending. People talk about paying for this or that program, but what really happens is that Congress appropriates the money for spending. Then the Treasury instructs the Federal Reserve Bank to credit the spending accounts. Taxes and borrowing are separate entries. The issue is gaining scrutiny as the US economy reconciles 2020’s economic output slowdown due to COVID-19 in the context of record government spending.

A relatively new area of economic analysis, called Modern Monetary Theory (MMT), emerged from practitioners in the finance industry telling a different story about government spending. It is worth examining as it is based more on financial traders’ practices, particularly those who work with government bonds. Warren Mosler was critical in formulating and shedding light on the actual processes involved in government spending. Thus my emphasis on Modern Monetary “Practice” as it starts with this description of the spending process that allows us to reframe its dynamics.

Spending should not be seen as a panacea for the economy. Spending can be wasteful and lead to inflation. Spending needs to be productive. The $28 trillion debt accumulated by May 2021 is worthy of monitoring, but what does it really mean? What are its implications?

Taxes are registered by government, yes, but it’s not like household economics. A household needs a breadwinner, someone to bring home the bacon, to load up the metaphors. Someone needs to have money to pay the bills. Governments operate under a different set of rules and responsibilities. They can print or mint minor amounts of money and use the Fed for larger quantities. Government provides the money for the economy to operate, and the incentive – taxes – to make people want to own it. Mosler argues that governments have a monopoly on their currency and the responsibility to get it into the economy, by spending, to enable markets to work.

Central banks can make purchases of bonds and quite frankly, whatever it wants to buy. The Fed traditionally only bought government treasuries but now regularly buys mortgage-backed securities in a process called quantitative easing. Ideally, they can sell these treasuries and securities to absorb money from the economy if it smells inflation. The banking sector also creates money when it loans money to consumers.

The Treasury auctions bonds. But to pay for spending? They essentially provide:

  • Time deposits for investors;
  • Hedge instruments for traders;
  • Opportunities for foreign countries to keep their currencies cheap versus the dollar;
  • A vehicle for the Fed to influence the money supply and coordinate interest rates.

Borrowing should be seen as a political strategy to keep the financial system secure, provide a stable hedge, and manage the dollar’s value.

So, rather than worrying about “paying” for something, US citizens should be active in deciding how taxes should be used in public policy. The US policy should be designed to tax what it doesn’t want. Well, that isn’t going to be easy. But it is what democracy is about. Spending should also be determined on what will keep the US safe and secure. It should keep the economy productive while providing opportunities and avoid excessive inflation.

This last point is important. Inflation is the primary limiting factor when it comes to spending and is a calculus between supply and “effective” demand. “Too many dollars chasing too few goods” is the standard explanation by economists. Spending is easier for a government to coordinate. A good example occurred during the COVID-19 when the US government passed several emergency spending packages to support businesses and families, especially airlines hurt by the shutdown in travel. While the economy skyrocketed due to the fiscal and monetary stimulus, the slowdowns in production, disruptions of supply chains, and people staying at home caused a significant spike in inflation during early 2021.

Inflation in the US had been largely absent since Nixon took us off the gold standard in the early 1970s. At that time, the dollar deflated, and OPEC countries restricted oil production. So they wanted to drive up prices to make up for the diminishing value of the US greenback. Meanwhile, lacking banking systems due to Islamic restrictions on credit, they recycled US dollars through a global euro-dollar system. Called “petrodollars,” banks worldwide coordinated syndicated loans with these funds for countries needing dollars for energy purchases and development projects. “Economic hit men” scoured the world and pressured countries around the world to borrow the money, eventually creating what was called the “Third World Debt Crisis” in the early 1980s.

Since the 1980s, financialization and the commercialization of Cold War technologies created sufficient competition and disruption to keep prices down. Primarily information technologies, they increased productivity, reducing labor and resource costs. Also, globalization created new forms of interstate competition and cooperation, as supply chains supported innovation and higher quality products. The US government also floated bonds internationally to countries like England and Japan that strengthened the dollar and kept it as the world’s reserve currency.

The COVID-19 pandemic presents unprecedented economic challenges, particularly with its 2021 resurgence as the delta variant. A rising stock market that saw the DJIA hit 35,000 and S&P hitting 4,395.26 with a market capitalization of US$38.2 trillion. But concerns about inflation grew as commodities such as copper, lumber, and oil increased. A computer chip shortage also raised concerns about the production and pricing of cars, computers, and other commodities based on microprocessing capability. But the Fed and others saw this phenomenon as “transitory,” citing disruption, demographics, debt, and productivity as factors that would reduce inflationary pressures.

So the economy looks to be at risk in late 2021. Will the practical application of MMT provide operational guidance for a new era of prosperity? Can infrastructure and climate change solutions provide sufficient returns on these investments? The big question is whether government spending for such programs can avoid significant inflationary pressures? With COVID, we are struggling with how to spend in a low-output economy.


AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and spending most of his career at New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in the Republic of Korea, he lives in Austin, Texas.

Thomas Edison Builds the Universal Ticker-Tape Machine

Posted on | July 10, 2021 | No Comments

In previous posts, I described the importance of the telegraph and other electro-mechanical devices on the emergence of Wall Street. Thomas Edison, the famous inventor, came to New York during the famous Black Friday financial crash in the Autumn of 1869. He immediately became involved in the technological innovations that were introducing new ways to represent and understand financial transactions.[1]

Electricity was being harnessed to expand finance’s reach throughout the country and across the seas. New networks of telegraphed information and news added both layers of certainty and volatility to Wall Street and for investors around the world. Edison’s inventions and patents helped solidify these technological innovations and the financial practices and provided the means for him to escape New York and set up his laboratories and facilities for creative innovation and manufacture.

A few weeks after Black Friday, Edison formed a partnership with Franklin L. Pope, a telegraph engineer who was also associated with Doctor Laws. Earlier in the year, Edison had conducted telegraph tests with Pope from Boston and Rochester, which was one reason he came to New York. On October 1, 1869, Edison and Pope announced they were going to create both the Financial and Commercial Telegraph Company and American Printing Telegraph Company.[2]

It was only a few months after completing the transcontinental railroad, and the US was expanding quickly westward. Immigration was on the rise, and the economy was growing rapidly. They jointly filed several patents in printing telegraphy during this period. Two printer designs were for stock tickers to provide gold and stock quotations in the New York area. Another printer called “The Pope and Edison Type-Printing Telegraph” was meant to be used on “private lines.” They transmitted messages other than commercial quotations from an easy-to-use terminal for individuals and business houses and, in a sense, were the first “emails.” While their partnership was successful, it ended when the Gold and Stock Telegraph Company absorbed their businesses.

Like much of the managerial talent of the time, General Marshall Lefferts had served during the Civil War. After several years heading Western Union, he became the President of the Gold & Stock Telegraph Company. Very quickly, they acquired the patents to Laws’ Gold Indicator and his Stock Printer, as well as the Calahan Stock Ticker.

In 1870, the Gold and Stock Company created the Exchange Telegraph Company with partners from London, England to expand internationally. Investment capital was streaming into the US, primarily for railroad development. Telegraph messages and ticker prices were streaming through the Atlantic Ocean as well. Lefferts became a friend and mentor to the young Edison, and both encouraged and funded his work.

Edison had gained extensive experience with the Laws Gold Indicator and the Calahan Stock Ticker and was in an ideal situation to develop a general stock ticker for mass production. By the end of the year, Edison made a number of innovations and obtained patents for the Electrical Printing Instrument. Anxious to go into mass production, General Lefferts, as head of the Gold and Stock Telegraph Company gave him $40,000 for the rights to use his telegraphic and ticker inventions.[2] The event was represented in this movie Edison the Man (1940) where Spencer Tracy played the famous inventor.

His first months in New York City was an extraordinary time for Edison, who had barely reached his mid-twenties. In early 1870, with the money from his stock ticker inventions, he opened his first factory in nearby Newark, New Jersey, for the manufacture of gold and stock tickers. Edison’s worked on telegraph instruments as well as duplex and quadruplex telegraph instruments that could send multiple electrical signals at the same time.

Another device he continued to work on allowed remote synchronization of tickers from the central station. If a ticker in a broker’s office went out of “unison” and began to print unstructured results, it needed to be quickly reset. In the first years of the stock ticker, a runner would have to go to each client’s location and reset the ticker by hand. Every mechanism needed to be synchronized to allow multiple machines to print the same information simultaneously. Also, the demand for the tickers was spreading beyond New York, and it was crucial that the machines operated simply and without the need for mechanical troubleshooters. A device had been invented by Henry Van Hoevenbergh, but it didn’t work very well. Edison tested a new design on the Laws Stock Printer during the spring of 1871 and he soon received a patent for his “screw-thread unison” innovation that could reset the printing machines with an electrical signal. Subsequently, all stock tickers developed after this innovation incorporated the unison device, resulting in the stock ticker’s rapid diffusion.[3]

As their tickers were beginning to be used in many remote cities in the US and around the world, it was important to fix the stock tickers without the use of local skilled workers. Edison continued to make improvements on the stock ticker to create a reliable machine that could be marketed on a mass basis. The concept of the interchangeability of parts was important for the production of clocks, guns, and sewing machines, and it was applied to the manufacture of tickers as well. Several variations of stock tickers were initially manufactured in small numbers, but then in 1871, Edison constructed the Universal Stock Printer for New York’s Gold and Stock Telegraph Company.

The “Universal” was very dependable and could be manufactured in high volume. The New York Stock Exchange was a major beneficiary of the new stock tickers. These machines enabled a higher volume of trading and spread the reach of the Wall Street exchange. This allowed the NYSE to unseat smaller exchanges in other cities as people around the country could get price quotes transmitted directly to their stock tickers from New York. Over 5,000 of these devices were produced and used by investors around the world.

The ticker also got the attention of Western Union, which was also envious of The Gold and Stock Telegraph Company’s monopoly on transmitting market information from the New York Stock Exchange. Another concern for the giant was that the Gold and Stock Telegraph Company was using Edison’s Universal Private Line Printers to send out financial information. Edison’s printer only had moderate success though while a faster printer from George M. Phelps gave Western Union technological considerable competition. Along with a new stock ticker in 1870, later called the “Financial Instrument” that was faster, more efficient, and more reliable, Western Union went on the offensive. When they threatened to enter the New York market with this new ticker, Gold and Stock arranged a merger in 1871 and took over the took over the manufacturing and distribution of the stock ticker equipment.

Edison worked closely with “Mr. P” after the merger. Over the next few years they would patent the Quadruplex, considered Edison’s major contribution to telegraphy. It would allow four simultaneous telegraph transmissions on a single conducting line and would save Western Union a considerable amount of money. He also helped with Phelps’ Electro-Motor Telegraph that was ten years in development and based on an electro-motor/governor that was able to achieve speeds of up to 60 wpm. By 1875, Phelps’ transmitting apparatus allowed an operator to simultaneously transmit stock information to hundreds of different offices. It could also operate between New York and that other emerging financial center, Chicago, without using a single repeater.

The Universal Stock Ticker was Edison’s first commercial success and in many ways the source of his future success. The first 40 of his 1,093 patents filed with the US Patents Office came from his work with stock tickers and printing telegraphs. The success of his stock tickers provided capital and connections with wealthy investors that helped fund his many inventions including perhaps his greatest achievement, the electric light bulb. In 1875, he moved to Menlo Park, New Jersey where he set up his laboratory and the Black Maria movie studio.[4]


[1] A very good account of Black Friday events appears on the New York Times website section “On This Day”. See Accessed on May /10/15.
[2] Anecdotal information on Edison’s timely circumstances on Wall Street from Edison: His Life and Inventions by Frank Lewis Dyer and Thomas Commerford Martin.
[3] The stocktickercompany website has very good information on the history of the stock indicators and ticker-tape machines.
[4] It has been difficult to trace the exact timing of Edison’s activities at the time. Ultimately, I decided to follow the patents.



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Show-Biz: The Televisual Re-mediation of the Modern Global Economy

Posted on | June 8, 2021 | No Comments

My use of “Show-Biz” refers to the meaning-making techniques of financial journalism and their relationship to the narratives that drive the economy. Media industries “show” business and finance through various camera, editing, and special effects techniques, drawing in data from many sources and presenting them on different windows of high-resolution screens. These techniques create ways of seeing and showing the economy. Consequently, they influence public opinion as well as investment and trading strategies that shape global, local, and national economic activities and investment patterns.

This post concerns the televisual surveying systems that monitor and display global business and financial activities. It starts with a theory of media, called remediation, and then examines different elements or media that are combined into the broadcast or streaming of financial news. Two key concepts, transparent immediacy and hypermediation help us understand the way the media operates. These transmissions of mediated financial information have consequences for the global economy.

This is not to say that such representations are necessarily valid constructions of reality or distortions of truth. One of the central themes of this post is that strategies of visual mediation are intertwined with authentic experiences and facts and that strategies of interpretation and incredulity/skepticism are required.

Television news expanded significantly in the 1970s with the creation of cable systems and satellite networks. Several networks were dedicated to financial news. Cable and traditional TV combined when CNBC (Consumer News and Business Channel) was established in April, 1989 as a joint venture between Cablevision and NBC. Bloomberg Television was launched in the United States on January 1, 1994, and drew on a Bloomberg boxdecade of financial analytics provided through the famous “Bloomberg Box.” (See image of my daughter pretending to use a Bloomberg Box)

Another successful financial news network was Fox Business News, launched on October 15, 2007. Yahoo also emerged as a major financial news provider. Bought by Verizon in 2016, it attracted over 100 million global monthly visitors on average in 2019, according to media analytics company ComScore. Yahoo Finance was recently sold by Verizon to Apollo Global along with AOL.

How is financial news mediated in these networks? What signifying practices are brought into play and for what purposes? What are the implications of their mediating styles and techniques for how we understand the health of the global economy, levels and types of employment, and the potential of innovative new industries and companies?

Financial television news is consistently broadcasting in many trading operations and other business environments. It is also popular in homes, whether by day traders or interested citizens. Many people might be invested in Bitcoin or other cryptocurrencies, concerned about housing prices, or following their 401K and other types of investments. Televised news and economic indicators play a vital role in various audiences’ perceptions of the economy.

Anchored by personalities that are informed and presentable, the television screen combines live human commentary with indexical information, graphs, and other numerical representations of different parts and states of the economy. The news anchor fixes meaning, guiding the narrative while transfixing the audience with the immediacy of their presence.

Remediation is literally the re-mediating of content through the inclusion of old media into new media. Or sometimes, including new media in an old medium, such as the use of computer and web techniques in modern television. One of the earliest media theorists, Marshall McLuhan, made these observations in the 1960s. Print remediates writing, which remediates speech. The TV anchor, an actual person of authority who “anchors” the meaning of the broadcast, is remediated on the television news screen.

However, Bolter and Grusin made a more systematic analysis in their Remediation, published by MIT Press (2000).[1] They echoed McLuhan’s observations that the content of new media is old media. They also ventured that the motivation was a desire for remediation is a “healed” media, one that provided better access to a more “authentic” version of reality. Bolter and Grusin pointed to a “double logic” of remediation – two different modes of representation that strive to better access the real. Television has coped with this dual system of remediated representation since its origins with a variety of incorporations and innovations.

One mode of remediation is transparent immediacy, the desire for a type of immersion into the medium that erases the technology and provides an unblemished experience. The cinematic movie experience strives for this authenticity with its large screen, darkened room, and conservative camera editing practices. The viewer wants to forget the presence and techniques of the movie apparatus and believe they are in the presence of the objects of representation – the actors and sets. TV less so.

McLuhan and others argued that TV was primarily an acoustic medium, mainly because sound anchors the meaning of the visual elements. Television is a storyteller, primarily an oral one. So it is no surprise that human “anchors” on broadcast news play an important role. Anchors read the news and also conduct live interviews with guest experts for additional credibility and information. They present the major topics of the day in real-time, fixing the meaning of the broadcast, organizing the narratives of the day. Financial television borrows this sonic dominance, although it streams many other sources of data and textual news.

Many celebrity financial analysts have become celebrities, such as Mohamed A. El-Erian, Jared Bernstein, Bill Gross, etc. Neel Kashkari and other Fed District presidents are also very popular. These “Talking Heads” are brought in to contribute to the narrative and bring their expertise remotely from different cities and countries, representing key companies or government positions.

Television news is interrupted occasionally by “Breaking News” that reinforces immediacy. This interruption usually includes live reporting by a journalist at a relevant location. Drone or helicopter “birds-eye” viewing enhances the dominant perspective of television news. Reports by the Fed Chair after their FOMC meeting on interest rates are very popular. These events keep viewers “glued” to the screen.

Bloomberg Intraday

Hypermediation is the other strategy and uses techniques of representation that “foreground the medium.” Television has taken on the multi-mediated look of the computer with different windows gathering in activities, data, and events. While the anchor is prominent (although most trading environments turn off the sound) other windows display hyper-mediated representations of economic and financial data streaming in from around the world. This information is primarily in the form of charts and graphs, and indices presenting a quantitative view of the world. The reliability of this global gaze often draws on the truth claims of numeracy and remediates the spreadsheet. In particular, the visual techniques of the table are utilized to quickly communicate an augmented view of the economy.

Financial hypermediation has moved away from transparency. Instead, it integrates an augmented reality with indexical denotations of stock markets, prices of commodities like gold and silver, and currency exchange fluctuations. Indicators range from the macro-indicators such as GDP, invented to mobilize industrial responses to the Great Depression and World War II. If Women Counted by Marilyn Waring was a major critique of GDP because it didn’t count domestic work. The age of big data is also returning information that is giving us a better picture of the larger economy. Unemployment statistics are a major indicator, as are the prices of commodities like gold, silver, and copper.

Financial news probably owes a debt to sports broadcasting and news. Notably, American sports like baseball, basketball, and football (gridiron) have embraced hypermediated techniques in the service of sports entertainment. While transparent immediacy is a crucial part of sport enjoyment, a new word, “datatainment” has emerged as the moniker of the joy many people get from statistics related to their favorite teams and players. In baseball, for example, scores remain the major source of numerical pleasure as they indicate winners and losers. But batting averages, earned run averages (ERA), and runs batted in (RBIs) are statistical sources of additional satisfaction.


Financial news on television combines several earlier and newer types of media to represent views of the global economy. It uses anchors and interviews with guests. It used many economic indicators throughout the screen and scrolling tickertapes. It tries to survey the world and paint a picture of an authentic world that it thinks its viewers would be interested in. What are the limitations of such media strategies? What are the limitations of these strategies of representation?


[1] Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 2000.
[2] Cook, Patrick J. Rev. of Remediation by Jay David Bolter and Richard Grusin. Resource Center for Cyberculture Studies (December 1999). 14 January 2001.


AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    May 2022
    M T W T F S S
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.