Anthony J. Pennings, PhD


It’s the E-Commerce, Stupid

Posted on | May 7, 2023 | No Comments

The U.S. developed a comparative advantage in Internet e-commerce partly because of its policy stance. It recognized from early on, when it amended the charter of the National Science Foundation to allow commercial traffic, that the Internet had vast economic potential. Consequently, a strategic U.S. policy agenda on e-commerce emerged as a high priority in the mid-1990s.

On July 1, 1997, the Clinton administration held a ceremony in the East Room of the White House to announce their new initiative, A Framework for Global E-Commerce. Essentially it was a hands-off approach to net business to be guided by the following five principles:

– The private sector should lead the development of the Internet and electronic commerce.
– Government should avoid undue restrictions on electronic commerce.
– Where government is needed, its aim should be to support and enforce a predictable, minimalist, consistent and simple legal environment for commerce.
– Governments should recognize the unique qualities of the Internet.
– Electronic commerce over the Internet should be facilitated on a global basis.

Clinton also asked Treasury Secretary, Robert Rubin to prevent “discriminatory taxes on electronic commerce” and the U.S. Trade Representative, Charlene Barshefsky, to petition the World Trade Organization to make the Internet a free-trade zone within the year. On February 19, 1998, the U.S. submitted a proposal to the WTO General Council requesting that bit-based electronic transmissions continued to be spared arduous tariffs.[1]

The WTO adopted the Declaration on Global Electronic Commerce on May 20, 1998. Members agreed to “continue their current practice of not imposing customs duties on electronic transmissions.” They also set out to study the trade-related aspects of global Internet commerce, including the needs of developing countries and related work in other international forums.[2]

Later that year, the OECD held a ministerial meeting on electronic commerce in Canada, where the WTO General Council adopted the Work Program on Electronic Commerce. In addition, the September meeting mandated the WTO’s Council for Trade in Services to examine and report on the treatment of electronic commerce in the GATS legal framework.

The WTO already protected e-commerce from taxation until 2001 by the time of “The Battle for Seattle” ministerial meeting in the state of Washington. But concerns were growing as e-commerce took off during the “dot-com craze” of the late 1990s. Particularly, the E.U. and other trade concerns worried about the unfettered ability of software products to be downloaded. France also attacked Yahoo! because its auction site trafficked in Nazi memorabilia. It got the search company to remove the items and established a precedent for a nation-state to police a website in another country. The WTO produced a definition of e-commerce that suggests some of the difficulties in developing meaningful trade policy.

The WTO defined E-commerce as “the production, advertising, sale, and distribution of products via telecommunications networks.” This extensive characterization has made it challenging to classify e-commerce as falling under the framework of the GATT, GATS, or TRIPs agreements. Each had different parameters that influenced the rollout of e-commerce technologies such as the Internet and Internet Protocol Television (IPTV). Nevertheless, the long-awaited convergence of digital technologies required an overarching multilateral trade framework.


[1] WTO information on e-commerce from Patrick Grady and Kathleen MacMillan’s (1999) Seattle and Beyond: The WTO Millennium Round. Ottawa, Ontario: Global Economics Ltd.
[2] The Geneva Ministerial Declaration on Global Electronic Commerce. Second Session Geneva, 18 and 20 May 1998 at

Citation APA (7th Edition)

Pennings, A.J. (2023, May 7). It’s the E-Commerce, Stupid.



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Deregulating U.S. Data Communications

Posted on | May 6, 2023 | No Comments

While deregulation has largely been seen as a phenomenon of the 1980s, tensions in the telecommunications policy structure can be traced back to 1959 when the FCC announced its Above 890 Decision. This determination held that other carriers besides AT&T were free to use the radio spectrum above 890 MHz. The FCC maintained that the spectrum above that level was large enough and technically available for other potential service providers. The fact that no one applied to use the frequencies until many years later does not lessen its significance–the FCC was responding to those who might use the spectrum for data communications and other new services. Because of the Above 890 Decision, part of the telecommunications spectrum was now deregulated for non-AT&T use.[1]

The Bell organization responded to the FCC’s decision with a discounted bulk private line service called Telpak. Although the FCC would declare AT&Ts low-end services (12 and 24 voice circuits) discriminatory in 1976 and illegal because of its extremely low pricing, Telpak (60 and 240 voice circuits) would persist until the middle of 1981. Users became accustomed to it and developed property rights in it over the years as it became part of their infrastructure. AT&T had offered Telpak to deter large users from building their own private telecommunications systems. The Above 890 Decision meant that large corporations such as General Motors could use the frequencies to connect facilities in several locations and even with clients and suppliers. The low tariffs, however, effectively undercut the costs of the private systems and convinced users to stay with Ma Bell.[2]

The Above 890 decision had its toll on the major carriers; however, one that would have far-reaching consequences for the development of national and international telecommunications. One consequence was an alliance between potential manufacturers of microwave equipment that could operate on these frequencies and the potential new bandwidth users. The National Association of Manufacturers had a special committee on radio manufacture that lobbied hard for permission to produce equipment that operated in these ranges. Retailers such as Montgomery Ward, for example, were investigating the potential of installing their own networks to connect their mail order houses, catalog stores, and retail stores which were dispersed widely around the country.[4]

The biggest success, however, occurred when a small company called MCI received permission to set up a microwave system between St. Louis and Chicago. The FCC was impressed with MCI’s market research that indicated large numbers of lower volume users were not being met by AT&T Telpak services. So, despite objections from AT&T, the FCC granted the tariffs for the new routes with both voice and customized data circuits. The MCI startup was the largest private venture initiative in Wall Street’s history up until that time.[5]

The Data Transmission Company (DATRAN) made a subsequent application to provide a nationwide data communication network that the Bell System was not offering. Other than providing a leased circuit with which the user could choose to transmit data, AT&T was not offering any specific data service. It provided its private line service in December of 1973 and switched data service in early 1975.[6] DATRAN ran into financial trouble and never became a threat to the Bell system. Unable to obtain funding, it ceased business in 1976. What it did do was stimulate AT&T into making a significant data-oriented response. In fact, it initiated a crash program at Bell Labs to develop data transmission solutions. AT&T soon came up with Data-Under-Voice, an adequate solution for the time that required only minor adjustments to its existing long-line microwave systems.[7]

The term “online” emerged as a way to avoid the FCC’s requirement to regulate all communications. While the nascent computer industry was experimenting with data transfer over telephone lines, it was coming to the attention of the FCC whose purview according to the Communications Act of 1934 was to regulate “all communication by air or wire.”[8]

The agency initiated a series of “Computer Inquiries” to determine what, if any, stance it should take regarding data communications. The First Computer Inquiry initiated during the 1960s investigated whether data communications should be excluded from government regulations. But just as important, it provided an early voice for the computer users to initiate change in the telecommunications network structure. It was after all, a time in which the only thing attached to the telephone network was a black rotary phone and few basic modems sanctioned by the Bell System. Computer One’s verdict in the early 1970s was to grant more power to corporate users to design and deploy a data communications infrastructure that would best suit their needs. The FCC subsequently created a distinction between unregulated computer services and regulated telecommunications.

Such a differentiation did not ensure however, the successful growth and modernization of network services for eager corporate computer users. A Second Computer Inquiry was initiated in 1976 amidst a widespread adoption of computer technologies by the Fortune 500. But they needed to use the basic telecommunications infrastructure which had been largely built by AT&T. Although AT&T’s Bell Labs had invented the transistor and connected SAGE’s radars over long distances to their central computers, they were not moving fast enough for corporate users. The Bell telephone network was preoccupied with offering universal telephone service and did not see connecting large mainframes as a major market, at first. Their hesitancy was also the result of previous regulation. The Consent Decree of 1956 had restricted AT&T from entering the computer business as well as engaging in any international activities.

The FCC’s decision at the conclusion of the Second Computer Inquiry allowed AT&T to move into the data communications area through an unregulated subsidiary. However, the ultimate fate of domestic data communications would require the resolution of a 1974 antitrust suit against AT&T. In 1982, the Justice Department’s Consent Decree settled against the domestic blue-chip monopoly and broke up the company. This action had a dramatic influence on the shaping of data communications and the Internet until the Telecommunications Act of 1996 created a whole new regulatory model.

In retrospect, Computer One and Computer Two determined that the FCC would continue to work in the interests of the corporate users and the development of data communications, even if that meant ruling against the dominant communications carrier.


[1] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 38.
[2] ibid, p. 42.
[3] Martin, J. (1976) Telecommunications and the Computer. New York: Prentice Hall, p. 348.
[4] Phister, M. (1979) Data Processing Technology and Economics. Santa Monica, CA: Digital Press.
[5] Phister, M. (1979) Data Processing Technology and Economics. Santa Monica, CA: Digital Press. p.79.
[6] ibid, p. 549.
[7] McGillem, C.D. and McLauchlan, W.P. (1978) Hermes Bound. IN: Purdue University Office of Publications. p. 173.
[8] The transition to “online” from Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation.

Citation APA (7th Edition)

Pennings, A.J. (2023, May 6). Deregulating U.S. Data Communications.



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics, and traditional macroeconomics. He also taught in Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

“Survivable Communications,” Packet-Switching, and the Internet

Posted on | April 29, 2023 | No Comments

In 1956, President Eisenhower won reelection in a landslide a few months after he signed the legislation for a national interstate highway. Although heavily lobbied for by the auto companies, Eisenhower justified the expensive project in terms of civil defense, arguing that major urban areas needed to be evacuated quickly in case of a USSR bomber attack. The year before, the USSR had denoted the infamous hydrogen bomb with over 1000 times the destructive force of the atomic bomb dropped on Hiroshima. A significant concern dealt with the possibility of a Soviet attack taking out crucial communications capabilities and leaving U.S. commanders without the ability to coordinate civil defense and armed forces. In particular, crucial points of the national communications system could be destroyed, bringing down significant parts of the communications network.

The need for a national air defense system fed the development of the first digital network in the 1950s called the Semi-Automatic Ground Environment (SAGE), which linked a system of radar sites to a centralized computer system developed by IBM and MIT. Later called NORAD, it that found itself burrowed into the granite of Colorado’s Cheyenne Mountains. The multibillion-dollar project also created the rudiments of the modern computer industry and helped AT&T enter the data communications business.

While Lincoln had set up a telegraph room in the War Department outside the White House, but President McKinley was the first U.S. president to centralize electronic information in the White House. During the Spanish-American war, at the turn of the century, McKinley followed activities in both the Caribbean and Pacific through text messages coming into Washington DC over telegraph lines. The Cuban Missile Crisis in 1962 made obvious the new need for a new command and control system to effectively coordinate military activities and intelligence. President Kennedy’s face-off with Nikita Khrushchev over the deployment of Russian missiles off the coast of Florida in Cuba sparked increasing interest in using computers for centralizing and controlling information flows.

The result was the Worldwide Military Command and Control Systems (WWMCCS), a network of centers worldwide organized into a hierarchy for moving information from dispersed military activities and sensors to the top executive. WWMCCS used leased telecommunications lines, although data rates were still so slow that the information was often put on magnetic tape disks and transported over land or via aircraft.[1] Unfortunately this system also failed during the Six-Day War between Egypt and Israel in 1967. Orders were sent by the Joint Chiefs of Staff to move the USS Liberty away from the Israeli coastline. Despite high-priority messages sent to the ship sent through WWMCCS, none were received for over 13 hours. By that time, the Israelis had attacked the ship and 37 of the crew were killed.

In strategic terms, this communications approach suggested a fundamental weakness. Conventional or nuclear attacks could cut communication lines, resulting in the chain of command being disrupted. Political leadership was needed for the flexible response strategy of nuclear war that relied on adapting and responding tactically to an escalating confrontation. The notion of “survivable communications” began to circulate in the early 1960s as a way of ensuring centralized command and control as well as decreasing the temptation to launch a preemptive first strike.

Paul Baran of RAND, an Air Force-sponsored think tank, took an interest in this problem and set out to design a distributed network of switching nodes. Baran had worked with Hughes Aircraft during the late 1950s, helping to create SAGE-like systems for the Army and Navy using transistor technology.[2]

Baran’s eleven-volume On Distributed Communications (1964) set out a plan to develop a store-and-forward message-switching system with redundant communication links that would be automatically used if the others went out of commission. Store-and-forward techniques had been used successfully by telegraph companies. They devised methods for storing incoming messages on paper tape at transitional stations before sending them to their destination or the next intermediate stop when a line was free. At first, this was done manually, but by the time Baran confronted this issue, the telegraph companies were already beginning to use computers.[3] But this was only one part of the solution that would form the foundation for the Internet.

While Baran’s work focused more on making communication links redundant, the trajectory of his work increasingly encountered the need for computer-oriented solutions. AT&T had already built a distributed voice network for the Department of Defense organized in “polygrids” to address survivability. Called AUTOVON, the network tried to protect itself by locating the switching centers in highly protected underground centers away from urban areas. Baran studied this system and discovered three major weaknesses. The first was that although AT&T’s distributed system had switching nodes that were dispersed; the decision to switch was still located in a single operations control center.

The second problem was that the system was largely manual. Operators monitored the network from a control center, and if traffic needed to be rerouted, they would instruct operators at the switching nodes with the proper instructions. The third problem was maintaining the quality of the transmission. A message would have to be rerouted many times before it reached its final destination, increasing the chances of transmission problems. His solution was a computerized network with both digital switching and digital transmission. Instead of the routing decisions coming in from a staffed control center, the nodes would make the switching determinations themselves. The messages would need to be broken up into discreet packages that could be routed separately and resent if a problem occurred.

The proposed solution was packet-switching technology. This yet-to-be-devised equipment would transmit data via addressed “packets” or what Paul Baran called initially “message blocks.” Digital bits were organized into individual blocks of information that could travel separately. Instead of single dedicated lines for continuously transmitting data, packets could be routed through different routes of telecommunications lines. Still using the abstraction of store-and-forward, packets were stored shortly at the next node and switched to the best route to get to their destination. Each packet was equipped with an address as well as the content of the message that could eventually send voice, video, or computer data.

The term “packet-switching” was actually named by Donald Davies of the National Physical Laboratory (NPL) in England. The British government had spearheaded computer technology to win the Second World War, but in its aftermath, it sought to “win the peace” by cutting down on its military and nationalizing key industries. Having booted Winston Churchill out of the Prime Minister’s office, it started down a long road toward rebuilding its war-torn nation.

It was concerned about using computers efficiently and subsidized programs to develop data communications, but the country could not compete with the U.S.’s Cold War mobilization that shaped computer and data communications through its massive budgets, G.I. Bill, the Space Race, and fear of nuclear attack. The British initiatives were soon outpaced by a newly created military agency called ARPA, dedicated to researching and developing new military technologies for all the branches of the U.S. Armed Forces. It would contract out for the realization of Baran’s ideas on survivable communications, creating ARPANET, the first operational packet-switching network.[4]

Citation APA (7th Edition)

Pennings, A.J. (2023, Apr 29). “Survivable Communications,” Packet-Switching, and the Internet.


[1] Janet Abbate. (1999) History of the Internet. Cambridge, MA: The MIT Press. p. 31.Abbate, J. p. 134.
[2] Founded by the enigmatic Howard Hughes during the Great Depression, the company was a major government contractor. Stewart Brand interview with Paul Baran, in Wired, March 2001. P. 146.
[3] Abbate, J. pp. 9-21.
[4] Abbate, J. pp. 9-21.



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy and digital media. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

The Digital Spreadsheet: Interface to Space-Time, and Beyond?

Posted on | April 16, 2023 | No Comments

“I must confess that I was not able to find a way to explain the atomistic character of nature. My opinion is that … one has to find a possibility to avoid the space-time continuum altogether. But I have not the slightest idea what kind of elementary concepts could be used in such a theory.” — Albert Einstein (1954)

As an avid bike rider, I’m intrigued by the perception of reality. Accurately perceiving road conditions, nearby traffic, and even the dynamics of staying balanced on my bike all seem crucial to avoiding unpleasant experiences. I’m also intrigued by the “perceptual” characteristics of digital spreadsheets. What do they let us see? What are the power implications of these perceptions? And, at another level, what are the calculative and predictive qualities of spreadsheet formulas? Do the mathematics of spreadsheets have correspondence with reality? Are they illusions? or do they create new realities?

This post examines connections between my investigation of spreadsheets and some of the cutting-edge theories of quantum physics, neuroscience, and how the human perceptual apparatus interacts with the world. This is not my usual fare, and it’s a big gap to traverse, so this post is exploratory and more of a search for concepts and language to frame the connection. It may not produce the intelligible results I’m hoping for, which is an understanding of how language, numbers, and mathematics operating in the grids of the digital spreadsheet are a source of productivity and power in world. Still, a few valuable ideas may emerge about understanding the digital spreadsheets and their interaction with the objective world, and possibly beyond.

Historically, I’m theoretically influenced by remediation theory, the notion that new media incorporate old media as they try improve on previous technologies and “heal” our perception of reality. It investigates how these remediated technologies converge and produce a more “authentic” version of the world.[1] Television, for example, not only remediates the sound of radio and the optics of film to improve its experience, but now the windowed look of computers, especially on financial channels. Even the microcomputer (PC), that went from the command line interface in early versions to the graphical user interface (GUI) known as WIMP, which stands for “windows, icons, menus, and pointer,” became a more rectified computer experience and easier to use.

I’ve been using this framework to explore spreadsheet components and how they come together to create a pretty incredible tool or “interface” with the world. Previous work on remediation confirmed that its objective is to examine its relationship with reality. Each component or media in the spreadsheet (writing, lists, tables, cells, formulas) introduces its own utility, and with it, a type of perceptual and organizing power. Furthermore, they pair up or work integratively to create a more complex cognitive/media experience.

To start off, the list is an ancient writing technology that has proven its powerful ability to be used to organize armies, monasteries, and palaces.[2] The book and movie Schindler’s List (1992) showed how lists can operate as a technology of social control, as well as a technology of resistance. The list is integrated into the “table-oriented interface” of the spreadsheet that displays and organizes information in a perceptual geometric space to show categories, cells, and relationships between data sets. The spreadsheet was the “killer app” for the PC, meaning people started to buy the personal computer just to run the application.

Spreadsheets produce a more “abstracted” version of reality. But do we really know what this means? Remediation involves cognitive/social media tools such as language, writing, numerals, and other forms of media and representation, such as simulations and mathematical formulations. In basic counting, fingers represent items that can then be displaced. Likewise, alphanumerical numbers represent and aggregate large quantities. With zero and positional notation, the numbers can get very large and still be manageable and manipulated to produce operations that produce intelligible and meaningful data, including information that can inform policy and strategy at the corporate and governmental organization level. The abstraction process renders events and inventories into qualities, dealing with ideas rather than events and items.

Words play an important function in organzing the spreadsheet. They provide both context and content. Lists are labelled and give context to other words and numbers. Words identify things and start to substitute for them and create relationships between them. Words become abstractions as they represent concepts or ideas rather than tangible physical objects. Words allow things to dissappear and yet still linger. What and where is the relationship between items in cells and rows? They transform data into aggregated or recategorized information.

Here, I want to consider the implications of Donald Hoffman’s theories of biological and cognitive construction of space-time reality for understanding the power of spreadsheets. First, I’m struck by Hoffman’s contention, shared by many physicists such as Nima Arkani-Hamed, that space-time is doomed and dissolving into more fundamental structures. This contention was confirmed by the 2022 Nobel Prize winners for Physics, where quantum theory won out over relativity.[3] The framework of reality sculpted by Isaac Newton, James Clerk Maxwell, Albert Einstein, and others have been incredibly useful as rigorous explorations/extensions of our corporeal space-time capabilities, but are they sufficient explanations of full reality?

Donald David Hoffman is an American cognitive psychologist and professor at the University of California, Irvine. Interestingly, he also has joint appointments in the School of Computer Science, the Department of Philosophy, and the Department of Logic and Philosophy of Science. He works on cognitive and mathematical theories that tie perception with how we construct an objective world. His “Interface Theory of Perception (ITP)” is central to this post’s inquiry into the power of the spreadsheet.[4]

Hoffman contends that new orders of objective complexity and structure are made perceptible by mathematical computations of formulas. Hoffman argues his theory is further supported by Kurt Gödel’s famous “Incompleteness Theorem” that showed that any consistent mathematical system contains propositions that cannot be proved or disproved within the system. Therefore, digital spreadsheets can possibly be viewed as an interface to multiple and successive stages of reality, including the levels of the objective space-time paradigm, but perhaps infinitely beyond.

Wolfgang Smith echoed Arthur Eddington’s mathematics that conjectured the act of measurement itself summons the quantum into the corporeal reality. Once you measure, you invoke the reality. Smith also contends that the act of measurement brings what he calls the physical (quantum) world into the corporeal (perceptual) world. Mathematical measurement triggers the transition from the physical to the corporal. It brings the subcorporeal potential into the world of objective possibilities.

The world we see with our five senses is real and consequential, but our language and mathematical models invoke the quantum world in a process Smith calls vertical causation. Unlike horizontal causation that occurs in space-time, vertical causation instantaneously links the physicist’s world with the corporeal world. But, by distinguishing two ontological planes, the lived world, and the physical world, you can observe certain discontinuities. Drawing on the famous Heisenberg uncertainty principle, the act of observation interrupts the multilocality of particles, splits them into multiple realities, and brings them to a precise ontological place, the corporeal instrument. So, do the observational characteristics of the spreadsheet disrupt the multilocational potentials of the quantum world and bring them to an exact corporeal location?

Hoffman claims that this world is quite different from the world we construct with our perceptual apparatus. He likes to use the example of the desktop icon of a document on a personal computer. The icon is not the document. But you also don’t want to drag it over to a trash can, unless you are prepared to part with that document, and probably lose hours of work. The desktop icon hides the reality of computing devices because that much information is not necessary for using it effectively but gives us an indexical connection. In other words, the icons we perceive have an indexical connection with reality, they are connected, but they are not indicative of all the possible domains of that reality.

We don’t generally interact with the mechanics of the computer, just as we don’t generally interact with the quantum mechanics of reality. A long line of thinkers, from the Greek atomist Democritus, to Descartes’s mind-body dualism, and Alfred Whitehead’s critique of bifurcation have addressed this issue. But maps have been proven to be tethered to reality. Hoffman suggests that what we see in the world is a construction, but nevertheless, one that has payoffs and consequences.[5] Iconic representations can guide useful behaviors, such as crossing the street without being hit by a BMW icon.

So, let’s consider the spreadsheet to be an interface. An interface is a site where independent and often unrelated systems interact. They transcend a boundary and connect and act on or communicate with each other. For spreadsheet use, those systems are the graphical “gridmatic” display of the application on the screen and the “reality” they interface. Reality is a big term, but Newton-Maxwell-Einstein’s notion of space-time apply, to a point. So, the interface of the digital spreadsheet is conducted systematically to examine its power using a “formal” analysis of the spreadsheet. That requires examining the meaning-making capabilities of the parts-the writing, lists, tables, and formulas, and the whole working together.

Getting to the latter part of the interface is difficult, but starting with the former, we can explore the perceptual aspects. The spreadsheet has several media components that begin to address the rift between the objective and post-spacetime world. The spreadsheet experience is initially structured with the words and/or numbers in cells that indicate corresponding indexical and symbolic connections. The base-10 Indo-Arabic numeral system with zero used to create positional notations has been largely accepted worldwide as the accounting standard. They also create categories for lists and rows. Tables provide visualizations of 2-D matrix relationships. This critique of Hoffman is interesting, but gets it wrong. Its not that the timetable gives some indication of when the bus is coming. The timetable comes before the bus and sets up the whole transportation framework. Most intriguing is the vast array of constructions that are produced by the myriad of formulas incorporated into spreadsheets. 

The remediation contention is that the spreadsheet emerged to give us a more healed or rectified representation of reality. It does this by successfully integrating several media components, including formulas. Given that arrangement, it should be possible to continue to examine the media components and formulas as providing particular points of view that produce knowledge about specific aspects of “reality.” The components combine and build up to provide increased utility. Here, we may not get evolutionary payoffs, but power in social contexts. Formulas should also confer a utility or power that is measurable.

Take for example, the power of the ratio. A ratio sets up relationships across time and space. It is a technique that “fixes” or freezes a relationship in order to construct a moment of reality. Ratios have analytic capacity and can identify high and low performing assets, track overall employee performance, and evaluate profitability.

This type of analysis requires media strategies that analyse the components and formulas of spreadsheets as signifying or meaning-making processes. The spreadspread needs to be examined closely considering its interacting media interface and consider strongly the insights provided by a range of social and physical sciences on the other side of the interface. While Hoffman’s ocular-centric approach does not apply intimately to the spreadsheet, his insistance on a scientific approach is worthy of note and study.


This is a quest for information about how spreadsheets operate effectively in the world, so ultimately, it is quite agnostic about many mathematical, philosophical, and scientific debates. For example, I venture into the realms of discourse initiated by the Greek Philosopher Plato. It starts with remediation theory’s observation that new media incorporate older media to produce a more “healed” or authentic version of reality. Digital spreadsheets integrate several important media components towards this end, each of which organizes aspects of reality. The PC and its windowed graphical user interface empowered the spreadsheet to combine words, numbers, lists, tables, and mathematical formulas.

The digital spreadsheet, a visual interface, uses language and numerical information organized for intelligibility and uses media and mathematical formulas to initiate horizontal and vertical causation. The spreadsheet interfaces the conscious agent with space-time and the quantum realm to operate in the space-time reality and to “summon” new realities through the acts of writing and measurement. According to Hoffman, consciousness has an “earthsuit” (a biological machine hosting the ghost) with a set of perceptual tools shaped by evolutionary forces that helps me ride a bike safely. Perhaps it now has an additional tool or interface, the digital spreadsheet.

A last note refers to the overall social impact. How much of the social change since 1979 can be attributed to the digital spreadsheet? A professor of mine in graduate school, Majid Tehranian, used to refer to the rise of “spreadsheet capitalism.” To what extent can we attribute the massive shift towards financialization in the early 1980s to the technology of the PC-assisted digital spreadsheet? Can we say it “conjured” a new epoch into existence?[8] If so, what were the implications of this historical shift?


[1] Bolter, J. D, and Grusin, R.A. Remediation: Understanding New Media. Cambridge, Mass: MIT Press, 1999. Print.
[2] Jack Goody’s (1984) Writing and the Organization of Society is informative on the use of the list as a historical management tool.
[3] Alain Aspect, John Clauser, and Anton Zeilinger won the 2022 Nobel Prize for Physics for experiments that proved the quantum nature of reality using the experiment that Einstein himself configured to prove relativity. Smith argues this supports his notion of vertical causality.
[4] So, the Hoffman conceptualization consists of the following three interconnected theories. First is the evolutionary natural selection view that “Fitness Beats Truth (FBT).” Reproduction comes first; accurately gauging the domains of reality is not a necessary requirement. Theorem 2 is the “Interface Theory of Perception (ITP)” that I draw on here, although I find his ocular-centric (vision focus) perspective limited, as shown below. The third theorem is “Conscious Realism,” a fascinating set of ideas that proposes the primacy of conscious agents in the world rather than an objective space-time reality. Evolutionary game theory is where Hoffman stays to keep his theory in the scientific regime. Our biological interface with the world is evolutionarily gained. Our eyes and brains, for example, are designed (or purposely shaped) for reproductive payoffs, staying alive to procreate. I have extraordinary capabilities for surviving my bike rides (so far). But evolution has no particular interest in the truth of reality. Biological agents don’t need to handle that much complexity. They are only interested in reproductive and survival payoffs.
[5] Hoffman, D.D. The Case Against Reality: Why Evolution Hid the Truth from Our Eyes Illustrated Edition. W. W. Norton & Company. Print.
[6] According to Dr. Wolfgang Smith, the act of measurement brings what he calls the physical (quantum) world into the corporeal (perceptual) world. Quantity and qualities are real. The world we see with our five senses is real, but our language and mathematical models invoke the quantum world. Measurement is a transition to and from the physical to the corporal. It brings the subcorporeal potential into the world of possibilities. He says in one world, the grass is green, and in the other, “its all quantum stuff.”
[7] Drawing on mathematician William Demski’s “complex specified information,” Smith suggests this is how complex designs are formed. It mathematically proves that improbable events, such as the writing of a book are not feasible. He explains, “A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified.” Unfortunately, Demski was embraced by “intelligent design” movement that ultimately caused him much distress and hampered his career succes.
[8] I start to use term like “summon,” “conjure,” and “evoke” as they have a mystical, almost magical resonance. It is on purpose, but not without recognizing a potential cost in terms of acceptability and viability. There is also a touch of numeromancy and even numerology here that I want to avoid.

I’ve often approached spreadsheets from what I call a techno-epistemological inquiry. It recognizes the unique knowledge-producing capabilities that emerged with digital technologies, particularly databases and spreadsheets. This strategy has been influenced by post-structuralism and “deconstruction” methods that expose the instability of meaning and how power centers in society use language and other signifying practices to intercede to produce and fix meaning.For example, VisiCalc and Lotus 1-2-3 began to be used in the early 1980s financial revolution to perform complex business calculations and interact with the data to evaluate different scenarios. Digital spreadsheets allowed financial analysts to inventory and value the assets of corporations and state-owned enterprises (SOEs). Subsequently, the information could be used to borrow money and take over other businesses, as well as enable government agencies and SOEs to be privatized and sold in global capital markets.

Citation APA (7th Edition)

Pennings, A.J. (2023, April 16). The Digital Spreadsheet: Interface to Space-Time and Beyond?



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Although his major focus is on ICT, he sometimes teaches quantum theory in his introduction to science and technology class. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Zeihan’s Global Prognostics and Sustainable Development, Part II: Implications of Tesla’s Master Plan 3

Posted on | March 19, 2023 | No Comments

“Prediction is very difficult, especially if it’s about the future.”

– Niels Bohr (and Yogi Berra)

This post continues the examination of Peter Zeihan’s geopolitical forecasting by contrasting it with Tesla’s guidance on achieving a sustainable global energy economy as presented in its Master Plan 3. Tesla faces extraordinary supply chain challenges in achieving its goals for building electric vehicles (EVs), charging stations, as well as battery packs for homes, businesses, and electric grids. Zeihan has warned about material constraints that will threaten sustainable development with what he considers will be the end of globalization. Nevertheless, Tesla argues that the “sustainable energy economy” is achievable in our lifetime, and a desirable goal.[1]

Tesla’s recent “Investor Day” presentations differed from previous events as it started by highlighting the so-called Master Plan 3. More of a vision for sustainable energy than a venue for future products, the Austin event wasn’t a big hit for investors looking for short-term guidance. Rather, it provided a macro view of a potential sustainable energy economy and Tesla’s contribution to that future. This plan included information on the total amount of energy currently being used worldwide, how much of that is from sustainable renewable sources, and how much is from fossil fuels?

Tesla Master Plan 3

Not surprisingly, it highlighted solutions that mainly favored Telsa’s product lines: large-scale batteries for homes, industrial mini-grids, and electricity producers; solar panels for houses and commercial properties; and electric vehicles for consumers and semis for long-haul transport. Particularly interesting was the manager of charging and the construction of a network infrastructure of superchargers. The rest of the event elaborated on the innovations for profitable production and recycling systems, as well as the efficiencies in geographical range and charging times for EVs.

Less attention was put on political context for reconstituting supply chains for needed materials and minerals in the post-Covid, high inflation, Russia-Ukraine war environment. This is more the domain of Peter Zeihan, a geostrategist with four books to his credit, mostly about the political economy of global energy flows. He worked in Austin, Texas for a decade as part of Stratfor, a forecasting company. In 2012, he started his own firm, Zeihan on Geopolitics. His primary areas of expertise are economic geography, population studies (demographics), and energy, but he considers himself a generalist who designs forecasts for specific clients. He has become a popular YouTube star primarily because of his statements about the possible end of globalization and its impacts on various countries and the supply challenges of different industries.

He also gained traction with his analysis of the Russian war on Ukraine and the consequences of continued fighting, particularly its implications for sustainable development worldwide. With both countries preoccupied or sanctioned, we face losing major suppliers of critical materials needed for the green economy. Russia is the 2nd largest exporter of crude and refined petroleum products, the largest source of palladium, and the 2nd largest source of platinum group materials (ruthenium, rhodium, palladium, osmium, iridium, and platinum) that often occur together in the same mineral deposits. Also, we are losing the third largest copper, titanium, and aluminum sources. Russia and particularly Ukraine, in combination, are also first in neon gases, which is critical for laser technology. All these are critical for the green revolution and continued development of the information and communication technologies (ICT) revolution.

Zeihan is also concerned that the withdrawal of political support for the US Navy operating worldwide to ensure freedom of navigation will be problematic for global trade. During World War II, the Battle of the Atlantic took over 5,000 commercial ships and 60,000 lives from Allied and Axis powers. Lack of maritime protection could collapse the intricate supply chains for the materials and sophisticated technologies needed for the sustainable energy revolution. Is that something we could see in our modern era of globalization?

While not a critic of sustainable energy, Zeihan is less confident than Tesla about its prospects. Claiming to have solar panels on his Colorado house, he is particularly concerned about the geographical distribution of good sunlight and wind energy. He points to Germany’s attempts to go green as particularly problematic. It has poor solar and wind potential and was recently cut off from Russian natural gas and oil. As a result, it has been forced to return to coal and lignite, both significant emitters of carbon dioxide and other pollutants that threaten its climate and pollution goals.

Zeihan points out that the border area around Mexico and the southwest US has significant solar and wind potential. They can provide the new efficiencies of smaller electrical grids run by renewables while still having ready access to the necessary hydrocarbons for paints, plastics, PVC resins, and other carbon-based industrial materials. Even companies from Germany are moving to the area to take advantage of cheap energy and carbon.

This geographic advantage appears to be no mystery to Tesla as it built a major “Gigafactory” in Austin, Texas, with its rooftop solar panels spelling out its logo that can be seen from space. It also announced a new Gigafactory facility across the border in Monterrey, Mexico, rumored to be designed to produce a new $25,000 consumer EV. Telsa is also building a major lithium refinery in Corpus Christi that will be designed to support 50GWh a year of storage capacity and easily draw in the abundant metal from significant producers in Central and South America. In addition, Musk’s related company, SpaceX, has been building and testing rockets in Boca Chica on the coast of the Gulf of Mexico for years now, very close to the border.

Looking back at Tesla’s Master Plan 2 in 2016, it emphasized several important strategies. These included solar roofs with integrated battery storage, expansion of the EV product line, and developing full self-driving (FSD) that is 10x safer due to massive machine learning from vehicle fleets on the road sending back information. Also, they suggested opportunities for using your car with FSD as a “robo-taxi” when you weren’t using it. FSD is still a work in progress, but Tesla Dojo supercomputers are collecting “big data” from over 400,000 participating drivers that have driven over 100 million miles. Tesla estimates that some 6 billion miles of driving will be needed to make the FSD relatively foolproof. Modeling with digital twin vehicles is taking up some of the slack in self-driving testing, but FSD is not universally accepted nor fully tested for its impact on sustainability.

In retrospect, Tesla’s Powerwalls are now in many homes, and its Megapacks (Megapack, Megapack 2, Megapack 2XL) are making a significant difference in both mini and major electric grids. For the latter, the Megapacks have drawn praise for its Virtual Machine Mode (VMM) firmware that smoothes out oscillations in long-range electrical grid transmissions. In addition, Megapacks have been standardized to an optimal size based on the legal requirements for transporting them over common road infrastructure. This standard means they can also be easily and quickly transported and deployed in various storage arrangements that can be scaled up quickly.

Master Plan 3 has a more global macro-perspective examining what Tesla thinks is needed for a sustainable civilization. It proposes that the sustainable energy economy is achievable and within reach during our lifetimes. It starts with some calculations dealing with quantities of electricity produced. The Master Plan suggests that the world needs some 30 TWh of ongoing renewable power capture/generation and a backup of 240 TWh of battery storage capacity. This storage number is a lot of battery capacity, but the trend is toward reducing cobalt, nickel, and even lithium. Instead, using more metals like iron, magnesiumm, and phosphorus provide safer and more long-term energy storage. The TWh (Terawatt-hours) is used in measuring energy and is equal to a million (1,000,000) megawatt-hours (MWh) or a billion (1,000,000,000) Kilowatt-hours (KWh). A Tesla uses about 34 KWh of electricity to go 100 miles. That’s 34,000 kWh per 100,000 miles of travel. Those figures are hard to fathom, and keep in mind that the Petawatt (PWh) units are even larger than TWh.

PWh units (a trillion KWh/hrs) are helpful when considering fossil fuels. The world uses some 165 PWh/year, roughly 80% of which are from combusting extracted hydrocarbons. Furthermore, about 2/3 of that is wasted. For example, an ICE car only uses about 25% of the energy of the fuel pumped into it at the fuel station. Factor in the mining and transportation of carbon-based fuels, and you get even less efficiency. So Tesla argues that instead of the 165 PWh/yr of current energy consumption (of which 20% is renewable), the world only needs 82 PWh/yr of energy if a transition occurs to sustainable sources. That means the world needs half as much energy from current consumption levels if it converts to an electric economy because of the waste factor in hydrocarbon combustion.

Significant transitions will likely occur in the following areas. First, the switch to renewable sources for electricity grids (instead of coal, natural gas, and oil) is expected (by Tesla) to replace 46 PWh/yr of fossil fuel combustion (35%). This transition is already occurring quite rapidly. Sixty percent of new investments in the existing grid have been in renewables.

EVs will replace 28 PWh/yr (21%), and high-temperature heat (stored heat for industrial purposes) and hydrogen will displace another 22PWh/yr (7%). Bill Gates has invested heavily in micro-grids using mirrors to concentrate sun energy and heat liquids to temperatures of over 1000 degrees Celsius. Geothermal and hydrogen are used as well. Replacing fossil fuels in boats and aircraft would reduce another 7PWh/yr (5%). Electric vertical take-off and landing vehicles (eVTOLs) are on pace to reconfigure certain supply runs and delivery speeds, bypassing trucking and trains. Shipping is already energy efficient compared to other types of transportation but still accounts for 3% of CO2 greenhouse gases (GHGs). Finally, heat pumps in buildings will be vital to the move to sustainable energy. These are like air conditioning (AC) units in reverse as they don’t pump heat but refrigerants. Tesla has no current plans to produce them, but they could replace some 29 PWh/yr of fossil fuels, primarily natural gas.

Another issue is real estate. Telsa says that only 0.2% of the Earth’s 510 million square kilometers of surface area is required. But less than 30% or 153 million square kilometers is land. So 0.2% of that is 30,600,000 square km. They further calculate that solar direct land area needed is 0.14% while wind direct land area is 0.03%. So, that is a lot of necessary land, but this is energy we are talking about, and it is absolutely critical for modern life.

Lastly, Tesla suggests this can be done with roughly USD 10 trillion in capital investment. Most of that (70%) will be required to switch to EVs, and another 10% for planes and ships. USD 2 trillion will be needed for the renewable energy grid, heat pumps for buildings, and high-temperature heating for industrial processes. Ten trillion dollars is about 10% of the 2022 World GDP of USD 95 trillion.

Tesla’s Investor Day presentation broadly sketched a vision for a sustainable energy economy and how the company would contribute to that plan. However, Peter Zeihan’s work suggests a tougher road ahead with limited premium locations for solar and wind. Furthermore, a deglobalization trend and geopolitical conflict threaten access to critical resources needed for a green energy revolution.


[1] This post contains links but most of the information comes from either Tesla or Zeihan. Additional examination of the energy numbers, geopolitical concepts, and technological possibities are needed.

Citation APA (7th Edition)

Pennings, A.J. (2023, Mar 19). Zeihan’s Global Prognostics and Sustainable Development, Part II: Implications of Tesla’s Master Plan 3.


AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Technostructural Stages of Global ICT for Development (ICT4D)

Posted on | January 31, 2023 | No Comments

Summarized remarks from my presentation on 30 January 2023 to the Interdisciplinary Doctoral Program in Communication and Information Sciences (CIS) at the University of Hawaii. In this talk, I also attempted to emphasize the contributions made by some of the faculty and research fellows at the University of Hawaii and the East-West Center who pioneered the area of information and communication in development (ICT4D).

“We shape our buildings, and afterwards our buildings shape us.” – From Winston Churchill’s address to the British Parliament that was reworked by the Marshall McLuhan movement into “We shape our tools and thereafter our tools shape us.”

This presentation will outline and explore economic and social development stages or phases utilizing information and communications Computerization and Development in SE Asia technologies (ICT). The ICT acronym has emerged as a popular moniker, especially in international usage, for the digital technology revolution and is often combined with “development” to form ICT4D. Development is a contested term with currency in several areas. Still, in global political economy, it refers to building enabling environments and infrastructure needed to improve the quality of human life and bridge equity divides. Often this means enhancing a nation’s agriculture, education, health, and other public goods that are not strictly economy-related but improve well-being and intellectual capital. But it has developed commercial and e-commerce applications as well.

ICT4D is increasingly tied to the Sustainable Development Goals (SDGs) developed by the United Nations in 2015. Created during the time of Secretary-General Ban-Ki moon, 17 Sustainable Development Goals (SDGs) were approved by the General Assembly for completion by 2030. They replaced the UN Millennium Development Goals (MDGs) that were designed for poor countries in the “South.” Now, SDGs recognize that all countries need to implement sustainable development practices, especially countries in the “North” that are hindered by legacy infrastructure and policy frameworks.

Returning to the ICT4D acronym, Martin Hilbert presents a 3-D framework in the video below that is helpful for those unfamiliar with the topic. It does not provide a historical perspective, but the cube framework helps distinguish the development of ICT technologies, services, and skills, from how ICT is used for development as well as how policy instruments can shape ICT implementation and regulation in a national action plan.

While Hilbert’s framework is useful, it lacks the historical depth that can provide a richer analysis of global ICT4D. My research shows that information and communication issues, often associated with ICT4D, have been at the forefront of international discussion and debate for decades. While technological convergence has changed how entertainment, news, and commercial content flow within and among countries, most nations have struggled with the cultural, economic, and political consequences of those flows and sought collective protection. Understanding the significance of global ICT for development will be useful for shaping relevant governance and investment strategies.

Of particular interest in this study is the transformation of national public-switched telecommunications from analog transmission into digital networks based on Internet Protocols (IP). While this change was not warmly welcomed, it allowed them to implement data, video, and voice services and eventually offer wireless mobile capabilities, all of which have become crucial technologies for ICT4D.[1] The resulting hypertextual World Wide Web also became the foundation for innovative data-intensive solutions that are being applied to many development issues with search applications and behavior-based advertising platforms. However, a growing concern is that “big data” is being collected extensively and used intrusively to manipulate personal behaviors, shape markets, and influence policy.

The ICT4D stages or phases that have played out globally over time and that will be addressed are categorized as:

1) Containment/Modernization; 2) New World Economic and Information Orders; 3) Structural Adjustment and Re-subordination, 4) Global ICT Integration, 5) Digital Borders and Authoritarianism (recently added), and 6) Smart/Sustainable Development.

Using a techno-structural approach, the explication of these stages will provide historical context for understanding trends in ICT innovation and implementation. This approach recognizes the reciprocal effects between technological developments and institutional power. ICT, in particular, can enhance corporate and government power. It is important however to consider the “dual-effects hypothesis” to acknowledge the parallel potentials of ICT4D as both a disempowering and an enabling force. ICTs have often “played a dual role in the service of centralization and the dispersion of power.”[2]

Research on ICT4D implementation and policy is being conducted around the world. This research project varies from much of it by acknowledging the deeper roots of ICT4D internationally. This includes an historical perspective that considers the implications of earlier versions of communications and computer technology and the analysis of national computerization policies that will provide insights into the trajectories and possibilities of ICT diffusion in developing environments.

1) Containment/Modernization

The US emerged as the primary economic and military power in the aftermath of World War II. Arms and materiel sales before the war had transferred much of the world’s gold to the US, which was transferred inland to Fort Knox, Kentucky to avoid Nazi capture. Franklin Delano Roosevelt (FDR) had sought to limit finance in the aftermath of the “Crash of 1929” and continued it globally with the initiative at Bretton Woods Hotel in New Hampshire. “Bretton Woods” created the International Monetary Fund (IMF), the World Bank, the International Trade Organization (rejected by Congress), and also instituted a dollar-gold standard that tied the US dollar to gold at $35 an ounce (oz) and other international currencies to the US dollar at set rates. This system was designed to “contain” international financial speculation and have the IMF coordinate currency values while individual nation-states would manage trade-based economies with Keynesian macroeconomic coordination.

Another aspect of containment, more widely known, is the containment of the Communist agenda. The painful success of the USSR in World War II in countering the Nazis on the eastern European front and their appropriation of German atomic and rocketry technology presented an ideological and military threat to the US and its allies. The US reaction to the USSR’s launch of Sputnik satellites in 1957 resulted in the formation of NASA and ARPA (who later developed packet-switching). The Communist revolution in China in 1949 and the explosion of an atomic bomb on Oct. 16, 1964, spurred additional concerns and US military intervention in Vietnam to contain Communism in Southeast Asia. The resultant “Cold War” and “Space Race” spurred technological development and competition for “developing countries” worldwide. This included competition over the appropriate development models for economic growth and the “development” of key sectors such as agriculture, education, and health.

“Modernization” characterized the U.S. post-World War II prescription for economic development worldwide, especially in newly decolonized nation-states. Modernization referred to a transitional process of moving from “traditional” or “primitive” communities to modern societies based on scientific rationality, abstract thought, and the belief in progress. It included urbanization, democratization, economic growth, and “psychic mobility” that types of media could influence. Also, the free flow model of information was championed by modernists to counter state regulation and censorship of the media and especially propaganda by communist opponents.

The general view of modernization theories was that developing countries needed to be transformed from agricultural-subsistence economies to liberalized modern industrial countries. Inspired by the success of the Marshall Plan, the North (both Western and Communist) studied strategies for transforming these traditional societies into modern, industrialized economies. Economists and communication scholars in the West generally described the process as one of progressive developmental stages and presumed a linear, evolutionary and universal process. Higher levels of civil order and industrial development could be achieved through change and capital.

The leading proponent of this economic philosophy was Walter Whitman Rostow, whose Stages of Economic Growth: A Non-Communist Manifesto (1960) outlined five stages that would lead to a modern nation. These were: 1) traditional society, 2) preconditions for takeoff, 3) takeoff, 4) drive towards maturity, and 5) age of high mass consumption. In a later book (1965) he would add, 6) the search for quality. Traditional societies needed to overcome their “resistances” to economic growth in order to achieve the all-important economic take-off stage that would lead them to the dream of economic and political modernization.

Scholars following Walter W. Rostow adopted the “stages” of economic development leading to an eventual “takeoff” into the realm of a modern society. If the proper regiment was followed, particularly the adoption of new agricultural techniques termed the “Green Revolution” that was heavily dependent on fertilizers and pesticides, developing countries could modernize.[3] Through the process of emulating the West, developing countries could attain increasing rates of economic growth and political democracy. Systems of bureaucracy and tradition, seen as endemic to developing countries, could be overcome through policies which liberalized markets, democratized politics, encouraged foreign trade and investment, and encouraged breaking down traditional cultures.

Interestingly, orthodox communism also presumed a linear path for economic growth, but one based not on capital but in labor and the “forces of production.” Capitalism was merely one stage in their development schemata that would eventually lead to a communist apex, liberating the working classes. The highest level to be achieved would be one of socialist equality, not mass consumption. Communism proved a staunch competitor to the West, especially in Asia and South America. India, for example, while keeping its British legal system and adopting a parliamentary system of democracy, largely incorporated the Soviet mode of industrialization. Jawaharlal Nehru, newly decolonialized India’s first Prime Minister, split from his friend and mentor Mahatma Gandhi, who stressed self-sufficiency and domestic production. Nehru would follow a state-led form of economic development based on the Russian industrial model.

The South was influenced by both these ideologies but began to perceive their own conditions as a deformation or underdevelopment due to unequal relations between North and South countries. Raul Prebisch, former head of the Argentine Central Bank, started using the terms “center” and “periphery” in his lectures during the 1940s. He critiqued the influence of the most powerful countries on the international division of labor, the declining prices of commodity goods (agriculture, minerals, and forestry), and their negative impact on local industrialization.[i] Several forums became important for expressing these concerns, notably the United Nations Conference on Trade and Development (UNCTAD) formed after the 1964 meeting that sought to protect young industries in the Third World.

C4D to ICT4D

Daniel Lerner’s (1958) The Passing of Traditional Society, argued that the psychic mobility of “peasants” could be manipulated by mass media and it would transform traditional societies into modern consumer societies. They provided a crucial intellectual foundation for communication scholars such as Wilbur Schramm and Everett Rogers and was instrumental in the field known as “Communication for Development” (C4D) that was a precursor to ICT4D. Part of the debate was whether it was Communication for Development or Development Communication. Later, the 2006 World Congress on Communication for Development defined C4D as ‘a social process based on dialogue using a broad range of tools and methods. It is also about seeking change at different levels including listening, building trust, sharing knowledge and skills, building policies, debating and learning for sustained and meaningful change.’

Information and Communications Technologies (ICTs) were rarely stressed, but “five communication revolutions” (print, film, radio, television, and later, satellites) were beginning to be recognized as contributing to national development.[4] These technologies were beginning to spread information and imagery about practices in agriculture, health, education, and national governance. The hope of social mobility was also enhanced by entertainment and advertising images of a modern consumer society in magazines and television.

Some early computerization projects continued population analysis, such as the census that had started with tabulation machines. Mainframes and minicomputers were increasingly utilized for statistical gathering by government agencies for development processes.

Telegraphy and telephones were strangely absent from much of the early discussion but were important for government activities as well as commercial operations such as large-scale plantations, mining operations, transportation coordination, and maritime shipping. Because of their large capital requirements and geographic expanse, countries uniformly developed state-controlled Post, Telephone, and Telegraph (PTTs) entities with the help and guidance of the International Telecommunications Union (ITU). The ITU is the oldest United Nations entity having been formed in the 1860s with the widespread adoption of the telegraph. PTTs struggled to provide basic voice and telegraphic services, however, they provided much needed jobs, technical resources, and currency for the national treasury.

International Telephone & Telegraph (ITT) was a private company that transcended borders and sold electronic equipment to PTTs, laid undersea telecommunication cables, and bought into telephone and telegraph companies worldwide. However, ITT suffered from political complications and diversified its holdings in the 1960s and 1970s, diminishing its role in international telecommunications.

Wilbur Schramm’s (1964) book Mass Media and National Development made crucial links between media and national development and became very influential in developing countries. Published by Stanford University Press and UNESCO, it examined the role of newspapers, radio, and television. Its emphasis on the role of communication and information in development also laid the foundation for the analysis of computerization and ICT in the development process.

I had an office next to Schramm for many years while I worked on the National Computerization Policy project at the East-West Center’s Communication Institute that he founded. We were inspired by The Computerization of Society: A Report to the President of France (1982), otherwise known as the Nora-Minc Report. Syed Rahim and I (as an intern) worked with the Asian Media Information and Communication Centre (AMIC) in Singapore to produce Computerization and Development in Southeast Asia (1987), which proved to be a benchmark study on ICT4D. AMIC is an international, non-profit, non-government organization currently based in Manila that has been a major research hub for media and information technologies and their influence on development.

Herbert Dordick, Meheroo Jussawalla, and Deane Neubauer were other key scholars conducting research on the transition from Communication for Development (C4D) to ICT4D at the East-West Center in Honolulu. But others dealing with news flow were important as well and can give us insights into concerns that erupted in response to modernization.

2) New World Economic and Information Orders

In the early 1970s, Nixon ended the Bretton Woods regulation of the dollar-gold standard, resulting in very volatile currency markets. Nixon administration had convinced Saudi Arabia to sell its oil only for dollars so countries needed to procure dollars for their energy needs. Oil prices increased, and dollars flowed into OPEC countries, only to be lent out to cash-poor developing countries in syndicated lending networks known as petrodollar markets. These instabilities created “stagflation” around the world and a general sense of “malaise” as President Jimmy Carter called it. It also brought concerns by the emerging South to the forefront of debates in international forums.

Rising frustrations and concerns about neo-colonialism due to the power of transnational corporations (TNCs), especially news companies, resulted in a collective call by developing countries for various conceptions of a New International Economic Order (NIEO) “New World Information and Communication Order.” They were echoed by many NGOs in the wake of OPEC oil shocks and the pressures of the impending Third World debt crisis.

From 1964, the United Nations Conference on Trade and Development (UNCTAD), along with the Group of 77 and the Non-Aligned Movement (NAM), were forums for discussions about the NIEO. Major concerns involved:

  • Commodity price stabilization.
  • Restructuring international trade to reduce developed country tariff and non-tariff barriers.
  • Diversifying developing economies through industrialization.
  • Integrating developing countries’ economies into regional free trade blocs like the Caribbean Community.

These concerns would lead, in part, to the Marxist-informed dependency and underdevelopment paradigms that saw global economic development in terms of power relationships between core geographic areas and weaker periphery areas. This had been discerned from intra-national conditions where major cities develop advantages over the countryside. The dynamic was ascribed to international conditions where developed countries had advantages over lesser developed countries.

Other important issues were news flow and the imbalanced flow of information from North to South. Developing countries were concerned about the flows of news and data from developing to developed countries. In part, it was the preponderance of journalism dealing with disasters, coups, and other calamities that many countries felt portrayed their countries in a negative light and restricted incoming foreign investment flows.[5] Also, magnetic tape, conditioned leased telephone lines, and value-added networks were making computer-based information easier to transport and transmit.

The term New World Information and Communication Order (NWICO), sometimes shortened to New World Information Order (NWIO), also began circulating in these forums. It was picked up by the MacBride Commission, set up by the United Nations’ Educational, Scientific and Cultural Organization (UNESCO) in 1977 and led by Nobel Prize winner Sean MacBride to address media concerns. Its October 1980 Many Voices, One World report called for the democratization of communication and news flow and strengthening national media and journalistic practices. Its recommendations included making global media representation more equitable and national media less dependent on external sources, particularly the news agencies such as Agence France-Presse (AFP), Associated Press (AP), Reuters, and United Press International (UPI).

It was followed by concerns about obstacles hindering communications infrastructure development and how telecommunications access across the world could be stimulated. In 1983, UNESCO’s World Communication Year, an Independent Commission met several times to discuss the importance of communication infrastructure for social and economic development and to make recommendations for spurring its growth.

The Commission consisted of seventeen members – primarily communication elites from private and public sectors representing several countries. Spurred on by the growing optimism about the development potential of telecommunications, they investigated ways that Third World countries could be supported in this area. They published their recommendations in The Missing Link (1984) or what soon was to be called the “Maitland Report” after its Chair, Sir Donald Maitland from the United Kingdom. This report brought recognition to the role of telecommunications in development and opened up resources by international organizations such as the World Bank.

The transition from telegraph and telex machines to computers and data communications also resulted in concerns about information transcending national boundaries. As a result, the Intergovernmental Bureau for Informatics (IBI), which had been set up as the International Computation Centre (ICC) in 1951 to help countries access major computers, began to study national computerization policy issues in the mid-1970s. The IBI-ICC assisted its member countries to understand the impact of technology on society and take advantage of the opportunities they presented.

The IBI increasingly focused on transborder data flows (TDF) that moved sensitive corporate, government, and personal information across national boundaries. The first International Conference on Transborder Data Flows was organized in September 1980, followed by a second held in 1984; both were held in Rome (Italy). The increasing use of computers raised questions about accounting and economic data that avoided political and tax scrutiny. The concern was that these data movements could act like a “trojan horse” and compromise a country’s credit ratings, national sovereignty, and citizens’ privacy.

3) Financialization and Structural Adjustment

Instead of a New International Economic Order, a new era of “structural adjustment” enforced by the International Monetary Fund emerged that targeted government entities and restrictions on the flow of financial news, investment, and lending. A major target was State-Owned Enterprises (SOEs), especially post, telephone, and telegraph (PTT) agencies and other aspects of government administration and commercial ownership.

The North–South Summit, officially the International Meeting on Cooperation and Development, was held in Cancun, Mexico at the end of October 1981. The summit was attended by representatives of 22 countries from 5 continents, including the leaders from Britain, France, and the US.[2] Newly elected US president Ronald Reagan told world leaders that in order to get foreign aid, developing nations needed to lower taxes, reduce government spending, pursue market-oriented economic policies and privatize government agencies. Development, he proclaimed, was not a question of “East versus West” but of “sense versus nonsense.” Development, he added, is created and sustained by private initiative and not coerced by governments. The sentiment was echoed by Prime Minister Margeret Thatcher who said developing countries needed to attract private investment and the “continuing flow of bank lending” with fair treatment for investment from abroad and the continual demonstration of creditworthiness.

This new system would require a higher level of news flow and transparency. Reagan ordered that financial support for UNESCO end in December 1983, citing the politicization of “a free market and a free press.” The move was applauded by the news companies who balked against the proposed licensing of journalists, and an international code of press ethics as well as increased government control over media content. It was opposed by many decolonized states in Africa and Asia that were leaning towards the Soviet bloc positions. But as the Space Race placed satellites in geosynchronous orbit and fiber optic cables began to traverse the ocean floors, a new paradigm of global information flow was emerging for data-enhanced supply chains and credit flows.

The Reagan administration retasked the IMF from harmonizing currency relationships to changing the political economy in debt-ridden countries. Petrodollars had circulated to international banks to be lent out to countries needing cash for energy purchases and infrastructure projects such as bridges and highways. The resultant “Third World Debt Crisis” provided the lever to force the Reagan-Thatcher agenda on countries as they needed IMF approval for additional loans. A primary target was the telecommunications infrastructure.

Long considered agents of national development and employment, PTTs came under increasing criticism for their antiquated technologies and lack of customer service. Thatcher aggressively pursued the sale of British Telecom shares to the public while the US broke up behemoth AT&T into regional Bell-operating companies (RBOCs). New Zealand adopted the privatization model citing the opportunity to retire 1/3 of its national debt as it sold it to 2 RBOCs who were required to sell 51 percent of the “Kiwi shares” to the domestic public. It was soon followed by Australia and other countries such as Singapore.

The flow of petrodollar lending and rising national debt in borrowing countries put pressure on PTTs to add new value-added data networks and undergo satellite deregulation. Global circuits of digital money and news emerged, such as Reuters Money Monitor Rates and SWIFT (Society for Worldwide Interbank Telecommunications). These networks, the first to use packet-switching, linked currency exchange markets worldwide in arguably the first virtual market. The volatility of financial markets due to the continual decontainment of finance added to the interest in telecommunications.

A new techno-economic imperative emerged that changed the relationship between government agencies and global capital. PC-based spreadsheet technologies were utilized to inventory, value, and privatize PTTs to be corporatized and listed on electronically linked share-market exchanges. Communications markets were liberalized to allow domestic and international competition for new telecommunications services. Sales of digital switches and fiber optic networks increased. Globalization was proceeding at an unprecedented scale.

Developing countries became “emerging markets,” consistently disciplined by what became known as the “Washington Consensus,” stressing a set of policy prescriptions to continue to open them up to transborder data flows and international investment.[6] Citicorp CEO Walther Wriston talked of the movement from the gold-dollar standard to an electronic “information standard,” coordinating the movement of capital to places that treated it well.

The gold-backed international currency framework that shaped the global economy after World War II was replaced by a new techno-financial system of computerized transactions, news flow, risk management, and technical analysis. US treasuries were the new foundation for the continuation of the Eurodollar as the US dollar strengthened its status as the global reserve currency and vehicle for transactions. Gold became just another dataset in the complex trading algorithms that shaped the prices and trades of the global financial system and consequently shaped the policy decisions of nations and the flows of investment capital.[7]

4) Global ICT Integration

The Internet became useful after Mosaic was developed at the National Center for Supercomputing Applications (NCSA) in late 1992 and released in 1993. The NCSA was funded by the High-Performance Computing Act of 1991, otherwise known as the “Gore Bill,” after its sponsor, Senator Al Gore Jr. Mosaic was developed by Marc Andreessen and other graduate students to take advantage of the hypertext protocols developed at CERN and the resource-sharing capabilities of the National Science Foundation’s NSFNET. They would go on to form Netscape, and its new web browser would be freely distributed to individuals and licensed to companies. When Congress altered NSFNET’s acceptable use policy in March 1991 to allow commercial traffic, the Internet was seen as the harbinger of a new economy.

Vice President Al Gore Jr. targeted national PTT monopolies and government regulatory agencies when he introduced the Global Information Infrastructure (GII) at the annual ITU meeting in Buenos Aires in March of 1994. There he proposed a new model of global telecommunications based on competition, instead of monopoly. He stressed the rule of law and the interconnection of new networks to existing networks at fair prices. In some ways he was echoing his father, the late Al Gore who had championed the interstate highway infrastructure across the US during the 1950s.

    I am very proud to have the opportunity to address the first development conference of the ITU because the President of the United States and I believe that an essential prerequisite to sustainable development, for all members of the human family, is the creation of this network of networks. To accomplish this purpose, legislators, regulators, and business people must do this: build and operate a Global Information Infrastructure. This GII will circle the globe with information superhighways on which all people can travel.

Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations. The Marrakesh Agreement aimed to create an multilateral trading system encompassing the GATT as well as the results of all the previous trade rounds that had been conducted since 1947. Significantly, that included the GATS (General Agreement on Trade in Services). GATS included everything from circuses, education, radio, and television, to telephone and data services. This process led to prices becoming very cheap for international data-based services, including video. Also at Marrakesh, they called for the World Trade Organization’s (WTO) creation.

Formed in early 1995, with New Zealand’s former PM Michael Moore at the helm, the WTO had two meetings in 1996 and 1997 that created the new era of global communications and development. Members party to the new multilateral arrangement met quickly in Singapore in 1996 to reduce tariffs on the international sales of a wide variety of information technologies. The Information Technology Agreement (ITA) was signed by 29 participants in December 1996. The agreement was expanded at the Nairobi Ministerial Conference in December 2015, to cover an additional 201 products valued at over $1.3 trillion per year. The agreements allowed Korea to successfully market early CDMA mobile handsets and develop a trajectory of success in the global smartphone market.

In 1997 the WTO met in Geneva and established rules for the continued privatization of national telecommunications operations. Sixty-nine nations party to the WTO, including the US, signed the Agreement on Basic Telecommunications Services in 1997 that codified new rules for telecommunications deregulation where countries agreed to privatize and open their telecommunications infrastructures to foreign penetration and competition by other telcos.

The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not: 1) negotiated and reduced tariffs for crucial networking technologies, and computer devices, and 2) negotiated a new services arrangement that reduced the prices on international communications, including video. The resultant liberalization of data and mobile services around the world made possible a new stage in global development.

Packet-switching technologies that had been standardized into the ITU’s X.25 and X.75 protocols for PTT data networks transformed into ubiquitous TCP/IP networks by the late 1990s. Cisco Systems became the principal enabler with multi-protocol routers designed for enterprises, governments, and telcos. However, Lucent, Northern Telecom, and other telecommunications equipment suppliers quickly lost market share as the Internet protocols, mandated by the US military’s ARPANET and later by the National Science Foundation’s NSFNET, were integrated into ISDN, ATM, and SONET telecommunications technologies around the world.

Hypertext, Ad Markets, and Search Engines

The online economy emerged with the Internet and its hypertext click environment. Starting with advertising and the keyword search and auctioning system, a new means of economic production and political participation emerged. It was based on the wide-scale collection and rendition of behavioral data emerged for prediction products and services.

As Shoshana Zuboff points out in Surveillance Capitalism (2019), the economy expands by finding new things to commodify, and the Internet provided a multitude of new products and services that could be sold. When the Internet was privatized in the early 1990s and the World Wide Web (WWW) established the protocols for hypertext and webpages, new virtual worlds of online media spaces were enabled. These were called “inventory.” This ad space was based on the amount of storage available and would be placed on a website, news site, or other web publication. The type of ad was based the information they had about the viewer based on registered clicks.

Behavioral data is the information produced as a result of actions that can is measured on a range of devices connected to the Internet, such as a PC, tablet, or smartphone. Behavioral data tracks the sites visited, the apps downloaded, or the games played. Cloud platforms claims human experience as free raw material for translation into behavioral data. Some of this data is applied to product or service improvements, the rest are declared as proprietary behavioral surplus, and fed into advanced manufacturing processes known as ‘machine intelligence.’ Automated machine processes can capture knowledge about behaviors but also shape behaviors.

Surplus behavioral and instrumental data is turned into prediction products such as recommendation engines for e-commerce and entertainment. These anticipate what people will do now, soon, and later. Prediction products are traded in a new kind of marketplace for behavioral predictions called behavioral futures markets. These are currently used primarily used in advertising systems based on CTR, Pay-Per-Click (PPC), and real-time bidding auction systems.

While ICT has a long history in global affairs, digital technologies largely emerged from the commercialization of Cold War and Space Race technologies. They were deeply molded by financialization and globalization processes in the 1970s and 1980s. Shaping the global infrastructure for the Internet involved aggressive political stances in remaking telecommunications systems. Deep changes were needed to allow hypertext protocols and TCP/IP technologies to be standardized and implemented worldwide for economic and social uses.

6) Digital Borders and Authoritarianism

Despite cyberspace promise of a world without digital borders, nationalistic concerns started to re-emerge. In the post-colonial era, domestic powers were often keen to limit the flows of financial information that may act as a type of “trojan horse” passing national boundaries without scrutiny. Before the leverage of petro-dollar debt opened up the flows of information and capital that characterized the neo-liberal financialization, national telecommunications borders worldwide were policed technologically and politically. The PTTs operated as an electronic moat that would be nearly unrecognizable by the turn of the century due to the push to liberalize and privatize the telco environment.

7) Smart/Sustainable Development

We recently passed the halfway point for the sustainable development goals (SDGs) outlined by the United Nations in 2015. The SDGs are providing additional impetus for global ICT4D as it encourages infrastructure building and support for crucial development activities that ICT can assist, such as monitoring land and sea resources, encouraging entrepreneurship, and providing affordable health information and communication activities.

The aggressive trade negotiations and agreements in the 1990s significantly reduced the costs of ICT devices and communication exchanges worldwide, making possible a wide variety of new commercial and development activities based on ICT capabilities. Countries significantly reduced their tariffs on electronics and services imports, making possible their adoption and diffusion among broader populations. Now, smart technologies are produced from complex international supply chains that advance technology and reduce prices – but are subject to the vagaries of globalization.

ICT4D is highly reliant on affordable digital devices and network equipment. A key variable going forward is the value of the dollar, the world’s primary reserve and transacting currency. A global shortage of dollars due to higher interest rates and greater political risk means increased prices for imported goods, regardless of agreements that lower import tariffs. In addition, the post-Covid crisis and Ukrainian invasion have stressed supply chains of critical materials and raw Earth minerals, further adding to potential costs of ICT and geopolitical risk facing sustainable development.

SDGs are Sustainable Development Goals

But there are characteristics of ICT4D that can offset the potential problems of digital equipment. The near-zero marginal costs for digital products make information content and services more accessible for developing countries. Books, MOOCs, and other online services provide value to a vast population with minimal costs to reach each additional person. Platform-based services providing agricultural, health, and other development services provide low-cost accessibility and outreach. They allow new applications to scale significantly with low costs.

Incidentally and significantly, renewable energy sources like solar and wind also provide near-zero marginal costs for producing electricity. Like digital products, they require high initial investments but output continuous product at low costs once operational. Offshore windmills, for example, can figuratively print money once the facility is set up and operating.

Mobility, broadband, and cloud services are three significant technologies presenting positive prospects for ICT4D. Mobile broadband technologies that bypass traditional wireline “last mile” infrastructure have been a major boost for ICT4D. They provide significant connectivity across a wide range of the population and with key commercial and government entities. 4G LTE technologies currently provide the optimal service, as 5G towers consume nearly over 60% more power than LTE. They also require more base stations and antennas as their range is lower.

Enhanced connectivity strengthens network effects. Blockchain technologies and cryptocurrencies, the Internet of Things (IoT), and the proliferation of web platforms are some of the current conceptions of how reduced costs for communications and information analysis are enhancing network effects and creating value from the collection and processing of unstructured data.

Concluding Remarks

This project looks at the historical phases and international dynamics of ICT4D. The “five communication revolutions” (print, film, radio, television, and later, satellites) had a significant impact on the development of national identities within a global context. These technologies were often enlisted in the contest between the US and its allies and the Communist countries. The telegraph and telephone were powerful commercial technologies that often synchronized with tabulating machines to create information collection processes and the nascent data processing industry. They were also central to news agencies that were expanding and feeding newspaper worldwide.

These technologies provided a powerful edge for globalizing companies that many countries considered a new type of colonialism. Many countries joined in the “non-aligned movement” to protest the strong globalizing trends and call for a “New World Information and Communication Order” in UN forums that would address economic and information imbalances between the powerful “North” and the weaker “South” countries.

The North doubled down on the emerging technological imperatives and the markets they enabled. They stressed breaking up state-owned enterprises and especially government-owned telecommunications systems that impeded international data flows. PC-enabled spreadsheets helped investment banks and financial traders inventory and value SOEs to be sold off to investors. The accumulation of debt during the 1970s was a lever that opened up countries to flows of capital and associated news.

Fiber optics and digital switches enhanced national and international telecommunications. The development of Internet technologies brought additional pressures on national information systems. The WTO brought together services and technology concerns into a wide set of agreements to reduce tariffs on both to make ICT widely available and relatively cheap worldwide.

Mobile, broadband, and online platforms are currently deemed most useful for ICT4D. They have shown the potential, primarily in smaller-scale projects to assist health, education, and other sustainable development activities. But ICT is a global technology that is difficult to replicate at a national scale. Technologies cheaper than the leading-edge digital devices, such as Orange’s 4G Sanza Touch, are becoming more available but still have ties to more extensive economic interests. Orange, for example, is a French-based telecommunications provider that also operates in many African countries.

In further explanation of these phases, I will provide a context for a further investigation of ICT for development, drawing on historical and current research. Of particular concern is the implementation of policies and practices related to contemporary sustainable development practices, but commercial and monetization techniques are important as well.


[1] Dordick, Herbert S. and Neubauer, Deane. 1985. “Information as Currency: Organizational Restructuring Under the Impact of the Information Revolution.” Bulletin of the Institute for Communications Research, Keio University, No 25, 12–13. This journal article was particularly insightful into the dynamics of the PTTs that would lead to pressures on them to adapt IP technologies leading to the World Wide Web.
[2] The techno-structural approach in development can be attributed to a debate between Ithiel de Sola Pool and Majid Tehranian. Pool‘s Technologies of Freedom was written by the MIT professor, primarily at the University of Hawaii Law Library and put forward a strong libertarian thesis about communications technologies and how technical change in this dynamic area should be treated by the legal/regulatory system. Tehranian felt compelled to respond with his “technostructuralist” book Technologies of Power. emphasizing that new technologies are guided in their development and deployment along existing lines of corporate, governmental, and military power. Other scholars, especially Richard Heeks and Tim Unwin have reframed the Communications/Computerization/Development debate in terms of ICT4D but put its origins in the 1980s. Heeks, R. (2017) Information and Communication Technology for Development (ICT4D) (Routledge Perspectives on Development) 1st Edition. Also Unwin, T. (2017) Reclaiming Information and Communication Technologies for Development. Oxford University Press.
[3] Rostow, W.W. (1960) Stages of Economic Growth: A Non-Communist Manifesto. Cambridge: Cambridge University Press. See also Rostow W.W., (1965) Stages of Political Development. Cambridge: Cambridge University Press.
[4] Lerner, D. (1976). 16. Technology, Communication, and Change. In W. Schramm & D. Lerner (Ed.), Communication and Change: The Last Ten Years – and the Next (pp. 285-301). Honolulu: University of Hawaii Press.
[5] An excellent discussion of the various development and new world communication and economic order calls can be found in Vincent, R. C., Galtung, J. (1992). Global Glasnost: Toward a New World Information and Communication Order? United States: Hampton Press. Also, see Jussawalla, M. (1981) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[6] The Washington Consensus was a set of policy prescriptions involving fiscal discipline, liberalization of trade and the inflows of capital, privatization, tax reform, and the deregulation of a wide range of industries to allow competition and foreign ownership.
[7] Wriston, W.W. (1992) The Twilight of Sovereignty : How the Information Revolution Is Transforming Our World.

[i] Chilote, R.H. (1984) Theories of Development and Underdevelopment. Boulder, CO: Westview Press.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 19). Five Technostructural Stages of ICT for Global Development.



AnthonybwAnthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea where he manages an undergraduate program with a specialization in ICT4D. His background in ICT4D goes back to the Computerization Policy Project at the at the East-West Center in Honolulu, Hawaii in the mid-1980s as an undergraduate research intern. After getting his PhD from the University of Hawaii, he moved to New Zealand to teach at Victoria University in Wellington. There he saw the complex international pressures shaping the island nation. From 2002-2012 he was on the faculty of New York University. His US home is in Austin, Texas.

The Expanse: Cyberpunk in Space

Posted on | January 21, 2023 | No Comments

The cyberpunk movement emerged in the mid-1980s and has lingered on with occasional stories enticing the audience’s imaginations with their futuristic narratives and visions. Frances Bonner offered the 4 Cs of cyberpunk to examine various science fiction stories in visual media.[1] These four categories of computers, corporations, corporeality, and criminality provide a way to gauge if a narrative could be classified as part of the cyberpunk genre. I also use them as categories for socio-techical analysis of current conditions, but in this case I will conduct a more traditional genre analysis by examining a science fiction TV series.

In these posts, I use Bonner’s 4 Cs of cyberpunk to examine The Expanse, which premiered in 2015 on the Syfy network and has continued on Amazon Prime, completing its 6th season in 2022. The cyberpunk genre is interesting to me for its insights into social dynamics.

Recent examples of cyberpunk in visual media include the Blade Runner 2049 (2017), The Ghost in the Shell (2017) and several video games such as the controversial Cyberpunk 2077. Bonner considered the ABC Network show Max Headroom to be probably the most characteristic example of cyberpunk, although the goal was not to necessarily to create a canon of cyberpunk.

The Expanse (201) is based on the novels of James S. A. Corey, the joint pen name of authors Daniel Abraham and Ty Franck. Their first novel Leviathan Wakes won the famous science fiction Hugo Award in 2012. They wrote eight more books in the series, which centers around the crew of the Rocinante, a fusion-drive-powered space ship salvaged from the Mars Congressional Republic Navy (MCRN).

The Expanse is about the colonization and commercialization of our solar system and some exploration beyond. It focuses on the terraforming of Mars and mining the Asteroid Belt. Earth is under the governance of the United Nations and in conflict with Mars, and with another faction called the Outer Planets Alliance (OPA). The OPA is an organization consisting mainly of the “Belters,” who do most of the construction and mining and are bitter about their conditions and explotation. These workers colonialized the Asteroid Belt, the dwarf planet Ceres, and some moons around the outer planets. Belters have developed their own culture, most notably a creole “pidgin” language, but even their bodies have diverged from Earth and Mars, growing long and thin due to the reduced gravity.

The 4 Cs is a framework to conduct a genre inquiry and build knowledge by addressing the categories and letting them discipline the investigation. I’ve been using it more to develop a socio-technical analysis develop a socio-technical analysis of various tech products in areas such as AI, energy, biochemistry, nanotechnology, robotics, and even space travel. Still, in this case, I’m using it for traditional genre analysis to get a sense of where science fiction is guiding our imagination. What is it saying about who we are and where we are heading as a civilization?

In general, I’m more interested in applying insights provided by cyberpunk to the social analysis of various technologies and their interrelationship with culture, economics and politics.[2] This is an exercise I often do with students. Each category of the 4Cs is an opportunity to produce evidence towards a better understanding of the complex world we live in.

But its useful to stay in touch with stories and do the genre analysis. As I’m on winter break, I’m enjoying watching the series again. In the next post of this series, I will discuss of each of the 4 Cs: Corporations, Criminality, Corporeality, and Cyberspace relate to The Expanse.


[1] Bonner, F. (1992). Separate Development: Cyberpunk in Film and Television (1037673716794540726 T. Shippey, Ed.). In 1037673715 794540726 G. Slusser (Ed.), Fiction 2000:Cyberpunk and the Future of Narrative (pp. 191-206). Athens, GA: The University of
Georgia Press

[2] See Pennings, A. (2018, August 13). The Cyberpunk Genre as Social And Technological Analysis. Retrieved December 08, 2022, from

Citation APA (7th Edition)

Pennings, A.J. (2023, Jan 21). The Expanse: Cyberpunk in Space https://

AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea teaching sustainable development and visual rhetoric. From 2002-2012, he was on the faculty of New York University where he taught comparative political economy, digital economics, and media production. He also taught film and television production at Marist College in New York and Victoria University in Wellington, New Zealand. When not in the Republic of Korea he lives in Austin, Texas.

ICT4D and Digital Development in a Changing World

Posted on | December 14, 2022 | No Comments

Prepared remarks as moderator for the 2nd Annual ICT4D Faculty Panel: Digital Development in a Changing World. Monday, Nov 28, 2022 at the State University of New York, Korea (SUNY Korea) in Songdo, Republic of Korea.

This year, 2022, marks the halfway point for the Agenda for Sustainable Development, aimed for completion by 2030. Created by the United Nations during the time of Secretary-General Ban-Ki moon, 17 Sustainable Development Goals (SDGs) were approved by the General Assembly for completion by 2030. SDGs were meant to create a “shared blueprint for peace and prosperity for people and the planet, now and into the future.“ At this halfway point, the SDGs face new challenges and opportunities.

This panel will address issues related to Information and Communications Technologies for Development (ICT4D) and some of the challenges coming out of the COVID-19 pandemic and the war in Ukraine, including geopolitical constraints on material supplies and digital technologies.

SDGs are Sustainable Development Goals

A contemporary concern threatening digital and sustainable development is the strength of the US dollar, the primary global reserve currency and vehicle for over 80% of international transactions. A worldwide shortage of the dollar (except in the US, apparently) increased its nominal value and turned it into an appreciating investment instrument, raising the purchase price. A strong US dollar makes imports more expensive, including digital and renewable energy technologies. Purchasing smart devices, even the less expensive ones like Orange’s Sanza Touch, a 4G Android device, can become more expensive. Solar panels, windmills, and other renewable technologies are also likely to be imported and additional dollars are needed for those as well. It may be that an alternate to the dollar emerges in the next several decades, maybe even a crypto version, but that doesn’t seem likely right now.[1]

Another issue is the looming limitation on material supplies, particularly on valuable elements and metals needed for ICTs and renewable energy technologies. A sustainable revolution needs massive shifts toward infrastructure development, and that means higher demand for a wide range of resources that need to be mined, refined, and distributed. The loss of crucial materials from Russia and Ukraine means energy, food, and natural resource shortages that will influence green development. Steel, nickel, aluminum, copper, bauxite, neon, palladium, etc., critical elements of modern technologies, have suffered heavy supply shocks. Food supplies are also increasingly insecure with potash and nitrogren-based fertilizers produced from natural gas becoming more scarce. Constraints on ICTs, natural resources, and renewable energies will present significant challenges for meeting the objectives and targets of the SDGs.

Relentless pollution and atmospheric changes affecting climate security also drive strategic questions affecting digital development. At stake are biodiversity, ocean acidification, and human habitats prone to droughts, floods, and wildfires. ICT solutions have been identified for their potential to reduce greenhouse gas (GHG) emissions by up to 15 percent by 2030.[2] Greenhouse gases increase the atmosphere’s temperature, allowing it to absorb more water. More moisture in the air means more weather volatility leading to droughts in some areas while increasing the intensity of hurricanes/typhoons along ocean coasts. For example, Hurricane Ida in late August of 2021 was extremely intense. After it came out of the Gulf of Mexico to hit Louisana, it had enough energy and moisture to travel north across the US mainland and drop in on New York City and surrounding areas, killing an additional 50 people.

Meanwhile, a number of factors present hope for a sustainable future and the role of ICT. Green funding has increased, particularly with the Inflation Reduction Act (IRA) in the US, and hopefully, recent commitments at the 2022 United Nations Climate Change Conference (COP 27) will be fruitful for climate justice and well as continued reductions in air pollution. Green New Deals are spreading worldwide, particularly in the stressed EU that had counted too much on Russian imports of hydrocarbons.

Korea has also envisaged a Green New Deal along with a Digital New Deal under the previous Moon administration, stressing its “fast follower” strategy to capitalize on its export industries to solve world problems while providing a protective safety net at home. Japan even recently voted Hyundai Motor’s all-electric Ioniq 5 SUV as its “Import Car of the Year.

Korean Green New Deal

Technological innovations have also been striking. Battery innovations, especially for long-term storage by companies like Ambri, can soothe the shortages of intermittent energy sources such as wind and solar. While many battery materials are scarce, spent cobalt, lead, and lithium in batteries can be successfully recycled.

Most significantly, the economics of renewables, with admittedly huge upfront investment requirements, have near-zero marginal costs over medium and long-term time frames. For example, once a windmill is constructed and operating, the subsequent costs are minimal, and as it produces electricity, wind “prints” money.

Climate awareness offers hope and motivation. Many climate deniers are still questioning the basic science and physical evidence, but the political will is growing to take impactful action to mitigate weather-related disasters and adapt to the changes.

What is the continuing role of ICT for Development (ICT4D) in this emerging post-COVID-19 environment? The Earth Institute has identified three key ICT (Information and Communications Technology) accelerators of the SDGs. These are mobility, broadband, and cloud services, such as platforms using AI and “big data.” These technologies can often “leapfrog” legacy communication and database systems to provide new efficiencies and opportunities in agriculture, education, energy, and healthcare. For example, while needing to be monitored with data justice precepts, digital ID programs can provide new eligibilities for financial inclusion, personal ownership, and social programs such as health services.

These key accelerators were woven into the BS in Technological Systems Management (TSM) specialization in ICT4D (Formerly ICT4SD) at SUNY Korea. This forum brings together a panel of faculty that teach these courses, including EST 371 – Data Science Management, EST 372 – The Mobile Revolution in Development, and EST 320 – Communication System Technologies, as well as today’s audience from EST 230 – ICT for Sustainable Development. We also cover related topics in our MS degree in Technological Systems Management, which includes a specialization in Digital Technologies in Disaster Risk Reduction. And, of course, several of our Ph.D. students in the Technology, Policy, and Innovation (TPI) program have pursued related research topics for their dissertations. All our degrees are conferred by Stony Brook University in New York.

Please welcome our faculty to today’s panel discussion.

James Larson
Suzana Brown 
Sira Maliphol
Jinsang Lee
Joseph Cabuay
Sangchan Park

Citation APA (7th Edition)

Pennings, A.J. (2022, December 14). ICT4D and Digital Development in a Changing World.

[1] The US dollar has been global since the 1950s as eurodollars, and had electronic versions since the late 1860s.
[2] Sachs, J. D. (n.d.). ICT and SDGs: How Information and Communications Technology Can Accelerate Action on the Sustainable Development Goals. Retrieved from


AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York (SUNY), Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s. He heads up the ICT4D specialization at SUNY Korea.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    November 2023
    M T W T F S S
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.