Anthony J. Pennings, PhD

WRITINGS ON DIGITAL STRATEGIES, ICT ECONOMICS, AND GLOBAL COMMUNICATIONS

Five Technostructural Stages of Global ICT4D

Posted on | January 31, 2023 | No Comments

Summarized remarks from “Five Technostructural Stages of Global ICT4D” presented to the CIS PhD program at the University of Hawaii 30 January 2023. In this talk I also emphasized the contributions made by some of the faculty and research fellows at the University of Hawaii and the East-West Center.


This presentation will explore and outline the following stages of economic and social development utilizing information and communications Computerization and Development in SE Asia technologies (ICT). The ICT acronym has emerged as a popular moniker, especially in international usage, for the digital technology revolution and is often combined with “development” to form ICT4D. Development is a contested term with currency in several areas. Still, in global political economy, it refers to the process of building environments and infrastructure needed to improve the quality of human life and bridge equity divides. Often this means enhancing a nation’s agriculture, education, health, and other public goods that are not strictly economy-related but improve well-being and intellectual capital.

Of particular interest is the transformation of public-switched networks for Internet Protocol (IP)-based services and then WiFi and mobile use. Data-intensive solutions are beginning to address many development issues. However, a growing concern is that data is being collected extensively and used intrusively to manipulate behaviors.

The stages are categorized as:

1) Containment/Development/Modernization; 2) New World Economic and Information Orders; 3) Structural Adjustment and Re-subordination, 4) Global ICT Integration, and; 5) Smart/Sustainable Mobile and Data-Driven Development.

Using a techno-structural approach, the explication of these stages will provide historical context for understanding trends in ICT innovation and implementation. This approach recognizes the reciprocal effects between technological developments and institutional power and a “dual-effects hypothesis” to illustrate the parallel potentials of ICT4D as both a democratizing and totalizing force. This research will also provide insights into the possibilities of ICT diffusion in developing environments.

1) Containment/Development/Modernization

The US emerged as the primary economic and military power in the aftermath of World War II. Arms and materiel sales before the war had transferred much of the world’s gold to the US, which was wisely transferred inland to Fort Knox, Kentucky. Franklin Delano Roosevelt (FDR) had sought to limit finance in the aftermath of the “Crash of 1929” and continued it globally with the initiative at Bretton Woods Hotel in New Hampshire. “Bretton Woods” created the International Monetary Fund (IMF), the World Bank, the International Trade Organization (rejected by Congress), and also instituted a dollar-gold standard that tied the US dollar to gold at $35 an ounce (oz) and other international currencies to the US dollar at set rates. This system was designed to “contain” financial speculation and encourage trade and development.

The other aspect of containment, more widely known, is the containment of Communism. The painful success of the USSR in World War II in countering the Nazis on the eastern front and their appropriation of German atomic and rocketry technology presented an ideological and military threat to the US and its allies. The USSR’s launch of Sputnik satellites in 1957 resulted in the US’s formation of NASA and ARPA. The Communist revolution in China in 1949 and their explosion of an atomic bomb on Oct. 16, 1964, spurred additional concern. The resultant Cold War and Space Race spurred technological development and competition for “developing countries”worldwide.

“Development” and “modernization” characterized the post-World War II US prescription for economic development around the world, and especially in newly decolonized nation-states. Modernization referred to a transitional process of moving from “traditional” or “primitive” communities to modern societies based on scientific rationality, abstract thought, and the belief in progress. It included urbanization, economic growth, and “psychic mobility” that could be influenced by types of media. Scholars talked of an eventual “takeoff” to a modern society if the proper regiment was followed, particularly the adoption of new agricultural techniques termed the “Green Revolution.”[1] Information and Communications Technologies (ICTs) were rarely stressed, but “five communication revolutions” (print, film, radio, television, and later, satellites) were beginning to be recognized as making a contribution to national development.

Communication technologies were beginning to spread information about modern practices in agriculture, health, education, and national governance. Some early computerization projects continued population analysis, such as the census that had started with tabulation machines, while mainframes and minicomputers were increasingly utilized for statistical gathering by government agencies for development processes.

Telegraphy and telephones were strangely absent from much of the discussion but were important for government activities as well as large-scale plantations, mining operations, transportation coordination, and maritime shipping. Because of their large capital requirements and geographic expanse, countries uniformly developed state-controlled Post, Telephone, and Telegraph (PTTs) entities. Organized with the help and guidance of the International Telecommunications Union (ITU), the oldest United Nations entity, PTTs struggled to provide basic voice and telegraphic services. However, they provided needed jobs, technical resources, and currency for the national treasury.

Wilbur Schramm’s (1964) book Mass Media and National Development made crucial links between media and national development. Published by Stanford University Press and UNESCO, it examined the role of newspapers, radio, and television. Its emphasis on the role of information in development also laid the foundation for the analysis of computerization and ICT in the development process. I had an office next to Schramm for many years at the East-West Center’s Communication Insitute that he founded while I worked on the National Computerization Policy project that resulted in the ICT4D benchmark study Computerization and Development in Southeast Asia (1987). Herbert Dordick, Meheroo Jussawalla, Deane Neubauer, and Syed Rahim were key scholars in the early years of ICT4D at the East-West Center.[1]

2) New World Economic and Information Orders

Rising frustrations and concerns about neo-colonialism due to the power of transnational corporations (TNCs), especially news companies, resulted in a collective call by developing countries for various conceptions of a “New World Economic and Communication Order.” It was echoed by UNESCO in the wake of OPEC oil shocks and the resulting Third World debt crisis. The issue was primarily news flow and the imbalanced flow of information from North to South. Developing countries were concerned about the unequal flows of news and data from developing to developed countries. In part, it was the preponderance of news dealing with disasters, coups, and other calamities that many countries felt restricted flows of foreign investment. The calls caused a backlash in the US and other developed countries concerned about the independence of journalism and the free flow of trade.[2]

It was followed by concerns about obstacles hindering communications infrastructure development and how telecommunications access across the world could be stimulated. In 1983, UNESCO’s World Communication Year, the Independent Commission met several times to discuss the importance of communication infrastructure for social and economic development and to make recommendations for spurring its growth.

The Commission consisted of seventeen members – communication elites from both private and public sectors and representing a number of countries. Spurred on by the growing optimism about the development potential of telecommunications, they investigated ways Third World countries could be supported in this area. They published their recommendations in The Missing Link (1984) or what soon was to be called the “Maitland Report” after its Chair, Sir Donald Maitland from the United Kingdom. This report brought recognition to the role of telecommunications in development and opened up resources by international organizations such as the World Bank.

The transition from telegraph and telex machines to computers also resulted in concerns about data transcending national boundaries. The Intergovernmental Bureau for Informatics (IBI) that had been set up as the International Computation Centre (ICC) in 1951 to help countries get access to major computers, began to study national computerization policy issues in the mid-1970s.

They increasingly focused on transborder data flows (TDF) that moved sensitive corporate, government, and personal information across national boundaries. The first International Conference on Transborder Data Flows was organized in September 1980, followed by a second held in 1984; both were held in Rome (Italy). The increasing use of computers raised questions about accounting and economic data avoiding political and tax scrutiny. The concern was that these data movements could act like a “trojan horse” and compromise a country’s credit ratings and national sovereignty, as well as individual privacy.

3) Structural Adjustment and Re-subordination

Instead, a new era of “structural adjustment” enforced by the International Monetary Fund emerged that targeted national post, telephone, and telegraph (PTT) agencies and other aspects of government administration and ownership. Long considered agents of national development and employment, PTTs came under increasing criticism for their antiquated technologies and lack of customer service.

In the early 1970s, Nixon ended the Bretton Woods regulation of the dollar-gold standard, resulting in very volatile currency markets. Oil prices increased, and dollars flowed into OPEC countries, only to be lent out to cash-poor developing countries. The flow of petrodollar lending and rising “third world debt” pressured PTTs to add new value-added data networks and undergo satellite deregulation. Global circuits of digital money and news emerged, such as Reuters Money Monitor Rates and SWIFT (Society for Worldwide Interbank Telecommunications). These networks, the first to use packet-switching, linked currency exchange markets worldwide in arguably the first virtual market.

A new techno-economic imperative emerged that changed the relationship between government agencies and global capital. PC-spreadsheet technologies were utilized to inventory, value, and privatize PTTs so they could be corporatized and listed on electronically linked share-market exchanges. Communications markets were liberalized to allow domestic and international competition for new telecommunications services, and sales of digital switches and fiber optic networks. Developing countries became “emerging markets,” consistently disciplined by the “Washington Consensus” stressing a set of policy prescriptions to continue to open them up to transborder data flows and international trade.[3]

4) Global ICT Integration

Packet-switching technologies standardized into the ITU’s X.25 and X.75 protocols for PTT data networks, transformed into ubiquitous TCP/IP networks by the late 1990s. Cisco Systems became the principal enabler with a series of multi-protocol routers designed for enterprises, governments, and eventually telcos. Lucent, Northern Telecom, and other telecommunications equipment suppliers quickly lost market share as the Internet protocols, mandated by the US military’s ARPANET, and later by the National Science Foundation’s NSFNET, were integrated into ISDN, ATM, and SONET technologies in telcos around the world.

The Global Information Infrastructure (GII) introduced at the annual ITU meeting in Buenos Aires in March of 1994 by Vice President Gore targeted national PTT monopolies and government regulatory agencies. He proposed a new model of global telecommunications based on competition, instead of monopoly. He stressed the rule of law and the interconnection of networks to existing networks at fair prices. Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations which called for the inclusion of GATS (General Agreement on Trade in Services) that include everything from ciruses, to education, radio and television, and telecommunications services. And, at this meeting, they called for the creation of the World Trade Organization (WTO).

Formed in 1995, the WTO had two meetings in 1996 and 1997 that created a new era of global communications and development. Members party to the new multilateral arrangement met quickly in Singapore in 1996 to reduce tariffs on the international sales of a wide variety of information technologies. The Information Technology Agreement (ITA) was signed by 29 participants in December 1996. The agreement was expanded at the Nairobi Ministerial Conference in December 2015, to cover an additional 201 products valued at over $1.3 trillion per year. A agreements allowed Korea to successfully market early CDMA mobile handsets and develop a trajectory of success in the smartphone market.

In 1997 the WTO met in Geneva and established rules for the continued privatization of national telecommunications operations. Sixty-nine nations party to the WTO, including the U.S., signed the Agreement on Basic Telecommunications Services in 1997 that codified new rules for telecommunications deregulation where countries agreed to privatize and open their own telecommunications infrastructures to foreign penetration and competition by other telcos.

The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not negotiated and reduced tariffs for crucial networking and computer equipment. The resultant liberalization of data and mobile services around the world made possible a new stage in global development.

Hypertext, Ad Markets, and Search Engines

The online economy emerged with the Internet and its hypertext click environment. Starting with advertising and the keyword search and auctioning system, a new means of economic production and political participation based on the wide-scale collection and rendition of surplus behavioral data emerged for prediction products and services.

As Shoshana Zuboff points out in Surveillance Capitalism (2019), the economy expands by finding new things to commodify, and the Internet provided a multitude of new products and services that could be sold. When the Internet was privatized in the early 1990s and the World Wide Web (WWW) established the protocols for hypertext and webpages, new virtual worlds of online media spaces were enabled. These were called “inventory.” Or you can can them ad space.

Behavioral data is the information produced as a result of actions that can is measured on a range of devices connected to the Internet, such as a PC, tablet, or smartphone. Behavioral data tracks the sites visited, the apps downloaded, or the games played. Cloud platforms claims human experience as free raw material for translation into behavioral data. Some of this data is applied to product or service improvements, the rest are declared as proprietary behavioral surplus, and fed into advanced manufacturing processes known as ‘machine intelligence.’ Automated machine processes can capture knowledge about behaviors but also shape behaviors.

Surplus behavioral and instrumental data is turned into prediction products such as recommendation engines for e-commerce and entertainment. These anticipate what people will do now, soon, and later. Prediction products are traded in a new kind of marketplace for behavioral predictions called behavioral futures markets. These are currently used primarily used in advertising systems based on CTR, Pay-Per-Click (PPC), and real-time bidding auction systems.

5) Smart/Sustainable Mobile and Data-Centric Development

The aggressive trade negotiations and agreements in the 1990s significantly reduced the costs of ICT devices and communication exchanges worldwide, making possible a wide variety of new commercial and development activities based on ICT capabilities. We are at the halfway point for the sustainable development goals (SDGs) outlined by the United Nations in 2015. The SDGs are providing an additional impetus for ICT4D as it encourages infrastructure building and support for key development activities that ICT can assist, such as monitoring Earth and sea resources and providing affordable health information and communication activities.

A key variable is the value of the dollar that is the world’s primary transacting currency. A global shortage of dollars due to high interest rates or political risk means higher prices for imported goods, regardless of lower tariffs. The post-Covid crisis in Ukraine has stressed supply chains of key materials and raw Earth mineral from Russia and Ukraine further adding to potential costs and geopolitical risk. ICT4D is highly reliant on global supply chains making digital devices readily available at reasonable prices.

The near-zero marginal costs for digital products make information content and services more accessible for developing countries. Books, MOOCs, and other online services provide value to a vast population with minimal costs to reach each additional person. Platform-based services providing agricultural, health, and other development services provide low-cost accessibility and outreach. They allow new applications to scale significantly with low costs. Incidentally and significantly, renewable energy sources like solar and wind also provide near-zero marginal costs for producing electricity. Like digital products, they require high initial investments but output product at low costs once operational.

Mobility, broadband, and cloud services are three significant technologies presenting positive prospects for ICT4D. Mobile broadband technologies that bypass traditional wireline “last mile” infrastructure have been a major boost to the prospects for ICT4D. They provide significant connectivity across a wide range of the population and with key commercial and government entities. 4G LTE technologies currently provide the optimal service, as 5G towers consume nearly over 60% more power than LTE and also require more stations as their range is lower.

Enhanced connectivity strengthens network effects. Blockchain technologies and cryptocurrencies, the Internet of Things (IoT), and the proliferation of web platforms are some of the current conceptions of how reduced costs for communications and information analysis are enhancing network effects and creating value from the collection and processing of unstructured data.

This project will expand on these stages and provide a context for a further investigation of ICT for development drawing on historical and current research. Of particular concern is the implementation of policies and practices related to contemporary development practices, but commercial and monetization techniques are important as well.

Notes

[1a] Dordick, Herbert S. and Deane Neubauer. 1985. “Information as Currency: Organizational Restructuring Under the Impact of the Information Revolution.” Bulletin of the Institute for Communications Research, Keio University, No 25, 12–13. This journal article was particularly insightful into the dynamics of the PTTs that would lead to pressures on them to adapt IP technologies leading to the World Wide Web.

[1] Rostow, W.W. (1960) Stages of Economic Growth: A Non-Communist Manifesto. Cambridge: Cambridge University Press. See also Rostow W.W., (1965) Stages of Political Development. Cambridge: Cambridge University Press.
[2] An excellent discussion of the various development and new world communication and economic order discourses can be found in Majid Tehranian’s (1999) Global Communication and World Politics: Domination, Development, and Discourse. Boulder, CO: Lynne Rienner Publishers. p. 40-41. Also, see Jussawalla, M. (1981) ) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[3] Wriston, W.W. (1992) The Twilight of Sovereignty : How the Information Revolution Is Transforming Our World.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 19). Five Technostructural Stages of ICT for Global Development. apennings.com https://apennings.com/how-it-came-to-rule-the-world/planting-to-platforms-five-stages-of-ict-for-global-development/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea where he manages an undergraduate program with a specialization in ICT4D. His background in ICT4D goes back to the Computerization Policy Project at the at the East-West Center in Honolulu, Hawaii in the mid-1980s as an undergraduate research intern. After getting his PhD from the University of Hawaii, he moved to New Zealand to teach at Victoria University in Wellington. There he saw the complex international pressures shaping the island nation. From 2002-2012 he was on the faculty of New York University. His US home is in Austin, Texas.

The Expanse: Cyberpunk in Space

Posted on | January 21, 2023 | No Comments

The cyberpunk movement emerged in the mid-1980s and has lingered on with occasional stories enticing the audience’s imaginations with their futuristic narratives and visions. Frances Bonner offered the 4 Cs of cyberpunk to examine various science fiction stories in visual media.[1] These four categories of computers, corporations, corporeality, and criminality provide a way to gauge if a narrative could be classified as part of the cyberpunk genre. I also use them as categories for socio-techical analysis of current conditions, but in this case I will conduct a more traditional genre analysis by examining a science fiction TV series.

In these posts, I use Bonner’s 4 Cs of cyberpunk to examine The Expanse, which premiered in 2015 on the Syfy network and has continued on Amazon Prime, completing its 6th season in 2022. The cyberpunk genre is interesting to me for its insights into social dynamics.

Recent examples of cyberpunk in visual media include the Blade Runner 2049 (2017), The Ghost in the Shell (2017) and several video games such as the controversial Cyberpunk 2077. Bonner considered the ABC Network show Max Headroom to be probably the most characteristic example of cyberpunk, although the goal was not to necessarily to create a canon of cyberpunk.

The Expanse (201) is based on the novels of James S. A. Corey, the joint pen name of authors Daniel Abraham and Ty Franck. Their first novel Leviathan Wakes won the famous science fiction Hugo Award in 2012. They wrote eight more books in the series, which centers around the crew of the Rocinante, a fusion-drive-powered space ship salvaged from the Mars Congressional Republic Navy (MCRN).

The Expanse is about the colonization and commercialization of our solar system and some exploration beyond. It focuses on the terraforming of Mars and mining the Asteroid Belt. Earth is under the governance of the United Nations and in conflict with Mars, and with another faction called the Outer Planets Alliance (OPA). The OPA is an organization consisting mainly of the “Belters,” who do most of the construction and mining and are bitter about their conditions and explotation. These workers colonialized the Asteroid Belt, the dwarf planet Ceres, and some moons around the outer planets. Belters have developed their own culture, most notably a creole “pidgin” language, but even their bodies have diverged from Earth and Mars, growing long and thin due to the reduced gravity.

The 4 Cs is a framework to conduct a genre inquiry and build knowledge by addressing the categories and letting them discipline the investigation. I’ve been using it more to develop a socio-technical analysis develop a socio-technical analysis of various tech products in areas such as AI, energy, biochemistry, nanotechnology, robotics, and even space travel. Still, in this case, I’m using it for traditional genre analysis to get a sense of where science fiction is guiding our imagination. What is it saying about who we are and where we are heading as a civilization?

In general, I’m more interested in applying insights provided by cyberpunk to the social analysis of various technologies and their interrelationship with culture, economics and politics.[2] This is an exercise I often do with students. Each category of the 4Cs is an opportunity to produce evidence towards a better understanding of the complex world we live in.

But its useful to stay in touch with stories and do the genre analysis. As I’m on winter break, I’m enjoying watching the series again. In the next post of this series, I will discuss of each of the 4 Cs: Corporations, Criminality, Corporeality, and Cyberspace relate to The Expanse.

Notes

[1] Bonner, F. (1992). Separate Development: Cyberpunk in Film and Television (1037673716794540726 T. Shippey, Ed.). In 1037673715 794540726 G. Slusser (Ed.), Fiction 2000:Cyberpunk and the Future of Narrative (pp. 191-206). Athens, GA: The University of
Georgia Press

[2] See Pennings, A. (2018, August 13). The Cyberpunk Genre as Social And Technological Analysis. Retrieved December 08, 2022, from https://apennings.com/dystopian-economies/the-
cyberpunk-genre-as-social-analysis/

Citation APA (7th Edition)

Pennings, A.J. (2023, Jan 21). The Expanse: Cyberpunk in Space apennings.com https://https://apennings.com/political-economies-in-sf/cyberpunk-in-space-the-expanse/



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea teaching sustainable development and visual rhetoric. From 2002-2012, he was on the faculty of New York University where he taught comparative political economy, digital economics, and media production. He also taught film and television production at Marist College in New York and Victoria University in Wellington, New Zealand. When not in the Republic of Korea he lives in Austin, Texas.

ICT4D and Digital Development in a Changing World

Posted on | December 14, 2022 | No Comments

Prepared remarks as moderator for the 2nd Annual ICT4D Faculty Panel: Digital Development in a Changing World. Monday, Nov 28, 2022 at the State University of New York, Korea (SUNY Korea) in Songdo, Republic of Korea.

This year, 2022, marks the halfway point for the Agenda for Sustainable Development, aimed for completion by 2030. Created by the United Nations during the time of Secretary-General Ban-Ki moon, 17 Sustainable Development Goals (SDGs) were approved by the General Assembly for completion by 2030. SDGs were meant to create a “shared blueprint for peace and prosperity for people and the planet, now and into the future.“ At this halfway point, the SDGs face new challenges and opportunities.

This panel will address issues related to Information and Communications Technologies for Development (ICT4D) and some of the challenges coming out of the COVID-19 pandemic and the war in Ukraine, including geopolitical constraints on material supplies and digital technologies.

SDGs are Sustainable Development Goals

A contemporary concern threatening digital and sustainable development is the strength of the US dollar, the primary global reserve currency and vehicle for over 80% of international transactions. A worldwide shortage of the dollar (except in the US, apparently) increased its nominal value and turned it into an appreciating investment instrument, raising the purchase price. A strong US dollar makes imports more expensive, including digital and renewable energy technologies. Purchasing smart devices, even the less expensive ones like Orange’s Sanza Touch, a 4G Android device, can become more expensive. Solar panels, windmills, and other renewable technologies are also likely to be imported and additional dollars are needed for those as well. It may be that an alternate to the dollar emerges in the next several decades, maybe even a crypto version, but that doesn’t seem likely right now.[1]

Another issue is the looming limitation on material supplies, particularly on valuable elements and metals needed for ICTs and renewable energy technologies. A sustainable revolution needs massive shifts toward infrastructure development, and that means higher demand for a wide range of resources that need to be mined, refined, and distributed. The loss of crucial materials from Russia and Ukraine means energy, food, and natural resource shortages that will influence green development. Steel, nickel, aluminum, copper, bauxite, neon, palladium, etc., critical elements of modern technologies, have suffered heavy supply shocks. Food supplies are also increasingly insecure with potash and nitrogren-based fertilizers produced from natural gas becoming more scarce. Constraints on ICTs, natural resources, and renewable energies will present significant challenges for meeting the objectives and targets of the SDGs.

Relentless pollution and atmospheric changes affecting climate security also drive strategic questions affecting digital development. At stake are biodiversity, ocean acidification, and human habitats prone to droughts, floods, and wildfires. ICT solutions have been identified for their potential to reduce greenhouse gas (GHG) emissions by up to 15 percent by 2030.[2] Greenhouse gases increase the atmosphere’s temperature, allowing it to absorb more water. More moisture in the air means more weather volatility leading to droughts in some areas while increasing the intensity of hurricanes/typhoons along ocean coasts. For example, Hurricane Ida in late August of 2021 was extremely intense. After it came out of the Gulf of Mexico to hit Louisana, it had enough energy and moisture to travel north across the US mainland and drop in on New York City and surrounding areas, killing an additional 50 people.

Meanwhile, a number of factors present hope for a sustainable future and the role of ICT. Green funding has increased, particularly with the Inflation Reduction Act (IRA) in the US, and hopefully, recent commitments at the 2022 United Nations Climate Change Conference (COP 27) will be fruitful for climate justice and well as continued reductions in air pollution. Green New Deals are spreading worldwide, particularly in the stressed EU that had counted too much on Russian imports of hydrocarbons.

Korea has also envisaged a Green New Deal along with a Digital New Deal under the previous Moon administration, stressing its “fast follower” strategy to capitalize on its export industries to solve world problems while providing a protective safety net at home. Japan even recently voted Hyundai Motor’s all-electric Ioniq 5 SUV as its “Import Car of the Year.

Korean Green New Deal

Technological innovations have also been striking. Battery innovations, especially for long-term storage by companies like Ambri, can soothe the shortages of intermittent energy sources such as wind and solar. While many battery materials are scarce, spent cobalt, lead, and lithium in batteries can be successfully recycled.

Most significantly, the economics of renewables, with admittedly huge upfront investment requirements, have near-zero marginal costs over medium and long-term time frames. For example, once a windmill is constructed and operating, the subsequent costs are minimal, and as it produces electricity, wind “prints” money.

Climate awareness offers hope and motivation. Many climate deniers are still questioning the basic science and physical evidence, but the political will is growing to take impactful action to mitigate weather-related disasters and adapt to the changes.

What is the continuing role of ICT for Development (ICT4D) in this emerging post-COVID-19 environment? The Earth Institute has identified three key ICT (Information and Communications Technology) accelerators of the SDGs. These are mobility, broadband, and cloud services, such as platforms using AI and “big data.” These technologies can often “leapfrog” legacy communication and database systems to provide new efficiencies and opportunities in agriculture, education, energy, and healthcare. For example, while needing to be monitored with data justice precepts, digital ID programs can provide new eligibilities for financial inclusion, personal ownership, and social programs such as health services.

These key accelerators were woven into the BS in Technological Systems Management (TSM) specialization in ICT4D (Formerly ICT4SD) at SUNY Korea. This forum brings together a panel of faculty that teach these courses, including EST 371 – Data Science Management, EST 372 – The Mobile Revolution in Development, and EST 320 – Communication System Technologies, as well as today’s audience from EST 230 – ICT for Sustainable Development. We also cover related topics in our MS degree in Technological Systems Management, which includes a specialization in Digital Technologies in Disaster Risk Reduction. And, of course, several of our Ph.D. students in the Technology, Policy, and Innovation (TPI) program have pursued related research topics for their dissertations. All our degrees are conferred by Stony Brook University in New York.

Please welcome our faculty to today’s panel discussion.

James Larson
Suzana Brown 
Sira Maliphol
Jinsang Lee
Joseph Cabuay
Sangchan Park

Citation APA (7th Edition)

Pennings, A.J. (2022, December 14). ICT4D and Digital Development in a Changing World. apennings.com https://apennings.com/sustainable-development/ict4d-and-digital-development-in-a-changing-world/

[1] The US dollar has been global since the 1950s as eurodollars, and had electronic versions since the late 1860s.
[2] Sachs, J. D. (n.d.). ICT and SDGs: How Information and Communications Technology Can Accelerate Action on the Sustainable Development Goals. Retrieved from http://www.fao.org/e-agriculture/news/ict-and-sdgs-how-information-and-communications-technology-can-accelerate-action-sustainable

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York (SUNY), Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s. He heads up the ICT4D specialization at SUNY Korea.

The Techno-Epistemology of Databases, Part I

Posted on | November 6, 2022 | No Comments

How is meaning created in an environment of computer databases and digital spreadsheets? How is that meaning fixed? How is it organized? What are the processes and consequences of categorization and classification? [1]

This post posits that databases are an epistemological technology that is part of a system of knowledge production and management. It explores the unique meaning-producing capabilities that emerged with databases, particularly table-based relational databases, and how it affects modern organizations and society. They have transformed modern forms of bureaucratic and corporate organization and are major forms of “time-space power,” the ability to endure over time and operate over distances.

Databases, like digital spreadsheets, are constitutive technologies that can shape perceptions and knowledge through their ability to capture data, store it, process it, and render it accessible for retrieval, presentation, and analysis. Consequently, they organize resources and empower control over the lived experiences of people and the dynamics of social organizations.

Epistemology is generally known as the philosophical attempt to understand knowledge and produce theories of we comprehend the world. Philosophers such as Plato, John Locke, Immanuel Kant, and Bertrand Russell, among many others, have attempted to understand how we know, what knowledge means to the knower and the society they live in, and how knowledge can be verified and justified.

Techno-epistemology refers to knowledge produced, categorized, and certified within a particular technological context. In particular, it asks how the characteristics of the technology shape the ways of perceiving, recording, and organizing knowledge. Database technology and management are at the nexus of systematizing knowledge in the organization.

Stuart Hall’s insights into culture and the creation of meaning are a helpful starting point. In his classic work on Representation he discussed the role of concepts and the processes of classification, as well as how meaning is created and shared. Hall also raised issues regarding the role of culture in creating and organizing meaning.

Categorizing or classifying information, for Hall, is a basic human social process that produces meaning and separates information into that which “belongs” and that which does not, based on distinguishing characteristics such as resemblances or differences. It involves grouping or distributing according to some common relations or attributes. A list, for example, is a category of items that delineates what is within the group, and what is on the outside.

Classification is a cultural phenomenon that involves negotiating society’s conceptual maps. Stuart Hall argued that while the ability to conceptualize and classify is an inherent human biological trait, the systems of classification that are produced are socially produced and learned. This is because humans are meaning-making entities and also operate in environments that require the systemization of knowledge. “Socialization” is the process of learning society’s cultural categories and the value structures associated with them.

To investigate the topic of techno-epistemology more thoroughly, it is useful to analyze a particular database. Using a PC desktop database is sufficient for this initial formal analysis. Formal analysis is a strategy for describing artifacts and visual information. This approach can be applied to any work of art or cultural artifact. I also draw in the notion of remediation, which is how new media incorporate older media to be functional.

I previously analyzed digital spreadsheets to examine their components and how they use older media to work together to provide the spreadsheet’s organizational power and its influence on modern capitalism. That formal analysis discussed 5 components that came together to invest spreadsheet with their power: alphanumerical representations, lists, tables, cells, and formulas.

The purpose of analyzing databases is to continue the examination of how technology shapes the organization of knowledge by examining the components of a database. They have transformed modern forms of bureaucratic and corporate organization by shaping knowledge production and management. A central questions What is remediated in the database? In the next installment I will analyze the Microsoft Access database relational database program that includes tables, fields and records as well as queries, forms, reports, macros, and modules.

Notes

[1] In “The Influence of Classification on World View and Epistemology” Gholamreza Fadaie, Faculty of Psychology & Education, University of Tehran, Iran. Proceedings of the Informing Science & IT Education Conference (In SITE) 2008.
http://proceedings.informingscience.org/InSITE2008/InSITE08p001-013Fadaie410.pdf

Citation APA (7th Edition)

Pennings, A.J. (2022, Oct 11). The Techno-Epistemology of Database Classification, Part I. apennings.com https://apennings.com/technologies-of-meaning/the-techno-epistemology-of-databases-part-i/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Zeihan’s Global Prognostics and Sustainable Development, Part I

Posted on | October 11, 2022 | No Comments

Peter Zeihan has become a popular author and media celebrity for his ideas on declining demographic patterns, the shale oil revolution, and the end of globalization. He formerly worked for Stratfor, a private geopolitical forecasting company in Austin, Texas, started by noted geopolitical futurist George Friedman.[1] His new firm Zeihan on Geopolitics provides custom analytical products and briefings on global trends and how they might impact his corporate and government clients.

This post looks at Zeihan’s hypotheses and their implications for sustainable development, roughly defined by the United Nations Brundtland Commission as meeting the needs of the present without compromising future generations. While one might say that all countries in the world are undergoing a transition to sustainable development, countries have different circumstances and need to develop unique economic policies and solutions. In any case, they must maintain situational awareness regarding their financial, geographical, and military affairs. They should also consider the relevance of the UN’s sustainable development goals (SDGs) that plot a trajectory based on economic, social, and environmental objectives.[1] Zeihan’s model, which is not beyond critique, is a helpful point of departure for assessing the development needs in each and every country and the possibilities of the SDGs to help identify and meet those needs.

Zeihan has written four books, The Accidental Superpower: The Next Generation of American Preeminence and the Coming Global Disorder (2016), The Absent Superpower (2017), Disunited Nations: The Scramble for Power in an Ungoverned World(2020), and just recently, The End of the World Is Just the Beginning: Mapping the Collapse of Globalization (2022). But YouTube has probably been his most significant outlet. You can view the below and follow the recommended videos or search directly.

Zeihan’s approach is primarily focused on demographic trends, geographic constraints, military concerns, and legacy energy availability. For example, he discusses the Russian Federation’s situation primarily in terms of its declining population, reliance on metals, natural gas, and oil exports, and a history of being invaded by foreign armies through several geographical vulnerabilities (particularly the plains of Ukraine). His major concern is that Russia’s younger population is significantly declining, which means fewer workers, fewer consumers, and less capital investment capability from savings. He considers the war in Ukraine one last chance to start barricading the 11 potential invasion points that have haunted Russia in the past. Otherwise, Russia is likely to be invaded or break up in the next few decades.

He also weaves agriculture, finance, maritime shipping, rare materials, manufacturing, and military activities into his prognostications. His general conclusion is that globalization is ending, and countries will have problems dealing with their populations getting older and with fewer younger generations. And all this will mean clear winners and losers as globalization comes to an end. And why is it ending?

The starting point for his major arguments is that the US-led world order that provided financial stability and military protection for global trade is ending due to a lack of interest and political will. Since 1945, the US has provided the primary currency for international reserves and transactions. They also orchestrated maritime and military protection for a worldwide system of trade. However, he argues the United States has been stepping down from its role as the global bank and policeman, and it will leave many countries vulnerable to drastic changes in international commerce and military affairs.

The Bretton Woods Conference in New Hampshire during World War II designed a new financial and trading order. Replacing the old gold standard, economists and policymakers from around the world created a new financial system. It was based on connecting the US dollar to gold at $35 an ounce. At the same time, other countries were required to tie the value of their currencies to the dollar at fixed rates. They also created the International Monetary Fund (IMF) and the World Bank to help countries maintain their prescribed currency exchange rate and develop economically.

But just as crucial for Zeihan’s thesis is the US creation of a global security system to protect any country’s maritime exports and imports. The US Navy has over 200 ships that patrol the world’s waterways, including 11 “super” aircraft carriers and 9 “helo” carriers. Using nuclear power, the supercarriers can operate for over 20 years without refueling. Aircraft carriers usually carry dozens of fighter jets and project tremendous power and national prestige. While China has an overall numerical advantage, its ships have a limited range that restricts the area they can patrol. China has two aircraft carriers and one helo carrier. Russia has a powerful navy, but it is divided into four areas with limited access to the world’s oceans. The United Kingdom, France, and Japan are considered other powerful global navies.

Caveat: A country gets protection providing it sides with the US against Communism (and later “terrorism). He calls this “the bribe.”

The main problem with this global protection system, Zeihan argues, is that it died in the early 1990s with the fall of the USSR. The US populace has subsequently not been particularly interested in its continuance. Although it got renewed emphasis after 9/11 and the war on “terror.” The US has steadily turned inward and away from the world’s problems. The end of this protection will leave much of the world’s shipping at the mercy of hostile states and pirates.

Global shipping has several choke points, particularly the Straits of Malacca, between Singapore and Indonesia, as well as the Gulf of Hormuz between Iran and the UAE. These routes are vulnerable to naval attacks and blockades. China’s energy imports, for example, are quite vulnerable, as are its exports to the world. For example, a naval blockade near Sri Lanka could seriously hamper China’s industries and shut down much of the country. Zeihan argues that US protection keeps China from falling into chaos. Without the US, essential trade routes will require convoy protection capabilities that China has yet to achieve and will require new forms of shipping insurance that could be very expensive.

Likewise, the US dollar is the mediating global currency, which is not likely to change soon, despite US neglect. Recurrent dollar shortages have created havoc in world markets. These shortages are due to declining US purchases of foreign oil, diminished military presence overseas, and smaller trade deficits that have made the dollar much more scarce and, thus, more expensive. The upward trend in dollar strength has recently been accelerated by rising US interest rates and global capital flight into US dollar assets due to the Russian invasion of Ukraine and China’s potential attack on Taiwan.

His second major concern is the rapidly aging populations in the developed and newly industrialized worlds that will soon send many economies into free fall. These countries are facing severe demographic problems with increased older generations and smaller new generations, primarily due to urbanization. Populations provide workers, consumers, and investors. Industrialization and scientific advances have meant a population boom worldwide, but that trend is ending. As a result, consumer spending and tax revenues dry up, while pension payments and medical care consume much of what is left. Japan has been leading the way, but China is Zeihan’s primary concern. Add in the one-child policy, and Zeihan believes China is on the verge of a significant collapse. Europe is also vulnerable, as is Russia. He argues that Russia needed to have its war with Ukraine now to expand its empire and plug the geographical holes in its defenses or lapse into a slow, unrecoverable decline.

Energy is extremely important. The US is now energy-independent primarily based on the “Shale Revolution,” which refers to the combination of hydraulic fracturing (“fracking”) and horizontal drilling that significantly increases the acquisition of natural gas and oil. Overproduction in 2017 caused price declines culminating in the tragic 2020 negative prices, but the market recovered as Covid-19 subsided and “revenge travel” ensued. Middle East countries are big producers of energy. Russia was important but is quickly going offline due to sanctions and a lack of technical expertise and may see many oil wells freeze up indefinitely. Venezuela has tremendous quantities but of a quality that is more expensive to refine.

Almost as important for his global analysis, is what countries are heavily dependent on gas and oil imports. China, Japan, Korea, and Taiwan are all severely dependent on foreign energy that must travel significant distances. These countries rely on large oil tankers (and paying in US dollars) to ship in their energy. Primarily built by France, Japan, and Sweden, and more recently South Korea. Zeihan says that a couple of destroyers in the Indian Ocean stopping oil tankers going to China could bring down the country within a year. For him, China is the most vulnerable country in the world.

Zeihan is not the biggest fan of renewable energies, although he claims to have solar panels on his home. He doesn’t think they will significantly scale in the quantities to replace oil and argues that many places are not suitable locations for wind or solar. He also points out that many of the rare elements and materials needed for the green revolution require inputs from many countries. If globalization continues to stall, Net Zero carbon pledges will be nearly meaningless.

In conclusion, Zeihan has a model of the world with several key variables that he uses to analyze the prospects for individual countries. Most are based on material conditions: geography, populations, oil, metals, etc. They produce a significant hypothesis: that globalization based on US protection is falling apart and that individual countries will have to assess their situations and decide whether to find partners or go it alone.

His model does not seem to highly value collective action, education, and innovation. These include cultural movements (including markets) and governmental momentum and policy that can assess and adapt to situations. It was, after all, the unlikely US mobilization for WWII that led to success in both the Atlantic and Pacific theaters and the technological innovations of radar, code decryption, and the atomic bomb, among other successes, that propelled the US to the global power it became and continued to sustain over nearly 80 years. He didn’t believe Ukraine would last long against Russia. (Who did?) Zeihan also downplays political influence in foreign policy and the tendency of the “military industrial complex” to shape US action.

However, the model he presents is a useful starting point for country assessments on the relevance of sustainable development goals. Every country has to evaluate its situation within the context of the world’s trading and financial arrangements. It needs to understand its demographic status to assess labor opportunities, investment trends, and consumption projections. It also needs to understand its topographical characteristics and limitations that determine the logistical capabilities of its navigable waterways, seaports, highways, and passable railways. These will help to realize what means it has available to mobilize resources and products for export as well as the access it has to world markets for the imports such as energy, plastics, steel, etc., that its producers need for everything from manufacturing to home building.

In Part II, I will go into more detail about how this model can serve the analysis of sustainable development in various countries.

Notes

[1] I’m testing this analytical approach in my undergraduate EST 230 ICT and Sustainable Development course. It’s important for the students to comprehend the context for sustainable development and understand the different challenges that different countries will face.

Citation APA (7th Edition)

Pennings, A.J. (2022, Oct 11). Zeihan’s Global Prognostics and Sustainable Development, Part I. apennings.com https://apennings.com/digital-geography/zeihans-global-prognostics-and-sustainable-development-part-i/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Weak Domestic Dollar, Strong Global Dollar

Posted on | August 3, 2022 | No Comments

The summer of 2022 has been a good time for Americans to travel overseas. Everything is expensive at home and cheap abroad. In other words, the dollar is weak domestically and strong globally. In this post, I examine the dynamics of the US dollar and why it operates differently within the domestic US and globally.[1]

We are still in the wake of the coronavirus (COVID-19) epidemic and its stimulus spending, supply shocks, and other dramatic tragedies that included the deaths of over a million people in the US alone. We can add the Russian invasion of Ukraine to that turbulence with its prospects for reduced trade in natural gas, metals, oil, wheat, uranium, and other resources.

In the US, we also have political divisions that have polarized policy perspectives, particularly on major spending issues such as the latest Senate budget reconciliation deal tentatively titled the Inflation Reduction Act (IRA), which will address inflation through tax changes, capping drug costs, and climate change measures.

These are some of the main issues that influence the dollar’s strength internally and out in the global system. The consequences are immense as we interrogate government spending and consider alternatives to the US dollar as the world’s primary transacting currency, such as Bitcoin or the Chinese Yuan.

Fed M1 Money Supply

How are Dollars Created?

In both the US and internationally, US-denominated dollars are primarily created by private banks as they make loans. The fractional reserve banking system that most of the world runs on takes deposits and lends out money at prescribed interest rates, contigent on levels of risk, compliance, and the quality of collateral. Banks are usually restricted in loan creation by a reserve requirement set by a central bank. The Federal Reserve had traditionally required US banks to put 10 percent of their deposits in vaults or stored at the Fed. But since the pandemic started, they reduced that requirement to zero percent. This meant that US banks could lend out anything that came in the door.

Internationally, banks have developed a system of lending that is based on what are called “Eurodollars.” These are US currencies outside the jurisdiction of the Federal Reserve and other US regulatory agencies and provide the majority of the world’s transacting currency. This shadow money exists on the ledgers of banks and operates with no reserve requirements. Eurodollars have been active since the 1950s but exploded in the 1970s when three things happened: Nixon took us off the Bretton Woods’ Dollar-Gold standard; global oil transactions were set in US dollars; and; communications networks went digital. More on Eurodollars below.[2]

While bank lending is the primary way dollars are created, some other processes are mentioned below that are relevant to the creation of the dollar in the US and its spending implications. These include actually printing them, interest rates, reserve requirements, and government spending.

The Domestic Dollar

The Fed can literally print money, which it does for the US Treasury. It keeps about 1.5 trillion in paper-based dollars circulating in the economy. That includes about 12 billion one dollar bills, 9 billion twenty dollar bills, and 12 billion hundred dollar bills. Another $2 billion is minted as various coins.

But mostly, what people call “printing” money is the Fed creating bank reserves in the process of targeting interest rates and “quantitative easing” (QE). It does this by buying government bonds from banks with its magical computer mouse, an unlimited credit card. This buying increases the reserves that banks can lend out to borrowers. But bank reserves are not real money unless they are converted into loans for people and businesses. Conversely, it can reduce the amount of bank reserves by selling government securities. That transaction absorbs bank reserves and can potentially reduce lending and thus pressures on inflation. So, in retrospect, the Fed’s role in money-making is not always predictable, and certainly not “money printing” unless you consider its role as the government’s bank.

Most people don’t understand how the US federal government creates money. It’s quite simple, although politicians don’t like talking about it. As Warren Mosler, a financial trader and author of Soft Currency Economics put it, “Congress appropriates the money and then the Treasury instructs the Fed to credit the appropriate accounts.” That’s it. The Fed just sends an electronic message changing figures on computer-based ledgers. But government spending does need Congressional approval and all US spending bills must originate from the House of Representatives.

The government doesn’t need to tax to get the money to spend. It doesn’t need to borrow the money. Taxing and borrowing have their purposes, but they are not required for the government to spend money because the US government issues its own currency. So why tax and why borrow if the government can just issue money to provision itself and pay its obligations?

Warren Mosler, the founder of MMT, argued that taxing keeps everyone on the national currency system and can also reduce inflationary pressures. The purpose of taxing is to keep the national currency relevant and reduce the amount of something. Taxing creates the demand for government-issued currency because they must be paid in US dollars. Taxes can reduce discretionary spending and alleviate social inequality. Taxes can channel additional resources toward national activities as they did during the Cold War and Space Race when the top marginal rates were above 90 percent.[3]

Borrowing money by auctioning off Treasury bonds creates a monetary instrument that pays interest. It’s basically another way for the government to issue its currency and keep markets operating in a dollar system. Also important, in fact quite critical, is its role as collateral in borrowing and hedge against financial risk. Its larger denominations useful for repurchase agreements (“repos”) and futures markets.

Repos are collateralized loans where a borrower of cash exchanges securities such as US Treasuries (the collateral) to the lender with an agreement to repurchase or buy them back at a specified price and time. Treasuries were critical for the Reagan Revolution’s global finance offensive. It tripled the US debt from $738 billion to $2.1 trillion, making the US the world’s largest debtor nation. This debt also provided the collateral needed for the expansion of US money-capital. US treasures provided for the liquidity to make the modern financial sphere possible.

Consequently, according to Stephanie Kelton, debt and deficits are a “myth” for a government that can issue currency. The COVID-19 response was unprecedented spending by the US government. The Senate used an old House bill to create the CARES Act of 2020, with $2.2 trillion in initial stimulus spending. Trump’s son-in-law Jared Kushner would manage the PPP while the COVID-19 vaccines were being developed by Operation Warp Speed. Another $900 billion was added before the end of the year, mostly for direct payments to individuals for the holiday season.

In 2021, the Democrats legislated another $2 trillion in the American Rescue Plan to address the K-shaped recovery. It expanded unemployment insurance, extended the enhanced child tax credit, and supported the Centers for Medicare & Medicaid Services (CMS) to ensure all Americans had access to free vaccinations. Was the COVID-19 spending the cause of the rise of inflation that rose steadily, hitting 9 percent in June of 2022?

We are still in the wake of the COVID-19 pandemic. Shortages emerged from businesses shutting down, factories closing, and shipping containers stranded at sea or in ports waiting for trucks to distribute their goods. Disposable income increased from government stimulus and inflated assets. People forced to study or work at home shifted spending from services to goods. Instead of going to a restaurant, they bought food at the local supermarket. Instead of going to the cinema, they watched Hulu, Netflix, or one of the new Internet-based streaming services.

Corporations raised prices, and purchasing power decreased. As a result, the dollar weakened domestically as the money supply inflated in 2020, and to a lesser extent in 2021. The chart above shows the M1 money supply levels since 1990 (In billions of US dollars). The increase in 2020 was dramatic.

The International US Dollar

As former US Treasury Secretary John Connally once said to his counterparts around the world at an economic summit, the dollar “is our currency, but it’s your problem.” The value of the dollar is currently at a 20-year high against other major currencies, creating a massive problem for everyone outside America buying dollar-denominated goods, which is about 85 percent of all international trade. Oil, and nearly all raw materials, from aluminum to wheat and zinc, are priced in US dollars.

As the world’s “reserve” currency, dollars are in demand around the world, primarily for transactional purposes. It is the mediating currency for transactions among dozens of different countries. If Argentina wants to do business with Australia, it conducts that business through US dollars. Why is the dollar currently very strong overseas? The problem is the significant shortages of the US dollar.

Two interrelated systems loosely organize US-denominated dollars overseas. The more official system is the US dollar reserves held by central banks. The US dollar became the world’s primary reserve currency due to the Bretton Woods Agreements at the end of World War II. The post-war trading system would be based on the US dollar backed by the gold in vaults at Fort Knox and the NY Federal Reserve Bank. The US dollar maintained this link with gold until the 1970s, when President Nixon ended the dollar’s convertibility into gold. Most countries stayed with the dollar and also began buying US Treasury securities as a safe store of the currency.

More than half of the official currency reserves are US dollars with the Euro taking up another quarter. This is actually a historic low. The Japanese yen, pounds sterling, and to a lesser, the Chinese renminbi constitute the rest of the major reserves held by central banks. Bitcoin and other blockchain cryptocurrencies are possibilities for the future, but as we see, currencies have been digital for quite a while. They just haven’t been blockchained-based. Central banks are also developing central bank digital currencies (CBDCs) that may make a multilateral transacting system outside the Eurodollar system more feasible.

Central banks get US dollars through its economy’s international trade, Foreign Direct Investment (FDI), and liquidity swaps. US aid programs, military bases, and trade deficits, especially buying foreign cars and oil, contribute to the spread of the US dollar. It was important for the US to run trade deficits to support the world’s need for dollars. The US oil independence has been particularly problematic for the world’s currency system. Many refer to the “dollar hegemony,” as it is challenging to manage, often destructive, and the US often doesn’t even seem politically motivated to try.

The other system involves the digitally created Eurodollar deposits in internationally networked banks. Eurodollars are not “Euros,” the currency of the European Union. Instead, these are US-denominated virtual currencies created and used outside the US. They not only operate outside the geography of the US, but also the legal jurisdiction of the US. Eurodollars are used by various big banks, money market and hedge funds, insurance companies, and many big corporations.

Eurodollars are primarily lent out (created) in exchange for collateral such as US Treasuries. Depending on the quality of the collateral, the lender also attaches a small fee called a “haircut” to cover any potential liquidity costs. Since the Eurodollar system works on lending, it makes sense that any bank or corporation borrowing money would want to make that transaction as cheap as possible. The price goes down by using good collateral to make it a more secure loan. One can envision a hierarchy of types of collateral with US Treasuries on top, but corporate junk bonds, mortgage-backed securities, sovereign debt, and even gold being other grades.

Overseas dollar markets are unregulated and proprietary, so accurate figures are hard to come by. As it is based on lending, the loans are often refinanced. A major issue is liquidity, being able to respond to the changing conditions in global financial markets that reach over a hundred trillion dollars. The cost and availability of the US dollar can shift for many reasons, including:

  • Changes in US interest rates
  • Shifts in global risk assessments
  • Periods of market stress such as coronavirus epidemics.

As a result, we currently see a considerable dollar shortage across the globe, dramatically increasing its value while causing depreciation of other currencies worldwide. While the international US dollar demand remains high, we can expect energy and food shortages, as well as the transmission of economic shocks such as Sri Lanka recently experienced. Goldman Sachs calculates that dollar strength is adding about $20 to a barrel oil for local currencies. These will have significant effects on both the global financial system and the global economy.

Currency swaps are one way to alleviate stresses in the global monetary system. They are repo agreements where a central bank sells a specified amount of a currency to another central bank in exchange for their currency at the prevailing market exchange rate. The first central bank agrees to buy back its currency at the same exchange rate (with interest) on a specified future date. A central bank initiates this swap transaction so they use the currency it swapped for to lend into their domestic economy. The Swiss National Bank in conjunction with the New York Fed regularly auctions off dollars to other central banks in liquidity swap operations. These are repos with needed collateral that used to be called foreign exchange swaps and have become more utilized as dollar liquidity has dimished.

Significant data gaps make assessing the risks and vulnerabilities of the global currency system challenging. Additional disclosure and data collection are needed to improve the transparency and possible regulation of the global US dollar system. In the meantime, we need to theoretically assess the dynamics of the domestic dollar administration and the possibilities of transforming the international monetary system. While it is highly unlikely that the US dollar will lose its reserve currency status, the current worldwide dollar shortage and its consequences are bringing increased scrutiny and calls for monetary reform, including more multicurrency settlements of energy.

Notes

[1] If you want a refresher on what causes inflation see my Great Monetary Surge of 2020 and the Return of Inflation from earlier this year.
[2] I did my Masters thesis on the emergence of the Eurodollar system and how it became electronic in the 1970s. I want to acknowledge Jeff Snider and the Eurodollar University for sparking the resurgence of my interest in the global currency.
[3] Unlike monetary policy enacted by the Federal Reserve which works to manage the economy by “lifting all boats” and not target individual industries, MMT will likely have to be more targeted. Although it can still focus on infrastructure and other enabling systems.

Citation APA (7th Edition)

Pennings, A.J. (2022, August 2). Weak Dollar, Strong Dollar. apennings.com. https://apennings.com/dystopian-economies/weak-dollar-strong-dollar/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

MMT in a Post-Covid-19 Environment

Posted on | July 29, 2022 | No Comments

Modern Monetary Theory (MMT) is a set of descriptions about government (fiscal) policy originating from the financial world. Traders of government bonds began to inquire about Federal spending and fiscal monetary processes in the post-Reagan environment as the US debt increased substantially. They looked beyond the curtain to processes of money creation and the roles of bonds and taxes in the political economy.

What is the role of money? How does the US government get its money? How does it spend money? What are the constraints on government spending? This post looks at MMT and its prospects by examining the economic problems associated with COVID-19 virus pandemic.

We hear the term “printing money” a lot, usually by gold or bitcoin enthusiasts who believe in establishing strict financial constraints. By establishing “hard money” and limiting the quantity of money in an economy, they hope to see their assets rise in value while keeping prices down. Certainly, governments do print some of their money for public use, but the preponderance of funds are entries in digital ledger accounts. I prefer the imagery of the “magical mouse” to the printing press. But I digress.

So how does the US government spend money? In the US, pursuant to Congressional legislative action, the Federal Reserve credits the appropriate accounts at the Department of the Treasury. That’s pretty much it.

Currency-issuing governments do not have the same financial constraints as a household, a business, or even a municipal government, state, or province. They can use their sovereign currency to pay for whatever is available to buy in their own denominated currency. Those are pretty important points so you may want to read them again. They don’t need to print, tax, or auction off bonds to to spend. But it is registered.

Stephanie Kelton at Stony Brook University in New York has been a strong academic and policy proponent of MMT and its possibilities. A professor of economics, she has also been a consultant to Democratic candidates. In this video, she explains how MMT sees the spending process and some of the implications of seeing money and the deficit in a new way.

Kelton and the MTT theorists have been clear about the downsides of these spending processes. While a currency may not have the restrictions many believe, resources can face limitations. Labor, education, land, materials, and even imagination are called into service under such conditions. Shortages can mean rising prices, has we have seen in the COVID shutdown of businesses and supply chains. MTT proponents have been mindful of this stress and the potential of rising prices from its inception.

But the problem of increasing costs from resource limitations never really emerged since the 1970s until the Covid-19 pandemic. Globalization (particularly the Chinese workforce), a large working force (including women), and technological innovations such at IT and the Internet increased productivity and managed prices at relatively low levels. Also, spending, despite deficits, stayed within manageable levels. Then, however, the pandemic began to shut down businesses and disrupt traditional supply lines. Also, the Federal Reserve, the US central bank, reduced interest rates and increased bank reserves. At the same time, the Federal government began to spend trillions on Payroll Protection Program (PPP) and other stimulus spending.

Stimulus spending continued into 2021 to address the K-shaped economy, a split in continuing economic benefits to those who could adjust to the pandemic and those hit the hardest. As people were hunkering down at home, many adjusted by working online or taking classes via Zoom. Others lost their jobs or had to quit to take care of children not attending school. Some $5 trillion was spent as COVID-19 stimulus, and the pandemic severely limited work and availability of goods and services. As a result, the economy was strained and thrown into disarray.

During 2021, prices started to rise, especially oil that had crashed in 2018, and to less than $20 a barrel in 2020 as the pandemic took hold. Oil production shut down until the roaring return of demand in late 2021 and 2022. In February 2022, as Russia attacked Ukraine, oil access was further reduced, as was the availability of fertilizers and materials needed to fight climate change.

In addition to resources, MMT also needs the ideas and policies to put the currency to work efficiently. Wars have been the defacto spending rationalization for governments. Military spending creates jobs and innovative technologies that often spill over into the larger economy.[1]

How do we collectively decide on alternative spending strategies? Should consumers drive social demand? Can universal healthcare drive down business costs? Can renewable energy infrastructure investment kickstart a new industrial era-based cheap electricity inputs? Should increased spending go into a war on climate change?

How do we recognize the constraints in these strategies? Materials? Labor? Criticisms of centrally planned economies abounded in the 20th century. Most concerns have lamented the lack of price signals aggregated from individual consumers and profit-seeking private firms.

Others have argued that government spending would “crowd out” the private sector by absorbing available investment capital. This would decrease private sector activity. But Kelton argues current economics gets it backward. Spending comes first.

Do they sufficiently allocate resources in a society? The Paycheck Protection Program (PPP) stimulus program under the Trump administration was probably the wildest expenditure of government money in modern history. But did it matter if the goal was to stop the slide into a major recession?

Notes

[1] Unlike monetary policy enacted by the Federal Reserve which works to manage the economy by “lifting all boats” and not target individual industries, MTT will likely have to be more targeted.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Al Gore, Atari Democrats, and the “Invention” of the Internet

Posted on | July 9, 2022 | No Comments

This is the fourth part of a narrative about how the Internet changed from a military network to a wide-scale global system of interconnected networks. Part I discussed the impact of the Strategic Defense Initiative (SDI) or “Star Wars” on funding for the National Science Foundation’s adoption of the ARPANET. In Part II, I looked at how the fears about nuclear war and Japan’s artificial intelligence (AI) propelled early funding on the Internet. Finally, Part III introduced the “Atari Democrats” and their early role in crafting the legislation in creating the Internet.

This post is a follow-up to make some points about Al Gore and the Atari Democrat’s involvement in the success of the Internet and how political leadership is needed for technological innovation and implementation. What is particularly needed is to draw lessons for future infrastructure, open-source software, and other enabling systems of innovation, social connectedness, and civic-minded entrepreneurship. These include smart energy grids, mobility networks, healthcare systems, e-commerce platforms, and crypto-blockchain exchanges.

The story of Al Gore “inventing” the Internet started after CNN’s Wolf Blitzer interviewed the Vice-President in 1999 and gained traction during the 2000 Presidential campaign against George W. Bush. The accusation circulated that Gore claimed he “invented the Internet” and the phrase was used to tag the Vietnam vet and Vice President as a “liar” and someone who couldn’t be trusted. The issue says a lot about way election campaigns operate, about the role of science and technology in the economy, and especially about the impact of governance and statecraft in economic and technological development. Here is what he actually said:

Of course, the most controversial part of this interview about Vice President Gore’s plans to announce his presidential candidacy was this statement, “During my service in the United States Congress, I took the initiative in creating the Internet.” That was turned into “inventing the Internet” and was used against him in the 2000 presidential elections. The meanings are quite different.

“Inventing” suggests a combination of technical imagination and physical manipulation usually reserved for engineers. We do after all, want our buildings to remain upright and our tires to stay on our cars as we ride down the road. To “create” has a much more flexible meaning, indicating more of an art or a craft. There was no reason to say he invented the Internet except to frame it in a way that suggested he designed it technically, which does sound implausible.

Gore never claimed to have engineering prowess but was never able to adequately counter this critique. Gore would win the popular vote in 2000 but failed in his bid for the Presidency. The Supreme Court ruled he had lost Florida’s electoral votes in a close and controversial election in the “Sunshine” state. It’s hard to say how much this particular meme contributed to the loss but the “inventing” narrative stuck, and has persisted in modern politics in subtle ways.

The controversy probably says more about how little we understand innovation and technological development and how impoverished our conversations have been about the development of data networks and information technologies. The history of information technologies and particularly communications networking has been one of the interplay between technical innovation, market dynamics and intellectual leadership guiding policy actions, including military and research funding.

The data communications infrastructure, undoubtedly the world’s largest machine, required a set of political skills, both collective and individualized, to be implemented. In addition to the engineering skills that created the famed data packets and their TCP/IP (Transmission Control Protocol and Internet Protocol) protocols, political skills were needed for the funding, for the regulatory changes, and the global power needed to guide the international frameworks that shape what are now often called Information and Communications Technologies (ICT). These frameworks included key developments at the International Telecommunications Union (ITU), the World Intellectual Property Organization (WIPO) and the World Trade Organization (WTO).

Al Gore got support from somebody generally considered to be the “real inventors” of the Internet. While Republicans continued to ridicule and “swiftboat” Gore for trying to claim he “invented the Internet.” many in the scientific community including the engineers who designed the Internet, verified Gore’s role.Robert Kahn and Vinton Cerf acknowledged Gore initiatives as both a Congressman and Senator.

    As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s.

    But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises. – Vint Cerf

Senator Gore was heavily involved in the 1980s sponsoring legislation to research and connect supercomputers. Gore was an important member of the “Atari Democrats.” Along with Senator Gary Hart, Robert Reich, and other Democrats, they pushed forward “high tech” ideas and legislation for funding and research. The meaning of the term varied but “Atari Democrat” generally referred to a pro-technology and pro-free trade social liberal Democrat.

Atari was a very successful game arcade, console, and software company in the 1970s. Started by Nolan Bushnell, it gave a start to Steve Jobs and Steve Wozniak, among others. The term began to stick to some Democrats around 1982 and generally linked them to the Democrats’ Greens and an emerging “neo-liberal” wing. It also suggested that they were “young moderates who saw investment and “high technology” as an answer and complement to the New Deal.” [1]

The New York Times discussed the Atari Democrats and tensions that emerged during the 1980s between the traditional Democratic liberals and the Atari Democrats. The latter were attempting to find a middle ground on the economy and international affairs with the Republicans while the former courted union workers, many of whom had shifted to Reagan and the Republicans.[2]

One of the emerging issues pf the time was the trade deficit with Japan whose cars and electronics were making significant inroads into the US economy. Gore and other Democrats were particularly concerned about losing the race for artificial intelligence. The Japanese “Fifth Generation” AI project was launched in 1982 by the country’s Japan’s Ministry of International Trade and Industry (MITI) that had a reputation at the time for guiding Japan’s very successful export strategy.

Known as the national Fifth Generation Computer Systems (FGCS) project, the AI project was carried out by ICOT, (later AITRG), part of the Japan Information Processing Development Corporation (JIPDEC), the Advanced IT Research Group, (AITRG) a research institute that brought in Japanese computer manufacturers (JCMs), and a few other electronics industry firms. A major US concern was that the combination of government involvement and the Keiretsu corporate/industrial structure of the Japanese political economy would give them a major advantage in advanced computing innovations.

Congress was concerned about the competition over high-speed processors and new software systems that were recognized at the time as a crucial components in developing a number of new military armaments, especially the space-based “Star Wars” missile defense system that President Reagan had proposed as the Strategic Defense Initiative (SDI). Any system of satellites and weaponry forming a defensive shield against nuclear attack would need advanced microprocessors and supercomputing capabilities. It would need artificial intelligence (AI).

The likely vehicle for this research was National Science Foundation (NSF), the brainchild of Vannevar Bush, who managed Science and Technology for the US during World War II. That included the Manhattan Project that created the Atomic Bomb. The NSF was formed during the 1950s with established areas of research in biology, chemistry, mathematics, and physics. In 1962, it set up its first computing science program within its Mathematical Sciences Division. At first it encouraged the use of computers in each of these fields and later towards providing a general computing infrastructure, including setting up university computer centers in the mid-1950s that would be available to all researchers. In 1968, an Office of Computing Activities began subsidizing computer networking. They funded some 30 regional centers to help universities make more efficient use of scarce computer resources and timesharing capabilities.

In 1984, a year after TCP/IP was institutionalized by the military, the NSF created the Office of Advanced Scientific Computing, whose mandate was to create several supercomputing centers around the US. [2] Over the next year, five centers were funded by the NSF.

General Atomics — San Diego Supercomputer Center, SDSC
University of Illinois at Urbana-Champaign — National Center for Supercomputing Applications, NCSA
Carnegie Mellon University — Pittsburgh Supercomputer Center, PSC
Cornell University — Cornell Theory Center, CTC
Princeton University — John von Neumann National Supercomputer Center, JvNC

However, it soon became apparent that they would not adequately serve the scientific community. Gore began to support high-performance computing and networking projects, particularly the National Science Foundation Authorization Act where he added two amendments, one calling for more research on the “Greenhouse Effect” and other calling for an investigation of future options for communications networks for connecting research computers. This Computer Network Study Act would specifically examine the requirements for data transmission capabilities conducted through fiber optics, data security, and software capability. The NSFNET decision to chose TCP/IP in 1985 as the protocol for the planned National Science Foundation Network (NSFNET) would pave the way for the Internet.

In the Supercomputer Network Study Act of 1986 Gore proposed to direct the Office of Science and Technology Policy (the Office) to study critical issues and options regarding communications networks for supercomputers at universities and Federal research facilities in the United States and required the Office to report the results to the Congress within a year. The bill got attached to the Senate Bill S. 2184: National Science Foundation Authorization Act for Fiscal Year 1987 but it was never passed.

Still, a report was produced that pointed to the potential role of the NSF in networking supercomputers and in 1987 the NSF agreed to manage the NSFNET backbone with Merit and IBM. In October 1988 Gore sponsored additional legislation for “data superhighways” in the 100th Congress. S.2918 National High-Performance Computer Technology Act of 1988 And later H.R.3131 – National High-Performance Computer Technology Act of 1989 sponsored by Rep. Doug Walgren to amend the National Science and Technology Policy, Organization, and Priorities Act of 1976 and direct the President, through the Federal Coordinating Council for Science, Engineering, and Technology (Council), to create a National High-Performance Computer Technology Plan and to fund a 3 Gigabit network for the NSFNET.

It paved the way for S.272 High-Performance Computing and the National Research and Education Network (1991-1992) sponsored by Al Gore that passed and was signed by President George H.W. Bush on December 9, 1991. Often called the Gore Bill, it led to the development of the National Information Infrastructure (NII) and the funding of the National Research and Education Network (NREN).

The Gore Bill began discussion of the “Information Superhighway” that enticed cable, broadcast, telecommunications, satellite, and wireless companies to start developing their digital strategies. It also provided the groundwork for Gore’s international campaign for a Global Information Infrastructure (GII) that would lead to the relatively seamless and cheap data communications of the World Wide Web.

Its $600 million appropriation also funded the National Center for Supercomputing Applications (NCSA) at the University of Illinois, where graduate student Marc Andreessen and others created Mosaic, the early Web browser that became Netscape. The Netscape IPO started the Internet’s commercial boom of the 1990s.

As chairman of the Senate Subcommittee on Science, Technology, and Space, Gore held hearings on these issues. During a 1989 hearing colloquy with Dr. Craig Fields of ARPA and Dr. William Wulf of NSF, Gore solicited information about what constituted a high-speed network and where technology was headed. He asked how much sooner NSFnet speed could be enhanced 30-fold if more Federal funding was provided. During this hearing, Gore made fun of himself during an exchange about high-speed networking speeds: “That’s all right. I think of my [1988] presidential campaign as a gigaflop.” [The witness had explained that “gigaflop” referred to one billion floating point operations per second.]

It’s not my intention to obsess on the man or the personality. Rather, Gore is interesting because he has been a successful legislator and a mover of public opinion. He can also take credit for much of the climate change discussion. He has worked hard to bring the topic to the public’s attention and mobilize action on markets for carbon credits and the acceleration of developments in alternative energy. His actions and tactics are worth studying as we need more leaders, perhaps “Atari Democrats,” who can create positive futures rather than obsessing on tearing down what we have.

Notes

[1] E. J. Dionne, Special to The New York Times. WASHINGTON TALK; Greening of Democrats: An 80’s Mix of Idealism And Shrewd Politics. The New York Times, 14 June 1989, www.nytimes.com/1989/06/14/us/washington-talk-greening-democrats-80-s-mix-idealism-shrewd-politics.html. Accessed April 24th, 2019.

[2] Wayne, Leslie. Designing a New Economics for the “Atari Democrats.” The New York Times, 26 Sept. 1982, www.nytimes.com/1982/09/26/business/designing-a-new-economics-for-the-atari-democrats.html.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is a Professor at the State University of New York, South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He began his teaching career at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

keep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    February 2023
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.