Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

US INTERNET POLICY, PART 2: THE SHIFT TO BROADBAND

Posted on | March 24, 2020 | No Comments

This post is second in a series that I am producing during the COVID-19 pandemic about the importance of telecommunications policy in ensuring the widespread availability of affordable high-speed Internet access. Teaching online and working from home have gone from fringe activities to be central components of life. As we move to a Smart New Deal to transform American life, how we structure our digital environments will be central. This post discusses the transition from dial-up modems in the early days of the Internet to high-speed broadband connections. With that technical transition and the FCC’s 2005 decision, the competitive environment that the Clinton-Gore administration built, collapsed into the cable-telco duopoly we see today and made “net neutrality” an issue.

The Internet Service Providers (ISPs) mentioned in US Internet Policy, Part 1 facilitated the process of getting individuals, businesses and government “online” by linking to the Internet backbone and going “retail” with dial-up modems and then high-speed broadband connections. The term “online” emerged as way to distinguish data communications from telephony, which was highly regulated by the Communications Act of 1934. ISPs offered businesses and consumers high-speed data services for accessing the World Wide Web, hosting websites, and providing large file transfers (FTP). The key was accessing the rapidly expanding Internet packet-switching backbone network that had been developed by the National Science Foundation (NSF).

The National Science Foundation’s backbone network (NSFNET) began data transmissions at 56 kilobits per second (Kbit/s) but was upgraded to T1 lines in 1988, sending at 1.544 megabits per second (Mbit/s). It eventually consisted of some 170 smaller networks connecting research centers and universities. In 1991, the NSFNET backbone was upgraded to T3 lines sending data at 45 Mbit/s. From 1993, NSFNET was privatized, and a new backbone architecture was solicited, that incorporated the private sector.

The next-generation very-high-performance Backbone Network Service (vBNS) was developed as the successor to the NSFNet. vBNS began operation in April 1995 and was developed with MCI Communications, now a part of Verizon. The new backbone consisted primarily of glass Optical Carrier (OC) lines, each of which had several fiber-optic cables banded together to increase the total amount of capacity of the line. The interconnected Optical Carrier (OCx) lines operated at 155 Mbit/s and higher. These high-speed trunk lines soon multiplied their capabilities from OC-3 operating at 155 Mbit/s, to OC-12 (622 Mbit/s), OC-24 (1244 Mbit/s), and OC-48 (2488 Mbit/s). By 2005, OC-48 was surpassed by OC-192 (9953.28 Mbit/s) and 10 Gigabit Ethernet.

As part of NSFNET decommission in 1995, these backbone links connected to the four national network access points (NAPs) in California, Chicago, New Jersey, and Washington D.C. The backbone expanded to multiple carriers that coordinated with ISPs to provide high-speed connections for homes and businesses.

At first consumers used analog dial-up modems over the telephone lines at speeds that increased to 14.4 kilobits per second (Kbit/s or just k) by 1991 to 28.8 kbit/s in 1994. Soon the 33.6 Kbit/s was invented that many thought to be the upper limit for phone line transmissions. But the 56K modem was soon available and a new set of standards continue to push speeds of data over the telephone system. The 56K modem was invented by Dr. Brent Townshend for an early music streaming service. This new system avoided the analog to digital conversion that seriously hampered data speeds and allow content to be switched digitally to the consumer’s terminal device, usually a PC.

Also, during the 1990s, the telcos were conducting tests using a new technology called ADSL (Asynchronous Digital Subscriber Line). It was initially designed to provide video over copper lines to the home. Baby Bells, in particular, wanted to offer television services to compete with cable television. It was called asynchronous because it could send data downstream to the subscriber faster (256 kbit/s-9 Mbit/s) than upstream (64 Kbit/s-1.54 Kbit/s) to the provider.

ADSL was able to utilize electromagnetic frequencies that telephone wires carry, but don’t use. ADSL services separated the telephone signals into three bands of frequencies-one for telephone calls and the other two bands for uploading and downloading Internet activities. Different versions and speeds emerged based on the local telco’s ability and willingness to get an optical fiber link close to the neighborhood or “to the curb” next to a household or business location.

They were soon called Digital Subscriber Lines (DSL), and they began to replace dial-up modems. High demand and competition from cable companies with high-speed coaxial lines pressured ISPs and telcos to adapt DSL technologies. DSL and new cable technologies that carried Internet traffic, as well as television, came to be collectively called “broadband” communications.

Internet traffic grew at a fantastic rate during the late 1990s as individuals and corporations rushed to “get on the web.” The rhetoric of the “new economy” circulated and fueled investments in web-based companies and telecommunications providers.

A temporary investment bubble emerged as many companies lacked the technology or business expertise to obtain profits. Dot.coms such as Drkoop.com, eToys.com, Flooz.com, GeoCities, Go.com, Kozmo.com, Pets.com, theGlobe.com, and Webvan.com failed for a variety of reasons but mainly flawed business plans and the premature expenditure of investment capital.

Similarly, many carriers such as Global Crossing, WorldCom, and ISPs overestimated web traffic and built excess capacity. In the wake of the dot.com crash in 2000 and the telecom crash in 2002, many ISPs filed for bankruptcy, including Wall Street darlings like Covad, Excite@home, NorthPoint, PSINet, Rhythms NetConnections, and Winstar Communications.

The broadband industry changed significantly after the 2000 election. The technological infrastructure was significantly devastated by the dot.com crash of 2000 and the telecom crash of 2002.

Furthermore, Internet policy changed when the Bush administration was reelected in 2004. The FCC revoked Computer II in 2005 when it redefined carrier-based broadband as an information service.

This meant that broadband was effectively not regulated and telcos could go on to compete with the ISPs. Instead of offering backbone services and being required to interconnect with the ISPs, they became ISPs and were no longer required to provide ISPs their connection to the Internet. The competitive environment that nurtured Internet growth was effectively decimated.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 1: The Rise of ISPs

Posted on | March 15, 2020 | No Comments

Much of the early success of the Internet in the USA can be attributed to the emergence of a unique organizational form, the Internet Service Provider or “ISP,” which became the dominant provider of Internet and broadband services in the 1990s. These organizations resulted from a unique set of regulatory directives that pushed the Internet’s development and created a competitive environment that encouraged the proliferation of ISPs and the spread of the World Wide Web.

In this series on Internet policy, I look at the rise of the World Wide Web, the shift to broadband, deregulation, and consolidation of broadband services. Later in the series, I address the issue of net neutrality and raise the question, “Is Internet service a utility? Is it an essential service that should be made universally available to all Americans at regulated prices?

The first ISPs began as US government-funded entities that served research and education communities of the early Internet. Secured by Al Gore in 1991, legislation signed by President George H. Bush created the model of the National Research and Education Network (NREN), a government-sponsored internet service provider dedicated to supporting the needs of the research and education communities within the US. Internet2, Merit, NYSERNET, OARnet, and KanRen were a few of the systems that provided schools and other non-profit organizations access to the World Wide Web. While dialup services like Compuserve existed in the early 1980s, only later were the ISPs released for commercial traffic and services.

While telecommunications carriers had been moving some Internet traffic since the late 1980s, their role expanded dramatically after the Internet began to allow commercial activities. In June of 1992 Congressman Rick Boucher (D-Va) introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the US National Science Foundation Network (NSFNET). “A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law.” The amendment allowed advertising and sales activities on the NSFNET and marked the advent of online commercial activities.

As part of the National Information Infrastructure (NII) plan, the US government decommissioned the US National Science Foundation Network (NSFNET) in 1995. It had been the publicly financed backbone for most IP traffic in the US. The NII handed over interconnection to four Network Access Points (NAPs) in different parts of the country to create a bridge to the modern Internet of many private-sector competitors.

These NAPS contracted with the big commercial carriers of the time such as Ameritech, Pacific Bell, and Sprint for new facilities to form a network-of-networks, anchored around Internet Exchange Points (IXPs). The former regional Bell companies were to be primarily wholesalers, interconnecting with ISPs. This relatively easy process of connecting routers was to put the “inter” in the Internet but also became sites of performance degradation and unequal power relations.

As the Internet took off in the late 1990s, thousands of new ISPs set up business to commercialize the Internet. The major markets for ISPs were: 1) access services, 2) wholesale IP services, and 3) value-added services offered to individuals and corporations. Access services were provided for both individual and corporate accounts and involved connecting them to the Internet via dial-up, ISDN, T-1, frame-relay or other network connections. Wholesale IP services were primarily offered by facilities-based providers like MCI, Sprint, and WorldCom UUNET (a spinoff of a DOD-funded seismic research facility) and involved providing leased capacity over its backbone networks. Value-added services included web-hosting, e-commerce, and networked resident security services. By the end of 1997, over 4,900 ISPs existed in North America, although most of them had fewer than 3,000 subscribers.[2] See the below video and this response for how much things have changed.

FCC policy had allowed unlimited local phone calling for enhanced computer services and early Internet users connected to their local ISP using their modems over POTS (Plain Old Telephone System). ISPs quickly developed software that was put on CD-ROMs that could be easily installed on a personal computer. The software usually put an icon on the desktop screen of the computer that when clicked on would dial the ISP automatically, provide the password, and connect the user to Internet. A company called Netscape created a popular “browser” that allowed text and images to be displayed on the screen. The browser used what was called the World Wide Web, a system of accessing files quickly from computer servers all over the globe.

The ISPs emerged as an important component to the Internet’s accessibility and were greatly aided by US government policy. The distinctions made in the FCC’s Second Computer Inquiry in 1981 allowed ISPs to bypass many of the regulatory roadblocks experienced by traditional communication carriers. They opened up possibilities and created protections for computer communications. Telcos were to provide regulated basic services and “enhanced services” were to stay unregulated. Dan Schiller explained:

    Under federal regulation, U.S. ISPs had been classed as providers of enhanced service. This designation conferred on ISPs a characteristically privileged status within the liberalized zone of network development. It exempted them from the interconnection, or access, charges levied on other systems that tie in with local telephone networks; it also meant that ISPs did not have to pay into the government’s universal service fund, which provided subsidies to support telephone access in low-income and rural areas. As a result of this sustained federal policy, ISPs enjoyed a substantial cross-subsidy, which was borne by ordinary voice users of the local telecommunications network.[3]

ISPs looked to equip themselves for potential new markets and also connect with other companies. For example, IBM and telecom provider Qwest hooked up to offer web hosting services. PSINet bought Metamor to not only transfer data but to host, design, and move companies from the old software environment to the new digital environment. ISPs increasingly saw themselves as not only providers of a transparent data pipe but also as a provider of value-added services such as web hosting, colocation, and support for domain name registration.

The next part of this series will discuss the shift to higher speed broadband capabilities. Later, the consolidation of the industries starting in 2005 when the FCC changed the regulatory regime for wireline broadband services.

Notes

[1] Hundt, R. (2000) You Say You Want a Revolution? A Story of Information Age Politics. Yale University Press. p. 25.
[2] McCarthy, B. (1999) “Introduction to the Directory of Internet Service Providers,” Boardwatch Magazine’s Directory of Internet Service Providers. Winter 1998-Spring 1999. p. 4.
[3] Schiller, D. (1999) Digital Capitalism. The MIT Press. p. 31.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Undergraduate Director at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Five Stages of ICT for Global Development

Posted on | February 19, 2020 | No Comments

Summarized remarks from “Five Stages of Global ICT4D: Governance, Network Transformation, and the Increasing Utilization of Surplus Behavioral Data for Prediction Products and Services,” presented at the 27th AMIC Annual Conference on 17-19 June 2019 at Chulalongkorn University, Bangkok, Thailand.

This presentation will explore and outline the following stages of economic and social development utilizing information and communications Computerization and Development in SE Asia technologies (ICT). The ICT acronym has emerged as a popular moniker, especially in international usage, for the digital technology revolution and is often combined with “development” to form ICT4D. Development is a contested term with currency in several areas. Still, in global political economy, it refers to the process of building environments and infrastructure needed to improve the quality of human life and bridge equity divides. Often this means enhancing a nation’s agriculture, education, health, and other public goods that are not strictly economy-related but improve well-being and intellectual capital.

Of particular interest is the transformation of public-switched networks for Internet Protocol (IP)-based services and then WiFi and mobile use. Data-intensive solutions are beginning to address many development issues. However, a growing concern is that data is being collected extensively and used intrusively to manipulate behaviors.

The stages are categorized as:

1) Containment/Development/Modernization; 2) New World Economic and Information Orders; 3) Structural Adjustment and Re-subordination, 4) Global ICT Integration, and; 5) Smart/Sustainable Mobile and Data-Driven Development.

Using a techno-structural approach, the explication of these stages will provide historical context for understanding trends in ICT innovation and implementation. This approach recognizes the reciprocal effects between technological developments and institutional power and a “dual-effects hypothesis” to illustrate the parallel potentials of ICT4D as both a democratizing and totalizing force. This research will also provide insights into the possibilities of ICT diffusion in developing environments.

1) Containment/Development/Modernization

The US emerged as the primary economic and military power in the aftermath of World War II. Arms and materiel sales before the war had transferred much of the world’s gold to the US, which was wisely transferred inland to Fort Knox, Kentucky. Franklin Delano Roosevelt (FDR) had sought to limit finance in the aftermath of the “Crash of 1929” and continued it globally with the initiative at Bretton Woods Hotel in New Hampshire. “Bretton Woods” created the International Monetary Fund (IMF), the World Bank, the International Trade Organization (rejected by Congress), and also instituted a dollar-gold standard that tied the US dollar to gold at $35 an ounce (oz) and other international currencies to the US dollar at set rates. This system was designed to “contain” financial speculation and encourage trade and development.

The other aspect of containment, more widely known, is the containment of Communism. The painful success of the USSR in World War II in countering the Nazis on the eastern front and their appropriation of German atomic and rocketry technology presented an ideological and military threat to the US and its allies. The USSR’s launch of Sputnik satellites in 1957 resulted in the US’s formation of NASA and ARPA. The Communist revolution in China in 1949 and their explosion of an atomic bomb on Oct. 16, 1964, spurred additional concern. The resultant Cold War and Space Race spurred technological development and competition for “developing countries”worldwide.

“Development” and “modernization” characterized the post-World War II US prescription for economic development around the world, and especially in newly decolonized nation-states. Modernization referred to a transitional process of moving from “traditional” or “primitive” communities to modern societies based on scientific rationality, abstract thought, and the belief in progress. It included urbanization, economic growth, and “psychic mobility” that could be influenced by types of media. Scholars talked of an eventual “takeoff” if the proper regiment was followed, particularly the adoption of new agricultural techniques termed the “Green Revolution.”[1] Information and Communications Technologies (ICTs) were rarely stressed, but “five communication revolutions” (print, film, radio, television, and later, satellites) were beginning to be recognized as making a contribution to national development.

Communication technologies were beginning to spread information about modern practices in agriculture, health, education, and national governance. Some early computerization projects continued population analysis, such as the census that had started with tabulation machines, while mainframes and minicomputers were increasingly utilized for statistical gathering by government agencies for development processes.

Telegraphy and telephones were strangely absent from much of the discussion but were important for government activities as well as large-scale plantations, mining operations, transportation coordination, and maritime shipping. Because of their large capital requirements and geographic expanse, countries uniformly developed state-controlled Post, Telephone, and Telegraph (PTTs) entities. Organized with the help and guidance of the International Telecommunications Union (ITU), the oldest United Nations entity, PTTs struggled to provide basic voice and telegraphic services. However, they provided needed jobs, technical resources, and currency for the national treasury.

Wilbur Schramm’s (1964) book Mass Media and National Development made crucial links between media and national development. Published by Stanford University Press and UNESCO, it examined the role of newspapers, radio, and television. Its emphasis on the role of information in development also laid the foundation for the analysis of computerization and ICT in the development process. I had an office next to Schramm for many years at the East-West Center’s Communication Insitute that he founded while I worked on the National Computerization Policy project that resulted in the ICT4D benchmark study Computerization and Development in Southeast Asia (1987). Herbert Dordick, Meheroo Jussawalla, Deane Neubauer, and Syed Rahim were key scholars in the early years of ICT4D at the East-West Center.[1]

2) New World Economic and Information Orders

Rising frustrations and concerns about neo-colonialism due to the power of transnational corporations (TNCs), especially news companies, resulted in a collective call by developing countries for various conceptions of a “New World Economic and Communication Order.” It was echoed by UNESCO in the wake of OPEC oil shocks and the resulting Third World debt crisis. The issue was primarily news flow and the imbalanced flow of information from North to South. Developing countries were concerned about the unequal flows of news and data from developing to developed countries. In part, it was the preponderance of news dealing with disasters, coups, and other calamities that many countries felt restricted flows of foreign investment. The calls caused a backlash in the US and other developed countries concerned about the independence of journalism and the free flow of trade.[2]

It was followed by concerns about obstacles hindering communications infrastructure development and how telecommunications access across the world could be stimulated. In 1983, UNESCO’s World Communication Year, the Independent Commission met several times to discuss the importance of communication infrastructure for social and economic development and to make recommendations for spurring its growth.

The Commission consisted of seventeen members – communication elites from both private and public sectors and representing a number of countries. Spurred on by the growing optimism about the development potential of telecommunications, they investigated ways Third World countries could be supported in this area. They published their recommendations in The Missing Link (1984) or what soon was to be called the “Maitland Report” after its Chair, Sir Donald Maitland from the United Kingdom. This report brought recognition to the role of telecommunications in development and opened up resources by international organizations such as the World Bank.

The transition from telegraph and telex machines to computers also resulted in concerns about data transcending national boundaries. The Intergovernmental Bureau for Informatics (IBI) that had been set up as the International Computation Centre (ICC) in 1951 to help countries get access to major computers, began to study national computerization policy issues in the mid-1970s.

They increasingly focused on transborder data flows (TDF) that moved sensitive corporate, government, and personal information across national boundaries. The first International Conference on Transborder Data Flows was organized in September 1980, followed by a second held in 1984; both were held in Rome (Italy). The increasing use of computers raised questions about accounting and economic data avoiding political and tax scrutiny. The concern was that these data movements could act like a “trojan horse” and compromise a country’s credit ratings and national sovereignty, as well as individual privacy.

3) Structural Adjustment and Re-subordination

Instead, a new era of “structural adjustment” enforced by the International Monetary Fund emerged that targeted national post, telephone, and telegraph (PTT) agencies and other aspects of government administration and ownership. Long considered agents of national development and employment, PTTs came under increasing criticism for their antiquated technologies and lack of customer service.

In the early 1970s, Nixon ended the Bretton Woods regulation of the dollar-gold standard, resulting in very volatile currency markets. Oil prices increased, and dollars flowed into OPEC countries, only to be lent out to cash-poor developing countries. The flow of petrodollar lending and rising “third world debt” pressured PTTs to add new value-added data networks and undergo satellite deregulation. Global circuits of digital money and news emerged, such as Reuters Money Monitor Rates and SWIFT (Society for Worldwide Interbank Telecommunications). These networks, the first to use packet-switching, linked currency exchange markets worldwide in arguably the first virtual market.

A new techno-economic imperative emerged that changed the relationship between government agencies and global capital. PC-spreadsheet technologies were utilized to inventory, value, and privatize PTTs so they could be corporatized and listed on electronically linked share-market exchanges. Communications markets were liberalized to allow domestic and international competition for new telecommunications services, and sales of digital switches and fiber optic networks. Developing countries became “emerging markets,” consistently disciplined by the “Washington Consensus” stressing a set of policy prescriptions to continue to open them up to transborder data flows and international trade.[3]

4) Global ICT Integration

Packet-switching technologies standardized into the ITU’s X.25 and X.75 protocols for PTT data networks, transformed into ubiquitous TCP/IP networks by the late 1990s. Cisco Systems became the principal enabler with a series of multi-protocol routers designed for enterprises, governments, and eventually telcos. Lucent, Northern Telecom, and other telecommunications equipment suppliers quickly lost market share as the Internet protocols, mandated by the US military’s ARPANET, and later by the National Science Foundation’s NSFNET, were integrated into ISDN, ATM, and SONET technologies in telcos around the world.

The Global Information Infrastructure (GII) introduced at the annual ITU meeting in Buenos Aires in March of 1994 by Vice President Gore targeted national PTT monopolies and government regulatory agencies. He proposed a new model of global telecommunications based on competition, instead of monopoly. He stressed the rule of law and the interconnection of networks to existing networks at fair prices. Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations which called for the inclusion of GATS (General Agreement on Trade in Services) that include everything from ciruses, to education, radio and television, and telecommunications services. And, at this meeting, they called for the creation of the World Trade Organization (WTO).

Formed in 1995, the WTO had two meetings in 1996 and 1997 that created a new era of global communications and development. Members party to the new multilateral arrangement met quickly in Singapore in 1996 to reduce tariffs on the international sales of a wide variety of information technologies. The Information Technology Agreement (ITA) was signed by 29 participants in December 1996. The agreement was expanded at the Nairobi Ministerial Conference in December 2015, to cover an additional 201 products valued at over $1.3 trillion per year. A agreements allowed Korea to successfully market early CDMA mobile handsets and develop a trajectory of success in the smartphone market.

In 1997 the WTO met in Geneva and established rules for the continued privatization of national telecommunications operations. Sixty-nine nations party to the WTO, including the U.S., signed the Agreement on Basic Telecommunications Services in 1997 that codified new rules for telecommunications deregulation where countries agreed to privatize and open their own telecommunications infrastructures to foreign penetration and competition by other telcos.

The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not negotiated and reduced tariffs for crucial networking and computer equipment. The resultant liberalization of data and mobile services around the world made possible a new stage in global development.

Hypertext, Ad Markets, and Search Engines

The online economy emerged with the Internet and its hypertext click environment. Starting with advertising and the keyword search and auctioning system, a new means of economic production and political participation based on the wide-scale collection and rendition of surplus behavioral data emerged for prediction products and services.

As Shoshana Zuboff points out in Surveillance Capitalism (2019), the economy expands by finding new things to commodify, and the Internet provided a multitude of new products and services that could be sold. When the Internet was privatized in the early 1990s and the World Wide Web (WWW) established the protocols for hypertext and webpages, new virtual worlds of online media spaces were enabled. These were called “inventory.” Or you can can them ad space.

Behavioral data is the information produced as a result of actions that can is measured on a range of devices connected to the Internet, such as a PC, tablet, or smartphone. Behavioral data tracks the sites visited, the apps downloaded, or the games played. Cloud platforms claims human experience as free raw material for translation into behavioral data. Some of this data is applied to product or service improvements, the rest are declared as proprietary behavioral surplus, and fed into advanced manufacturing processes known as ‘machine intelligence.’ Automated machine processes can capture knowledge about behaviors but also shape behaviors.

Surplus behavioral and instrumental data is turned into prediction products such as recommendation engines for e-commerce and entertainment. These anticipate what people will do now, soon, and later. Prediction products are traded in a new kind of marketplace for behavioral predictions called behavioral futures markets. These are currently used primarily used in advertising systems based on CTR, Pay-Per-Click (PPC), and real-time bidding auction systems.

5) Smart/Sustainable Mobile and Data-Centric Development

The aggressive trade negotiations and agreements in the 1990s significantly reduced the costs of ICT devices and communication exchanges worldwide, making possible a wide variety of new commercial and development activities based on ICT capabilities. We are at the halfway point for the sustainable development goals (SDGs) outlined by the United Nations in 2015. The SDGs are providing an additional impetus for ICT4D as it encourages infrastructure building and support for key development activities that ICT can assist, such as monitoring Earth and sea resources and providing affordable health information and communication activities.

A key variable is the value of the dollar that is the world’s primary transacting currency. A global shortage of dollars due to high interest rates or political risk means higher prices for imported goods, regardless of lower tariffs. The post-Covid crisis in Ukraine has stressed supply chains of key materials and raw Earth mineral from Russia and Ukraine further adding to potential costs and geopolitical risk. ICT4D is highly reliant on global supply chains making digital devices readily available at reasonable prices.

The near-zero marginal costs for digital products make information content and services more accessible for developing countries. Books, MOOCs, and other online services provide value to a vast population with minimal costs to reach each additional person. Platform-based services providing agricultural, health, and other development services provide low-cost accessibility and outreach. They allow new applications to scale significantly with low costs. Incidentally and significantly, renewable energy sources like solar and wind also provide near-zero marginal costs for producing electricity. Like digital products, they require high initial investments but output product at low costs once operational.

Mobility, broadband, and cloud services are three significant technologies presenting positive prospects for ICT4D. Mobile broadband technologies that bypass traditional wireline “last mile” infrastructure have been a major boost to the prospects for ICT4D. They provide significant connectivity across a wide range of the population and with key commercial and government entities. 4G LTE technologies currently provide the optimal service, as 5G towers consume nearly over 60% more power than LTE and also require more stations as their range is lower.

Enhanced connectivity strengthens network effects. Blockchain technologies and cryptocurrencies, the Internet of Things (IoT), and the proliferation of web platforms are some of the current conceptions of how reduced costs for communications and information analysis are enhancing network effects and creating value from the collection and processing of unstructured data.

This project will expand on these stages and provide a context for a further investigation of ICT for development drawing on historical and current research. Of particular concern is the implementation of policies and practices related to contemporary development practices, but commercial and monetization techniques are important as well.

Notes

[1a] Dordick, Herbert S. and Deane Neubauer. 1985. “Information as Currency: Organizational Restructuring Under the Impact of the Information Revolution.” Bulletin of the Institute for Communications Research, Keio University, No 25, 12–13. This journal article was particularly insightful into the dynamics of the PTTs that would lead to pressures on them to adapt IP technologies leading to the World Wide Web.

[1] Rostow, W.W. (1960) Stages of Economic Growth: A Non-Communist Manifesto. Cambridge: Cambridge University Press. See also Rostow W.W., (1965) Stages of Political Development. Cambridge: Cambridge University Press.
[2] An excellent discussion of the various development and new world communication and economic order discourses can be found in Majid Tehranian’s (1999) Global Communication and World Politics: Domination, Development, and Discourse. Boulder, CO: Lynne Rienner Publishers. p. 40-41. Also, see Jussawalla, M. (1981) ) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[3] Wriston, W.W. (1992) The Twilight of Sovereignty : How the Information Revolution Is Transforming Our World.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 19). Five Stages of ICT for Global Development. apennings.com https://apennings.com/how-it-came-to-rule-the-world/planting-to-platforms-five-stages-of-ict-for-global-development/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea. From 2002-2012 he was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, andVictoria University in Wellington, New Zealand. His American home is in Austin, Texas and taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets – Part 5 – Numeracy and the Power of Zero

Posted on | February 17, 2020 | No Comments

Previously, I explored the electronic spreadsheet as a meaning-making application that was central to the financial explosion of the 1980s and its Apple II running VisiCalc spreadsheet economic aftershocks. Spreadsheets framed and produced information and meaning consequential to monetary and organizational practices as they became part of the daily routines of information workers. Although, initially, the purview of accountants and bookkeepers, spreadsheet usage became ubiquitous throughout work practices in conjunction with word processing and databases. The spreadsheet incorporated numerical formulations and innovations with systems of categorization and inventorying to become a tool of productivity and power over people and resources.

In my last post, I explored the importance of symbolic representation systems, mainly writing, in the workings of a spreadsheet. Written alphanumerical symbols have shaped Western society. For example, tables and lists are epistemological technologies that have historically organized administrative knowledge in castles, military camps, and monasteries. With the invention of ASCII characters, the facility of the PC-based applications, including the spreadsheet, became powerful tools for organizing written and numerical facts in modern corporations and other organizations worldwide.

In this post, I will focus specifically on the power of numeracy, with a special emphasis on the role of zero. The zero is an extraordinary cognitive invention that has been central to the quantitative workings of the spreadsheet. In conjunction with Indo-Arabic numerals and double-entry accounting techniques, the spreadsheet has been crucial to the rise of modern capitalism and that peculiar historical manifestation, the corporation.

Although still used on occasion for style, Roman numerals have been mathematically obsolete for several hundred years. Initially based on scratching or tallying systems for sheep and other items, Roman numbers most likely represented hand figurations or gestures. For example, the number 10 or X probably represented two thumbs crisscrossed. Addition and subtraction were relatively straightforward, but division and multiplication were not as “easy or obvious.” Roman numerals included thousands (“M”) but they never developed a representation for million or beyond.

The modern system of numeration is based on representations using ten different digits 0, …, 9 imported from the Middle East and Asia and will be called Indo-Arabic in this post. These numerals are said to have been designed based on the number of angles each numeral contained and over the years the way they are written have rounded out. The Arabian interest in Indian numerals based on zero arose to solve practical problems such as inheritances, purchases, sales contracts, tax collection and wills. Indo-Arabic numerals moving from India to the Middle East and finally to Europe were crucial for accounting and financial systems that have since become global standards and key ingredients in spreadsheet formulations.

Evidence dates the zero (also known as the naught, or nil) back some 2000 years to the Angkor Wat civilization in Cambodia; although it is generally recognized that India refined its use around 500AD, and it came to Europe in the early 1200’s from Arabia. Muhammed ibn-Musa al-Khwarizmi, or “Algorismus,” as his name was Latinized, was probably one of the most influential mathematicians in the transfer of this knowledge to the West. The Persian scholar taught in Baghdad sometime between 800 and 850. He wrote a book on the Hindu number system that was translated into Latin as De numero indorum or “On the Hindu numbers.” He later wrote another seminal book, Al-jabr w’al muqabalah, which became known in Europe as Algebra, based on the author’s Latin name and is the root of the English word “algorithm.”

One of the mathematicians who introduced these numbers to Europe was Leonardo of Pisa or Leonardo Pisano, famously known as “Fibonacci.” It is short for filius Bonacci, the son of Bonaccio. In his book, Liber abaci (Book of the Abacus or Book of Calculating) completed in 1202, he showed how the Indo-Arabic numbers could be used. The book was divided into four parts. The first introduced Indo-Arabic numbers, especially zephirum, which became zefiro in Italian, zero in the Venetian dialect. The second section showed how calculations dealing with currency conversions, compound interest, and the determination of profit could benefit businesses. The third and fourth sections addressed a number of mathematical problems including irrational numbers and the Fibonacci sequence that the author is most known for today. The video below introduces his relevance to commercial activities.

It was the development of the zero and the related positional system that made modern “Western” calculation systems so effective. It has become necessary for a variety of mathematical purposes including decimals, sets, and quite significantly, the mathematical systems that makes it easier to work with larger quantities. Using the place holding system, nine numbers plus zero can represent an infinity of figures. The same symbol, such as 7, takes on different meanings (7, 70, 700, etc.) depending on its location within the representation of the number. The positional base-10 system using ten different digits 0, …, 9 has been globally accepted as the primary mathematical standard for human calculation.

The base-10 positional system probably emerged from counting on our fingers, but it is adequately suited to arithmetical computations as it needs only ten different symbols and uses the zero to mark the place of a power of the base not actually occurring. Think of a car’s odometer that every ten miles causes the dial to turn and milestones such as 10,000 and 100,000 are markers of a car’s age that we often subscribe significance.

One of the strengths of the spreadsheet is its ability to combine complex calculations with human understanding of the base-10 mathematical system. While the “alien intelligence” of computers can now handle more complex base systems such as the duodecimal (base-12) and sexagesimal (base-60) place-holding systems used in time and geographic calculations, base-10 is useful because, quite frankly, humans are used to it. Although computers use a base-2 system with nearly infinite combinations of 1s and 0s, the positional base-10 system has been globally accepted as the mathematical standard for human calculation and a key component of spreadsheet usability.

The calculative abilities of zero with other Indo-Arabic numbers brought new levels of certainty and confidence for commerce and eventually science in the West. By 1300, zero-based accounting and other numerical techniques were being adapted by the merchant classes. Double-entry accounting techniques emerged first for tracking resources and checking for errors, but later resulted in the conceptual separation of a business from its owner, a precursor condition for the emergence of the modern corporation.

The spreadsheet drew on this history of numerical innovation to become a tool of organizational productivity and power. As part of my “formulating power” project, I will continue to examine the ways spreadsheets construct techno-epistemological knowledge using combinations of algorithms and other meaning-making practices.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is Professor at the Dept of Technology and Society at the State University of New York (SUNY) in Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

“It’s the Infrastructure, Stupid”

Posted on | February 9, 2020 | No Comments

Infrastructure, at the deepest level, is a techno-socio bond that brings together new communications technologies, new energy sources, new modes of mobility and logistics, and new built environments, enabling communities to more efficiently manage, power, and move their economic activity, social life, and governance. – Jeremy Rifkin, The Green New Deal.[1]

The 1992 US presidential election was notable for the phrase, “It’s the Economy Stupid” as William Jefferson Clinton attacked incumbent president George H. Bush for ignoring economic and social problems at home. President Bush had a dramatic military victory that drove the Iraqis out of Kuwait with “Desert Storm,” but the economy at home lingered with high unemployment and unprecedented debt. After the election, the new Clinton-Gore administration implemented the Global Information Infrastructure (GII) that helped the obscure data networks used by academia and research institutes blossom into the Internet and its World Wide Web.

The framework combined with the rapidly emerging technology and financial resources to create the famous “Bull Run” economic revival of the late 1990s. Starting with the Netscape IPO in August, 1995, investment poured into “dot.coms” and telecoms. This information infrastructure continues as the dominant economic engine and social dynamic of the 21st century. However, opportunities exist to scale the networks, and include energy and mobility.

This week, I had a chance to attend the Japan-Texas Infrastructure Investment Forum in Austin. Texas and Japan are prolific trade partners and got together to discuss more “public goods” like transportation, water, and the possibilities of “smart cities.”

In this post, I want to connect the current imperative to build infrastructure in the US with green and smart technologies and this series’ emphasis on the New Deal. New thinking and infrastructure investments offer the opportunity to create enabling environments for sustainable and replenishing economic activity. These frameworks require examining political and juridical structures to open up avenues for new investments and evaluation systems. We may not want to reinvent the New Deal, but it’s a point of reference for examining the way forward.

If Texas were a country, it would have the 10th largest economy in the world. Bigger than Australia, Canada, and even Russia. Japan, of course, is number 3, after the US and China. Japan is moving many of its US operations to Texas due to the Lone Star state’s business environment, rich resources and cosmopolitan cities. Texas exports chemicals, food, oil and gas, and electronic products.

I was primarily interested in seeing the presentations on Smart City developments, but was intrigued by the talks on water management in Texas (I’m Dutch, it’s in my genes) and transportation. I didn’t know, for example, that Texas desalinates over 150 million gallons of water every day. Also, Texans drive over 550 million miles a day. What would be the implications of renewable-powered desalination for agriculture and general water needs? How do you manage that much road maintenance? What alternatives are available to the traditional internal combustion vehicle? Just a few tidbits that set the context for the day’s presentations on building infrastructure in Texas.

One of the objectives for Japan was to pursue contracts for their Shinkansen, the high-speed railroad that makes Japan fairly easy to traverse. It’s only a matter of time before Dallas, Austin, and Houston are connected with more efficient lines, and Japan wants to get in on that.

They even brought in the Japan Overseas Infrastructure Investment Corporation for Transport & Urban Development (JOIN) and the Japan Bank for International Cooperation (JBIC) to support the funding of the operation. Having both driven the routes to Dallas and Houston as well as taken the railroad in Japan, I certainly would enjoy the Shinkansen bullet train for my next Texas trip.

Texas is unique because it is a major carbon-extracting and exporting state. But like Saudi Arabia, it recognizes the importance of pursuing infrastructural projects with a green tinge. Growth in Texas is expected to increase substantially over the next several decades, and that means new strategies for mobility, water availability, and disaster risk reduction.

Now we have to confront and analyze the implications of a post-petroleum age. Exploring the New Deal gives us a better perspective on the size of that task, how deep and sometimes intrusive the process will be. The New Deal was a monumental, multi-decade endeavor that made American great and will not be easily matched.

Roosevelt’s New Deal was primarily an infrastructure program. Facing economic collapse and massive unemployment, FDR promised “a new deal for the American people.”[2] In the first 100 days of the FDR administration, some 15 bills were passed to assist the recovery and put people to work. Some of the major projects were the Civilian Conservation Corps (CCC), the Public Works Administration (PWA), the Tennessee Valley Authority (TVA), and the related Rural Electrification Act. These were all designed to get people back to work and build an infrastructure that would support the new energy paradigm – hydrocarbons and electricity.

While economic recovery sputtered throughout the 1930s,
federally-funded infrastructure projects built the roads, tunnels, and bridges for cars and trucks, the dams and coal-fired utilities for electrification, and some 125,000 public buildings including schools, hospitals, and government facilities. Even airports like LaGuardia outside Manhattan were a product of the New Deal.

The Hoover Dam tapped the Colorado River and provided electricity for the entire Southwest, including Los Angeles and a small town called Las Vegas. When the dam was completed in 1936, it was the most extensive electricity producing facility in the world, providing power to Arizona, California, and Nevada. It electrified homes, entertainment, industry, and agriculture.

Another big infrastructure project for the New Deal was the Interstate Highway System of 1938 that eventually laid down almost 47,000 miles of public roads. Before the attack on Pearl Harbor Roosevelt appointed a National Interregional Highway Committee to study the need for several cross-country inter-state highways. The building of the “Autobahn” in Nazi Germany was a major source of motivation. In Interregional Highways, the committee recommended constructing 40,000 miles (64,000 km) interstate highway system. It was an extraordinary push for the mobilization and motorization of the US. It provided extraordinary interconnection between cities and was the “killer app” for the automobile.

Vice-President Gore was influenced by his father, Senator Al Gore Sr., who co-authored the Federal-Aid Highway Act of 1956 infrastructure program during the Eisenhower Administration. Dwight Eisenhower had studied the German Reichsautobahnen as he plotted the invasion of Europe during World War II and was committed to building the nation-wide highway in the US. It created a network of roadways that sparked the US economy, eventually reaching some 46,000 miles. The son used the inspiration to conceptualize the National Information Infrastructure plan that turned the NSFNET into the Internet.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 9). It’s the Infrastructure, Stupid. apennings.com https://apennings.com/democratic-political-economies/from-new-deal-to-green-new-deal-part-3-its-the-infrastructure-stupid/

Notes

[1] Quote “It’s the Infrastructure Stupid” in the title is also from Jeremy Rifkin, The Green New Deal. “It’s the Economy, Stupid” was coined by campaign strategist James Carville during the Clinton-Gore 1992 presidential campaign during the 1992 election to counter George H. Bush’s success with the first Iraq War.
[2] Blitz, M. (2017, November 20). When America’s Infrastructure Saved Democracy. Retrieved February 8, 2020, from https://www.popularmechanics.com/technology/infrastructure/a24692/fdr-new-deal-wpa-infrastructure/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

FROM NEW DEAL TO GREEN NEW DEAL, Part 2: The Failure of the National Industrial Recovery Act of 1933

Posted on | February 6, 2020 | No Comments

This is the second of an ongoing series about how the New Deal restructured the American political economy and what lessons it has for transiting to a Green New Deal. The first one dealt with the New Deal emerging from the wake of the Great Depression and the immediate policy responses by the Roosevelt administration to deal with the banking and financial crisis. This post looks at a failure, the National Industrial Recovery Agency’s (NIRA) attempts to administer a wide range of codes and practices for business and industry. What are the lessons of the NIRA for the Green New Deal?

The green energy revolution strives for zero emissions of harmful carbon by-products and near-zero marginal costs after installation. Hydrocarbons from coal or oil are incredible sources of energy but emit deadly carbon monoxide and climate threatening carbon dioxide. They are also used up in consumption and require constant replenishment. Good for petro-states like Russia and Saudi Arabia but a constant currency drain for the rest of the world. Renewables produce a constant supply of energy. They don’t last forever and eventually require replacement, but their economics are extraordinary and will make possible exciting new opportunities like desalination and purification of saltwater, for example.

The Green New Deal will reach deep into the commerce and industrialization of the global economy. While the movement is gaining momentum, a few sold Teslas, and some homes with solar panels do not a revolution make. Although I recently had my first ride in a Tesla and it was awesome.

The Green New Deal will need to continue to technologically build the Smart Grid while regulatory changing the utilities to allow smaller microgrids that can utilize local resources, including buying energy from residences and small businesses. Broadband networks and the Internet of Things (IoT) will be crucial to the convergence and providing the “smart” aspects of the grid. Other industries that will be affected include agriculture, architecture, automobiles, construction, supply chain logistics, military, etc. [1]

How will they all work together? Not only between different industries but different companies within the same sector. How will winners emerge? What will happen to losers? Solyndra, for example, became a major political issue when it filed for bankruptcy in 2011. It was a manufacturer of innovative thin- film solar cells based in Fremont, California. Solyndra received significant subsidies in guaranteed loans from the Department of Energy as part of the economic stimulus plan. But it still couldn’t compete with more traditional solar cell technology companies, especially from China. What are we to make of situations like this?

The Green New Deal faces many significant and complicated issues. What are the electrical standards, for example, the building codes? The sewage interconnections? Establishing networks of automobile recharging (or hydrogen refueling) stations? Can prices within an industry be regularized without penalizing consumers? Does labor organization need to be revived from its decimation during the Reagan years?

A step back for the New Deal…

In his larger attempt at industrial structuring, President Roosevelt sent a plan to Congress that became the National Industrial Recovery Act of 1933. Congress passed it into law on June 16 of that year. The Act created the National Industrial Recovery Agency (NIRA) to administer codes of practice for business and industry. The Act was “a clear victory for the many prominent businessmen who were backing cartelization as a solution to the nation’s industrial problems.” [1]

The Act allowed industries to create “codes of fair practice.” By suspending antitrust laws, these codes allowed corporations in particular sectors to set prices, restrict output, and increase profits. These were agreements that enabled the NIRA to administer, with the dominant trade associations, what were in effect national cartels. Although it was later declared unconstitutional by the Supreme Court, the codes became part of later legislation and became part of the national restructuring of the US economy.

While Hoover had limited his activism to the establishment of the Reconstruction Finance Corporation that served as a lender of last resort to banks and the railroads, he opposed cartelization and thus alienated himself from many business leaders. The NIRA placated most of the big business concerns. However, for political reasons, Roosevelt’s plan was designed to serve many other constituencies. He had made concessions to liberal conservatives to reassure them that socialism was not anywhere near the path he was taking, nor was he forwarding “monopoly.” But opposition did mount from small business people and farmers who saw the codes being dominated by big business.

Finally, the Act started to antagonize large corporations because of Section 7a, which encouraged labor organization. and had led to a series of violent strikes.

A poultry company from Brooklyn, NY, sued the NIRA and the case went all the way to the Supreme Court. In Schechter Poultry Corp. v. The United States, the U.S. Supreme Court, rejected the compulsory-code system. SCOTUS argued that the NIRA improperly delegated legislative powers to the executive and that regulating poultry codes did not meet the standards of interstate commerce, a constitutional requirement for Federal regulation. By May of 1935, the Supreme Court declared the Act unconstitutional.

The New Deal shows us how massive and complicated a major economic reorganization can be. The Green New Deal should seriously study the issues that FDR confronted to revive the economy and chart a new course for the US that avoided revolution. The case of the NIRA gives us some idea of the scale of the transition and the challenges of government intervention in the economy.

Notes

[1] Rifkin, J. (2020) Green New Deal. Retrieved from

McQuail, K. (1982) Big Business and Presidential Power. NY: William Morrow and Company. p. 27.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

YouTube Meaning-Creating (and Money-Making) Practices

Posted on | February 2, 2020 | No Comments

Note: This is required reading for my Visual Rhetoric and IT class.

Youtube has emerged as the primary global televisual medium, attracting about 1.3 billion viewers from countries around the world with over 5 billion videos watched every day. People suck up some 3.25 billion hours of YouTube videos each month and over ten thousand YouTube videos generated over 1 billion views since they were posted. YouTube contents range from homemade DIY videos to professional high definition television productions.

Alphabet recently announced that YouTube made $15 billion in advertising revenues in 2019, growing 36% over 2018. That is a lot of money to spread around.

YouTube provides opportunities for new publishers or “vloggers” covering a wide range of topics. Every minute, some 400 hours of video are uploaded to YouTube from all around the world. Not many of those owners get rich but some have done extraordinarily well. Together, the world’s 10 highest-paid YouTube stars made $180 million in the year between June 1, 2017, and June 1, 2018, almost double the year before.

One big star to emerge on YouTube is Daniel Middleton (DanTDM) who made US$18.5 million in 2018. Middleton is an Australian professional gamer, and his videos primarily cover games like Minecraft, Plants vs. Zombies, and other favorite games that DanTDM’s primary audience, young kids, enjoy. Here he reviews the massive hit called Fortnite.

What makes DanTDM’s YouTube videos successful? What does he do to keep the viewer’s interested in his content and what keeps his audience coming back for more? How does he create entertainment and meaning for those who watch his show?

Even more extraordinary is Ryan ToysReview (now called Ryan’s World!). Ryan is a 7-year-old host of the YouTube show and is the top-earning YouTube channel at $22.5 million for the year up to June 1, 2018.

What knowledge can we gather about Ryan’s World!? What observations can we make about his show and other popular channels?

This series of posts will set out to explore a crucial relationship in (digital) media studies – between cultural/technical production practices and the meanings, emotions, and feelings that are produced by those practices. Media production involves a combination of equipment and processes to capture and construct various images, edit sequences, and integrate audio and sound effects to produce specific results. These are the meaning-making techniques that construct our blockbuster movies, our Netflix binge favorites, our local newscasts, and also the YouTube channels that educate and entertain us. Can we use some of the same analytical techniques to “interrogate” YouTube channels?

A good deal of related work has been done on film and television. By exploring camera shots: close-ups, zooms, pans, shot composition, as well as montage: cutting rates, parallel editing, reaction shots, wipes, etc., film studies and even television studies have given us a literacy to understand the power of these mediums. These important meaning-making practices can also be discerned in the realm of YouTube videos.

Social media apps like YouTube present significant new complications in understanding the power of the global mediasphere. One area of concern are the metrics associated with YouTube. Ratings were always a significant part of television services to determine the value of programming. YouTube measures “views” and adds likes, dislikes, shares, play-listing, and subscribers to measure the credibility and commercial viability of a channel. But vulnerabilities in the system allow many of these numbers to be tweaked by the “fake-view” ecosystem that has grown around YouTube.

YouTube has become a new frontier for media studies. The opportunity exists now to pioneer strategies for understanding this intriguing visual medium and the sets of meanings they create. What techniques are used in YouTube “channels?” What types of persuasive techniques are effective on YouTube channels. How do they differ from techniques used in film and television? Who is driving the narration of the video and what voices are they using?

But there are broader issues to address as well. What are the cultural, economic, and social implications of YouTube? What new ideas and cultural forms diffuse via Youtube? What economic activities and opportunities are made available through the platform? What impact will YouTube have on existing institutions?

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

FROM NEW DEAL TO GREEN NEW DEAL, Part 1: Roosevelt Saves Capitalism

Posted on | February 2, 2020 | No Comments

“I pledge you, I pledge myself, to a new deal for the American people.” – FDR in the summer of 1932.

Recent discussions about a proposed Green New Deal encouraged me to review some of my notes on the original New Deal and if it could provide relevant insights into our current situation. The New Deal began in the early 1930s as a response to the economic crash in the late 1920s. It built up momentum during the “Great Depression” of the next decade. It ended, arguably, just before the end of the millennium with the repeal of the Glass-Steagall Act on November 12, 1999 when President Clinton signed the Financial Services Modernization Act. The New Deal was one of the most influential sets of legislation in American history and it set the course of modern US history.

On October 1, 1928, the Dow Industrial Average closed at 240.07. Earlier that year, “the Dow” was increased to thirty stocks from the traditional 12 stocks. The “Roaring Twenties” had been a good decade for many investors. But that was about to end for most.

It would continue to rise over the next year to 381 before beginning its dramatic descent. In October 1929, the Dow began its steep decline as investors hastened to liquidate their positions. From Thursday, October 24 to Tuesday, October 29, the stock market crashed dramatically, eventually falling to just 41.

Capitalism had run into deep trouble. Never before had public confidence been shaken so thoroughly. Unemployment was estimated to have fallen to 25% in the US and England. It was even worse in Germany, which had been strapped with war reparations at the Treaty of Versailles. Political unrest was brewing in democratic political economies around the world. It was little more than a decade earlier that a Communist revolution had occurred in Russia, and many believed that the problems of the industrial economy needed a communist or socialist solution.

In the US, protest marches became frequent. Rent riots broke out as people organized to prevent home evictions and farm foreclosures, often physically. The country was in such dire straits that many believed it could have gone in any political direction. This confusion lapsed into the term of the next President, Franklin D. Roosevelt, who beat Republican incumbent Herbert Hoover and was inaugurated as President on March 4 1933.[1]

Roosevelt moved immediately to save the banking system that had been experiencing significant drains on deposits. He closed the banks until the federal auditors could review the books. Although many were deemed insolvent, he decided to save the remaining banks by signing the Bank Moratorium that suspended acknowledgment of their demise.

After deliberating with Treasury Department officials, former Hoover advisers, and several leading bankers to develop intervention measures and reform practices, Roosevelt reopened many banks. However, he decided to leave them in the hands of their original owners instead of nationalizing them as many thought he might.

The Emergency Banking Act of March 9 allowed the Federal Reserve to make loans to businesses and nonmember banks against the assets that they were allowed to define very broadly. The Reconstruction Finance Corporation, Hoover’s singular response to the economic collapse, was authorized to buy stock in banks and thus provide them with working capital. Three days later, Roosevelt made his first “fireside chat,” in which he made a plea for citizens to redeposit their money. The legislation plus his encouragement were a success, and by the end of the month, he had saved the banking system.

Immediately after the inauguration, Roosevelt took controversial measures to stop the hoarding of gold. In April, Roosevelt took the US off the gold standard and called on Americans to turn in their gold coins for a new gold-backed paper currency. On Jan 31, 1934, the US devalued paper money from $20.67 to $35 for an oz of gold.

In 1935, amidst concerns about the threat of fascism growing in Europe, Roosevelt decided to move US gold reserves from New York to Fort Knox in Kentucky. Moving the gold past the Appalachian Mountains and next to a military tank battalion and training facility reduced the risk of a Nazi attack on Manhattan that might capture significant amounts of gold bullion. Central bank vaults were the primary targets of the Nazis in their early invasion of Czechoslovakia and Poland, as gold was essential to efforts to procure oil and other critical supplies internationally for its war effort.

Despite the boldness and swiftness in which FDR carried out his plan, it was basically a very conservative response to the national disaster. The program he embarked on was to essentially save capitalism. At the time, Russia was firmly in the grip of its Communist leaders, and Hitler and the National Socialists (Nazis) had strengthened their hold on Germany. Many business leaders, ensconced in the rhetoric of laissez-faire and free enterprise economics, were slow to realize the conservativeness of the New Deal. They chastised Roosevelt in the newspapers and on the radio. But he was very popular with the people, including Ronald Reagan, the future US president that would later become one of the New Deal’s fiercest critics.

Over the next few years, the administration established a reformed capitalist system based on the rationalization of business, finance, and labor practices and focused on long-term stability. Banks were separated from their other activities, such as investment banking and stock brokerage, through the Glass-Steagall Act of 1933. The Securities Exchange Act passed the next year created the Securities and Exchange Commission (SEC) to oversee and prevent manipulation and rigging in the stock markets. The Federal Reserve Board in Washington was also given greater powers to oversee the regional Reserve Banks. The Federal Deposit Insurance Corporation (FDIC) was instituted to prevent further bank panics and restore depositor confidence.[2]

Hopefully, the transition to a green economy will not take such a dramatic event as the Great Depression. Tragically, we might be facing a bigger threat with climate change and environmental pollution. The New Deal was central to transiting to a carbon-based industrial model still heavily reliant on manual labor. Coal and oil were central to the New Deal, as was the process of electrification.

A very crucial component of the new economic transition will be “green finance” – how will the Green New Deal be paid for, and what are the ideological implications of the process? Will the current banking system suffice? Can pension funds and other wealth management funds be sufficiently incentivized to make the crucial investments? Will the political will be developed to stop subsidizing fossil fuels? Perhaps most importantly, we’ll have to reconcile the roles of government and the private sector. Specifically, will the Green New Deal be “socialist?” or can capitalism be harnessed for the transition?

Part 2 of this series will discuss an early failure of the New Deal, the National Industrial Recovery Agency’s (NIRA) attempts to administer the economy and a wide range of codes and practices for business and industry.

Notes

[1] The Three Roosevelts. (n.d.). Retrieved February 16, 2020.
[2] McQuail, K. (1982) Big Business and Presidential Power. NY: William Morrow and Company.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    December 2024
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    3031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.