Anthony J. Pennings, PhD

WRITINGS ON AI POLICY, DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL E-COMMERCE

Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera

Posted on | June 9, 2020 | No Comments

Keanu Reeves produced a thoughtful and timely documentary on the move from photochemical to digital film. In Side by Side (2012), he interviews some of the best directors and directors of cinematic photography (DPs) about the transformation of the film making process from celluloid to silicon. The transition, which has taken decades, is worth examining through the lens Clay Christensen provides through his theory of innovative disruption. His theory examines how technology can start out “under the radar” with an inferior and cheaper version that is continuously improved until it disrupts a major industry.

The documentary looks at the development of digital cameras, editing, and special effects processes, as well as distribution and projection systems. For each category, it examines the differences between film and digital video. In interviews with prominent actors, directors and editors, the pros and cons of each are discussed along with the obvious arc of the movement towards the increased use of digital processes in cinema.

In this post, I introduce some of the issues in the move to digital cameras within the context of disruptive innovation theory. It is instrumental in understanding how technology like digital video, which was obviously inferior for so long, could emerge to seriously challenge the beauty and institutional momentum of celluloid cinema.

Film merged into a working technology during the 1880s and 1890s with significant improvements in cameras and projection technologies primarily made by Thomas Edison and the French Lumiere brothers. Innovations occurred over the next 100 years, but the invention of the digital charge-coupled device (CCD) in 1969 at Bell Labs marked the beginning of a disruptive trend in digital cameras that would slowly continue to improve until they became a major competitor to film cameras.

The CCD was initially developed for spy satellites during the Cold War but was later integrated into consumer and commercial video products. It was used by Steven Sasson, at Kodak to invent the first digital camera in 1974. The technology was very crude, however, and Kodak did not see it as a worthy replacement for its film-based cameras. Neither did the film industry.

It was the Japanese electronics company SONY that developed the Camcorder in the 1980s based on digital CCD technology and continued development into the generation of 4K cameras. Similar resolution was achieved by the Red One Camera that rocked the film industry in 2007. The same company announced their Red Dragon 6K Sensor at at NAB 2013.

CCDs were largely replaced by complementary metal-oxide semiconductor (CMOS) technology in digital cameras. A spinoff from NASA and JPL, CMOS used less power and could make the cameras smaller. It made an enormous impact on social media, bringing awareness to uprisings in the Arab Spring and other crises around the world.

“Digital cinematography” emerged by 2002 with George Lucas’s Star Wars: Episode II, the Attack of the Clones, which was shot entirely in a high definition digital format. Although the lack of digital projections systems meant that the footage was transferred to film to play in theaters; the film still caused major controversy as the industry debated digital’s pros and cons. While initially these cameras were clearly inferior to their film counterparts, SONY and other companies stayed on the digital trail of eventually and unmistakably improving them to the point where some of the early adopters like Lucas took a chance.

By committing first to consumer markets, digital cameras found the resources to continually improve. Later they showed additional characteristics that film couldn’t match, such as the ability to store huge amounts of film data in very small devices. This meant no more transporting cans of film – a major consideration when shooting in remote and hazardous locations. Increased storage capability also meant the ability to shoot for longer periods of time – often to the actor’s chagrin but with the benefit of maintaining momentum.

Another benefit was being able to watch what was being shot instantaneously on a monitor and being able to review the shots while still on the set. This move was very popular with directors but shifted power away from the directors of photography.

The last fifteen years of camera development have witnessed the disruption of the entire global complex known as “Hollywood.” New cameras such as the Red Dragon and other digital technologies for editing, special effects, and distribution played havoc with the entire chain of creative processes established in the film world and its production and distribution circuits. Digital convergence has also broken down the barriers between film and television and expanded cinematic presentations to a wide variety of screens, from mobile phones to stadium jumbotrons.

Are we still in the infancy of an entirely new array of digital televisual experiences? The year 2007 marked a critical threshold for the use of digital technologies in cameras, but the innovations continued, including the integration of the digital camera on a smartphone and the widescale adoption of digital high definition. Rather than LIDAR, it is the digital camera that has been adopted by Telsa and other EVs to enhance automation and safety.

Film continues to be championed by directors like Christopher Nolan who used it in Interstellar (2014) and Dunkirk (2017). Quentin Tarantino is also known for being dedicated to film in his movies, including Once Upon a Time in Hollywood (2019).

In the next section we look at the disruptive influence of digital editing. Following that is a look at non-linear editing (NLE) that was also made possible by the digital revolution.

Citation APA (7th Edition)

Pennings, A.J. (2020, Jun 9). Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera. apennings.com https://apennings.com/multimedia/digital-disruption-in-the-film-industry-gains-and-losses-part-1-the-camera/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. He wrote this during a brief stay at the Digital Media Management program at St. Edwards University in Austin, Texas. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

“Letteracy” and Logos

Posted on | May 5, 2020 | No Comments

More than ever, people are interested in how visual design influences the production of meaning and its intellectual and emotional effects. In the new media age, a new type of literacy has emerged that Seymour Papert (1993) and others began to call “letteracy.” Papert was critical of the idea of introducing letters too early in a child’s development but recognized that connecting with culture and history required understanding alphabets and their significance.

“Letteracy” suggests a larger conversation about global visual culture and why people are increasingly more interested in the impact of letters, typographies, and logos in our media world. A twist on “literacy,” it points to the discrepancy between a world in which reading is pervasive and the relative ignorance of how letters are designed and have an influence on us.

This blog post discusses letteracy by discussing the significance of the alphabet and then focusing on the importance of fonts and typography in visual design such as in a magazine or webpage layout, as well as in the use of logos.

Letters Capture Sound

One of the first questions to ask is “What are letters?” Letters typically refer to characters of the alphabet, which are used in written language to represent sounds. Letters are the building blocks of words and are fundamental to written communication in many languages. Each letter typically represents one or more sounds in a spoken language, and when combined in various sequences, they form words that convey meaning.

Letters are phonographic – they code the sounds of language in scripted figures. A few writing systems like Chinese characters are ideographic, they code ideas into their figures. Phonographic writing has the advantage of coding everyday language in their letters while being flexible enough to incorporate new words. Ideographic writing requires extensive memorization and social mentoring to enforce meanings and consistency in sound reproduction.

Asian societies like Korea, and to a lesser extent, Japan, have replaced Chinese characters with the phonographic characters. Korea instituted “Hangul” that is phonographic but with some iconic aspects. The characters represent oral movements of the tongue and lips used to achieve sounds. The change allowed Korea’s population to achieve a high rate of reading literacy. Japan has two sets of phonographic characters, hiragana, and katakana. These both are sound based, but each character represents a whole syllable – the vowel and the consonant. To make the situation a bit more complicated, they still use “Kanji” ideographic characters borrowed from China.

Fonts and Typography

A key distinction in letteracy is between the terms “font” and “typography” that are often used interchangeably, but they refer to different aspects of written or printed text. A font refers to a specific style or design of a set of characters that share consistent visual characteristics. This set would include letters, numbers, punctuation marks, and symbols. Font characteristics include attributes such as typeface, weight, style (e.g., regular, italic, bold), size, and spacing. Examples of fonts include Arial, Times New Roman, Helvetica, and Comic Sans.

Typography, on the other hand, encompasses the art and technique of arranging type to make written language readable, legible, and visually appealing. It is the design and arrangement of letters and text so that the writing is easy to understand, appealing, and conveys an appropriate set of feelings and meanings. Typography involves the selection and use of fonts, as well as considerations such as layout, spacing, alignment, hierarchy, color, and overall design. Good typography involves careful attention to detail and consideration of the intended audience, context, and purpose of the text.

Below is one of my favorite TED talks about typography.

Spacing

Part of this literacy is an understanding the various meanings associated with typography. Type fonts can be designed and used with various purposes in mind. The “Power of Typography” video above explains in more detail. As the speaker points out, spacing is an important issue in typography. Kerning, Tracking, and Leading are three terms that describe the importance of space and help us do the denotative analysis.

Kerning deals with distance between two letters. Words are indecipherable if the letters are tooclose or too far apart. They can also be awkward to read when some letters have wi d e r spacing and others narrower.

Tracking involves adjusting the spacing throughout the e n t i r e word. It can be used to change the spacing equally between every letter at once. Tracking can make a single word seem airy and impressive but can quickly lead to difficulty in reading if used excessively.

Leading is a design aspect that determines how text is spaced vertically in lines. It deals with the distance |||||||||||||||||||||||||||||| from the bottom of the words above to the top of the words below in order to make them legible.

From Printing Press to Desktop Publishing

Johannes Gutenberg is credited with inventing both the printing press and the production of durable typefaces around 1460 AD. The technology had also been developed in China and Korea, but conditions in Europe were better for its expansion. Printing presses in China and Korea were state-based projects that eventually withered. Conversely, religious, market, and political conditions in Europe improved their chances of success.

The first best-seller? The Christian Bible. In 1517, the Protestant revolution began that emphasized reading of the Bible over the services of the Catholic church and its priests. It also helped Europe develop separate nation-states as people became more literate in their local languages. Printed materials in different dialects began to coagulate community identities. People began to identify with others who spoke the same dialect and recognize them as sharing the same national or “tribal” values. Benedict Anderson called these “imagined communities” in the book by the same name.

Thanks to Steve Jobs and the Apple Macintosh graphical user interface, different typefaces were added to computers. Along with WYSIWYG display, the new GUI enabled desktop publishing. This democratized the printing “press.” Consequently, understanding the importance of different styles of letters became an important literacy of the digital age.

Logos

A logo is the graphic signature of a person or organization. It is meant to encapsulate and communicate the preferred symbolic meanings of an organization that cannot be imparted through words alone. A logo is a simple, abstracted representation of an individual or corporate identity. It is a constructed icon meant to immediately denote and connote meaning.

A logo should maintain its clarity in many different sizes. It should be designed as an easily remembered icon that represents its bearer. It is meant to be seen and recognized instantly. It is also meant to be reduced in size without disappearing from view or losing its legibility. 

It may include visual elements such as colors, forms, fonts and shapes, and even dress codes. It should be effective in both black and white, and color. It should appear well in different media (paper, RGB, etc.) It should also translate well into different media and packaging. Few logos achieve these standards, but those that succeed play an important role in determining success.

Conclusion

Letteracy provides a framework for understanding the significance of letters, typography, and logos in visual design. By appreciating the art and science of typography and recognizing the power of logos, we can better comprehend and communicate through the visual language of text and symbols in our media-saturated world.

Citation APA (7th Edition)

Pennings, A.J. (2020, May 5) “Letteracy” and Logos. apennings.com https://apennings.com/visual-literacy-and-rhetoric/letteracy-and-logos/

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. His American home is in Austin, Texas where he has taught in the Digital Media BS and MBA programs at St. Edwards University. He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

US INTERNET POLICY, PART 2: THE SHIFT TO BROADBAND

Posted on | March 24, 2020 | No Comments

This post is second in a series that I am producing during the COVID-19 pandemic about the importance of telecommunications policy in ensuring the widespread availability of affordable high-speed Internet access. Teaching online and working from home have gone from fringe activities to be central components of life. As we move to a Smart New Deal to transform American life, how we structure our digital environments will be central. This post discusses the transition from dial-up modems in the early days of the Internet to high-speed broadband connections. With that technical transition and the FCC’s 2005 decision, the competitive environment that the Clinton-Gore administration built, collapsed into the cable-telco duopoly we see today and made “net neutrality” an issue.

The Internet Service Providers (ISPs) mentioned in US Internet Policy, Part 1 facilitated the process of getting individuals, businesses and government “online” by linking to the Internet backbone and going “retail” with dial-up modems and then high-speed broadband connections. The term “online” emerged as way to distinguish data communications from telephony, which was highly regulated by the Communications Act of 1934. ISPs offered businesses and consumers high-speed data services for accessing the World Wide Web, hosting websites, and providing large file transfers (FTP). The key was accessing the rapidly expanding Internet packet-switching backbone network that had been developed by the National Science Foundation (NSF).

The National Science Foundation’s backbone network (NSFNET) began data transmissions at 56 kilobits per second (Kbit/s) but was upgraded to T1 lines in 1988, sending at 1.544 megabits per second (Mbit/s). It eventually consisted of some 170 smaller networks connecting research centers and universities. In 1991, the NSFNET backbone was upgraded to T3 lines sending data at 45 Mbit/s. From 1993, NSFNET was privatized, and a new backbone architecture was solicited, that incorporated the private sector.

The next-generation very-high-performance Backbone Network Service (vBNS) was developed as the successor to the NSFNet. vBNS began operation in April 1995 and was developed with MCI Communications, now a part of Verizon. The new backbone consisted primarily of glass Optical Carrier (OC) lines, each of which had several fiber-optic cables banded together to increase the total amount of capacity of the line. The interconnected Optical Carrier (OCx) lines operated at 155 Mbit/s and higher. These high-speed trunk lines soon multiplied their capabilities from OC-3 operating at 155 Mbit/s, to OC-12 (622 Mbit/s), OC-24 (1244 Mbit/s), and OC-48 (2488 Mbit/s). By 2005, OC-48 was surpassed by OC-192 (9953.28 Mbit/s) and 10 Gigabit Ethernet.

As part of NSFNET decommission in 1995, these backbone links connected to the four national network access points (NAPs) in California, Chicago, New Jersey, and Washington D.C. The backbone expanded to multiple carriers that coordinated with ISPs to provide high-speed connections for homes and businesses.

At first consumers used analog dial-up modems over the telephone lines at speeds that increased to 14.4 kilobits per second (Kbit/s or just k) by 1991 to 28.8 kbit/s in 1994. Soon the 33.6 Kbit/s was invented that many thought to be the upper limit for phone line transmissions. But the 56K modem was soon available and a new set of standards continue to push speeds of data over the telephone system. The 56K modem was invented by Dr. Brent Townshend for an early music streaming service. This new system avoided the analog to digital conversion that seriously hampered data speeds and allow content to be switched digitally to the consumer’s terminal device, usually a PC.

Also, during the 1990s, the telcos were conducting tests using a new technology called ADSL (Asynchronous Digital Subscriber Line). It was initially designed to provide video over copper lines to the home. Baby Bells, in particular, wanted to offer television services to compete with cable television. It was called asynchronous because it could send data downstream to the subscriber faster (256 kbit/s-9 Mbit/s) than upstream (64 Kbit/s-1.54 Kbit/s) to the provider.

ADSL was able to utilize electromagnetic frequencies that telephone wires carry, but don’t use. ADSL services separated the telephone signals into three bands of frequencies-one for telephone calls and the other two bands for uploading and downloading Internet activities. Different versions and speeds emerged based on the local telco’s ability and willingness to get an optical fiber link close to the neighborhood or “to the curb” next to a household or business location.

They were soon called Digital Subscriber Lines (DSL), and they began to replace dial-up modems. High demand and competition from cable companies with high-speed coaxial lines pressured ISPs and telcos to adapt DSL technologies. DSL and new cable technologies that carried Internet traffic, as well as television, came to be collectively called “broadband” communications.

Internet traffic grew at a fantastic rate during the late 1990s as individuals and corporations rushed to “get on the web.” The rhetoric of the “new economy” circulated and fueled investments in web-based companies and telecommunications providers.

A temporary investment bubble emerged as many companies lacked the technology or business expertise to obtain profits. Dot.coms such as Drkoop.com, eToys.com, Flooz.com, GeoCities, Go.com, Kozmo.com, Pets.com, theGlobe.com, and Webvan.com failed for a variety of reasons but mainly flawed business plans and the premature expenditure of investment capital.

Similarly, many carriers such as Global Crossing, WorldCom, and ISPs overestimated web traffic and built excess capacity. In the wake of the dot.com crash in 2000 and the telecom crash in 2002, many ISPs filed for bankruptcy, including Wall Street darlings like Covad, Excite@home, NorthPoint, PSINet, Rhythms NetConnections, and Winstar Communications.

The broadband industry changed significantly after the 2000 election. The technological infrastructure was significantly devastated by the dot.com crash of 2000 and the telecom crash of 2002.

Furthermore, Internet policy changed when the Bush administration was reelected in 2004. The FCC revoked Computer II in 2005 when it redefined carrier-based broadband as an information service.

This meant that broadband was effectively not regulated and telcos could go on to compete with the ISPs. Instead of offering backbone services and being required to interconnect with the ISPs, they became ISPs and were no longer required to provide ISPs their connection to the Internet. The competitive environment that nurtured Internet growth was effectively decimated.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 1: The Rise of ISPs

Posted on | March 15, 2020 | No Comments

Much of the early success of the Internet in the USA can be attributed to the emergence of a unique organizational form, the Internet Service Provider or “ISP,” which became the dominant provider of Internet and broadband services in the 1990s. These organizations resulted from a unique set of regulatory directives that pushed the Internet’s development and created a competitive environment that encouraged the proliferation of ISPs and the spread of the World Wide Web.

In this series on Internet policy, I look at the rise of the World Wide Web, the shift to broadband, deregulation, and consolidation of broadband services. Later in the series, I address the issue of net neutrality and raise the question, “Is Internet service a utility? Is it an essential service that should be made universally available to all Americans at regulated prices?

The first ISPs began as US government-funded entities that served research and education communities of the early Internet. Secured by Al Gore in 1991, legislation signed by President George H. Bush created the model of the National Research and Education Network (NREN), a government-sponsored internet service provider dedicated to supporting the needs of the research and education communities within the US. Internet2, Merit, NYSERNET, OARnet, and KanRen were a few of the systems that provided schools and other non-profit organizations access to the World Wide Web. While dialup services like Compuserve existed in the early 1980s, only later were the ISPs released for commercial traffic and services.

While telecommunications carriers had been moving some Internet traffic since the late 1980s, their role expanded dramatically after the Internet began to allow commercial activities. In June of 1992 Congressman Rick Boucher (D-Va) introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the US National Science Foundation Network (NSFNET). “A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law.” The amendment allowed advertising and sales activities on the NSFNET and marked the advent of online commercial activities.

As part of the National Information Infrastructure (NII) plan, the US government decommissioned the US National Science Foundation Network (NSFNET) in 1995. It had been the publicly financed backbone for most IP traffic in the US. The NII handed over interconnection to four Network Access Points (NAPs) in different parts of the country to create a bridge to the modern Internet of many private-sector competitors.

These NAPS contracted with the big commercial carriers of the time such as Ameritech, Pacific Bell, and Sprint for new facilities to form a network-of-networks, anchored around Internet Exchange Points (IXPs). The former regional Bell companies were to be primarily wholesalers, interconnecting with ISPs. This relatively easy process of connecting routers was to put the “inter” in the Internet but also became sites of performance degradation and unequal power relations.

As the Internet took off in the late 1990s, thousands of new ISPs set up business to commercialize the Internet. The major markets for ISPs were: 1) access services, 2) wholesale IP services, and 3) value-added services offered to individuals and corporations. Access services were provided for both individual and corporate accounts and involved connecting them to the Internet via dial-up, ISDN, T-1, frame-relay or other network connections. Wholesale IP services were primarily offered by facilities-based providers like MCI, Sprint, and WorldCom UUNET (a spinoff of a DOD-funded seismic research facility) and involved providing leased capacity over its backbone networks. Value-added services included web-hosting, e-commerce, and networked resident security services. By the end of 1997, over 4,900 ISPs existed in North America, although most of them had fewer than 3,000 subscribers.[2] See the below video and this response for how much things have changed.

FCC policy had allowed unlimited local phone calling for enhanced computer services and early Internet users connected to their local ISP using their modems over POTS (Plain Old Telephone System). ISPs quickly developed software that was put on CD-ROMs that could be easily installed on a personal computer. The software usually put an icon on the desktop screen of the computer that when clicked on would dial the ISP automatically, provide the password, and connect the user to Internet. A company called Netscape created a popular “browser” that allowed text and images to be displayed on the screen. The browser used what was called the World Wide Web, a system of accessing files quickly from computer servers all over the globe.

The ISPs emerged as an important component to the Internet’s accessibility and were greatly aided by US government policy. The distinctions made in the FCC’s Second Computer Inquiry in 1981 allowed ISPs to bypass many of the regulatory roadblocks experienced by traditional communication carriers. They opened up possibilities and created protections for computer communications. Telcos were to provide regulated basic services and “enhanced services” were to stay unregulated. Dan Schiller explained:

    Under federal regulation, U.S. ISPs had been classed as providers of enhanced service. This designation conferred on ISPs a characteristically privileged status within the liberalized zone of network development. It exempted them from the interconnection, or access, charges levied on other systems that tie in with local telephone networks; it also meant that ISPs did not have to pay into the government’s universal service fund, which provided subsidies to support telephone access in low-income and rural areas. As a result of this sustained federal policy, ISPs enjoyed a substantial cross-subsidy, which was borne by ordinary voice users of the local telecommunications network.[3]

ISPs looked to equip themselves for potential new markets and also connect with other companies. For example, IBM and telecom provider Qwest hooked up to offer web hosting services. PSINet bought Metamor to not only transfer data but to host, design, and move companies from the old software environment to the new digital environment. ISPs increasingly saw themselves as not only providers of a transparent data pipe but also as a provider of value-added services such as web hosting, colocation, and support for domain name registration.

The next part of this series will discuss the shift to higher speed broadband capabilities. Later, the consolidation of the industries starting in 2005 when the FCC changed the regulatory regime for wireline broadband services.

Notes

[1] Hundt, R. (2000) You Say You Want a Revolution? A Story of Information Age Politics. Yale University Press. p. 25.
[2] McCarthy, B. (1999) “Introduction to the Directory of Internet Service Providers,” Boardwatch Magazine’s Directory of Internet Service Providers. Winter 1998-Spring 1999. p. 4.
[3] Schiller, D. (1999) Digital Capitalism. The MIT Press. p. 31.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Undergraduate Director at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Five Stages of ICT for Global Development

Posted on | February 19, 2020 | No Comments

Summarized remarks from “Five Stages of Global ICT4D: Governance, Network Transformation, and the Increasing Utilization of Surplus Behavioral Data for Prediction Products and Services,” presented at the 27th AMIC Annual Conference on 17-19 June 2019 at Chulalongkorn University, Bangkok, Thailand.

This presentation will explore and outline the following stages of economic and social development utilizing information and communications Computerization and Development in SE Asia technologies (ICT). The ICT acronym has emerged as a popular moniker, especially in international usage, for the digital technology revolution and is often combined with “development” to form ICT4D. Development is a contested term with currency in several areas. Still, in global political economy, it refers to the process of building environments and infrastructure needed to improve the quality of human life and bridge equity divides. Often this means enhancing a nation’s agriculture, education, health, and other public goods that are not strictly economy-related but improve well-being and intellectual capital.

Of particular interest is the transformation of public-switched networks for Internet Protocol (IP)-based services and then WiFi and mobile use. Data-intensive solutions are beginning to address many development issues. However, a growing concern is that data is being collected extensively and used intrusively to manipulate behaviors.

The stages are categorized as:

1) Containment/Development/Modernization; 2) New World Economic and Information Orders; 3) Structural Adjustment and Re-subordination, 4) Global ICT Integration, and; 5) Smart/Sustainable Mobile and Data-Driven Development.

Using a techno-structural approach, the explication of these stages will provide historical context for understanding trends in ICT innovation and implementation. This approach recognizes the reciprocal effects between technological developments and institutional power and a “dual-effects hypothesis” to illustrate the parallel potentials of ICT4D as both a democratizing and totalizing force. This research will also provide insights into the possibilities of ICT diffusion in developing environments.

1) Containment/Development/Modernization

The US emerged as the primary economic and military power in the aftermath of World War II. Arms and materiel sales before the war had transferred much of the world’s gold to the US, which was wisely transferred inland to Fort Knox, Kentucky. Franklin Delano Roosevelt (FDR) had sought to limit finance in the aftermath of the “Crash of 1929” and continued it globally with the initiative at Bretton Woods Hotel in New Hampshire. “Bretton Woods” created the International Monetary Fund (IMF), the World Bank, the International Trade Organization (rejected by Congress), and also instituted a dollar-gold standard that tied the US dollar to gold at $35 an ounce (oz) and other international currencies to the US dollar at set rates. This system was designed to “contain” financial speculation and encourage trade and development.

The other aspect of containment, more widely known, is the containment of Communism. The painful success of the USSR in World War II in countering the Nazis on the eastern front and their appropriation of German atomic and rocketry technology presented an ideological and military threat to the US and its allies. The USSR’s launch of Sputnik satellites in 1957 resulted in the US’s formation of NASA and ARPA. The Communist revolution in China in 1949 and their explosion of an atomic bomb on Oct. 16, 1964, spurred additional concern. The resultant Cold War and Space Race spurred technological development and competition for “developing countries”worldwide.

“Development” and “modernization” characterized the post-World War II US prescription for economic development around the world, and especially in newly decolonized nation-states. Modernization referred to a transitional process of moving from “traditional” or “primitive” communities to modern societies based on scientific rationality, abstract thought, and the belief in progress. It included urbanization, economic growth, and “psychic mobility” that could be influenced by types of media. Scholars talked of an eventual “takeoff” if the proper regiment was followed, particularly the adoption of new agricultural techniques termed the “Green Revolution.”[1] Information and Communications Technologies (ICTs) were rarely stressed, but “five communication revolutions” (print, film, radio, television, and later, satellites) were beginning to be recognized as making a contribution to national development.

Communication technologies were beginning to spread information about modern practices in agriculture, health, education, and national governance. Some early computerization projects continued population analysis, such as the census that had started with tabulation machines, while mainframes and minicomputers were increasingly utilized for statistical gathering by government agencies for development processes.

Telegraphy and telephones were strangely absent from much of the discussion but were important for government activities as well as large-scale plantations, mining operations, transportation coordination, and maritime shipping. Because of their large capital requirements and geographic expanse, countries uniformly developed state-controlled Post, Telephone, and Telegraph (PTTs) entities. Organized with the help and guidance of the International Telecommunications Union (ITU), the oldest United Nations entity, PTTs struggled to provide basic voice and telegraphic services. However, they provided needed jobs, technical resources, and currency for the national treasury.

Wilbur Schramm’s (1964) book Mass Media and National Development made crucial links between media and national development. Published by Stanford University Press and UNESCO, it examined the role of newspapers, radio, and television. Its emphasis on the role of information in development also laid the foundation for the analysis of computerization and ICT in the development process. I had an office next to Schramm for many years at the East-West Center’s Communication Insitute that he founded while I worked on the National Computerization Policy project that resulted in the ICT4D benchmark study Computerization and Development in Southeast Asia (1987). Herbert Dordick, Meheroo Jussawalla, Deane Neubauer, and Syed Rahim were key scholars in the early years of ICT4D at the East-West Center.[1]

2) New World Economic and Information Orders

Rising frustrations and concerns about neo-colonialism due to the power of transnational corporations (TNCs), especially news companies, resulted in a collective call by developing countries for various conceptions of a “New World Economic and Communication Order.” It was echoed by UNESCO in the wake of OPEC oil shocks and the resulting Third World debt crisis. The issue was primarily news flow and the imbalanced flow of information from North to South. Developing countries were concerned about the unequal flows of news and data from developing to developed countries. In part, it was the preponderance of news dealing with disasters, coups, and other calamities that many countries felt restricted flows of foreign investment. The calls caused a backlash in the US and other developed countries concerned about the independence of journalism and the free flow of trade.[2]

It was followed by concerns about obstacles hindering communications infrastructure development and how telecommunications access across the world could be stimulated. In 1983, UNESCO’s World Communication Year, the Independent Commission met several times to discuss the importance of communication infrastructure for social and economic development and to make recommendations for spurring its growth.

The Commission consisted of seventeen members – communication elites from both private and public sectors and representing a number of countries. Spurred on by the growing optimism about the development potential of telecommunications, they investigated ways Third World countries could be supported in this area. They published their recommendations in The Missing Link (1984) or what soon was to be called the “Maitland Report” after its Chair, Sir Donald Maitland from the United Kingdom. This report brought recognition to the role of telecommunications in development and opened up resources by international organizations such as the World Bank.

The transition from telegraph and telex machines to computers also resulted in concerns about data transcending national boundaries. The Intergovernmental Bureau for Informatics (IBI) that had been set up as the International Computation Centre (ICC) in 1951 to help countries get access to major computers, began to study national computerization policy issues in the mid-1970s.

They increasingly focused on transborder data flows (TDF) that moved sensitive corporate, government, and personal information across national boundaries. The first International Conference on Transborder Data Flows was organized in September 1980, followed by a second held in 1984; both were held in Rome (Italy). The increasing use of computers raised questions about accounting and economic data avoiding political and tax scrutiny. The concern was that these data movements could act like a “trojan horse” and compromise a country’s credit ratings and national sovereignty, as well as individual privacy.

3) Structural Adjustment and Re-subordination

Instead, a new era of “structural adjustment” enforced by the International Monetary Fund emerged that targeted national post, telephone, and telegraph (PTT) agencies and other aspects of government administration and ownership. Long considered agents of national development and employment, PTTs came under increasing criticism for their antiquated technologies and lack of customer service.

In the early 1970s, Nixon ended the Bretton Woods regulation of the dollar-gold standard, resulting in very volatile currency markets. Oil prices increased, and dollars flowed into OPEC countries, only to be lent out to cash-poor developing countries. The flow of petrodollar lending and rising “third world debt” pressured PTTs to add new value-added data networks and undergo satellite deregulation. Global circuits of digital money and news emerged, such as Reuters Money Monitor Rates and SWIFT (Society for Worldwide Interbank Telecommunications). These networks, the first to use packet-switching, linked currency exchange markets worldwide in arguably the first virtual market.

A new techno-economic imperative emerged that changed the relationship between government agencies and global capital. PC-spreadsheet technologies were utilized to inventory, value, and privatize PTTs so they could be corporatized and listed on electronically linked share-market exchanges. Communications markets were liberalized to allow domestic and international competition for new telecommunications services, and sales of digital switches and fiber optic networks. Developing countries became “emerging markets,” consistently disciplined by the “Washington Consensus” stressing a set of policy prescriptions to continue to open them up to transborder data flows and international trade.[3]

4) Global ICT Integration

Packet-switching technologies standardized into the ITU’s X.25 and X.75 protocols for PTT data networks, transformed into ubiquitous TCP/IP networks by the late 1990s. Cisco Systems became the principal enabler with a series of multi-protocol routers designed for enterprises, governments, and eventually telcos. Lucent, Northern Telecom, and other telecommunications equipment suppliers quickly lost market share as the Internet protocols, mandated by the US military’s ARPANET, and later by the National Science Foundation’s NSFNET, were integrated into ISDN, ATM, and SONET technologies in telcos around the world.

The Global Information Infrastructure (GII) introduced at the annual ITU meeting in Buenos Aires in March of 1994 by Vice President Gore targeted national PTT monopolies and government regulatory agencies. He proposed a new model of global telecommunications based on competition, instead of monopoly. He stressed the rule of law and the interconnection of networks to existing networks at fair prices. Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations which called for the inclusion of GATS (General Agreement on Trade in Services) that include everything from ciruses, to education, radio and television, and telecommunications services. And, at this meeting, they called for the creation of the World Trade Organization (WTO).

Formed in 1995, the WTO had two meetings in 1996 and 1997 that created a new era of global communications and development. Members party to the new multilateral arrangement met quickly in Singapore in 1996 to reduce tariffs on the international sales of a wide variety of information technologies. The Information Technology Agreement (ITA) was signed by 29 participants in December 1996. The agreement was expanded at the Nairobi Ministerial Conference in December 2015, to cover an additional 201 products valued at over $1.3 trillion per year. A agreements allowed Korea to successfully market early CDMA mobile handsets and develop a trajectory of success in the smartphone market.

In 1997 the WTO met in Geneva and established rules for the continued privatization of national telecommunications operations. Sixty-nine nations party to the WTO, including the U.S., signed the Agreement on Basic Telecommunications Services in 1997 that codified new rules for telecommunications deregulation where countries agreed to privatize and open their own telecommunications infrastructures to foreign penetration and competition by other telcos.

The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not negotiated and reduced tariffs for crucial networking and computer equipment. The resultant liberalization of data and mobile services around the world made possible a new stage in global development.

Hypertext, Ad Markets, and Search Engines

The online economy emerged with the Internet and its hypertext click environment. Starting with advertising and the keyword search and auctioning system, a new means of economic production and political participation based on the wide-scale collection and rendition of surplus behavioral data emerged for prediction products and services.

As Shoshana Zuboff points out in Surveillance Capitalism (2019), the economy expands by finding new things to commodify, and the Internet provided a multitude of new products and services that could be sold. When the Internet was privatized in the early 1990s and the World Wide Web (WWW) established the protocols for hypertext and webpages, new virtual worlds of online media spaces were enabled. These were called “inventory.” Or you can can them ad space.

Behavioral data is the information produced as a result of actions that can is measured on a range of devices connected to the Internet, such as a PC, tablet, or smartphone. Behavioral data tracks the sites visited, the apps downloaded, or the games played. Cloud platforms claims human experience as free raw material for translation into behavioral data. Some of this data is applied to product or service improvements, the rest are declared as proprietary behavioral surplus, and fed into advanced manufacturing processes known as ‘machine intelligence.’ Automated machine processes can capture knowledge about behaviors but also shape behaviors.

Surplus behavioral and instrumental data is turned into prediction products such as recommendation engines for e-commerce and entertainment. These anticipate what people will do now, soon, and later. Prediction products are traded in a new kind of marketplace for behavioral predictions called behavioral futures markets. These are currently used primarily used in advertising systems based on CTR, Pay-Per-Click (PPC), and real-time bidding auction systems.

5) Smart/Sustainable Mobile and Data-Centric Development

The aggressive trade negotiations and agreements in the 1990s significantly reduced the costs of ICT devices and communication exchanges worldwide, making possible a wide variety of new commercial and development activities based on ICT capabilities. We are at the halfway point for the sustainable development goals (SDGs) outlined by the United Nations in 2015. The SDGs are providing an additional impetus for ICT4D as it encourages infrastructure building and support for key development activities that ICT can assist, such as monitoring Earth and sea resources and providing affordable health information and communication activities.

A key variable is the value of the dollar that is the world’s primary transacting currency. A global shortage of dollars due to high interest rates or political risk means higher prices for imported goods, regardless of lower tariffs. The post-Covid crisis in Ukraine has stressed supply chains of key materials and raw Earth mineral from Russia and Ukraine further adding to potential costs and geopolitical risk. ICT4D is highly reliant on global supply chains making digital devices readily available at reasonable prices.

The near-zero marginal costs for digital products make information content and services more accessible for developing countries. Books, MOOCs, and other online services provide value to a vast population with minimal costs to reach each additional person. Platform-based services providing agricultural, health, and other development services provide low-cost accessibility and outreach. They allow new applications to scale significantly with low costs. Incidentally and significantly, renewable energy sources like solar and wind also provide near-zero marginal costs for producing electricity. Like digital products, they require high initial investments but output product at low costs once operational.

Mobility, broadband, and cloud services are three significant technologies presenting positive prospects for ICT4D. Mobile broadband technologies that bypass traditional wireline “last mile” infrastructure have been a major boost to the prospects for ICT4D. They provide significant connectivity across a wide range of the population and with key commercial and government entities. 4G LTE technologies currently provide the optimal service, as 5G towers consume nearly over 60% more power than LTE and also require more stations as their range is lower.

Enhanced connectivity strengthens network effects. Blockchain technologies and cryptocurrencies, the Internet of Things (IoT), and the proliferation of web platforms are some of the current conceptions of how reduced costs for communications and information analysis are enhancing network effects and creating value from the collection and processing of unstructured data.

This project will expand on these stages and provide a context for a further investigation of ICT for development drawing on historical and current research. Of particular concern is the implementation of policies and practices related to contemporary development practices, but commercial and monetization techniques are important as well.

Notes

[1a] Dordick, Herbert S. and Deane Neubauer. 1985. “Information as Currency: Organizational Restructuring Under the Impact of the Information Revolution.” Bulletin of the Institute for Communications Research, Keio University, No 25, 12–13. This journal article was particularly insightful into the dynamics of the PTTs that would lead to pressures on them to adapt IP technologies leading to the World Wide Web.

[1] Rostow, W.W. (1960) Stages of Economic Growth: A Non-Communist Manifesto. Cambridge: Cambridge University Press. See also Rostow W.W., (1965) Stages of Political Development. Cambridge: Cambridge University Press.
[2] An excellent discussion of the various development and new world communication and economic order discourses can be found in Majid Tehranian’s (1999) Global Communication and World Politics: Domination, Development, and Discourse. Boulder, CO: Lynne Rienner Publishers. p. 40-41. Also, see Jussawalla, M. (1981) ) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[3] Wriston, W.W. (1992) The Twilight of Sovereignty : How the Information Revolution Is Transforming Our World.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 19). Five Stages of ICT for Global Development. apennings.com https://apennings.com/how-it-came-to-rule-the-world/planting-to-platforms-five-stages-of-ict-for-global-development/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea. From 2002-2012 he was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, andVictoria University in Wellington, New Zealand. His American home is in Austin, Texas and taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets – Part 5 – Numeracy and the Power of Zero

Posted on | February 17, 2020 | No Comments

Previously, I explored the digital spreadsheet as a meaning-making application that was central to the financial explosion of the 1980s and its Apple II running VisiCalc spreadsheet economic aftershocks. Spreadsheets framed and produced information and meaning consequential to monetary and organizational practices as they became part of the daily routines of the new class of professional and managerial information workers. Initially the purview of accountants and bookkeepers, spreadsheet usage became ubiquitous throughout work practices in conjunction with word processing and databases. The spreadsheet incorporated numerical formulations and innovations with systems of categorization and inventorying to become a tool for productivity and power over people and resources.

In my last post, I explored the importance of symbolic representation systems, mainly writing, in the workings of a spreadsheet. Written alphanumerical symbols have shaped Western society. For example, tables and lists are epistemological technologies that have historically organized administrative knowledge in castles, military camps, and monasteries. With the invention of ASCII characters, the facility of the PC-based applications, including the spreadsheet, became powerful tools for organizing written and numerical facts in modern corporations and other organizations worldwide.

In this post, I will focus specifically on the power of numeracy, with a special emphasis on the role of zero. The zero is an extraordinary cognitive invention that has been central to the quantitative workings of the spreadsheet. In conjunction with Indo-Arabic numerals and double-entry accounting techniques, the spreadsheet has been crucial to the rise of modern capitalism and that peculiar historical manifestation, the corporation.

Although still used on occasion for style, Roman numerals have been mathematically obsolete for several hundred years. Initially based on scratching or tallying systems for sheep and other items, Roman numbers most likely represented hand figurations or gestures. For example, the number 10 or X probably represented two thumbs crisscrossed. Addition and subtraction were relatively straightforward, but division and multiplication were not as “easy or obvious.” Roman numerals included thousands (“M”) but they never developed a representation for million or beyond.

The modern system of numeration is based on representations using ten different digits 0, …, 9 imported from the Middle East and Asia and will be called Indo-Arabic in this post. These numerals are said to have been designed based on the number of angles each numeral contained and over the years the way they are written have rounded out. The Arabian interest in Indian numerals based on zero arose to solve practical problems such as inheritances, purchases, sales contracts, tax collection and wills. Indo-Arabic numerals moving from India to the Middle East and finally to Europe were crucial for accounting and financial systems that have since become global standards and key ingredients in spreadsheet formulations.

Evidence dates the zero (also known as the naught, or nil) back some 2000 years to the Angkor Wat civilization in Cambodia; although it is generally recognized that India refined its use around 500AD, and it came to Europe in the early 1200’s from Arabia. Muhammed ibn-Musa al-Khwarizmi, or “Algorismus,” as his name was Latinized, was probably one of the most influential mathematicians in the transfer of this knowledge to the West. The Persian scholar taught in Baghdad sometime between 800 and 850. He wrote a book on the Hindu number system that was translated into Latin as De numero indorum or “On the Hindu numbers.” He later wrote another seminal book, Al-jabr w’al muqabalah, which became known in Europe as Algebra, based on the author’s Latin name and is the root of the English word “algorithm.”

One of the mathematicians who introduced these numbers to Europe was Leonardo of Pisa or Leonardo Pisano, famously known as “Fibonacci.” It is short for filius Bonacci, the son of Bonaccio. In his book, Liber abaci (Book of the Abacus or Book of Calculating) completed in 1202, he showed how the Indo-Arabic numbers could be used. The book was divided into four parts. The first introduced Indo-Arabic numbers, especially zephirum, which became zefiro in Italian, zero in the Venetian dialect. The second section showed how calculations dealing with currency conversions, compound interest, and the determination of profit could benefit businesses. The third and fourth sections addressed a number of mathematical problems including irrational numbers and the Fibonacci sequence that the author is most known for today. The video below introduces his relevance to commercial activities.

It was the development of the zero and the related positional system that made modern “Western” calculation systems so effective. It has become necessary for a variety of mathematical purposes including decimals, sets, and quite significantly, the mathematical systems that makes it easier to work with larger quantities. Using the place holding system, nine numbers plus zero can represent an infinity of figures. The same symbol, such as 7, takes on different meanings (7, 70, 700, etc.) depending on its location within the representation of the number. The positional base-10 system using ten different digits 0, …, 9 has been globally accepted as the primary mathematical standard for human calculation.

The base-10 positional system probably emerged from counting on our fingers, but it is adequately suited to arithmetical computations as it needs only ten different symbols and uses the zero to mark the place of a power of the base not actually occurring. Think of a car’s odometer that every ten miles causes the dial to turn and milestones such as 10,000 and 100,000 are markers of a car’s age that we often subscribe significance.

One of the strengths of the spreadsheet is its ability to combine complex calculations with human understanding of the base-10 mathematical system. While the “alien intelligence” of computers can now handle more complex base systems such as the duodecimal (base-12) and sexagesimal (base-60) place-holding systems used in time and geographic calculations, base-10 is useful because, quite frankly, humans are used to it. Although computers use a base-2 system with nearly infinite combinations of 1s and 0s, the positional base-10 system has been globally accepted as the mathematical standard for human calculation and a key component of spreadsheet usability.

The calculative abilities of zero with other Indo-Arabic numbers brought new levels of certainty and confidence for commerce and eventually science in the West. By 1300, zero-based accounting and other numerical techniques were being adapted by the merchant classes. Double-entry accounting techniques emerged first for tracking resources and checking for errors, but later resulted in the conceptual separation of a business from its owner, a precursor condition for the emergence of the modern corporation.

The spreadsheet drew on this history of numerical innovation to become a tool of organizational productivity and power. As part of my “formulating power” project, I will continue to examine the ways spreadsheets construct techno-epistemological knowledge using combinations of algorithms and other meaning-making practices.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 17) Digital Spreadsheets – Part 5 – Numeracy and the Power of Zero apennings.com https://apennings.com/dystopian-economies/electric-money/lotus-spreadsheets-part-5-numeracy-and-the-power-of-zero/

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is Professor at the Dept of Technology and Society at the State University of New York (SUNY) in Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

“It’s the Infrastructure, Stupid”

Posted on | February 9, 2020 | No Comments

Infrastructure, at the deepest level, is a techno-socio bond that brings together new communications technologies, new energy sources, new modes of mobility and logistics, and new built environments, enabling communities to more efficiently manage, power, and move their economic activity, social life, and governance. – Jeremy Rifkin, The Green New Deal.[1]

The 1992 US presidential election was notable for the phrase, “It’s the Economy Stupid” as William Jefferson Clinton attacked incumbent president George H. Bush for ignoring economic and social problems at home. President Bush had a dramatic military victory that drove the Iraqis out of Kuwait with “Desert Storm,” but the economy at home lingered with high unemployment and unprecedented debt. After the election, the new Clinton-Gore administration implemented the Global Information Infrastructure (GII) that helped the obscure data networks used by academia and research institutes blossom into the Internet and its World Wide Web.

The framework combined with the rapidly emerging technology and financial resources to create the famous “Bull Run” economic revival of the late 1990s. Starting with the Netscape IPO in August, 1995, investment poured into “dot.coms” and telecoms. This information infrastructure continues as the dominant economic engine and social dynamic of the 21st century. However, opportunities exist to scale the networks, and include energy and mobility.

This week, I had a chance to attend the Japan-Texas Infrastructure Investment Forum in Austin. Texas and Japan are prolific trade partners and got together to discuss more “public goods” like transportation, water, and the possibilities of “smart cities.”

In this post, I want to connect the current imperative to build infrastructure in the US with green and smart technologies and this series’ emphasis on the New Deal. New thinking and infrastructure investments offer the opportunity to create enabling environments for sustainable and replenishing economic activity. These frameworks require examining political and juridical structures to open up avenues for new investments and evaluation systems. We may not want to reinvent the New Deal, but it’s a point of reference for examining the way forward.

If Texas were a country, it would have the 10th largest economy in the world. Bigger than Australia, Canada, and even Russia. Japan, of course, is number 3, after the US and China. Japan is moving many of its US operations to Texas due to the Lone Star state’s business environment, rich resources and cosmopolitan cities. Texas exports chemicals, food, oil and gas, and electronic products.

I was primarily interested in seeing the presentations on Smart City developments, but was intrigued by the talks on water management in Texas (I’m Dutch, it’s in my genes) and transportation. I didn’t know, for example, that Texas desalinates over 150 million gallons of water every day. Also, Texans drive over 550 million miles a day. What would be the implications of renewable-powered desalination for agriculture and general water needs? How do you manage that much road maintenance? What alternatives are available to the traditional internal combustion vehicle? Just a few tidbits that set the context for the day’s presentations on building infrastructure in Texas.

One of the objectives for Japan was to pursue contracts for their Shinkansen, the high-speed railroad that makes Japan fairly easy to traverse. It’s only a matter of time before Dallas, Austin, and Houston are connected with more efficient lines, and Japan wants to get in on that.

They even brought in the Japan Overseas Infrastructure Investment Corporation for Transport & Urban Development (JOIN) and the Japan Bank for International Cooperation (JBIC) to support the funding of the operation. Having both driven the routes to Dallas and Houston as well as taken the railroad in Japan, I certainly would enjoy the Shinkansen bullet train for my next Texas trip.

Texas is unique because it is a major carbon-extracting and exporting state. But like Saudi Arabia, it recognizes the importance of pursuing infrastructural projects with a green tinge. Growth in Texas is expected to increase substantially over the next several decades, and that means new strategies for mobility, water availability, and disaster risk reduction.

Now we have to confront and analyze the implications of a post-petroleum age. Exploring the New Deal gives us a better perspective on the size of that task, how deep and sometimes intrusive the process will be. The New Deal was a monumental, multi-decade endeavor that made American great and will not be easily matched.

Roosevelt’s New Deal was primarily an infrastructure program. Facing economic collapse and massive unemployment, FDR promised “a new deal for the American people.”[2] In the first 100 days of the FDR administration, some 15 bills were passed to assist the recovery and put people to work. Some of the major projects were the Civilian Conservation Corps (CCC), the Public Works Administration (PWA), the Tennessee Valley Authority (TVA), and the related Rural Electrification Act. These were all designed to get people back to work and build an infrastructure that would support the new energy paradigm – hydrocarbons and electricity.

While economic recovery sputtered throughout the 1930s,
federally-funded infrastructure projects built the roads, tunnels, and bridges for cars and trucks, the dams and coal-fired utilities for electrification, and some 125,000 public buildings including schools, hospitals, and government facilities. Even airports like LaGuardia outside Manhattan were a product of the New Deal.

The Hoover Dam tapped the Colorado River and provided electricity for the entire Southwest, including Los Angeles and a small town called Las Vegas. When the dam was completed in 1936, it was the most extensive electricity producing facility in the world, providing power to Arizona, California, and Nevada. It electrified homes, entertainment, industry, and agriculture.

Another big infrastructure project for the New Deal was the Interstate Highway System of 1938 that eventually laid down almost 47,000 miles of public roads. Before the attack on Pearl Harbor Roosevelt appointed a National Interregional Highway Committee to study the need for several cross-country inter-state highways. The building of the “Autobahn” in Nazi Germany was a major source of motivation. In Interregional Highways, the committee recommended constructing 40,000 miles (64,000 km) interstate highway system. It was an extraordinary push for the mobilization and motorization of the US. It provided extraordinary interconnection between cities and was the “killer app” for the automobile.

Vice-President Gore was influenced by his father, Senator Al Gore Sr., who co-authored the Federal-Aid Highway Act of 1956 infrastructure program during the Eisenhower Administration. Dwight Eisenhower had studied the German Reichsautobahnen as he plotted the invasion of Europe during World War II and was committed to building the nation-wide highway in the US. It created a network of roadways that sparked the US economy, eventually reaching some 46,000 miles. The son used the inspiration to conceptualize the National Information Infrastructure plan that turned the NSFNET into the Internet.

Citation APA (7th Edition)

Pennings, A.J. (2020, Feb 9). It’s the Infrastructure, Stupid. apennings.com https://apennings.com/democratic-political-economies/from-new-deal-to-green-new-deal-part-3-its-the-infrastructure-stupid/

Notes

[1] Quote “It’s the Infrastructure Stupid” in the title is also from Jeremy Rifkin, The Green New Deal. “It’s the Economy, Stupid” was coined by campaign strategist James Carville during the Clinton-Gore 1992 presidential campaign during the 1992 election to counter George H. Bush’s success with the first Iraq War.
[2] Blitz, M. (2017, November 20). When America’s Infrastructure Saved Democracy. Retrieved February 8, 2020, from https://www.popularmechanics.com/technology/infrastructure/a24692/fdr-new-deal-wpa-infrastructure/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

FROM NEW DEAL TO GREEN NEW DEAL, Part 2: The Failure of the National Industrial Recovery Act of 1933

Posted on | February 6, 2020 | No Comments

This is the second of an ongoing series about how the New Deal restructured the American political economy and what lessons it has for transiting to a Green New Deal. The first one dealt with the New Deal emerging from the wake of the Great Depression and the immediate policy responses by the Roosevelt administration to deal with the banking and financial crisis. This post looks at a failure, the National Industrial Recovery Agency’s (NIRA) attempts to administer a wide range of codes and practices for business and industry. What are the lessons of the NIRA for the Green New Deal?

The green energy revolution strives for zero emissions of harmful carbon by-products and near-zero marginal costs after installation. Hydrocarbons from coal or oil are incredible sources of energy but emit deadly carbon monoxide and climate threatening carbon dioxide. They are also used up in consumption and require constant replenishment. Good for petro-states like Russia and Saudi Arabia but a constant currency drain for the rest of the world. Renewables produce a constant supply of energy. They don’t last forever and eventually require replacement, but their economics are extraordinary and will make possible exciting new opportunities like desalination and purification of saltwater, for example.

The Green New Deal will reach deep into the commerce and industrialization of the global economy. While the movement is gaining momentum, a few sold Teslas, and some homes with solar panels do not a revolution make. Although I recently had my first ride in a Tesla and it was awesome.

The Green New Deal will need to continue to technologically build the Smart Grid while regulatory changing the utilities to allow smaller microgrids that can utilize local resources, including buying energy from residences and small businesses. Broadband networks and the Internet of Things (IoT) will be crucial to the convergence and providing the “smart” aspects of the grid. Other industries that will be affected include agriculture, architecture, automobiles, construction, supply chain logistics, military, etc. [1]

How will they all work together? Not only between different industries but different companies within the same sector. How will winners emerge? What will happen to losers? Solyndra, for example, became a major political issue when it filed for bankruptcy in 2011. It was a manufacturer of innovative thin- film solar cells based in Fremont, California. Solyndra received significant subsidies in guaranteed loans from the Department of Energy as part of the economic stimulus plan. But it still couldn’t compete with more traditional solar cell technology companies, especially from China. What are we to make of situations like this?

The Green New Deal faces many significant and complicated issues. What are the electrical standards, for example, the building codes? The sewage interconnections? Establishing networks of automobile recharging (or hydrogen refueling) stations? Can prices within an industry be regularized without penalizing consumers? Does labor organization need to be revived from its decimation during the Reagan years?

A step back for the New Deal…

In his larger attempt at industrial structuring, President Roosevelt sent a plan to Congress that became the National Industrial Recovery Act of 1933. Congress passed it into law on June 16 of that year. The Act created the National Industrial Recovery Agency (NIRA) to administer codes of practice for business and industry. The Act was “a clear victory for the many prominent businessmen who were backing cartelization as a solution to the nation’s industrial problems.” [1]

The Act allowed industries to create “codes of fair practice.” By suspending antitrust laws, these codes allowed corporations in particular sectors to set prices, restrict output, and increase profits. These were agreements that enabled the NIRA to administer, with the dominant trade associations, what were in effect national cartels. Although it was later declared unconstitutional by the Supreme Court, the codes became part of later legislation and became part of the national restructuring of the US economy.

While Hoover had limited his activism to the establishment of the Reconstruction Finance Corporation that served as a lender of last resort to banks and the railroads, he opposed cartelization and thus alienated himself from many business leaders. The NIRA placated most of the big business concerns. However, for political reasons, Roosevelt’s plan was designed to serve many other constituencies. He had made concessions to liberal conservatives to reassure them that socialism was not anywhere near the path he was taking, nor was he forwarding “monopoly.” But opposition did mount from small business people and farmers who saw the codes being dominated by big business.

Finally, the Act started to antagonize large corporations because of Section 7a, which encouraged labor organization. and had led to a series of violent strikes.

A poultry company from Brooklyn, NY, sued the NIRA and the case went all the way to the Supreme Court. In Schechter Poultry Corp. v. The United States, the U.S. Supreme Court, rejected the compulsory-code system. SCOTUS argued that the NIRA improperly delegated legislative powers to the executive and that regulating poultry codes did not meet the standards of interstate commerce, a constitutional requirement for Federal regulation. By May of 1935, the Supreme Court declared the Act unconstitutional.

The New Deal shows us how massive and complicated a major economic reorganization can be. The Green New Deal should seriously study the issues that FDR confronted to revive the economy and chart a new course for the US that avoided revolution. The case of the NIRA gives us some idea of the scale of the transition and the challenges of government intervention in the economy.

Notes

[1] Rifkin, J. (2020) Green New Deal. Retrieved from

McQuail, K. (1982) Big Business and Presidential Power. NY: William Morrow and Company. p. 27.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    July 2025
    M T W T F S S
     123456
    78910111213
    14151617181920
    21222324252627
    28293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.
  • Verified by MonsterInsights