Anthony J. Pennings, PhD

WRITINGS ON DIGITAL STRATEGIES, ICT ECONOMICS, AND GLOBAL COMMUNICATIONS

Letteracy and Logos

Posted on | May 5, 2020 | No Comments

More than ever, people are interested in how visual design influences the production of meaning as well as its intellectual emotional effects. Typography is the design and arrangement of letters and text so that the writing is easy to understand, appealing, and conveys an appropriate set of feelings and meanings. A logo is the graphic signature of a person or organization that is meant to encapsulate and communicate the preferred symbolic meanings of an organization.

Below is one of my favorite TED talks about typography.

This blog post discusses both the importance of typography in visual design such as in a magazine or webpage layout as well as in the use of logos. A new type of literacy in the new media age has emerged that Seymour Papert (1993) and others began to call “letteracy.” Papert was critical of the idea of introducing letters too early in a child’s development, but recognized that connecting with culture and history required alphabetical literacy.

“Letteracy” suggests a larger conversation about global visual culture and why people are increasingly more interested in the impact of typography in our media world. A twist on “literacy,” it points to discrepancy between a world in which reading is pervasive, and the relative ignorance of how letters are designed and have an influence on us. One of the first questions to ask is “What are letters?”

Capturing Sound

Letters are phonographic – they code the sounds of language in scripted figures. A few writing systems like Chinese characters are ideographic, they code ideas into their figures. Phonographic writing has the advantage of coding everyday language in their letters while being flexible enough to incorporate new words. Ideographic writing requires extensive memorization and social mentoring to enforce meanings and consistency in sound reproduction.

Asian societies like Korea, and to a lesser extent, Japan, have replaced Chinese characters with the phonographic characters. Korea instituted “Hangul” that is phonographic but with some iconic aspects. The characters represent oral movements of the tongue and lips used to achieve those sounds. The change allowed Korea to achieve a high rate of the population, achieving reading literacy. Japan has two sets of phonographic characters, hiragana, and katakana. These both are sound based, but each character represents a whole syllable – the vowel and the consonant. To make the situation a bit more complicated, they still use “Kanji” ideographic characters borrowed from China.

From Printing Press to Desktop Publishing

Johannes Gutenberg is credited with inventing both the printing press and the production of durable typefaces around 1460 AD. The technology had also been developed in China and Korea, but conditions in Europe were better for its expansion. Printing presses in China and Korea were state-based projects that eventually withered. Conversely, religious, market, and political conditions in Europe improved their chances of success.

The first best-seller? The Christian Bible. In 1517, the Protestant revolution began that emphasized reading of the Bible over the services of the Catholic church and its priests. It also helped Europe develop separate nation-states as people became more literate in their local languages. Printed materials in different dialects began to coagulate community identities. People began to identify with others who spoke the same dialect and recognize them as sharing the same national values. Benedict Anderson called these “imagined communities” in the book by the same name.

Thanks to Steve Jobs and the Apple Macintosh graphical user interface, different typefaces were added to computers. Along with WYSIWYG display, the new GUI enabled desktop publishing. This democratized the printing “press.” Consequently, understanding the importance of different styles of letters became an important literacy of the digital age.

Part of this literacy is an understanding the various meanings associated with typography. Type fonts can be designed and used with various purposes in mind. The “Power of Typography” video above explains in more detail.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

US INTERNET POLICY, PART 2: THE SHIFT TO BROADBAND

Posted on | March 24, 2020 | No Comments

This post is second in a series that I am producing during the COVID-19 pandemic about the importance of telecommunications policy in ensuring the widespread availability of affordable high-speed Internet access. Teaching online and working from home have gone from fringe activities to be central components of life. As we move to a Smart New Deal to transform American life, how we structure our digital environments will be central. This post discusses the transition from dial-up modems in the early days of the Internet to high-speed broadband connections. With that technical transition and the FCC’s 2005 decision, the competitive environment that the Clinton-Gore administration built, collapsed into the cable-telco duopoly we see today and made “net neutrality” an issue.

The Internet Service Providers (ISPs) mentioned in US Internet Policy, Part 1 facilitated the process of getting individuals, businesses and government “online” by linking to the Internet backbone and going “retail” with dial-up modems and then high-speed broadband connections. The term “online” emerged as way to distinguish data communications from telephony, which was highly regulated by the Communications Act of 1934. ISPs offered businesses and consumers high-speed data services for accessing the World Wide Web, hosting websites, and providing large file transfers (FTP). The key was accessing the rapidly expanding Internet packet-switching backbone network that had been developed by the National Science Foundation (NSF).

The National Science Foundation’s backbone network (NSFNET) began data transmissions at 56 kilobits per second (Kbit/s) but was upgraded to T1 lines in 1988, sending at 1.544 megabits per second (Mbit/s). It eventually consisted of some 170 smaller networks connecting research centers and universities. In 1991, the NSFNET backbone was upgraded to T3 lines sending data at 45 Mbit/s. From 1993, NSFNET was privatized, and a new backbone architecture was solicited, that incorporated the private sector.

The next-generation very-high-performance Backbone Network Service (vBNS) was developed as the successor to the NSFNet. vBNS began operation in April 1995 and was developed with MCI Communications, now a part of Verizon. The new backbone consisted primarily of glass Optical Carrier (OC) lines, each of which had several fiber-optic cables banded together to increase the total amount of capacity of the line. The interconnected Optical Carrier (OCx) lines operated at 155 Mbit/s and higher. These high-speed trunk lines soon multiplied their capabilities from OC-3 operating at 155 Mbit/s, to OC-12 (622 Mbit/s), OC-24 (1244 Mbit/s), and OC-48 (2488 Mbit/s). By 2005, OC-48 was surpassed by OC-192 (9953.28 Mbit/s) and 10 Gigabit Ethernet.

As part of NSFNET decommission in 1995, these backbone links connected to the four national network access points (NAPs) in California, Chicago, New Jersey, and Washington D.C. The backbone expanded to multiple carriers that coordinated with ISPs to provide high-speed connections for homes and businesses.

At first consumers used analog dial-up modems over the telephone lines at speeds that increased to 14.4 kilobits per second (Kbit/s or just k) by 1991 to 28.8 kbit/s in 1994. Soon the 33.6 Kbit/s was invented that many thought to be the upper limit for phone line transmissions. But the 56K modem was soon available and a new set of standards continue to push speeds of data over the telephone system. The 56K modem was invented by Dr. Brent Townshend for an early music streaming service. This new system avoided the analog to digital conversion that seriously hampered data speeds and allow content to be switched digitally to the consumer’s terminal device, usually a PC.

Also, during the 1990s, the telcos were conducting tests using a new technology called ADSL (Asynchronous Digital Subscriber Line). It was initially designed to provide video over copper lines to the home. Baby Bells, in particular, wanted to offer television services to compete with cable television. It was called asynchronous because it could send data downstream to the subscriber faster (256 kbit/s-9 Mbit/s) than upstream (64 Kbit/s-1.54 Kbit/s) to the provider.

ADSL was able to utilize electromagnetic frequencies that telephone wires carry, but don’t use. ADSL services separated the telephone signals into three bands of frequencies-one for telephone calls and the other two bands for uploading and downloading Internet activities. Different versions and speeds emerged based on the local telco’s ability and willingness to get an optical fiber link close to the neighborhood or “to the curb” next to a household or business location.

They were soon called Digital Subscriber Lines (DSL), and they began to replace dial-up modems. High demand and competition from cable companies with high-speed coaxial lines pressured ISPs and telcos to adapt DSL technologies. DSL and new cable technologies that carried Internet traffic, as well as television, came to be collectively called “broadband” communications.

Internet traffic grew at a fantastic rate during the late 1990s as individuals and corporations rushed to “get on the web.” The rhetoric of the “new economy” circulated and fueled investments in web-based companies and telecommunications providers.

A temporary investment bubble emerged as many companies lacked the technology or business expertise to obtain profits. Dot.coms such as Drkoop.com, eToys.com, Flooz.com, GeoCities, Go.com, Kozmo.com, Pets.com, theGlobe.com, and Webvan.com failed for a variety of reasons but mainly flawed business plans and the premature expenditure of investment capital.

Similarly, many carriers such as Global Crossing, WorldCom, and ISPs overestimated web traffic and built excess capacity. In the wake of the dot.com crash in 2000 and the telecom crash in 2002, many ISPs filed for bankruptcy, including Wall Street darlings like Covad, Excite@home, NorthPoint, PSINet, Rhythms NetConnections, and Winstar Communications.

The broadband industry changed significantly after the 2000 election. The technological infrastructure was significantly devastated by the dot.com crash of 2000 and the telecom crash of 2002.

Furthermore, Internet policy changed when the Bush administration was reelected in 2004. The FCC revoked Computer II in 2005 when it redefined carrier-based broadband as an information service.

This meant that broadband was effectively not regulated and telcos could go on to compete with the ISPs. Instead of offering backbone services and being required to interconnect with the ISPs, they became ISPs and were no longer required to provide ISPs their connection to the Internet. The competitive environment that nurtured Internet growth was effectively decimated.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 1: The Rise of ISPs

Posted on | March 15, 2020 | No Comments

Much of the early success of the Internet in the USA can be attributed to the emergence of a unique organizational form, the Internet Service Provider or “ISP,” which became the dominant provider of Internet and broadband services in the 1990s. These organizations resulted from a unique set of regulatory directives that pushed the Internet’s development and created a competitive environment that encouraged the proliferation of ISPs and the spread of the World Wide Web.

In this series on Internet policy, I look at the rise of the World Wide Web, the shift to broadband, deregulation, and consolidation of broadband services. Later in the series, I address the issue of net neutrality and raise the question, “Is Internet service a utility? Is it an essential service that should be made universally available to all Americans at regulated prices?

The first ISPs began as US government-funded entities that served research and education communities of the early Internet. Secured by Al Gore in 1991, legislation signed by President George H. Bush created the model of the National Research and Education Network (NREN), a government-sponsored internet service provider dedicated to supporting the needs of the research and education communities within the US. Internet2, Merit, NYSERNET, OARnet, and KanRen were a few of the systems that provided schools, and other non-profit organizations access to the World Wide Web. While dialup services like Compuserve existed in the early 1980s, only later were the ISPs released for commercial traffic and services.

While telecommunications carriers had been moving some Internet traffic since the late 1980s, their role expanded dramatically after the Internet began to allow commercial activities. In June of 1992 Congressman Rick Boucher (D-Va) introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the US National Science Foundation Network (NSFNET). “A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law.” The amendment allowed advertising and sales activities on the NSFNET.

As part of the National Information Infrastructure (NII) plan, the US government decommissioned the US National Science Foundation Network (NSFNET) in 1995. It had been the publicly financed backbone for most IP traffic in the US. The NII handed over interconnection to four Network Access Points (NAPs) in different parts of the country to create a bridge to the modern Internet of many private-sector competitors.

These NAPS contracted with the big commercial carriers such as Ameritech, Pacific Bell, and Sprint for new facilities to form a network-of-networks, anchored around Internet Exchange Points (IXPs). The former regional Bell companies were to be primarily wholesalers, interconnecting with ISPs. This relatively easy process of connecting routers was to put the “inter” in the Internet but also became sites of performance degradation and unequal power relations.

As the Internet took off in the late 1990s, thousands of new ISPs set up business to commercialize the Internet. The major markets for ISPs were: 1) access services, 2) wholesale IP services, and 3) value-added services offered to individuals and corporations. Access services were provided for both individual and corporate accounts and involved connecting them to the Internet via dial-up, ISDN, T-1, frame-relay or other network connections. Wholesale IP services were primarily offered by facilities-based providers like MCI, Sprint, and WorldCom UUNET (a spinoff of a DOD-funded seismic research facility) and involved providing leased capacity over its backbone networks. Value-added services included web-hosting, e-commerce, and networked resident security services. By the end of 1997, over 4,900 ISPs existed in North America, although most of them had fewer than 3,000 subscribers.[2] See the below video and this response for how much things have changed.

FCC policy had allowed unlimited local phone calling for enhanced computer services and early Internet users connected to their local ISP using their modems over POTS (Plain Old Telephone System). ISPs quickly developed software that was put on CD-ROMs that could be easily installed on a personal computer. The software usually put an icon on the desktop screen of the computer that when clicked on would dial the ISP automatically, provide the password, and connect the user to Internet. A company called Netscape created a popular “browser” that allowed text and images to be displayed on the screen. The browser used what was called the World Wide Web, a system of accessing files quickly from computer servers all over the globe.

The ISPs emerged as an important component to the Internet’s accessibility and were greatly aided by US government policy. The distinctions made in the FCC’s Second Computer Inquiry in 1981 allowed ISPs to bypass many of the regulatory roadblocks experienced by traditional communication carriers. Telcos were to provide regulated basic services and “enhanced services” were to stay unregulated. Schiller explained:

    Under federal regulation, U.S. ISPs had been classed as providers of enhanced service. This designation conferred on ISPs a characteristically privileged status within the liberalized zone of network development. It exempted them from the interconnection, or access, charges levied on other systems that tie in with local telephone networks; it also meant that ISPs did not have to pay into the government’s universal service fund, which provided subsidies to support telephone access in low-income and rural areas. As a result of this sustained federal policy, ISPs enjoyed a substantial cross-subsidy, which was borne by ordinary voice users of the local telecommunications network.[3]

ISPs looked to equip themselves for potential new markets and also connect with other companies. For example, IBM and telecom provider Qwest hooked up to offer web hosting services. PSINet bought Metamor to not only transfer data but to host, design, and move companies from the old software environment to the new environment. ISPs increasing saw themselves as not only providers of a transparent data pipe but also as a provider of value-added services such as web hosting, colocation, and support for domain name registration.

The next part of this series will discuss the shift to higher speed broadband capabilities. Later, the consolidation of the industries starting in 2005 when the FCC changed the regulatory regime for wireline broadband services.

Notes

[1] Hundt, R. (2000) You Say You Want a Revolution? A Story of Information Age Politics. Yale University Press. p. 25.
[2] McCarthy, B. (1999) “Introduction to the Directory of Internet Service Providers,” Boardwatch Magazine’s Directory of Internet Service Providers. Winter 1998-Spring 1999. p. 4.
[3] Schiller, D. (1999) Digital Capitalism. The MIT Press. p. 31.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Undergraduate Director at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Five Stages of ICT for Global Development

Posted on | February 19, 2020 | No Comments

Summarized remarks from “Five Stages of Global ICT4D: Governance, Network Transformation, and the Increasing Utilization of Surplus Behavioral Data for Prediction Products and Services,” presented at the 27th AMIC Annual Conference on 17-19 June 2019 at Chulalongkorn University, Bangkok, Thailand.

This presentation will explore and outline the following stages of development utilizing information and communications Computerization and Development in SE Asia technologies (ICT). The ICT acronym has emerged as one of the most popular monikers for the digital revolution and is often combined with “development” to form ICT4D. Development is a contested term with currency in several areas. Still, in global political economy, it refers to the process of building environments and infrastructure needed to improve the quality of human life. Often this means enhancing a nation’s agriculture, education, health, and other public goods that are not strictly economy-related but improve well-being and intellectual capital. Of particular interest is the transformation of the network for WiFi and mobile use as well as data-intensive solutions to address many development issues. A growing concern is that data could be used intrusively to manipulate behaviors.

The stages are:

1) Development/Modernization; 2) New World Economic and Information Orders; 3) Structural Adjustment and Re-subordination, 4) Global Integration, and; 5) Smart/Sustainable Mobile and Data-Driven Development.

The explication of these stages will provide historical context for understanding trends in ICT innovation and implementation. This research will also provide insights into the possibilities of ICT diffusion in developing environments.

1) Development/Modernization

“Development” and “modernization” characterized the post-World War II prescription for economic development around the world, and especially in newly decolonized nation-states. Scholars talked of an eventual “takeoff” if the proper prescriptions were followed, particularly the adoption of new agricultural techniques termed the “Green Revolution.”[1] Information and Communications Technologies (ICTs) consisted primarily of the “five communication revolutions” (print, film, radio, television, and later satellite) that were utilized to spread information about modern development techniques in agriculture, health, education, and national governance.

Telegraphy and telephones were strangely absent from much of the discussion but were important for government activities as well as large-scale plantations, mining operations, and transportation, and shipping. Because of their large capital requirements and geographic expanse, countries uniformly developed state-controlled Post, Telephone, and Telegraph (PTTs) entities. Organized with the help and guidance of the International Telecommunications Union (ITU), the oldest United Nations entity, PTTs struggled to provide basic voice and telegraphic services.

Wilbur Schramm’s (1964) book Mass Media and National Development made crucial links between media and national development. Published by Stanford University Press and UNESCO, it examined the role of newspapers, radio, and television. Its emphasis on the role of information in development also laid the foundation for the analysis of computerization and ICT in the development process. I had an office next to Schramm for many years at the East-West Center while we worked on the National Computerization Policy project that resulted in the benchmark study Computerization and Development in Southeast Asia (1987).

2) New World Economic and Information Orders

Rising frustrations and concerns about neo-colonialism resulted in a collective call by developing countries for various conceptions of a “New World Economic and Communication Order.” It was echoed by UNESCO in the wake of OPEC oil crises and the resulting rising Third World debt. The issue was primarily news flow and the imbalanced flow of information from North to South. Developing countries were concerned about the unequal flows of news and data from developing to developed countries. In part, it was the preponderance of news dealing with disasters, coups, and other calamities that many countries felt restricted flows of capital.The calls caused a backlash in the US and other developed countries concerned about the independence of journalism and the free flow of trade. [2]

It was followed by concerns about obstacles hindering communications infrastructure development and how telecommunications access across the world could be stimulated. In 1983, UNESCO’s World Communication Year, the Independent Commission met several times to discuss the importance of communication infrastructure for social and economic development and to make recommendations for spurring its growth.

The Commission consisted of seventeen members – communication elites from both private and public sectors and representing a number of countries. Spurred on by the growing optimism about the development potential of telecommunications, they investigated ways Third World countries could be supported in this area. They published their recommendations in The Missing Link (1984) or what soon was to be called the “Maitland Report” after its Chair, Sir Donald Maitland from the United Kingdom.

The transition from telegraph and telex machines to computers also resulted in concerns about data transcending national boundaries. The Intergovernmental Bureau for Informatics (IBI) that had been set up as the International Computation Centre (ICC) in 1951 to help countries get access to major computers, began to study national computerization policy issues in the mid-1970s.

They increasingly focused on transborder data flows (TDF) that moved sensitive corporate, government, and personal information across national boundaries. The first International Conference on Transborder Data Flows was organized in September 1980, followed by a second held in 1984; both were held in Rome (Italy). The increasing use of computers raised questions about accounting and economic data avoiding political and tax scrutiny. The concern was that these data movements could act like a “trojan horse” and compromise a country’s credit ratings and national sovereignty, as well as individual privacy.

3) Structural Adjustment and Re-subordination

Instead, a new era of “structural adjustment” enforced by the International Monetary Fund emerged that targeted national post, telephone, and telegraph (PTT) agencies and other aspects of government administration and ownership. Long considered agents of national development and employment, PTTs came under increasing criticism for their antiquated technologies and lack of customer service. The flow of petrodollars pressured PTTs to add new value-added data networks and undergo satellite deregulation. Global circuits of digital money and news emerged such as Reuters Money Monitor Rates that linked currency exchange markets around the world in arguably the first virtual market.

A new techno-economic imperative emerged that changed the relationship between government agencies and global capital. PC-spreadsheet technologies were utilized to value and privatize PTTs so they could be corporatized and listed on electronically linked share-market exchanges. Communications markets were liberalized to allow international competition for sales of digital switches and fiber optic networks. Developing countries became “emerging markets,” consistently disciplined by the “Washington Consensus” stressing a set of policy prescriptions to continue to open them up to transborder data flows and international trade.[3]

4) Global ICT Integration

Packet-switching technologies standardized into the ITU’s X.25 and X.75 protocols for PTT data networks, transformed into ubiquitous TCP/IP networks. Cisco Systems became the principal enabler with a series of multi-protocol routers designed for enterprises, governments and eventually telcos. Lucent, Northern Telecom, and other telecommunications equipment suppliers quickly lost market share as the Internet protocols, mandated by the US military’s ARPANET, and later by the National Science Foundation’s NSFNET, were integrated into ISDN, ATM, and SONET technologies in telcos around the world.

The Global Information Infrastructure (GII) introduced at the annual ITU meeting in Buenos Aires in March of 1994 by Vice President Gore targeted national PTT monopolies. He proposed a new model of global telecommunications based on competition instead of monopoly, the rule of law, and the interconnection of networks to existing networks at fair prices. Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations which called for the creation of the World Trade Organization (WTO).

Formed in 1995, the WTO had two meetings in 1996 and 1997 that created a new era of global communications development. Members party to the new multilateral arrangement met quickly in Singapore in 1996 to reduce tariffs on the international sales of a wide variety of information technologies. The next year the WTO met in Geneva and established rules for the continued privatization of national telecommunications operations. Sixty-nine nations party to the WTO, including the U.S., signed the Agreement on Basic Telecommunications Services in 1997 that codified new rules for telecommunications deregulation where countries agreed to privatize and open their own telecommunications infrastructures to foreign penetration and competition by other telcos.

The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not negotiated and reduced tariffs for crucial networking and computer equipment. The resultant liberalization of data and mobile services around the world made possible a new stage in global development.

5) Smart/Sustainable Mobile and Data-Centric Development

The aggressive trade negotiations and agreements in the 1990s significantly reduced the costs of ICT devices and communication exchanges throughout the world, making possible a wide variety of new commercial and development activities based on web capabilities. Blockchain technologies and cryptocurrencies, the Internet of Things (IoT), and the proliferation of web platforms are some of the current conceptions of how reduced costs for communications and information analysis are enhancing network effects and creating value from the collection and processing of unstructured data.

This project will expand on these stages and provide a context for a further investigation of ICT for development drawing on historical and current research. Of particular concern is the implementation of policies and practices related to contemporary development practices, but commercial and monetization techniques are important as well.

Notes

[1] Rostow, W.W. (1960) Stages of Economic Growth: A Non-Communist Manifesto. Cambridge: Cambridge University Press. See also Rostow W.W., (1965) Stages of Political Development. Cambridge: Cambridge University Press.
[2] An excellent discussion of the various New World Order discourses is found in Majid Tehranian’s (1999) Global Communication and World Politics: Domination, Development, and Discourse. Boulder, CO: Lynne Rienner Publishers. p. 40-41. Also, see Jussawalla, M. (1981) ) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[3] Wriston, W.W. (1992) The Twilight of Sovereignty : How the Information Revolution Is Transforming Our World.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea. From 2002-2012 he was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, andVictoria University in Wellington, New Zealand. His American home is in Austin, Texas and taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

From New Deal to Green New Deal, Part 3: “It’s the Infrastructure Stupid”

Posted on | February 9, 2020 | No Comments

Infrastructure, at the deepest level, is a techno-socio bond that brings together new communications technologies, new energy sources, new modes of mobility and logistics, and new built environments, enabling communities to more efficiently manage, power, and move their economic activity, social life, and governance.” – Jeremy Rifkin, The Green New Deal.[1]

This week, I had a chance to attend the Japan-Texas Infrastructure Investment Forum in Austin. Texas and Japan are prolific trade partners and got together to discuss more “public goods” like transportation, water, and the possibilities of “smart cities.” In this post, I want to connect the current imperative to build infrastructure in the US with green technologies and also with this series’ emphasis on the New Deal.

If Texas were a country, it would have the 10th largest economy in the world. Bigger than Australia, Canada, and even Russia. Japan, of course, is number 3, after the US and China. Japan is moving many of its US operations to Texas due to the Lone Star state’s business environment, rich resources and cosmopolitan cities. Texas exports chemicals, food, oil and gas, and electronic products.

I was primarily interested in seeing the presentations on Smart City developments, but was intrigued by the talks on water management in Texas (I’m Dutch, it’s in my genes) and transportation. I didn’t know, for example, that Texas desalinates over 150 million gallons of water every day. Also, Texans drive over 550 million miles a day. What would be the implications of renewable-powered desalination for agriculture and general water needs? How do you manage that much road maintenance? What alternatives are available to the traditional internal combustion vehicle? Just a few tidbits that set the context for the day’s presentations on building infrastructure in Texas.

One of the objectives for Japan was to pursue contracts for their Shinkansen, the high-speed railroad that makes Japan fairly easy to traverse. Its only a matter of time before Dallas, Austin, and Houston are connected with more efficient lines, and Japan wants to get in on that.

They even brought in the Japan Overseas Infrastructure Investment Corporation for Transport & Urban Development (JOIN) and the Japan Bank for International Cooperation (JBIC) to support the funding of the operation. Having both driven the routes to Dallas and Houston as well as taken the railroad in Japan, I certainly would enjoy the Shinkansen bullet train for my next Texas trip.

Texas is unique because it is a major carbon-extracting and exporting state. But like Saudi Arabia, it recognizes the importance of pursuing infrastructural projects with a green tinge. Growth in Texas is expected to increase substantially over the next several decades, and that means new strategies for mobility, water availability, and disaster risk reduction.

Now we have to confront and analyze the implications of a post-petroleum age. Exploring the New Deal gives us a better perspective on the size of that task, how deep and sometimes intrusive the process will be. The New Deal was a monumental, multi-decade endeavor that made American great and will not be easily matched.

Roosevelt’s New Deal was primarily an infrastructure program. Facing economic collapse and massive unemployment, FDR promised “a new deal for the American people.”[2] In the first 100 days of the FDR administration some 15 bills were passed to assist the recovery and put people to work. Some of the major projects were the Civilian Conservation Corps (CCC), the Public Works Administration (PWA), the Tennessee Valley Authority (TVA) and the related Rural Electrification Act. These were all designed to get people back to work and build a infrastructure that would support the new energy paradigm – hydrocarbons and electricity.

While economic recovery sputtered throughout the 1930s,
federally-funded infrastructure projects built the roads, tunnels, and bridges for cars and trucks, the dams and coal-fired utilities for electrification, and some 125,000 public buildings including schools, hospitals, and government facilities. Even airports like LaGuardia outside Manhattan were a product of the New Deal.

The Hoover Dam tapped the Colorado River and provided electricity for the entire Southwest, including Los Angeles and a small town called Las Vegas. When the dam was completed in 1936, it was the most extensive electricity producing facility in the world, providing power to Arizona, California, and Nevada. It electrified homes, entertainment, industry, and agriculture.

Another big infrastructure project for the New Deal was the Interstate Highway System of 1938 that eventually lay down almost 47,000 miles of public roads. Before the attack on Pearl Harbor Roosevelt appointed a National Interregional Highway Committee to study the need for several cross-country inter-state highways. The building of the “Autobahn” in Nazi Germany was a major source of motivation. In Interregional Highways, the committee recommended constructing 40,000 miles (64,000 km) interstate highway system. It was an extraordinary push for the mobilization and motorization of the US. It provided extraordinary interconnection between cities and was the “killer app” for the automobile.

Notes

[1] Quote “It’s the Infrastructure Stupid” in the title is also from Jeremy Rifkin, The Green New Deal. “It’s the Economy, Stupid” was coined by campaign strategist James Carville during the Clinton-Gore 1992 presidential campaign during the 1992 election to counter George H. Bush’s success with the first Iraq War.
[2] Blitz, M. (2017, November 20). When America’s Infrastructure Saved Democracy. Retrieved February 8, 2020, from https://www.popularmechanics.com/technology/infrastructure/a24692/fdr-new-deal-wpa-infrastructure/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

FROM NEW DEAL TO GREEN NEW DEAL, Part 2: The Failure of the National Industrial Recovery Act of 1933

Posted on | February 6, 2020 | No Comments

This is the second of an ongoing series about how the New Deal restructured the American political economy and what lessons it has for transiting to a Green New Deal. The first one dealt with the New Deal emerging from the wake of the Great Depression and the immediate policy responses by the Roosevelt administration to deal with the banking and financial crisis. This post looks at a failure, the National Industrial Recovery Agency’s (NIRA) attempts to administer a wide range of codes and practices for business and industry. What are the lessons of the NIRA for the Green New Deal?

The green energy revolution strives for zero emissions of harmful carbon by-products and near-zero marginal costs after installation. Hydrocarbons from coal or oil are incredible sources of energy but emit deadly carbon monoxide and climate threatening carbon dioxide. They are also used up in consumption and require constant replenishment. Good for petro-states like Russia and Saudi Arabia but a constant currency drain for the rest of the world. Renewables produce a constant supply of energy. They don’t last forever and eventually require replacement, but their economics are extraordinary and will make possible exciting new opportunities like desalination and purification of saltwater, for example.

The Green New Deal will reach deep into the commerce and industrialization of the global economy. While the movement is gaining momentum, a few sold Teslas, and some homes with solar panels do not a revolution make. Although I recently had my first ride in a Tesla and it was awesome.

The Green New Deal will need to continue to technologically build the Smart Grid while regulatory changing the utilities to allow smaller microgrids that can utilize local resources, including buying energy from residences and small businesses. Broadband networks and the Internet of Things (IoT) will be crucial to the convergence and providing the “smart” aspects of the grid. Other industries that will be affected include agriculture, architecture, automobiles, construction, supply chain logistics, military, etc. [1]

How will they all work together? Not only between different industries but different companies within the same sector. How will winners emerge? What will happen to losers? Solyndra, for example, became a major political issue when it filed for bankruptcy in 2011. It was a manufacturer of innovative thin- film solar cells based in Fremont, California. Solyndra received significant subsidies in guaranteed loans from the Department of Energy as part of the economic stimulus plan. But it still couldn’t compete with more traditional solar cell technology companies, especially from China. What are we to make of situations like this?

The Green New Deal faces many significant and complicated issues. What are the electrical standards, for example, the building codes? The sewage interconnections? Establishing networks of automobile recharging (or hydrogen refueling) stations? Can prices within an industry be regularized without penalizing consumers? Does labor organization need to be revived from its decimation during the Reagan years?

A step back for the New Deal…

In his larger attempt at industrial structuring, President Roosevelt sent a plan to Congress that became the National Industrial Recovery Act of 1933. Congress passed it into law on June 16 of that year. The Act created the National Industrial Recovery Agency (NIRA) to administer codes of practice for business and industry. The Act was “a clear victory for the many prominent businessmen who were backing cartelization as a solution to the nation’s industrial problems.” [1]

The Act allowed industries to create “codes of fair practice.” By suspending antitrust laws, these codes allowed corporations in particular sectors to set prices, restrict output, and increase profits. These were agreements that enabled the NIRA to administer, with the dominant trade associations, what were in effect national cartels. Although it was later declared unconstitutional by the Supreme Court, the codes became part of later legislation and became part of the national restructuring of the US economy.

While Hoover had limited his activism to the establishment of the Reconstruction Finance Corporation that served as a lender of last resort to banks and the railroads, he opposed cartelization and thus alienated himself from many business leaders. The NIRA placated most of the big business concerns. However, for political reasons, Roosevelt’s plan was designed to serve many other constituencies. He had made concessions to liberal conservatives to reassure them that socialism was not anywhere near the path he was taking, nor was he forwarding “monopoly.” But opposition did mount from small business people and farmers who saw the codes being dominated by big business.

Finally, the Act started to antagonize large corporations because of Section 7a, which encouraged labor organization. and had led to a series of violent strikes.

A poultry company from Brooklyn, NY, sued the NIRA and the case went all the way to the Supreme Court. In Schechter Poultry Corp. v. The United States, the U.S. Supreme Court, rejected the compulsory-code system. SCOTUS argued that the NIRA improperly delegated legislative powers to the executive and that regulating poultry codes did not meet the standards of interstate commerce, a constitutional requirement for Federal regulation. By May of 1935, the Supreme Court declared the Act unconstitutional.

The New Deal shows us how massive and complicated a major economic reorganization can be. The Green New Deal should seriously study the issues that FDR confronted to revive the economy and chart a new course for the US that avoided revolution. The case of the NIRA gives us some idea of the scale of the transition and the challenges of government intervention in the economy.

Notes

[1] Rifkin, J. (2020) Green New Deal. Retrieved from

McQuail, K. (1982) Big Business and Presidential Power. NY: William Morrow and Company. p. 27.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

YouTube Meaning-Creating (and Money-Making) Practices

Posted on | February 2, 2020 | No Comments

Note: This is required reading for my Visual Rhetoric and IT class.

Youtube has emerged as the primary global televisual medium, attracting about 1.3 billion viewers from countries around the world with over 5 billion videos watched every day. People suck up some 3.25 billion hours of YouTube videos each month and over ten thousand YouTube videos generated over 1 billion views since they were posted. YouTube contents range from homemade DIY videos to professional high definition television productions.

Alphabet recently announced that YouTube made $15 billion in advertising revenues in 2019, growing 36% over 2018. That is a lot of money to spread around.

YouTube provides opportunities for new publishers or “vloggers” covering a wide range of topics. Every minute, some 400 hours of video are uploaded to YouTube from all around the world. Not many of those owners get rich but some have done extraordinarily well. Together, the world’s 10 highest-paid YouTube stars made $180 million in the year between June 1, 2017, and June 1, 2018, almost double the year before.

One big star to emerge on YouTube is Daniel Middleton (DanTDM) who made US$18.5 million in 2018. Middleton is an Australian professional gamer, and his videos primarily cover games like Minecraft, Plants vs. Zombies, and other favorite games that DanTDM’s primary audience, young kids, enjoy. Here he reviews the massive hit called Fortnite.

What makes DanTDM’s YouTube videos successful? What does he do to keep the viewer’s interested in his content and what keeps his audience coming back for more? How does he create entertainment and meaning for those who watch his show?

Even more extraordinary is Ryan ToysReview (now called Ryan’s World!). Ryan is a 7-year-old host of the YouTube show and is the top-earning YouTube channel at $22.5 million for the year up to June 1, 2018.

What knowledge can we gather about Ryan’s World!? What observations can we make about his show and other popular channels?

This series of posts will set out to explore a crucial relationship in (digital) media studies – between cultural/technical production practices and the meanings, emotions, and feelings that are produced by those practices. Media production involves a combination of equipment and processes to capture and construct various images, edit sequences, and integrate audio and sound effects to produce specific results. These are the meaning-making techniques that construct our blockbuster movies, our Netflix binge favorites, our local newscasts, and also the YouTube channels that educate and entertain us. Can we use some of the same analytical techniques to “interrogate” YouTube channels?

A good deal of related work has been done on film and television. By exploring camera shots: close-ups, zooms, pans, shot composition, as well as montage: cutting rates, parallel editing, reaction shots, wipes, etc., film studies and even television studies have given us a literacy to understand the power of these mediums. These important meaning-making practices can also be discerned in the realm of YouTube videos.

Social media apps like YouTube present significant new complications in understanding the power of the global mediasphere. One area of concern are the metrics associated with YouTube. Ratings were always a significant part of television services to determine the value of programming. YouTube measures “views” and adds likes, dislikes, shares, play-listing, and subscribers to measure the credibility and commercial viability of a channel. But vulnerabilities in the system allow many of these numbers to be tweaked by the “fake-view” ecosystem that has grown around YouTube.

YouTube has become a new frontier for media studies. The opportunity exists now to pioneer strategies for understanding this intriguing visual medium and the sets of meanings they create. What techniques are used in YouTube “channels?” What types of persuasive techniques are effective on YouTube channels. How do they differ from techniques used in film and television? Who is driving the narration of the video and what voices are they using?

But there are broader issues to address as well. What are the cultural, economic, and social implications of YouTube? What new ideas and cultural forms diffuse via Youtube? What economic activities and opportunities are made available through the platform? What impact will YouTube have on existing institutions?

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

FROM NEW DEAL TO GREEN NEW DEAL, Part 1: Roosevelt Saves Capitalism

Posted on | February 2, 2020 | No Comments

“I pledge you, I pledge myself, to a new deal for the American people.” – FDR in the summer of 1932.

Recent discussions about a proposed Green New Deal encouraged me to review some of my notes on the original New Deal and if it could provide relevant insights into our current situation. The New Deal began in the early 1930s as a response to the economic crash in the late 1920s. It built up momentum during the “Great Depression” of the next decade. It ended, arguably, just before the end of the millennium with the repeal of the Glass-Steagall Act on November 12, 1999 when President Clinton signed the Financial Services Modernization Act. The New Deal was one of the most influential sets of legislation in American history and it set the course of modern US history.

On October 1, 1928, the Dow Industrial Average closed at 240.07. Earlier that year, “the Dow” was increased to thirty stocks from the traditional 12 stocks. The “Roaring Twenties” had been a good decade for many investors. But that was about to end for most.

It would continue to rise over the next year to 381 before beginning its dramatic descent. In October 1929, the Dow began its steep decline as investors hastened to liquidate their positions. From Thursday, October 24 to Tuesday, October 29, the stock market crashed dramatically, eventually falling to just 41.

Capitalism had run into deep trouble. Never before had public confidence been shaken so thoroughly. Unemployment was estimated to have fallen to 25% in the US and England. It was even worse in Germany, which had been strapped with war reparations at the Treaty of Versailles. Political unrest was brewing in democratic political economies around the world. It was little more than a decade earlier that a Communist revolution had occurred in Russia, and many believed that the problems of the industrial economy needed a communist or socialist solution.

In the US, protest marches became frequent. Rent riots broke out as people organized to prevent home evictions and farm foreclosures, often physically. The country was in such dire straits that many believed it could have gone in any political direction. This confusion lapsed into the term of the next President, Franklin D. Roosevelt, who beat Republican incumbent Herbert Hoover and was inaugurated as President on March 4 1933.[1]

Roosevelt moved immediately to save the banking system that had been experiencing significant drains on deposits. He closed the banks until the federal auditors could review the books. Although many were deemed insolvent, he decided to save the remaining banks by signing the Bank Moratorium that suspended acknowledgment of their demise.

After deliberating with Treasury Department officials, former Hoover advisers, and several leading bankers to come up with intervention measures and reform practices, Roosevelt reopened many banks. He decided to leave them in the hands of their original owners instead of nationalizing them as many thought he might.

The Emergency Banking Act of March 9 allowed the Federal Reserve to make loans to businesses and nonmember banks against the assets that they were allowed to define very broadly. The Reconstruction Finance Corporation, Hoover’s singular response to the economic collapse, was authorized to buy stock in banks and thus provide them with working capital. Three days later, Roosevelt made his first “fireside chat,” in which he made a plea for citizens to redeposit their money. The legislation plus his encouragement were a success, and by the end of the month, he had saved the banking system.

Immediately after inauguration, Roosevelt took controversial measures to stop the hoarding of gold. In April Roosevelt took the US off the gold standard and called on all Americans to turn in their gold coins for a new gold-backed paper currency. On Jan 31, 1934, the US devalued paper money from $20.67 to $35 dollars for an oz of gold.

In 1935, amidst concerns about the threat of fascism growing in Europe, Roosevelt decided to move US gold reserves from New York to Fort Knox in Kentucky. By moving the gold past the Appalachian Mountains and next to a military tank battalion and training facility, it reduced the risk of a Nazi attack on Manhattan that might capture significant amounts of gold bullion. Central bank vaults were the primary targets of the Nazis in their early invasion of Czechoslovakia and Poland, as gold was essential to efforts to procure oil and other critical supplies internationally for its war effort.

Despite the boldness and swiftness in which FDR carried out his plan, it was basically a very conservative to the national disaster. The plan he embarked on was to essentially save capitalism. At the time was Russia was firmly in the grips of its Communist leaders, and Hitler and the National Socialists (Nazis) had strengthened their grip on Germany. Many business leaders, ensconced in the rhetoric of laissez-faire and free enterprise economics, were slow to realize the conservativeness of the New Deal. They chastised Roosevelt in the newspapers and on the radio. But he was very popular with the people, including Ronald Reagan, the future US president that would later become one of the New Deal’s fiercest critics.

Over the next few years the administration set into place a reformed capitalist system based on the rationalization of business, finance, and labor practices and focused on long-term stability. Banks were separated from their other activities such as investment banking and stock-brokerage through the Glass-Steagall Act of 1933. The Securities Exchange Act passed in the next year created the Securities and Exchange Commission (SEC) to oversee and prevent manipulation and rigging in the stock markets. The Federal Reserve Board in Washington was also given greater powers to oversee the regional Reserve Banks and the Federal Deposit Insurance Corporations (FDIC) was instituted to prevent further bank panics and to restore depositor confidence.[2]

Hopefully, the transition to a green economy will not take such a dramatic event as the Great Depression. Tragically, we might be facing a bigger threat with climate change and environmental pollution. The New Deal was central to transiting to a carbon-based industrial model still heavily reliant on manual labor. Coal and oil were central to the New Deal, as was the process of electrification.

A very crucial component of the new economic transition will be “green finance” – how will the Green New Deal be paid for, and what are the ideological implications of the process? Will the current banking system suffice? Can pension funds and other wealth management funds be sufficiently incentivized to make the crucial investments? Will the political will be developed to stop subsidizing fossil fuels?

Finally, will the Green New Deal be “socialist?” or can capitalism be harnessed for the transition? The 2020 elections will be an important forum for discussion on these matters.

Part 2 of this series will discuss an early failure of the New Deal, the National Industrial Recovery Agency’s (NIRA) attempts to administer the economy and a wide range of codes and practices for business and industry.

Notes

[1] The Three Roosevelts. (n.d.). Retrieved February 16, 2020.
[2] McQuail, K. (1982) Big Business and Presidential Power. NY: William Morrow and Company.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

keep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from http://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor and Associate Chair at State University of New York (SUNY) Korea. Recently taught at Hannam University in Daejeon, South Korea. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, media economics, and strategic communications.

    You can reach me at:

    anthony.pennings@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Calendar

    May 2020
    M T W T F S S
     123
    45678910
    11121314151617
    18192021222324
    25262728293031
  • Pages

  • May 2020
    M T W T F S S
     123
    45678910
    11121314151617
    18192021222324
    25262728293031
  • Flag Counter
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.