Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

ICT and Sustainable Development: Some Origins

Posted on | August 17, 2021 | No Comments

I teach a course called ICT for Sustainable Development (ICT4SD) every year. It refers to information and communications technologies (ICT) enlisted in the service of cities, communities, and countries to help them be economically and environmentally healthy. An important consideration for sustainability is that they don’t impose on conditions or compromise resources that will be needed for future generations. Sustainable Development (SD) is an offshoot of traditional “development” that dealt primarily with national economies that were organizing to “take off” into a westernized, pro-growth, industrial scenarios, and had some consideration of the colonial vestiges they needed to take into account.

While development was also cognizant of the need to support agriculture, education, governance, and health activities, SD put a major focus on related environmental issues and social justice. (See Heeks) SD has been embraced by the United Nations (UN) that came out with seventeen Sustainable Development Goals (SDGs) that were adopted by all UN organizations in 2015.

SDGs

In this post, I briefly introduce ICT4D and it’s connection to SD. Also, how it emerged and why it is beneficial. Of particular importance are the economic benefits of ICT and recognizing them in the renewable energies so crucial to sustainable development.

ICT was not well understood by development economists and largely ignored by funding agencies, except for telephone infrastructure. Literacy and education were early concerns. Book production, radio, and then television sets were monitored as crucial indicators of development progress. Telephones and telegraphs helped transact business over longer distances but were installed and managed by government agencies called Post, Telephone, and Telegraph (PTTs) entities. PTTs found funding difficult and were challenging to manage, given their technical complexity and enormous geographical scope. Satellites were used in some countries like India and Indonesia and facilitated better mass communications as well as distance education and disaster management.

Most of the economic focus in “developing countries” was on the extraction and growing of various commodities, utilizing low-cost labor for manufacturing, or adding to the production processes of global supply chains. It was only when television and films became important domestic industries that “information products” were recognized economically in the development process.

New dynamics to development and economic processes were introduced with computerization and ICTs. I began my career as an Intern on a National Computerization Policy program at the East-West Center in Honolulu, Hawaii. Computerization and Development in SE AsiaInspired by the Nora-Minc Report in France, it was part of the overall emphasis on development at their Communications Institute. I had an office next to Wilbur Schramm, who was one of the most influential development pioneers with his Mass Media and National Development: The Role of Information in the Developing Countries (1964).[1]

With my mentor, Syed Rahim, I co-authored Computerization and Development in Southeast Asia (1987) that serves as a benchmark studies in understanding the role of ICT in development. One objective of the book was to study the mainframe computers that were implemented, starting in the mid-1960s, for development activities. These “large” computers some of them with RAM of merely 14K, were implemented in many government agencies dealing with development activities: agriculture, education, health, and some statistical organizations. We also looked at what narratives were being created to talk about computerization at that time. For example, the term “Information Society” was becoming popular. Also, with the rise of the “microcomputer” or personal computer (PC), the idea of computer technology empowering individuals was diffusing through advertisements and other media.

Information economics opened up some interesting avenues for ICT4D and sustainable development. Initially, it was concerned with measuring different industrial sectors and how many people were employed in each area, such as agriculture, manufacturing, information, and services. Fritz Machlup, wrote the The Production and Distribution of Knowledge in the United States in 1973 that showed that the information goods and services accounted for nearly 30 percent of the U.S. gross national product. A major contributor to information economics, he concluded the “knowledge industry” employed 43 percent of the civilian labor force.

Machlup was also a student of Ludwig von Mises, known today as the founder of the so-called “Austrian School of Economics.” But he was soon overshadowed by fellow “members” Friedrich von Hayek and Milton Friedman, and the resurgence of Von Mises himself. While this debate was primarily against mainstream Keynesian economics, it was also significant for development studies as these economists saw government activities as running counter to the dynamics of the market. The main nemesis of the Austrian school was socialism and government planning activities. While most developing countries were not communist countries, the Cold War was a significant issue that was playing out in countries worldwide.

The Austrian movement had a significant impact in the 1970s and 1980s. Transactions in the economy were seen as knowledge-producing activities and they focused on the use of prices as communication or signaling devices in the economy. It led to a new emphasis on markets and Hayek and Friedman both received Nobel Prizes for their work.

For context, President Nixon had taken the US off the gold standard in August 1971 and the value of the US dollar dropped sharply. But currency markets were free to operate on market principles. It was also a time when the microprocessor was invented and computers were becoming more prominent. In 1973, Reuters set up its Money Monitor Rates, the first virtual market for foreign exchange transactions. They used computer terminals to display news and currency prices and charged banks to both subscribe to the prices and to post them. With the help of the Group of 5 nations, it brought order to international financial markets, especially after the Arab-Israel War broke out in late 1973. The volatility of the war ensured the economic success of the Reuters technology and currency markets have been digitally linked ever since.

Many development theorists by that time were becoming frustrated by the slow progress of capitalism in the “Third World.” Although the Middle East war was short, it resulted in increasing prices for oil around the world. This was a major strain on developing countries that had bought into mechanized development and the “Green Revolution” of the 1960s that emphasized petroleum-based fertilizers and pesticides. The Arab-dominated Organization of Petroleum Exporting Countries (OPEC) began an embargo of western countries for their support of Israel that refused to withdraw from the occupied territories. Prices of oil increased by 70 percent and the US suffered additional setbacks as they ended the war in Vietnam and inflation raged.

A split occurred between traditional development studies and market fundamentalists. British Prime Minister Margaret Thatcher and US President Ronald Reagan were strong advocates of the Austrian School. Both had been taken by Hayek’s Road to Serfdom (1949) and stressed a pro-market approach to development economics. The IMF was mobilized to pressure countries to undergo “structural adjustment” towards more market-oriented approaches to economic development. The PTTs were a primary target and investment strategies were utilized to turn them into state-owned enterprises (SEOs) and parts sold off to domestic and international investors.

Researchers began to focus on the characteristics or “nature” of information. As the economies became more dependent on information, more scholarship was conducted. It became understood that information was not diminished by use or by sharing. Certainly the value of information varied, often by time. The ability to easily share information by email and FTP created interest in network effects and the viral diffusion of information.

These characteristics became particular important after the development of the Internet that quickly globalized. Vice-President Gore’s Global Information Infrastructure (GII) became the foundation for the World Trade Organization’s Information Technology Agreements (ITA) and the privatization of telecommunications services. Tariffs on information and communications technologies decreased significantly. Countries that had gotten into debt in the 1970s were pressured into selling off their telecommunications infrastructure to private interests and they quickly adopted TCP and Internet Protocols (IP).

Other studies focused on efficiencies of production brought on by science and technology, specifically reducing the marginal costs of producing additional units of a product. Marginal costs have been a major issue in media economics because electronic and then digital technologies have allowed the increasing efficiency of producing these types of products. Media products have historically had high production costs, but decreasing marginal costs on the “manufacture” or reproduction of each additional unit of that product.

If we start with books for example, we know it is time-consuming to write a book and the first physical copies of the book are likely to be expensive, especially if only a small number of them are actually printed. But as traditional economies of scale are applied, the cost of each additional book becomes cheaper. Electronic copies of books in particular have become very cheap to produce, and even distribute through the Internet. Although that hasn’t necessarily resulted in major price decreases.

Digital outputs are generally unique economic products. They have unusual characteristics that make it difficult to exclude people from using them, and they are also not used up in consumption. Microsoft faced this problem in the early days of the microcomputer when it was getting started. It criticized computer hobbyists for sharing cassette tapes of their computer programs. Later, their investment in the MS-DOS operating system and subsequently Windows paid off handsomely when they were able to sell it with enormous margins for IBM PCs and then “IBM Compatibles” such as Acer, Compaq, and Dell. That is how Bill Gates became the richest man in the world (or one of them).

The issue of marginal costs have resonated with me for a long time, due to my work on media economics and what economists call “public goods.” In some of my previous posts, I addressed the taxonomy of goods based on key economic characteristics. Public goods and such as digital and media products are misbehaving economic goods in that they are not used up in consumption and are difficult to exclude from use. These writings examined what kind of products are conducive to reduced marginal costs and what social systems are conducive to managing these different types of goods. Originally, the focus was more on media products like film, radio and television, but then digital products like games and operating systems. Will these efficiencies apply to sustainable development?

Can the economics of media products apply to other products. More recently sustainable technologies like solar and wind are being examined for their near-zero marginal costs. A major voice on this topic is Jeremy Rifkin, who is most noted for his book The Third Industrial Revolution (2011) that refers to the importance of concurrent communications, energy, and transportation transitions. We have moved from an integrated political economy based on telephone/telex communications, and carbon combustion energy and transportation to a digital, clean energy. Two books by Jeremy Rifkin, The Near Zero Marginal Cost Society and The Green New Deal are significant points of departure for sustainable development.

Sustainable development initiatives by definition look to economize and reduce costs for the future. It is important to analyze the characteristics of economic goods and their social implications. This level of understanding is important to understand the market structure and types of regulation.

ICT4D has struggled to claim a strong narrative and research stake in the trajectory of development. The Earth Institute’s ICTs for SDGs: Final Report: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals (2015) and the World Bank’s (2016) World Development Report were significant boosts for ICT4D, especially for economic development, and the move towards sustainable development.

Citation APA (7th Edition)

Pennings, A.J. (2021, Aug 21) ICT and Sustainable Development: Some Origins. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/ict-and-sustainable-development-marginal-costs/

Share

Notes

[1] Mass Media and National Development: The Role of Information in the Developing Countries. Stanford University Press. 1964.

[2] Sachs J et al (2016) ICT & SDGs: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals. The Earth Institute: Columbia University. Accessed at https://www.ericsson.com/assets/local/about-ericsson/sustainability-and-corporate-responsibility/documents/ict-sdg.pdf. 15 Jan 2019

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and spending most of his career at New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in the Republic of Korea, he lives in Austin, Texas.

Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains

Posted on | August 12, 2021 | No Comments

Are we ready for the age of diamonds? Instead of mining gold or even Bitcoin for a currency base, why not use diamonds from captured carbon? And why not use recaptured carbon from the atmosphere via hemp? Instead of “conflict diamonds,” why not have “climate diamonds?”

Can we move to a diamond standard? Can diamonds replace gold and back other currencies? Can this be done in an economical and sustainable process? This post examines the processes of biological carbon sequestration and the manufacture of diamonds that can be used in a fintech environment. While I’m generally ok with the prospects of fiat money, if a hard money alternative emerged that was dynamic and contributed to climate security, why not try it?

The process of creating “industrial diamonds” is well established and has produced impressive results. Diamonds can be “grown” using small ones as seeds and adding to the crystalline structure in two ways. One is using superheated “greenhouses” with pressurized methane and hydrogen. Another uses CVD (Carbon Vapor Deposition), a low pressure, vacuum-based system, that uses heat or microwaves to bond carbon-rich gasses to the diamond seeds.

I wrote my Ph.D. dissertation about money and standards, so I’ve been thinking about this topic a lot. In Symbolic Economies and the Politics of Global Cyberspaces (1993), I examined the social forces that drive us to use general equivalents like money and the forces that establish monetary standardization in a digital environment. So, I’m not entirely convinced of my argument so far, but I want to consider available systems of regenerative agriculture and manufacturing that can be mobilized for making climate diamonds and tie them into newer generation cryptocurrency developments.

I was intrigued by ” rel=”noopener” target=”_blank”>a video from “Have a Think.” It presents an intriguing industrial scenario for growing and using hemp to sequester CO2 from the air and use it to produce several very valuable byproducts, one of which can be be transformed into sparking diamonds. Hemp is controversial due to its connection with THC, a psychoactive substance, but that is only present in certain strains. Hemp has a rich history and was particularly important for ropes needed by the sea-faring ships that relied entirely on wind.

Hemp is a dynamic plant that grows quickly. In the process, it can produce several types of industrial products, including lubricants for cars and wind turbines as well as ingredients for cosmetics, soaps, and printer ink. In addition, they have fiber substances that can make products like cloth, paper, and rope. The seeds have positive nutritional benefits and may be a replacement for soy proteins. It can also be processed to produce biochar, a type of charcoal used for fertilizers, graphene products, and the manufactured diamonds mentioned extensively in this post.

Agriculture is going through a technological transformation, with increased use of big data, hydroponics, and robotics. Hydroponics are a way of growing plants on a water-based medium in a protective environment. Hemp farms can remove significant amounts of carbon dioxide (C02) from the air and produce the oils and fibers mentioned above in a clean and economical way. Biological carbon sequestering is probably better then geologic sequestration that injects carbon into underground porous rock formations. Both may be necessary for reducing climate threats, but it is better done by plants that can produce many valuable byproducts.

Can we create a new monetary standard based on climate diamonds? Is it feasible? Is that something we want to do? Globally, we have been on an information standard since the 1970s, anchored by the strength of the US dollar and hedged by multiple financial instruments worldwide. The new diamond market will likely grow within the cryptocurrency environment.

Much of this depends on the future of cryptocurrency platforms and the digital ledger systems that are emerging in new generations. Cardona’s blockchain platform, for example, is evolving to create a peer-to-peer transactional system to trade many types of value like labor and portions of investment vehicles like houses, labor, art, etc. Imagine, for example, going to Home Depot and buying your gardening supplies with informational “deeds” to fractions of your car, your house, your future work, or your diamonds. Diamonds are likely to be another “value” in a chain of intersecting commodities classified on blockchain with dynamic pricing and smart contracts.

Diamonds have utility based on their beauty but also their durability and strength. Most notable is their sparkling effervescence that makes jewelry treasured symbols of relationship and commitment. Their crystalline structure is used in high-tech products like audio speakers as they can vibrate rapidly without affecting or deforming sound quality. Their high heat and voltage tolerance make diamond-based microprocessors an increasingly viable component of digital technologies.

Their hardness also makes them extremely valuable for drill bits. They have practical uses in delicate drilling for art, dentistry, and manufacturing. With a melting point of around 3550 degrees Celsius, they have the durability to drill industrial metals, geothermal wells, and underground tunnels.

Diamonds can also be money. They are portable, durable, measurable, and difficult to counterfeit. For diamonds, size matters, although color and clarity matter as well. Bigger is better and they pack a lot more value per size than gold. Gold has higher storage costs that quickly eat up any profit gains from their appreciated prices. Granted, neither diamonds or gold are generally used as currency, primarily because they lack acceptability by merchants and interoperability between financial systems. Try cashing in your diamonds at Home Depot.

That is why the cryptocurrency environment is the most likely solution to that problem, barring an economic collapse. While the dollar is going digital, officially known as CDBC (Central Bank Digital Currency), it will not be replaced by Bitcoin or other “cryptocurrencies” like Ethereum and Litecoin. Instead, other “values” will line up in relation to the dollar. The Fed will regulate but protect crypto-currencies because it knows that the financial system wants to trade anything, anywhere, anytime. So cryptocurrencies will survive, and diamonds will find their place within their ethereal block-chained cyberspace. In the future, who knows?

This is an exploratory essay stimulated by the “Have a Think” video but also shaped by my interest in fintech, monetary policy, and cryptocurrencies. The reintroduction of hemp in the modern economy and its potential for absorbing carbon dioxide can be a powerful addition to the global economy. It’s not entirely about taking the CO2 out of the air and bonding the carbon to diamonds. Rather, I am hopeful that the green manufacturing of diamonds will help incentivize and stimulate an industrial process with multiple benefits for the economy and the environment.

Citation APA (7th Edition)

Pennings, A.J. (2021, Aug 12). Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains apennings.com https://apennings.com/dystopian-economies/electric-money/diamonds-are-a-worlds-best-friend/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Modern Monetary “Practice” and the Fear of Inflation in a Low-Supply Economy

Posted on | August 1, 2021 | No Comments

One of America’s most potent political myths is that you, as a citizen, pay for government spending. People talk about paying for this or that program, but what really happens is that Congress appropriates the money for spending. Then the Treasury instructs the Federal Reserve Bank to credit the spending accounts. Taxes and borrowing are separate entries. The issue is gaining scrutiny as the US economy reconciles 2020’s economic output slowdown due to COVID-19 in the context of record government spending.

A relatively new area of economic analysis, called Modern Monetary Theory (MMT), emerged from practitioners in the finance industry telling a different story about government spending. It is worth examining as it is based more on financial traders’ practices, particularly those who work with government bonds. Warren Mosler was critical in formulating and shedding light on the actual processes involved in government spending. Thus my emphasis on Modern Monetary “Practice” as it starts with this description of the spending process that allows us to reframe its dynamics.

Spending should not be seen as a panacea for the economy. Spending can be wasteful and lead to inflation. Spending needs to be productive. The $28 trillion debt accumulated by May 2021 is worthy of monitoring, but what does it really mean? What are its implications?

Taxes are registered by government, yes, but it’s not like household economics. A household needs a breadwinner, someone to bring home the bacon, to load up the metaphors. Someone needs to have money to pay the bills. Governments operate under a different set of rules and responsibilities. They can print or mint minor amounts of money and use the Fed for larger quantities. Government provides the money for the economy to operate, and the incentive – taxes – to make people want to own it. Mosler argues that governments have a monopoly on their currency and the responsibility to get it into the economy, by spending, to enable markets to work.

Central banks can make purchases of bonds and quite frankly, whatever it wants to buy. The Fed traditionally only bought government treasuries but now regularly buys mortgage-backed securities in a process called quantitative easing. Ideally, they can sell these treasuries and securities to absorb money from the economy if it smells inflation. The banking sector also creates money when it loans money to consumers.

The Treasury auctions bonds. But to pay for spending? They essentially provide:

  • Time deposits for investors;
  • Hedge instruments for traders;
  • Opportunities for foreign countries to keep their currencies cheap versus the dollar;
  • A vehicle for the Fed to influence the money supply and coordinate interest rates.

Borrowing should be seen as a political strategy to keep the financial system secure, provide a stable hedge, and manage the dollar’s value.

So, rather than worrying about “paying” for something, US citizens should be active in deciding how taxes should be used in public policy. The US policy should be designed to tax what it doesn’t want. Well, that isn’t going to be easy. But it is what democracy is about. Spending should also be determined on what will keep the US safe and secure. It should keep the economy productive while providing opportunities and avoid excessive inflation.

This last point is important. Inflation is the primary limiting factor when it comes to spending and is a calculus between supply and “effective” demand. “Too many dollars chasing too few goods” is the standard explanation by economists. Spending is easier for a government to coordinate. A good example occurred during the COVID-19 when the US government passed several emergency spending packages to support businesses and families, especially airlines hurt by the shutdown in travel. While the economy skyrocketed due to the fiscal and monetary stimulus, the slowdowns in production, disruptions of supply chains, and people staying at home caused a significant spike in inflation during early 2021.

Inflation in the US had been largely absent since Nixon took us off the gold standard in the early 1970s. At that time, the dollar deflated, and OPEC countries restricted oil production. So they wanted to drive up prices to make up for the diminishing value of the US greenback. Meanwhile, lacking banking systems due to Islamic restrictions on credit, they recycled US dollars through a global euro-dollar system. Called “petrodollars,” banks worldwide coordinated syndicated loans with these funds for countries needing dollars for energy purchases and development projects. “Economic hit men” scoured the world and pressured countries around the world to borrow the money, eventually creating what was called the “Third World Debt Crisis” in the early 1980s.

Since the 1980s, financialization and the commercialization of Cold War technologies created sufficient competition and disruption to keep prices down. Primarily information technologies, they increased productivity, reducing labor and resource costs. Also, globalization created new forms of interstate competition and cooperation, as supply chains supported innovation and higher quality products. The US government also floated bonds internationally to countries like England and Japan that strengthened the dollar and kept it as the world’s reserve currency.

The COVID-19 pandemic presents unprecedented economic challenges, particularly with its 2021 resurgence as the delta variant. A rising stock market that saw the DJIA hit 35,000 and S&P hitting 4,395.26 with a market capitalization of US$38.2 trillion. But concerns about inflation grew as commodities such as copper, lumber, and oil increased. A computer chip shortage also raised concerns about the production and pricing of cars, computers, and other commodities based on microprocessing capability. But the Fed and others saw this phenomenon as “transitory,” citing disruption, demographics, debt, and productivity as factors that would reduce inflationary pressures.

So the economy looks to be at risk in late 2021. Will the practical application of MMT provide operational guidance for a new era of prosperity? Can infrastructure and climate change solutions provide sufficient returns on these investments? The big question is whether government spending for such programs can avoid significant inflationary pressures? With COVID, we are struggling with how to spend in a low-output economy.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and spending most of his career at New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in the Republic of Korea, he lives in Austin, Texas.

Thomas Edison Builds the Universal Ticker-Tape Machine

Posted on | July 10, 2021 | No Comments

In previous posts, I described the importance of the telegraph and other electro-mechanical devices and their impact on the Civil War and the expansion of the US economy. Thomas Edison, the famous inventor, came to New York during the famous Black Friday financial crash in the Autumn of 1869. He immediately became involved in the technological innovations that were changing Wall Street and introducing new ways to represent and understand financial transactions.[1]

Electricity was harnessed to expand finance’s reach throughout the country and seas. New networks of telegraphed information and news added layers of certainty and volatility to Wall Street and for investors worldwide. Edison’s inventions and patents helped solidify these technological innovations and financial practices. They provided the means for him to escape New York and set up his laboratories and facilities for creative innovation and manufacture.

A few weeks after Black Friday, Edison formed a partnership with Franklin L. Pope, a telegraph engineer. A few weeks after Black Friday, Edison partnered with Franklin L. Pope, a telegraph engineer who was also associated with Doctor Laws. Earlier in the year, Edison had conducted telegraph tests with Pope from Boston and Rochester, one reason he came to New York City. On October 1, 1869, Edison and Pope announced they were going to create both the Financial and Commercial Telegraph Company and the American Printing Telegraph Company.[2]

They partnered only a few months after completing the transcontinental railroad, and the US was expanding quickly westward. Immigration was on the rise, and the economy was growing rapidly. They jointly filed several patents in printing telegraphy during this period. Two printer designs were for stock tickers to provide gold and stock quotations in the New York area. Another printer called “The Pope and Edison Type-Printing Telegraph” was meant to be used on “private lines.” They transmitted messages other than commercial quotations from an easy-to-use terminal for individuals and business houses and, in a sense, were the first “emails.” While their partnership was successful, it ended when the Gold and Stock Telegraph Company absorbed their businesses.

Like much of the managerial talent of the time, General Marshall Lefferts had served during the Civil War. After several years heading Western Union, he became the President of the Gold & Stock Telegraph Company. Very quickly, they acquired the patents to Laws’ Gold Indicator and his Stock Printer, as well as the Calahan Stock Ticker.

In 1870, the Gold and Stock Company created the Exchange Telegraph Company with partners from London, England to expand internationally. Investment capital was streaming into the US, primarily for railroad development. Telegraph messages and ticker prices were streaming through the Atlantic Ocean as well. Lefferts became a friend and mentor to the young Edison, and both encouraged and funded his work.

Edison had gained extensive experience with the Laws Gold Indicator and the Calahan Stock Ticker and was in an ideal situation to develop a general stock ticker for mass production. By the end of the year, Edison made a number of innovations and obtained patents for the Electrical Printing Instrument. Anxious to go into mass production, General Lefferts, as head of the Gold and Stock Telegraph Company gave him $40,000 for the rights to use his telegraphic and ticker inventions.[2] The event was represented in this movie Edison the Man (1940) where Spencer Tracy played the famous inventor.

His first months in New York City was an extraordinary time for Edison, who had barely reached his mid-twenties. In early 1870, with the money from his stock ticker inventions, he opened his first factory in nearby Newark, New Jersey, for the manufacture of gold and stock tickers. Edison’s worked on telegraph instruments as well as duplex and quadruplex telegraph instruments that could send multiple electrical signals at the same time.

Another device he continued to work on allowed remote synchronization of tickers from the central station. If a ticker in a broker’s office went out of “unison” and began to print unstructured results, it needed to be quickly reset. In the first years of the stock ticker, a runner would have to go to each client’s location and reset the ticker by hand. Every mechanism needed to be synchronized to allow multiple machines to print the same information simultaneously. Also, the demand for the tickers was spreading beyond New York, and it was crucial that the machines operated simply and without the need for mechanical troubleshooters. A device had been invented by Henry Van Hoevenbergh, but it didn’t work very well. Edison tested a new design on the Laws Stock Printer during the spring of 1871 and he soon received a patent for his “screw-thread unison” innovation that could reset the printing machines with an electrical signal. Subsequently, all stock tickers developed after this innovation incorporated the unison device, resulting in the stock ticker’s rapid diffusion.[3]

As their tickers were beginning to be used in many remote cities in the US and around the world, it was important to fix the stock tickers without the use of local skilled workers. Edison continued to make improvements on the stock ticker to create a reliable machine that could be marketed on a mass basis. The concept of the interchangeability of parts was important for the production of clocks, guns, and sewing machines, and it was applied to the manufacture of tickers as well. Several variations of stock tickers were initially manufactured in small numbers, but then in 1871, Edison constructed the Universal Stock Printer for New York’s Gold and Stock Telegraph Company.

The “Universal” was very dependable and could be manufactured in high volume. The New York Stock Exchange was a major beneficiary of the new stock tickers. These machines enabled a higher volume of trading and spread the reach of the Wall Street exchange. This allowed the NYSE to unseat smaller exchanges in other cities as people around the country could get price quotes transmitted directly to their stock tickers from New York. Over 5,000 of these devices were produced and used by investors around the world.

The ticker also got the attention of Western Union, which was also envious of The Gold and Stock Telegraph Company’s monopoly on transmitting market information from the New York Stock Exchange. Another concern for the giant was that the Gold and Stock Telegraph Company was using Edison’s Universal Private Line Printers to send out financial information. Edison’s printer only had moderate success though while a faster printer from George M. Phelps gave Western Union technological considerable competition. Along with a new stock ticker in 1870, later called the “Financial Instrument” that was faster, more efficient, and more reliable, Western Union went on the offensive. When they threatened to enter the New York market with this new ticker, Gold and Stock arranged a merger in 1871 and took over the took over the manufacturing and distribution of the stock ticker equipment.

Edison worked closely with “Mr. P” after the merger. Over the next few years they would patent the Quadruplex, considered Edison’s major contribution to telegraphy. It would allow four simultaneous telegraph transmissions on a single conducting line and would save Western Union a considerable amount of money. He also helped with Phelps’ Electro-Motor Telegraph that was ten years in development and based on an electro-motor/governor that was able to achieve speeds of up to 60 wpm. By 1875, Phelps’ transmitting apparatus allowed an operator to simultaneously transmit stock information to hundreds of different offices. It could also operate between New York and that other emerging financial center, Chicago, without using a single repeater.

The Universal Stock Ticker was Edison’s first commercial success and, in many ways, the source of his future success. The first 40 of his 1,093 patents filed with the US Patents Office came from his work with stock tickers and printing telegraphs. The success of his stock tickers provided capital and connections with wealthy investors that helped fund his many inventions, including perhaps his most outstanding achievement, the electric light bulb. In 1875, he moved to Menlo Park, New Jersey, where he set up his laboratory and the Black Maria movie studio.[4]

Citation APA (7th Edition)

Pennings, A.J. (2021, Jul 21). Thomas Edison Builds the Universal Stock Ticker. apennings.com https://apennings.com/telegraphic-political-economy/thomas-edison-builds-the-universal-ticker-tape-machine/

Notes

[1] A very good account of Black Friday events appears on the New York Times website section “On This Day”. See http://www.nytimes.com/learning/general/onthisday/harp/1016.htm. Accessed on May /10/15.
[2] Anecdotal information on Edison’s timely circumstances on Wall Street from Edison: His Life and Inventions by Frank Lewis Dyer and Thomas Commerford Martin.
[3] The stocktickercompany website has very good information on the history of the stock indicators and ticker-tape machines.
[4] It has been difficult to trace the exact timing of Edison’s activities at the time. Ultimately, I decided to follow the patents. http://www.prc68.com/I/StkTckPat.shtml#List

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York, Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Show-Biz: The Televisual Re-mediation of the Modern Global Economy

Posted on | June 8, 2021 | No Comments

My use of “Show-Biz” refers to the meaning-making techniques of financial journalism and their relationship to the narratives that represent and drive the economy. Media industries “show” business and finance through various camera, editing, and special effects techniques, drawing in data from many sources and presenting them on different windows of high-resolution screens. These techniques create ways of seeing and showing the economy. Consequently, they influence public opinion as well as investment and trading strategies that shape global, local, and national economic activities and investment patterns.

This post concerns the televisual surveying systems that monitor and display global business and financial activities. It starts with a theory of media, called remediation, and then examines different elements or media that are combined into the broadcast or streaming of financial news. Two key concepts, transparent immediacy and hypermediation help us understand the way the media operates. These transmissions of mediated financial information have consequences for the global economy.

This is not to say that such representations are necessarily valid constructions of reality or distortions of truth. One of the central themes of this post is that strategies of visual mediation are intertwined with authentic experiences and facts and that strategies of interpretation and incredulity/skepticism are required.

Television news expanded significantly in the 1970s with the creation of cable systems and satellite networks. Several networks were dedicated to financial news. Cable and traditional TV combined when CNBC (Consumer News and Business Channel) was established in April, 1989 as a joint venture between Cablevision and NBC. Bloomberg Television was launched in the United States on January 1, 1994, and drew on a Bloomberg boxdecade of financial analytics provided through the famous “Bloomberg Box.” (See image of my daughter pretending to use a Bloomberg Box)

Another successful financial news network was Fox Business News, launched on October 15, 2007. Yahoo also emerged as a major financial news provider. Bought by Verizon in 2016, it attracted over 100 million global monthly visitors on average in 2019, according to media analytics company ComScore. Yahoo Finance was recently sold by Verizon to Apollo Global along with AOL.

How is financial news mediated in these networks? What signifying practices are brought into play and for what purposes? What are the implications of their mediating styles and techniques for how we understand the health of the global economy, levels and types of employment, and the potential of innovative new industries and companies?

Financial television news is consistently broadcasting in many trading operations and other business environments. It is also popular in homes, whether by day traders or interested citizens. Many people might be invested in Bitcoin or other cryptocurrencies, concerned about housing prices, or following their 401K and other types of investments. Televised news and economic indicators play a vital role in various audiences’ perceptions of the economy.

Anchored by personalities that are informed and presentable, the television screen combines live human commentary with indexical information, graphs, and other numerical representations of different parts and states of the economy. The news anchor fixes meaning, guiding the narrative while transfixing the audience with the immediacy of their presence.

Remediation is literally the re-mediating of content through the inclusion of old media into new media. Or sometimes, including new media in an old medium, such as the use of computer and web techniques in modern television. One of the earliest media theorists, Marshall McLuhan, made these observations in the 1960s. Print remediates writing, which remediates speech. The TV anchor, an actual person of authority who “anchors” the meaning of the broadcast, is remediated on the television news screen.

However, Bolter and Grusin made a more systematic analysis in their Remediation, published by MIT Press (2000).[1] They echoed McLuhan’s observations that the content of new media is old media. They also ventured that the motivation was a desire for remediation is a “healed” media, one that provided better access to a more “authentic” version of reality. Bolter and Grusin pointed to a “double logic” of remediation – two different modes of representation that strive to better access the real. Television has coped with this dual system of remediated representation since its origins with a variety of incorporations and innovations.

One mode of remediation is transparent immediacy, the desire for a type of immersion into the medium that erases the technology and provides an unblemished experience. The cinematic movie experience strives for this authenticity with its large screen, darkened room, and conservative camera editing practices. The viewer wants to forget the presence and techniques of the movie apparatus and believe they are in the presence of the objects of representation – the actors and sets. TV less so.

McLuhan and others argued that TV was primarily an acoustic medium, mainly because sound anchors the meaning of the visual elements. Television is a storyteller, primarily an oral one. So it is no surprise that human “anchors” on broadcast news play an important role. Anchors read the news and also conduct live interviews with guest experts for additional credibility and information. They present the major topics of the day in real-time, fixing the meaning of the broadcast, organizing the narratives of the day. Financial television borrows this sonic dominance, although it streams many other sources of data and textual news.

Many celebrity financial analysts have become celebrities, such as Mohamed A. El-Erian, Jared Bernstein, Bill Gross, etc. Neel Kashkari and other Fed District presidents are also very popular. These “Talking Heads” are brought in to contribute to the narrative and bring their expertise remotely from different cities and countries, representing key companies or government positions.

Television news is interrupted occasionally by “Breaking News” that reinforces immediacy. This interruption usually includes live reporting by a journalist at a relevant location. Drone or helicopter “birds-eye” viewing enhances the dominant perspective of television news. Reports by the Fed Chair after their FOMC meeting on interest rates are very popular. These events keep viewers “glued” to the screen.

Bloomberg Intraday

Hypermediation is the other strategy and uses techniques of representation that “foreground the medium.” Television has taken on the multi-mediated look of the computer with different windows gathering in activities, data, and events. While the anchor is prominent (although most trading environments turn off the sound) other windows display hyper-mediated representations of economic and financial data streaming in from around the world. This information is primarily in the form of charts and graphs, and indices presenting a quantitative view of the world. The reliability of this global gaze often draws on the truth claims of numeracy and remediates the spreadsheet. In particular, the visual techniques of the table are utilized to quickly communicate an augmented view of the economy.

Financial hypermediation has moved away from transparency. Instead, it integrates an augmented reality with indexical denotations of stock markets, prices of commodities like gold and silver, and currency exchange fluctuations. Indicators range from the macro-indicators such as GDP, invented to mobilize industrial responses to the Great Depression and World War II. If Women Counted by Marilyn Waring was a major critique of GDP because it didn’t count domestic work. The age of big data is also returning information that is giving us a better picture of the larger economy. Unemployment statistics are a major indicator, as are the prices of commodities like gold, silver, and copper.

Financial news probably owes a debt to sports broadcasting and news. Notably, American sports like baseball, basketball, and football (gridiron) have embraced hypermediated techniques in the service of sports entertainment. While transparent immediacy is a crucial part of sport enjoyment, a new word, “datatainment” has emerged as the moniker of the joy many people get from statistics related to their favorite teams and players. In baseball, for example, scores remain the major source of numerical pleasure as they indicate winners and losers. But batting averages, earned run averages (ERA), and runs batted in (RBIs) are statistical sources of additional satisfaction.

Conclusion

Financial news on television combines several earlier and newer types of media to represent views of the global economy. It uses anchors and interviews with guests. It used many economic indicators throughout the screen and scrolling tickertapes. It tries to survey the world and paint a picture of an authentic world that it thinks its viewers would be interested in. What are the limitations of such media strategies? What are the limitations of these strategies of representation?

Citation APA (7th Edition)

Pennings, A.J. (2021, Jun 08). Show-Biz: The Televisual Re-mediation of the Modern Global Economy. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/show-biz-the-televisual-re-mediation-of-the-modern-global-economy/

Notes

[1] Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 2000.
[2] Cook, Patrick J. Rev. of Remediation by Jay David Bolter and Richard Grusin. Resource Center for Cyberculture Studies (December 1999). 14 January 2001.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 5: Trump, Title I, and the End of Net Neutrality

Posted on | May 16, 2021 | No Comments

The election of Donald Trump in 2016 presented new challenges to broadband policy and the net neutrality rules passed under the Obama administration. Tom Wheeler resigned from the Federal Communications Commission (FCC), allowing Trump to pick a Republican chair and swing the power to the GOP.

The major issue would be to challenge the FCC’s 2015 classification of Internet Service Providers (ISPs) under Title II of the Communications Act of 1934 that emphasized common carriage, the commercial obligation to serve all customers equally and fairly. This post recaps the development of net neutrality rules for Internet broadband under the FCC during the Obama administration to their immediate dissolution under the Trump administration’s FCC.

The FCC’s Computer Inquiries were conducted from the 1960s to the 1980s and identified three layers of technological services for data communications. The telecommunication market offering basic services to homes and businesses would be classified as regulated Title II companies because of their monopoly positions. Data communications and processing service providers operating on top of the telco infrastructure would be lightly regulated Title I “enhanced” companies. The content companies that would offer information services were not included in the regulations. This legal framework allowed the Internet to take off in the 1990s, including the creation of over 7,000 ISPs in the US. But this was before the higher speed broadband services became available.

Broadband companies became Title I “information services” during George W. Bush administration’s FCC. Telephone companies that had carved up America during the breakup of AT&T in the 1980s became unregulated ISPs. Cable television companies had also developed IP broadband capabilities in the late 1990s and, with cable modems, competed or merged with telephone companies to provide “triple play” (TV, broadband, and voice) services to households. In 2009, cable companies were also deregulated by classifying them under Title I.

The result of these two decisions was a highly oligopolistic market structure for broadband services. These companies began to acquire smaller ISPS, often after making it difficult for them to interconnect to their facilities as they had been required as Title II companies. Customers soon found themselves limited to monopoly or duopoly ISPs in their area.

These newly deregulating companies also wanted to expand into new digital services, including payment systems and providing information, video, and search content. These actions violated the “maximum separation” rules that restricted these companies from competing with their customers. They also had designs to operate as gateways that would package games, social media, geo-locational data, and email services into bundles that they would offer at various prices. Concerns proliferated about pricing and service issues and this led to the movement for “net neutrality” and the return of common carriage.

During the first Obama administration, The FCC began a major study the broadband market structure of ISPs in the US.

In 2010, the FCC passed six broad “net neutrality principles:”

    Transparency: Consumers and innovators have a right to know the basic performance characteristics of their Internet access and how their network is being managed;

    No blocking: This includes a right to send and receive lawful traffic, prohibits the blocking of lawful content, apps, services and the connection of non-harmful devices to the network;

    Level playing field: Consumers and innovators have a right to a level playing field. This means a ban on unreasonable content discrimination. There is no approval for so-called “pay for priority” arrangements involving fast lanes for some companies but not others;

    Network management: This is an allowance for broadband providers to engage in reasonable network management. These rules do not forbid providers from offering subscribers tiers of services or charging based on bandwidth consumed;

    Mobile: The provisions adopted today do not apply as strongly to mobile devices, though some provisions do apply. Of those that do are the broadly applicable rules requiring transparency for mobile broadband providers and prohibiting them from blocking websites and certain competitive applications;

    Vigilance: The order creates an open Internet advisory committee to assist the commission in monitoring the state of Internet openness and the effects of the rules.[1]

The new rules faced a judicial challenge. The courts, while sympathetic to the goals of net neutrality, questioned the FCC’s authority to regulate Title I companies. After an appeal by Verizon, the DC circuit court sent the FCC back to the drawing boards. Judge David Tatel said that the FCC did not have the authority under the current regulatory conditions to treat telcos as “common carriers” that must pass data content through their networks without interference or preference.

The result of Verizon vs. the FCC was that without a new regulatory classification, the FCC wouldn’t have the authority to actually enforce restricting the big ISPs from banning or blocking of legal websites, throttling or degrading traffic on the basis of content, and enacting “paid prioritization” for Internet services. The latter, the so-called “fast lanes” for companies like Google and Netflix were particularly contentious.[2]

President Obama got involved and supported reclassifying ISPs as common carriers under Title II of the Communications Act of 1934. This would give the FCC the authority they needed to regulate the ILEC ISPs. In February 26, 2015, the FCC passed the new Title II Net Neutrality Rules in a 3–2 party-line vote that went into effect in the summer of 2015. The FCC’s open internet rules applied to both wired and wireless Internet connections.

Trump’s new FCC Chairman, Ajit Pai, argued that the web was too competitive to regulate effectively. Ignoring the impacts of deregulating cable and telephone companies on broadband competition, he argued ISPs did not have the incentive to throttle web speeds or restrict other services. He compared regulating ISPs with regulating websites, a clear deviation from the regulatory layers set out in the computer inquiries. Subsequently, the FCC began seeking comments on eliminating the Title II classification for broadband and removing the Obama era net neutrality rules.

On December 14, 2017, the Federal Communications Commission (FCC) voted in favor of repealing these policies, 3–2, along party lines. Pai voted with the majority of the FCC to reverse the decision to regulate the Internet under Title II of the Communications Act of 1934. Called the Restoring Internet Freedom Order, it repealed the net neutrality rules that were put in place two years earlier.

Pai’s justification speech argued that the Internet was not broken and didn’t need to be fixed. His contention was that the bureaucratic complexity of net neutrality was a burden on small ISPs and a disincentive to invest in new facilities and digital pipes. The new FCC voted to begin eliminating Obama’s net neutrality rules as it reclassified home and mobile broadband service providers as Title I information services.

The both Democrats and Republicans responded with several strategies to reverse the rule. In 2018, they attempted to invoke the Congressional Review Act (CRA) to undo the FCC order. This would bypass the filibuster and allow Congress to repeal recent administrative regulations. The motion passed the Republican-controlled vote Senate by 52–47, but did not get the necessary votes in the Republican-controlled House.

The Democrats tried after gaining a majority in the 2018 midterm elections with the Save the Internet Act of 2019 bill. It codified no blocking or throttling websites, or bundle websites or apps like a cable packages. It designated network access a “utility” under Title II of the 1996 Communications Act.

Rep. Mike Doyle (D-PA), the bill’s main sponsor and chair of the Subcommittee on Communications and Technology within the House Committee on Energy and Commerce, said he believes Internet access is a right for all and that “We want that gatekeeper to be neutral.” It passed the House 232-190 but was declared dead on arrival by Mitch McConnell, Senate Majority Leader.

Pai resigned with Trump’s departure in 2020, leaving behind a mixed legacy. While he was acknowledged for some internal changes, including creating the FCC’s Office of Economics and Analytics (OEA) that collected FCC economists in a central think tank, instead of the separate bureaus. But the FCC was slow on 5G deployment and making available the much needed supply of spectrum in the mid-band (2 GHz-6 GHz) range. Rural buildout was weak with the FCC caught working with telcos to reclassify mobile broadband requirements lower. This meant they could count lower mobile bandwidth capabilities as broadband. But, by far, the so-called Restoring Internet Freedom order that repealed net neutrality will be the legacy of the Trump era, with the central question being was it a capitulation to telco lobbyists?

In the next post, I will examine the challenges for the Biden Administration in addressing broadband policy, including net neutrality, but also the Internet of Things, and the expansion of broadband infrastructure in rural and other underserved areas.

Notes

[1] Gustin, S. (2018, September 11). FCC Passes Compromise Net Neutrality Rules. Wired. https://www.wired.com/2010/12/fcc-order/.
[2] Finley, Klint. (2017, May 18) Internet Providers Insist They Love Net Neutrality. Seriously? Wired. Conde Nast, 18 May 2017. Web.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets as Remediated Technologies

Posted on | May 5, 2021 | No Comments

In his classic (1964) Understanding Media: The Extensions of Man, Marshall McLuhan argued that “the content of any medium is always another medium.”[1] For example, the content of print is the written word, and the content of writing is speech. Likewise, the content of the telex was writing, and the content of television was radio and cinema. The book was notable for coining the phrase, “the medium is the message,” and pointed to the radical psychological and social impacts of technology.

McLuhan had a specific focus on the effects instead of the content transmitted by each medium. He probed how new forms of technologies extended the senses of humans and changed the activities of societies. He invited us to think of the lightbulb, not so much in terms of its luminous content, but in the way it influenced modern society. He noted it creates new environments and changes in lifestyles, particularly at night. This post will examine the media technologies embedded in the digital spreadsheet that have made it a transformative technology and changed modern life.

Mediating “Authentic” Realities

In Remediation: Understanding New Media, Jay Bolter and Robert Grusin extended McLuhan’s ideas to a number of “new media,” including television, computer games, and the World Wide Web. They argued new media technologies are designed to improve upon or “remedy” prior technologies in an attempt to capture or mediate a more “authentic” sense of reality. They used the term “remediation” to refer to this innovation process in media technologies.[2] For example, VR remediates perspectival art, which remediates human vision. TV not only remediates the radio and film but now the windowed look of computers, including the ticker-tape scrolling of information across the screen.

Unfortunately, but understandably, they neglected the spreadsheet.

And yet, the digital spreadsheet is exemplary of the remediation process. Several years ago, I initiated an analysis of the spreadsheet that focuses on the various “media” components of the spreadsheet and how they combine to give it its extraordinary capabilities. To recap, these are:

  1. writing and numerals;
  2. lists;
  3. tables;
  4. cells, and;
  5. formulas.

The digital spreadsheet refashioned these prior media forms: writing, lists, tables, and formulas to create a dynamic meaning-producing technology. Writing and lists have rich historical significance in the organization of palaces, temples, monasteries, as well as armies, and navies. Indo-Arabic numbers replaced Roman numerals and expanded the realm of numerical calculation with the introduction of zero and the positional place holding system. Numbers and ledgers led to the development of double-entry accounting systems and the rise of merchants and later modern businesses.

Tables helped knowledge disciplines emerge as systems of inquiry and classification, initially areas like accounting, arithmetic, and political economy. Still, later areas such as astronomy, banking, construction, finance, insurance, and shipping depended on printed tables to replace constant calculation. Charles Babbage (1791-1871), a mathematician and an early innovator in mechanical computing, expressed his frustration with constructing tables when he famously said, “I wish to God these calculations had been executed by steam.”

First with VisiCalc and then Lotus 1-2-3, these media elements worked together to form the gridmatic intelligibility of the spreadsheet. Bolter and Grusin proposed a “double logic of remediation” for the representation of reality: transparent immediacy and hypermediacy. Both work to produce meaning. However, the former tries to forget the mediation at work and produce transparent immediacy, such as watching a live basketball game on television. The latter tries to foreground the medium, especially through computer graphics. Financial news programs on TV such as Bloomberg Surveillance mix the immediacy of live news using hosts and guests, with hypermediated indexes of stock markets (DJIA, S&P 500, NASDAQ, etc.) and other economic indicators such as GDP. How do spreadsheets attempt to perceive, display, and produce reality? How do they heal our perception of reality?”

Windows to the World Wide Web

It was the personal computer (PC) that brought the spreadsheet to life. The Apple II brought us VisiCalc in 1976 with 40 columns and 25 rows, a small area that could be navigated quickly using the arrow keys. One of the first formulas developed for the spreadsheet was net present value (NPV) that calculated the return on investment (ROI) for projects, including large purchases of equipment. Microsoft’s Disk Operating System (DOS) was the technical foundation for Lotus 1-2-3 as the IBM PC and “IBM-compatibles” proliferated during the 1980s. The spreadsheet was becoming known as the “killer app” that made buying the “microcomputer” worthwhile. But it was the Graphic User Interface (GUI) that popularized the PC, and thus the spreadsheet.

The Apple Mac marked the shift to the GUI and new desktop metaphor in computing. GUIs replaced the inputted ASCII characters of the command line interface with a more “natural” immediacy provided by the interactivity of the mouse, the point-able cursor, and drop-down menus. The desktop metaphor drew on the iconic necessities of the office: the file, inboxes, trash cans, etc. A selection of fonts and typographies remediated both print and handwriting. The use of the Mac required some suspension of disbelief, but humans have been conditioned for this alteration of reality by story-telling and visual narratives in movies and TV.

Microsoft’s Excel was the first spreadsheet to use the graphic user interface (GUI) developed by Xerox PARC and Apple. Designed for the Apple Macintosh, it became a powerful tool that combined the media elements of the spreadsheet to produce more “authentic” versions of reality. An ongoing issue is the way it became a powerful tool for organizing that reality in ways that benefitted certain parties, and not others.

Excel was the center of Microsoft’s own shift to GUIs starting in 1985. Called Windows, it made spreadsheets a key part of its Office software applications package. Microsoft had captured the IBM-compatible PC market with DOS and initially built Windows on top of that OS. Windows 2.0 changed the OS to allow for overlapping windows. Excel became available on Windows in 1987 and soon became the dominant spreadsheet. Lotus had tried to make the transition to GUI with Jazz but missed the market by aiming too low and treating the Mac as a toy.

Windows suggested transparent views for the individual to different realities.

But while the emerging PC was moving towards transparent immediacy, the spreadsheet delved into what Bolter and Grusin would call hypermediacy. This is an alternate strategy for attaining an authentic access to the real. Windows promised transparent views of the world, but the spreadsheet offered new extensions of the senses – a surveying and calculative gaze – by remediating.

Spreadsheets drew on the truth-claims of both writing and arithmetic while combining them in powerful ways to organize and produce practical information. They combined and foregrounded the mediums involved to present or remediate a “healed” version of reality. Each medium provides a level of visibility or signification. The WYSIWYG (What You See Is What You Get) environment of the desktop metaphor provided a comfortable level of interactivity for defining categories, inputting data, and organizing formulas and displaying that information in charts and graphs.

The Political Economy of PC-based Spreadsheets

How has the digital spreadsheet changed modern society? Starting with VisiCalc and Lotus 1-2-3, the spreadsheet created new ways to see, categorize, and analyze the world. It combined and remediated previous media to create a signifying and pan-calculative gaze that enhanced the powers of accounting, finance, and management. Drawing on Bolter and Grusin, can we say that digital spreadsheets as remediated technology became a “healing” media? But this does beg some important questions. What was its impact on the modern political economy? What was its impact on capitalism?

The spreadsheet amplified existing managerial processes and facilitated new analytical operations. Its grid structure allowed a tracking system to monitor people and things. It connected people with tasks and results, creating new methods of surveillance and evaluation. It could register millions of items as assets in multiple categories. It itemized, tracked, and valued resources while constructing scenarios of future opportunity and profit.

Digital spreadsheets introduced a major change of pace and scale to the financial revolution that started with Nixon’s decision to go off gold and on to an “information standard.” The spreadsheet facilitated quick analysis and recalculating loan payment schedules in an era of inflation and dynamic interest rates. Spreadsheet proliferation started with accountants and bookkeepers who quickly realized that they could do their jobs with new precision and alacrity. But their use soon became ubiquitous.

PCs and spreadsheets started to show up in corporate offices, sometimes to the chagrin of the IT people. The IBM PC legitimized the individual computer in the workplace, and new software applications emerged, including new types of spreadsheet applications such as Borland’s Quattro Pro. Spreadsheet capabilities increased dramatically through the 1980s adding new formulas from a wide scope of disciplines like accounting, engineering, operations management and statistics. But it was the new processes of analyzing assets that allowed for the shift to a new era of spreadsheet capitalism.

Reaganomics’ emphasis on the financial resurgence and the globalization of news resulted in ways that money-capital could flow more freely. It’s no surprise that the digital spreadsheet brought in the era of leveraged buyouts (LBOs) and widescale privatization of public assets that characterized the late 1980s and the 1990s. Companies could be analyzed by “corporate raiders” and their assets separated into different categories/companies. Spreadsheets could determine NPV, and plans could be presented to investment bankers for short-term loans to purchase the company. Then certain assets could be sold off to pay off the loans and cash in big rewards.

Similarly, the assets of public agencies could be itemized, valued, and sold off or securitized and listed on share markets/stock exchanges. The “Third World Debt Crisis” created by the oil shocks of the 1970s and the flood of Eurodollars made available to countries created new incentives to find and sell off public assets to pay off government loans. This logic happened to telecommunications companies worldwide.

Previously, PTTs (Post, Telephone, and Telegraph) were government-owned operations that provided relatively poor telephone services but returned profits to the nation’s Treasury. But the calculative rationality of the spreadsheet was quickly turned to analyzing the PTTs. They could be used in summing the value of all the telephone poles, maintenance trucks, switches, and other assets. At first, these companies were turned into state-owned enterprises (SOEs), but they were eventually sold off to other companies or listed on share markets. By 2000, the top companies in most countries, in terms of market capitalization, were former PTTs, now transformed into privatized “telcos.”

World Trade Organization (WTO) meetings in 1996 and 1997 reduced tariffs on computers and other IT-related products. With the IMF they had pressured countries to liberalize telecommunications and complete PTT privatization. In the US and other countries, the dot.com “bull run” was taking place, aided by a spreadsheet at Worldcom that projected the “doubling meme” – continual fast growth of the Internet and all the technology associated with it.

By the late 1990s, these telcos were adopting the new Internet Protocols (IP) that allowed for the World Wide Web. Cisco Systems and Juniper Networks were two companies that were instrumental in developing new switching and routing systems. While initially used by small Internet Service Providers (ISPs) These technologies soon allowed telcos to convert their PTT infrastructures into IP providers and dominate the ISP broadband markets.

A spreadsheet is a tool, and it was also a world view – a reality by categories, data sets, and numbers. As the world moved into the financialization and globalization of the post-oil crisis Reagan era, the PC-based spreadsheet was forged into a powerful new “remediated” technology.

Was it responsible for a new era in capitalism? Where combinations of media framed by the computer windows guided and shaped the perceptions of a new era of capitalism. We have Apple’s iWork Numbers, Google Sheets, and LibreOffice Calc, but Microsoft Excel is still the dominant spreadsheet. But how has Microsoft repurposed and scaled Excel, particularly with Access and SQL language? Excel was the foundational technology for an era of database technologies. What about blockchain?

Capitalism is highly variable and subject to changes in regulations, legislation, and technologies. These can change the political economy and shape the flows of information and money. The spreadsheet was central to Reagan’s financial revolution, but also the globalized world of the Internet. Digital spreadsheets became a new way of viewing and interacting with the the world. But not through transparent immediacy, rather via a calculative rationality and hypermediated instrumentality providing new perspectives and techniques to understand and shape the relationships between capital, innovation, and management.

Notes

[1] McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964. Print.
[2] Bolter, J. D, and Richard A. Grusin. Remediation: Understanding New Media. Cambridge, Mass: MIT Press, 1999. Print.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Internet Policy, Part 4: Obama and the Return of Net Neutrality, Temporarily

Posted on | March 26, 2021 | No Comments

The highly competitive Internet services provider (ISP) environment of the 1990s was significantly altered by Federal Communications Commission (FCC) during the Bush Administration. Two Bush appointments to the FCC Chair position guided ISP policies towards a more deregulated environment. The result, however, was a more oligopolistic market structure and less competition in the Internet space. Furthermore, these policies raised concerns that powerful ISPs could influence the flow of data through the Internet and discriminate against competing content providers to the detriment of consumers.

The FCC is an independent commission but can lean in political directions. Under the leadership of Michael Powell (January 22, 2001 – March 17, 2005), Republican from Virginia and son of General Colin Powell, FCC decisions favored cable companies. In the summer of 2005, the FCC now guided by the new FCC Chairman Kevin J. Martin Republican from North Carolina (March 18, 2005 – January 19, 2009) guided decisions that favored telcos. The FCC made cable modem services and broadband services by telecommunications companies Title I unregulated “information services.” This has raised ongoing concerns that powerful ISPs influence the flow and speed of data through the Internet and could discriminate against competing content providers or users to the determent of consumers.[1]

This post examines the Obama administration’s approach to Internet regulation and the issue of net neutrality. This involved reviving “Title II” regulation that works to guarantee the equal treatment of content throughout the Internet. Previously, I examined the legal and regulatory components of common carriage and the emergence of net neutrality as an enabling framework for Internet innovation and growth.

Comedian John Oliver explained net neutrality in his Last Week Tonight Show published on Jun 1, 2014.

The Internet’s political and social impact was becoming more apparent with the social media presidential campaign of Barack Obama in 2008. It was recognized by the Pew Research Center that some 74% of Internet users interacted with election information. Many citizens received news online, communicated with others about elections, and received information from campaigns via email or other online sources.

In 2010, the Obama administration began to write new rules dealing with Internet providers that would require ISPs to treat all traffic equally. In what were called the “Open Internet” rules, FCC Chairman Julius Genachowski, Democratic from Washington, D.C.(June 29, 2009-May 17, 2013) sought to restrict telecom providers from blocking or slowing down specific Internet services. Verizon sued the agency to overturn those rules in a case that was finally decided in early 2014. It determined the FCC didn’t have the power to require ISPs to treat all traffic equally due to their new Title I designations. The judge was sympathetic to the consumer’s plight though, and directed the ISPs to inform subscribers when they slow traffic or block services.

After the appeal by Verizon, the DC circuit court sent the FCC back to the drawing boards. Judge David Tatel said that the FCC did not have the authority under the current regulatory conditions to treat telcos as “common carriers” that must pass data content through their networks without interference or preference. The result of Verizon vs. the FCC was that without a new regulatory classification, the FCC wouldn’t have the authority to actually enforce restricting the big ISPs from banning or blocking legal websites, throttling or degrading traffic on the basis of content, and limiting “paid prioritization” of Internet services. The latter, the so-called “fast lanes” for companies like Google and Netflix were particularly contentious.[2]

So, on November 10, 2014, President Obama went on the offensive and asked the FCC to “implement the strongest possible rules to protect net neutrality” and to stop oligopolistic ISPs from blocking, slowing down, or otherwise discriminating against lawful content. Tom Wheeler, the incoming FCC Chairman, from California (November 4, 2013 – January 20, 2017), sought a new classification from the legacy of the Communications Act of 1934 by invoking Title II “common carrier” distinctions for broadband providers.

To its credit, the FCC had been extremely helpful in creating data communications networks in the past. The FCC’s classification of data services in Computer I as being “online” and not “communications” provided timely benefits. For example, it allowed early PCs with modems to connect to ISPs over telephone lines for hours without paying toll charges to the providers of local telephone service. But with a competitive Internet, opening up the deregulated broadband capabilities to telcos seemed excessive.

“Information services” under Title I is a more deregulatory stance that allows the telcos to impose more control over the Internet. “Information services” under Title I refers to “the offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications.” As mentioned previously, under the George W. Bush’s FCC, cable companies in 2002 and then telcos in 2005 were classified as Title I information services. This led to a major consolidation of US broadband service that started to be dominated by large integrated service providers such as AT&T, Comcast, Sprint, and Verizon. These companies began trying to merge with content providers, raising the specter of monolithic companies controlling information and invading privacy.

On February 26, 2015, the FCC’s new “Open Internet” rules went into effect based on Title II of the Communications Act of 1934 and Section 706 of the Telecommunications Act of 1996. The latter gave the FCC authority to regulate broadband networks, including imposing net neutrality rules on Internet service providers. Section 706 directs the FCC and state utility commissions to encourage the deployment of advanced telecommunications capability to all Americans by removing barriers to infrastructure investment and promoting competition in the local telecommunications markets.

But Section 706 authority only kicks in when the FCC finds that “advanced telecommunications capability” is “not being deployed to all Americans in a reasonable and timely fashion.”

In other words, the case needs to made that the US Internet infrastructure is lacking. For example, the FCC established 25 Mbps download/3 Mbps upload as the new standard for “advanced telecommunications capacity” for residential service. This is actually a pretty low benchmark for urban broadband users as only 8% of America’s city dwellers lack access to that level of service. But that still left some 55 million Americans behind as rural areas were largely underserved, especially in tribal lands.

In early 2015, President Obama went began to point attention towards broadband access. Consequently Chairman Wheeler announced that the FCC’s Connect America Fund will disburse $11 billion to support modernizing Internet infrastructure in rural areas. It also reformed the E-rate program to support fiber deployment and Wi-Fi service to the nation’s schools and libraries.[3]

Open Internet rules were meant to protect the free flow of content and promote innovation and investment in America’s broadband networks. It was grounded in multiple sources of authority, including Title II of the Communications Act of 1934 and Section 706 of the Telecommunications Act of 1996. In addition to providing consumer protections by restricting the blocking, throttling, and paid prioritization of Internet services, the FCC strove to promote competition by ensuring that all broadband providers have access to poles and conduits for the physical plant.

They also did not require providers to get the FCC’s permission to offer new rate plans or allow new services. Nor did they require companies to lease access to their networks and monitor interconnection complaints, a key provision that promoted ISP competition. A key dilemma was to promote the ubiquity of the Internet, while exempting broadband customers from universal service fees.

The election of Donald Trump presented new challenges to Net Neutrality and the potential of a new reversal. Tom Wheeler resigned from the FCC, allowing Trump to pick a new Democrat to the FCC and a Republican. The new FCC voted 3-2 to begin eliminating Obama’s net neutrality rules and reclassify home and mobile broadband service providers as Title I information services. A new FCC Chairman, Ajit Pai, argued that the web was too competitive to regulate effectively, and throttling some web applications and services websites might help Internet users. The FCC began seeking comments about eliminating the Title II classification. Replacing the Obama net neutrality rules was put to the vote by the end of the year, and the FCC once again returned to Title I deregulation through a declaratory ruling.

Notes

[1] Ross, B.L. and Shumate, B.A., Rein, W. “Regulating Broadband Under Title II? Not So Fast.” Bloomberg BNA. N.p., 25 June 2014. Web. 18 June 2017.
[2] Finley, Klint. “Internet Providers Insist They Love Net Neutrality. Seriously?” Wired. Conde Nast, 18 May 2017. Web. 18 June 2017.
[3] “What Section 706 Means for Net Neutrality, Municipal Networks, and Universal Broadband.” Benton Foundation, 13 Feb. 2015. Web. 18 June 2017.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    May 2024
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.