ARPA and the Formation of the Modern Computer Industry, Part I: Transforming SAGE
Posted on | September 12, 2021 | No Comments
In response to the Russian Sputnik satellites launched in late 1957, US President Dwight D. Eisenhower formed the Advanced Research Projects Agency (ARPA) within the Department of Defense (DoD). As the former leader of the Allied forces during D-Day and the invasion of the European theater, he was all-to-aware of the problems facing the military in a technology-intensive era. ARPA was created, in part, to research and develop high technology for the military and bridge the divide between the Air Force, Army, Marines, and Navy.
Under pressure because of the USSR’s continuous rocket launches, the Republican President set up ARPA despite considerable Congressional and military dissent. Although it scaled back some of its original goals, ARPA went on to subsidize the creation of computer science departments throughout the country, funded the Internet, and consistently supported projects that enhanced human/computer interactivity.
Forming ARPA
Headquartered in the Pentagon, ARPA was established to develop the US lead in science and technology innovations applicable to the military and help it respond quickly to any new challenges. Eisenhower had multiple suspicions about the military and its industrial connections. However, he did believe in basic research and appointed a man with similar notions, Neil McElroy, the head of Proctor & Gamble, as his Secretary of Defense. McElroy pushed his vision of a “single manager” for all military-related research through Congress. Despite objections by the heads of the various armed forces, Eisenhower sent a request to Congress on January 7, 1958, for startup funds to create ARPA and appointed its director, a vice-president from General Electric. Shortly after, Congress appropriated funds for ARPA as a line item in an Air Force appropriations bill.[1]
Roy Johnson came to head ARPA from GE, dreaming of human-crewed space stations, military moon bases, orbital weapons systems, global surveillance satellites, and geostationary communications satellites. But by the end of ARPA’s first year, Eisenhower had established NASA, dashing Johnson’s space fantasies. Space projects moved to the new civilian agency or back to the individual military services, including the covert ones like those of the CIA’s spy planes and satellites. ARPA desperately searched for a new mission and argued effectively for going into “basic research” areas that were considered too “far out” for the other services and agencies.
With the Kennedy Administration taking office and its appeal for the nation’s “best and brightest” to enter government service, ARPA found its prospects improving. It looked aggressively for talent to develop the best new technologies. Behavioral research, command and control, missile defense, and nuclear test detection were some of the newest projects taken on by ARPA, although not necessarily “basic” research. The new agency also got increasingly involved with computers, especially after Joseph Carl Robnett “JCR” Licklider joined the staff in October 1962.[2]
ARPA’s Information Processing Techniques Office (IPTO)
The IPTO emerged in the early 1960s with the charge of supporting the nation’s advanced computing and networking projects. Initially called the Office of Command and Control Research, its mandate was to extend the knowledge gained by researching and developing the multi-billion dollar SAGE (Semi-Automatic Ground Environment) project and extend it to other command and control systems for the military.
SAGE was a joint project by MIT and IBM with the military to computerize and network the nation’s air defense system. It linked a wide array of radar and other sensing equipment throughout Canada and the US to what was to become the Colorado-based NORAD headquarters. SAGE was meant to detect aircraft (bombers and later ICBMs) coming over the Artic to drop nuclear bombs on Canada and the US. The “semi-automatic” in SAGE meant that humans would be a crucial component of the air defense system, and that provided an opening for Licklider’s ideas.
SAGE consisted of some 50 computer systems located throughout North America. Although each was a 250-ton monster, SAGE computers had many innovations that further sparked the dream of man-machine interactivity. These included data communications over telephone lines, cathode ray terminals to display incoming data, and light pens to pinpoint potential hostile aircraft on the screen. ARPA’s IPTO helped transform SAGE innovations into the modern IT environment.
From Batch to Timesharing
Throughout the 1960s, three directors at IPTO poured millions of dollars into projects that created the field of computer science and got computers “talking” to people and to each other. Licklider had the Office of Command and Control Research changed to Information Processing Techniques Office (IPTO) when he moved from BBN to ARPA to become its first director. Licklider was also from MIT, but what made him unusual was that he was a psychologist amongst a majority of engineers. He got his Ph.D. from the University of Rochester in 1942 and lectured at Harvard University before working with the Air Force. Foremost on his agenda was to encourage the transition from “batch processing” to a new system called “timesharing” to promote a more real-time experience with computers, or at least a delay measured in seconds rather than hours or days.
These new developments meant the opportunity for new directions, and Licklider would provide the guidance and government’s cash. During the mid-1950s, Licklider worked on the SAGE project focusing mainly on the “human-factors design of radar console displays.”[3] From 1959 to 1962, he was a Vice-President for BBN, overseeing engineering, information systems, and psycho-acoustics projects. He was also involved in one of the first time-sharing cases at BBN with a DEC PDP-1 before taking a leave of absence to join ARPA for a year.[4]
Licklider swiftly moved IPTO’s agenda towards increasing the interactivity of computers by stressing Vannevar Bush’s ideas and the notion of a more personal and interactive computing experience. An influential military project at MIT was the TX-2, one of the first computers to be built with transistors and a predecessor to the PDP line of computers. It also had a graphics display, unlike most computers that used punch cards or a teletypewriter. The TX-2 was located at MIT’s Lincoln Laboratories and had a major influence on Licklider. The brilliant psychologist would ride the waves of Cold War grant monies and champion research and development for man-machine interactivity, including a radical new computer-communications technology called timesharing.
Early computer users submitted their requests and punch cards to a receptionist at a computer center. Then a team of computer operators would run several (or a “batch”) of these programs at a time. The results were usually picked up a day or two after submitting the requests. After Bell Labs developed transistor technology, individual transistors were wired into circuit boards, creating the “second generation” computer series. This new technology allowed vacuum tubes to be replaced by a smaller, cheaper, and more reliable technology and produced an exciting increase in processing speeds. Faster technology eventually led to machines that could handle several different computing jobs at one time – timesharing.
Time-sharing would allow several users to use a computer by taking advantage of the increasing processing speeds. It also used enhanced computer communications by allowing users to connect via teletype and later cathode-ray terminals. Rather than punching out programs on stacks of paper cards and submitting them for eventual processing, time-sharing made computing a more personal experience by making it more immediately interactive. Users could interact with a large mainframe computer via teletypewriters used originally for telex communications and the cathode-ray terminals used for televisions.
Timesharing emerged from the MIT environment and its support by the US government. Sets of procedures used for timesharing originated at MIT after receiving an IBM 704 in 1957, a version of the AN/FSQ-7 developed for SAGE. John McCarthy, a Sloan Fellow from Dartmouth, recognized some possibilities of sharing the computer’s capabilities among several users. As the keyboard replaced punch cards and magnetic-tape-to-magnetic-tape communication as the primary source of data entry, it became easier for the new computers to switch their attention to various users.[5]
As its human users paused to think or look up new information, the computer could handle the requests of other users. Licklider pressed the notion of timesharing to increase the machine’s interactivity with humans, but the rather grandiose vision would not be immediately accepted throughout the military-related sphere of ARPA. It was still in a relatively primitive state of computing in the early 1960s, but ARPA would soon be won over.
First on Licklider’s list was Systems Development Corporation (SDC), a RAND spin-off that had done most of the programming for the SAGE project. ARPA had inherited SDC, and a major part of the IPTO budget was set to help them transition from the SAGE air defense project to command and control computing. SDC had been given one of SAGE’s ANSFQ-32 mainframes, but to Licklider’s chagrin, they used it for batch processing. Licklider thought it ridiculous to use it in this manner, where responses often took hours or even days to help a commander react to battle situations.[6] Licklider immediately went to work to persuade SDC to switch from batch processing to time-sharing, including bringing in his allied colleagues such as Marvin Minsky for seminars to cajole SDC.
Soon they were convinced, and Licklider moved on to other time-sharing projects, pouring ARPA money into like-minded projects at MIT and Carnegie Mellon. Luckily, he had joined ARPA the same month as the Cuban Missile Crisis. The event raised concerns about the ability of the President and others high on the chain of command to get effective information. In fact, Kennedy had been pushing for better command and control support in the budget, reflecting his concerns about being the Commander-in-Chief of a major nuclear power.
In the next part I will examine timesharing and the first attempts to commercialize it as a utility.
Notes
[1] Background on ARPA from Hafner, K. and Lyon, M. (1998) Where Wizards Stay Up Late. New York: Touchstone. pp. 20-27.
[2] A much more detailed version of these events can be found in a chapter called “The Fastest Million Dollars,” in Hafner, K. and Lyon, M. (1998) Where Wizards Stay Up Late. New York: Touchstone. pp. 11-42.
[3] Information on Licklider’s involvement with SAGE from Campbell-Kelly, M. and Aspray, W. (1996) Computer: A History of the Information Machine. Basic Books, pp. 212-213.
[4] Information on JCR Licklider’s background at BBN from the (2002) Computing Encyclopedia Volume 5: People. Smart Computing Reference Series.
[5] Evans, B.O. “Computers and Communications” in Dertouzos, M.L. and Moses, J.(1979) The Computer Age: A Twenty Year View. Cambridge, Massachusetts: The MIT Press. p. 344.
[6] A good investigative job on Licklider and SDC was done by Waldrop, M. Mitchell (2001) The Dream Machine: J.C.R. Licklider and the Revolution that Made Computing Personal. New York: The Penguin Group.
Ⓒ ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: AN/FSQ-7 > ANSFQ-32 > ARPA > batch processing > J.C.R. Licklider > SAGE (Semi-Automatic Ground Environment) > Systems Development Corporation (SDC)
ICT and Sustainable Development: Some Origins
Posted on | August 17, 2021 | No Comments
I teach a course called ICT for Sustainable Development (ICT4SD) every year. It refers to information and communications technologies (ICT) enlisted in the service of cities, communities, and countries to help them be economically and environmentally healthy. An important consideration for sustainability is that they don’t impose on conditions or compromise resources that will be needed for future generations. Sustainable Development (SD) is an offshoot of traditional “development” that dealt primarily with national economies that were organizing to “take off” into a westernized, pro-growth, industrial scenarios, and had some consideration of the colonial vestiges they needed to take into account.
While development was also cognizant of the need to support agriculture, education, governance, and health activities, SD put a major focus on related environmental issues and social justice. (See Heeks) SD has been embraced by the United Nations (UN) that came out with seventeen Sustainable Development Goals (SDGs) that were adopted by all UN organizations in 2015.
In this post, I briefly introduce ICT4D and it’s connection to SD. Also, how it emerged and why it is beneficial. Of particular importance are the economic benefits of ICT and recognizing them in the renewable energies so crucial to sustainable development.
ICT was not well understood by development economists and largely ignored by funding agencies, except for telephone infrastructure. Literacy and education were early concerns. Book production, radio, and then television sets were monitored as crucial indicators of development progress. Telephones and telegraphs helped transact business over longer distances but were installed and managed by government agencies called Post, Telephone, and Telegraph (PTTs) entities. PTTs found funding difficult and were challenging to manage, given their technical complexity and enormous geographical scope. Satellites were used in some countries like India and Indonesia and facilitated better mass communications as well as distance education and disaster management.
Most of the economic focus in “developing countries” was on the extraction and growing of various commodities, utilizing low-cost labor for manufacturing, or adding to the production processes of global supply chains. It was only when television and films became important domestic industries that “information products” were recognized economically in the development process.
New dynamics to development and economic processes were introduced with computerization and ICTs. I began my career as an Intern on a National Computerization Policy program at the East-West Center in Honolulu, Hawaii. Inspired by the Nora-Minc Report in France, it was part of the overall emphasis on development at their Communications Institute. I had an office next to Wilbur Schramm, who was one of the most influential development pioneers with his Mass Media and National Development: The Role of Information in the Developing Countries (1964).[1]
With my mentor, Syed Rahim, I co-authored Computerization and Development in Southeast Asia (1987) that serves as a benchmark studies in understanding the role of ICT in development. One objective of the book was to study the mainframe computers that were implemented, starting in the mid-1960s, for development activities. These “large” computers some of them with RAM of merely 14K, were implemented in many government agencies dealing with development activities: agriculture, education, health, and some statistical organizations. We also looked at what narratives were being created to talk about computerization at that time. For example, the term “Information Society” was becoming popular. Also, with the rise of the “microcomputer” or personal computer (PC), the idea of computer technology empowering individuals was diffusing through advertisements and other media.
Information economics opened up some interesting avenues for ICT4D and sustainable development. Initially, it was concerned with measuring different industrial sectors and how many people were employed in each area, such as agriculture, manufacturing, information, and services. Fritz Machlup, wrote the The Production and Distribution of Knowledge in the United States in 1973 that showed that the information goods and services accounted for nearly 30 percent of the U.S. gross national product. A major contributor to information economics, he concluded the “knowledge industry” employed 43 percent of the civilian labor force.
Machlup was also a student of Ludwig von Mises, known today as the founder of the so-called “Austrian School of Economics.” But he was soon overshadowed by fellow “members” Friedrich von Hayek and Milton Friedman, and the resurgence of Von Mises himself. While this debate was primarily against mainstream Keynesian economics, it was also significant for development studies as these economists saw government activities as running counter to the dynamics of the market. The main nemesis of the Austrian school was socialism and government planning activities. While most developing countries were not communist countries, the Cold War was a significant issue that was playing out in countries worldwide.
The Austrian movement had a significant impact in the 1970s and 1980s. Transactions in the economy were seen as knowledge-producing activities and they focused on the use of prices as communication or signaling devices in the economy. It led to a new emphasis on markets and Hayek and Friedman both received Nobel Prizes for their work.
For context, President Nixon had taken the US off the gold standard in August 1971 and the value of the US dollar dropped sharply. But currency markets were free to operate on market principles. It was also a time when the microprocessor was invented and computers were becoming more prominent. In 1973, Reuters set up its Money Monitor Rates, the first virtual market for foreign exchange transactions. They used computer terminals to display news and currency prices and charged banks to both subscribe to the prices and to post them. With the help of the Group of 5 nations, it brought order to international financial markets, especially after the Arab-Israel War broke out in late 1973. The volatility of the war ensured the economic success of the Reuters technology and currency markets have been digitally linked ever since.
Many development theorists by that time were becoming frustrated by the slow progress of capitalism in the “Third World.” Although the Middle East war was short, it resulted in increasing prices for oil around the world. This was a major strain on developing countries that had bought into mechanized development and the “Green Revolution” of the 1960s that emphasized petroleum-based fertilizers and pesticides. The Arab-dominated Organization of Petroleum Exporting Countries (OPEC) began an embargo of western countries for their support of Israel that refused to withdraw from the occupied territories. Prices of oil increased by 70 percent and the US suffered additional setbacks as they ended the war in Vietnam and inflation raged.
A split occurred between traditional development studies and market fundamentalists. British Prime Minister Margaret Thatcher and US President Ronald Reagan were strong advocates of the Austrian School. Both had been taken by Hayek’s Road to Serfdom (1949) and stressed a pro-market approach to development economics. The IMF was mobilized to pressure countries to undergo “structural adjustment” towards more market-oriented approaches to economic development. The PTTs were a primary target and investment strategies were utilized to turn them into state-owned enterprises (SEOs) and parts sold off to domestic and international investors.
Researchers began to focus on the characteristics or “nature” of information. As the economies became more dependent on information, more scholarship was conducted. It became understood that information was not diminished by use or by sharing. Certainly the value of information varied, often by time. The ability to easily share information by email and FTP created interest in network effects and the viral diffusion of information.
These characteristics became particular important after the development of the Internet that quickly globalized. Vice-President Gore’s Global Information Infrastructure (GII) became the foundation for the World Trade Organization’s Information Technology Agreements (ITA) and the privatization of telecommunications services. Tariffs on information and communications technologies decreased significantly. Countries that had gotten into debt in the 1970s were pressured into selling off their telecommunications infrastructure to private interests and they quickly adopted TCP and Internet Protocols (IP).
Other studies focused on efficiencies of production brought on by science and technology, specifically reducing the marginal costs of producing additional units of a product. Marginal costs have been a major issue in media economics because electronic and then digital technologies have allowed the increasing efficiency of producing these types of products. Media products have historically had high production costs, but decreasing marginal costs on the “manufacture” or reproduction of each additional unit of that product.
If we start with books for example, we know it is time-consuming to write a book and the first physical copies of the book are likely to be expensive, especially if only a small number of them are actually printed. But as traditional economies of scale are applied, the cost of each additional book becomes cheaper. Electronic copies of books in particular have become very cheap to produce, and even distribute through the Internet. Although that hasn’t necessarily resulted in major price decreases.
Digital outputs are generally unique economic products. They have unusual characteristics that make it difficult to exclude people from using them, and they are also not used up in consumption. Microsoft faced this problem in the early days of the microcomputer when it was getting started. It criticized computer hobbyists for sharing cassette tapes of their computer programs. Later, their investment in the MS-DOS operating system and subsequently Windows paid off handsomely when they were able to sell it with enormous margins for IBM PCs and then “IBM Compatibles” such as Acer, Compaq, and Dell. That is how Bill Gates became the richest man in the world (or one of them).
The issue of marginal costs have resonated with me for a long time, due to my work on media economics and what economists call “public goods.” In some of my previous posts, I addressed the taxonomy of goods based on key economic characteristics. Public goods and such as digital and media products are misbehaving economic goods in that they are not used up in consumption and are difficult to exclude from use. These writings examined what kind of products are conducive to reduced marginal costs and what social systems are conducive to managing these different types of goods. Originally, the focus was more on media products like film, radio and television, but then digital products like games and operating systems. Will these efficiencies apply to sustainable development?
Can the economics of media products apply to other products. More recently sustainable technologies like solar and wind are being examined for their near-zero marginal costs. A major voice on this topic is Jeremy Rifkin, who is most noted for his book The Third Industrial Revolution (2011) that refers to the importance of concurrent communications, energy, and transportation transitions. We have moved from an integrated political economy based on telephone/telex communications, and carbon combustion energy and transportation to a digital, clean energy. Two books by Jeremy Rifkin, The Near Zero Marginal Cost Society and The Green New Deal are significant points of departure for sustainable development.
Sustainable development initiatives by definition look to economize and reduce costs for the future. It is important to analyze the characteristics of economic goods and their social implications. This level of understanding is important to understand the market structure and types of regulation.
ICT4D has struggled to claim a strong narrative and research stake in the trajectory of development. The Earth Institute’s ICTs for SDGs: Final Report: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals (2015) and the World Bank’s (2016) World Development Report were significant boosts for ICT4D, especially for economic development, and the move towards sustainable development.
Citation APA (7th Edition)
Pennings, A.J. (2021, Aug 21) ICT and Sustainable Development: Some Origins. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/ict-and-sustainable-development-marginal-costs/
Notes
[1] Mass Media and National Development: The Role of Information in the Developing Countries. Stanford University Press. 1964.
[2] Sachs J et al (2016) ICT & SDGs: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals. The Earth Institute: Columbia University. Accessed at https://www.ericsson.com/assets/local/about-ericsson/sustainability-and-corporate-responsibility/documents/ict-sdg.pdf. 15 Jan 2019
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Ⓒ ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: computerization > marginal costs > Network effects > Nora-Minc Report > Wilbur Schramm
Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains
Posted on | August 12, 2021 | No Comments
Are we ready for the age of diamonds? Instead of mining gold or even Bitcoin for a currency base, why not use diamonds from captured carbon? And why not use recaptured carbon from the atmosphere via hemp? Instead of “conflict diamonds,” why not have “climate diamonds?”
Can we move to a diamond standard? Can diamonds replace gold and back other currencies? Can this be done in an economical and sustainable process? This post examines the processes of biological carbon sequestration and the manufacture of diamonds that can be used in a fintech environment. While I’m generally ok with the prospects of fiat money, if a hard money alternative emerged that was dynamic and contributed to climate security, why not try it?
The process of creating “industrial diamonds” is well established and has produced impressive results. Diamonds can be “grown” using small ones as seeds and adding to the crystalline structure in two ways. One is using superheated “greenhouses” with pressurized methane and hydrogen. Another uses CVD (Carbon Vapor Deposition), a low pressure, vacuum-based system, that uses heat or microwaves to bond carbon-rich gasses to the diamond seeds.
I wrote my Ph.D. dissertation about money and standards, so I’ve been thinking about this topic a lot. In Symbolic Economies and the Politics of Global Cyberspaces (1993), I examined the social forces that drive us to use general equivalents like money and the forces that establish monetary standardization in a digital environment. So, I’m not entirely convinced of my argument so far, but I want to consider available systems of regenerative agriculture and manufacturing that can be mobilized for making climate diamonds and tie them into newer generation cryptocurrency developments.
I was intrigued by ” rel=”noopener” target=”_blank”>a video from “Have a Think.” It presents an intriguing industrial scenario for growing and using hemp to sequester CO2 from the air and use it to produce several very valuable byproducts, one of which can be be transformed into sparking diamonds. Hemp is controversial due to its connection with THC, a psychoactive substance, but that is only present in certain strains. Hemp has a rich history and was particularly important for ropes needed by the sea-faring ships that relied entirely on wind.
Hemp is a dynamic plant that grows quickly. In the process, it can produce several types of industrial products, including lubricants for cars and wind turbines as well as ingredients for cosmetics, soaps, and printer ink. In addition, they have fiber substances that can make products like cloth, paper, and rope. The seeds have positive nutritional benefits and may be a replacement for soy proteins. It can also be processed to produce biochar, a type of charcoal used for fertilizers, graphene products, and the manufactured diamonds mentioned extensively in this post.
Agriculture is going through a technological transformation, with increased use of big data, hydroponics, and robotics. Hydroponics are a way of growing plants on a water-based medium in a protective environment. Hemp farms can remove significant amounts of carbon dioxide (C02) from the air and produce the oils and fibers mentioned above in a clean and economical way. Biological carbon sequestering is probably better then geologic sequestration that injects carbon into underground porous rock formations. Both may be necessary for reducing climate threats, but it is better done by plants that can produce many valuable byproducts.
Can we create a new monetary standard based on climate diamonds? Is it feasible? Is that something we want to do? Globally, we have been on an information standard since the 1970s, anchored by the strength of the US dollar and hedged by multiple financial instruments worldwide. The new diamond market will likely grow within the cryptocurrency environment.
Much of this depends on the future of cryptocurrency platforms and the digital ledger systems that are emerging in new generations. Cardona’s blockchain platform, for example, is evolving to create a peer-to-peer transactional system to trade many types of value like labor and portions of investment vehicles like houses, labor, art, etc. Imagine, for example, going to Home Depot and buying your gardening supplies with informational “deeds” to fractions of your car, your house, your future work, or your diamonds. Diamonds are likely to be another “value” in a chain of intersecting commodities classified on blockchain with dynamic pricing and smart contracts.
Diamonds have utility based on their beauty but also their durability and strength. Most notable is their sparkling effervescence that makes jewelry treasured symbols of relationship and commitment. Their crystalline structure is used in high-tech products like audio speakers as they can vibrate rapidly without affecting or deforming sound quality. Their high heat and voltage tolerance make diamond-based microprocessors an increasingly viable component of digital technologies.
Their hardness also makes them extremely valuable for drill bits. They have practical uses in delicate drilling for art, dentistry, and manufacturing. With a melting point of around 3550 degrees Celsius, they have the durability to drill industrial metals, geothermal wells, and underground tunnels.
Diamonds can also be money. They are portable, durable, measurable, and difficult to counterfeit. For diamonds, size matters, although color and clarity matter as well. Bigger is better and they pack a lot more value per size than gold. Gold has higher storage costs that quickly eat up any profit gains from their appreciated prices. Granted, neither diamonds or gold are generally used as currency, primarily because they lack acceptability by merchants and interoperability between financial systems. Try cashing in your diamonds at Home Depot.
That is why the cryptocurrency environment is the most likely solution to that problem, barring an economic collapse. While the dollar is going digital, officially known as CDBC (Central Bank Digital Currency), it will not be replaced by Bitcoin or other “cryptocurrencies” like Ethereum and Litecoin. Instead, other “values” will line up in relation to the dollar. The Fed will regulate but protect crypto-currencies because it knows that the financial system wants to trade anything, anywhere, anytime. So cryptocurrencies will survive, and diamonds will find their place within their ethereal block-chained cyberspace. In the future, who knows?
This is an exploratory essay stimulated by the “Have a Think” video but also shaped by my interest in fintech, monetary policy, and cryptocurrencies. The reintroduction of hemp in the modern economy and its potential for absorbing carbon dioxide can be a powerful addition to the global economy. It’s not entirely about taking the CO2 out of the air and bonding the carbon to diamonds. Rather, I am hopeful that the green manufacturing of diamonds will help incentivize and stimulate an industrial process with multiple benefits for the economy and the environment.
Citation APA (7th Edition)
Pennings, A.J. (2021, Aug 12). Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains apennings.com https://apennings.com/dystopian-economies/electric-money/diamonds-are-a-worlds-best-friend/
Ⓒ ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Carbon Vapor Deposition (CVD) > climate diamonds > conflict diamonds > hemp > hydroponics > industrial diamonds
Modern Monetary “Practice” and the Fear of Inflation in a Low-Supply Economy
Posted on | August 1, 2021 | No Comments
One of America’s most potent political myths is that you, as a citizen, pay for government spending. People talk about paying for this or that program, but what really happens is that Congress appropriates the money for spending. Then the Treasury instructs the Federal Reserve Bank to credit the spending accounts. Taxes and borrowing are separate entries. The issue is gaining scrutiny as the US economy reconciles 2020’s economic output slowdown due to COVID-19 in the context of record government spending.
A relatively new area of economic analysis, called Modern Monetary Theory (MMT), emerged from practitioners in the finance industry telling a different story about government spending. It is worth examining as it is based more on financial traders’ practices, particularly those who work with government bonds. Warren Mosler was critical in formulating and shedding light on the actual processes involved in government spending. Thus my emphasis on Modern Monetary “Practice” as it starts with this description of the spending process that allows us to reframe its dynamics.
Spending should not be seen as a panacea for the economy. Spending can be wasteful and lead to inflation. Spending needs to be productive. The $28 trillion debt accumulated by May 2021 is worthy of monitoring, but what does it really mean? What are its implications?
Taxes are registered by government, yes, but it’s not like household economics. A household needs a breadwinner, someone to bring home the bacon, to load up the metaphors. Someone needs to have money to pay the bills. Governments operate under a different set of rules and responsibilities. They can print or mint minor amounts of money and use the Fed for larger quantities. Government provides the money for the economy to operate, and the incentive – taxes – to make people want to own it. Mosler argues that governments have a monopoly on their currency and the responsibility to get it into the economy, by spending, to enable markets to work.
Central banks can make purchases of bonds and quite frankly, whatever it wants to buy. The Fed traditionally only bought government treasuries but now regularly buys mortgage-backed securities in a process called quantitative easing. Ideally, they can sell these treasuries and securities to absorb money from the economy if it smells inflation. The banking sector also creates money when it loans money to consumers.
The Treasury auctions bonds. But to pay for spending? They essentially provide:
- Time deposits for investors;
- Hedge instruments for traders;
- Opportunities for foreign countries to keep their currencies cheap versus the dollar;
- A vehicle for the Fed to influence the money supply and coordinate interest rates.
Borrowing should be seen as a political strategy to keep the financial system secure, provide a stable hedge, and manage the dollar’s value.
So, rather than worrying about “paying” for something, US citizens should be active in deciding how taxes should be used in public policy. The US policy should be designed to tax what it doesn’t want. Well, that isn’t going to be easy. But it is what democracy is about. Spending should also be determined on what will keep the US safe and secure. It should keep the economy productive while providing opportunities and avoid excessive inflation.
This last point is important. Inflation is the primary limiting factor when it comes to spending and is a calculus between supply and “effective” demand. “Too many dollars chasing too few goods” is the standard explanation by economists. Spending is easier for a government to coordinate. A good example occurred during the COVID-19 when the US government passed several emergency spending packages to support businesses and families, especially airlines hurt by the shutdown in travel. While the economy skyrocketed due to the fiscal and monetary stimulus, the slowdowns in production, disruptions of supply chains, and people staying at home caused a significant spike in inflation during early 2021.
Inflation in the US had been largely absent since Nixon took us off the gold standard in the early 1970s. At that time, the dollar deflated, and OPEC countries restricted oil production. So they wanted to drive up prices to make up for the diminishing value of the US greenback. Meanwhile, lacking banking systems due to Islamic restrictions on credit, they recycled US dollars through a global euro-dollar system. Called “petrodollars,” banks worldwide coordinated syndicated loans with these funds for countries needing dollars for energy purchases and development projects. “Economic hit men” scoured the world and pressured countries around the world to borrow the money, eventually creating what was called the “Third World Debt Crisis” in the early 1980s.
Since the 1980s, financialization and the commercialization of Cold War technologies created sufficient competition and disruption to keep prices down. Primarily information technologies, they increased productivity, reducing labor and resource costs. Also, globalization created new forms of interstate competition and cooperation, as supply chains supported innovation and higher quality products. The US government also floated bonds internationally to countries like England and Japan that strengthened the dollar and kept it as the world’s reserve currency.
The COVID-19 pandemic presents unprecedented economic challenges, particularly with its 2021 resurgence as the delta variant. A rising stock market that saw the DJIA hit 35,000 and S&P hitting 4,395.26 with a market capitalization of US$38.2 trillion. But concerns about inflation grew as commodities such as copper, lumber, and oil increased. A computer chip shortage also raised concerns about the production and pricing of cars, computers, and other commodities based on microprocessing capability. But the Fed and others saw this phenomenon as “transitory,” citing disruption, demographics, debt, and productivity as factors that would reduce inflationary pressures.
So the economy looks to be at risk in late 2021. Will the practical application of MMT provide operational guidance for a new era of prosperity? Can infrastructure and climate change solutions provide sufficient returns on these investments? The big question is whether government spending for such programs can avoid significant inflationary pressures? With COVID, we are struggling with how to spend in a low-output economy.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Ⓒ ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: inflation > Modern Monetary Theory (MMT) > petrodollars
Thomas Edison Builds the Universal Ticker-Tape Machine
Posted on | July 10, 2021 | No Comments
In previous posts, I described the importance of the telegraph and other electro-mechanical devices and their impact on the Civil War and the expansion of the US economy. Thomas Edison, the famous inventor, came to New York during the famous Black Friday financial crash in the Autumn of 1869. He immediately became involved in the technological innovations that were changing Wall Street and introducing new ways to represent and understand financial transactions.[1]
Electricity was harnessed to expand finance’s reach throughout the country and seas. New networks of telegraphed information and news added layers of certainty and volatility to Wall Street and for investors worldwide. Edison’s inventions and patents helped solidify these technological innovations and financial practices. They provided the means for him to escape New York and set up his laboratories and facilities for creative innovation and manufacture.
A few weeks after Black Friday, Edison formed a partnership with Franklin L. Pope, a telegraph engineer. A few weeks after Black Friday, Edison partnered with Franklin L. Pope, a telegraph engineer who was also associated with Doctor Laws. Earlier in the year, Edison had conducted telegraph tests with Pope from Boston and Rochester, one reason he came to New York City. On October 1, 1869, Edison and Pope announced they were going to create both the Financial and Commercial Telegraph Company and the American Printing Telegraph Company.[2]
They partnered only a few months after completing the transcontinental railroad, and the US was expanding quickly westward. Immigration was on the rise, and the economy was growing rapidly. They jointly filed several patents in printing telegraphy during this period. Two printer designs were for stock tickers to provide gold and stock quotations in the New York area. Another printer called “The Pope and Edison Type-Printing Telegraph” was meant to be used on “private lines.” They transmitted messages other than commercial quotations from an easy-to-use terminal for individuals and business houses and, in a sense, were the first “emails.” While their partnership was successful, it ended when the Gold and Stock Telegraph Company absorbed their businesses.
Like much of the managerial talent of the time, General Marshall Lefferts had served during the Civil War. After several years heading Western Union, he became the President of the Gold & Stock Telegraph Company. Very quickly, they acquired the patents to Laws’ Gold Indicator and his Stock Printer, as well as the Calahan Stock Ticker.
In 1870, the Gold and Stock Company created the Exchange Telegraph Company with partners from London, England to expand internationally. Investment capital was streaming into the US, primarily for railroad development. Telegraph messages and ticker prices were streaming through the Atlantic Ocean as well. Lefferts became a friend and mentor to the young Edison, and both encouraged and funded his work.
Edison had gained extensive experience with the Laws Gold Indicator and the Calahan Stock Ticker and was in an ideal situation to develop a general stock ticker for mass production. By the end of the year, Edison made a number of innovations and obtained patents for the Electrical Printing Instrument. Anxious to go into mass production, General Lefferts, as head of the Gold and Stock Telegraph Company gave him $40,000 for the rights to use his telegraphic and ticker inventions.[2] The event was represented in this movie Edison the Man (1940) where Spencer Tracy played the famous inventor.
His first months in New York City was an extraordinary time for Edison, who had barely reached his mid-twenties. In early 1870, with the money from his stock ticker inventions, he opened his first factory in nearby Newark, New Jersey, for the manufacture of gold and stock tickers. Edison’s worked on telegraph instruments as well as duplex and quadruplex telegraph instruments that could send multiple electrical signals at the same time.
Another device he continued to work on allowed remote synchronization of tickers from the central station. If a ticker in a broker’s office went out of “unison” and began to print unstructured results, it needed to be quickly reset. In the first years of the stock ticker, a runner would have to go to each client’s location and reset the ticker by hand. Every mechanism needed to be synchronized to allow multiple machines to print the same information simultaneously. Also, the demand for the tickers was spreading beyond New York, and it was crucial that the machines operated simply and without the need for mechanical troubleshooters. A device had been invented by Henry Van Hoevenbergh, but it didn’t work very well. Edison tested a new design on the Laws Stock Printer during the spring of 1871 and he soon received a patent for his “screw-thread unison” innovation that could reset the printing machines with an electrical signal. Subsequently, all stock tickers developed after this innovation incorporated the unison device, resulting in the stock ticker’s rapid diffusion.[3]
As their tickers were beginning to be used in many remote cities in the US and around the world, it was important to fix the stock tickers without the use of local skilled workers. Edison continued to make improvements on the stock ticker to create a reliable machine that could be marketed on a mass basis. The concept of the interchangeability of parts was important for the production of clocks, guns, and sewing machines, and it was applied to the manufacture of tickers as well. Several variations of stock tickers were initially manufactured in small numbers, but then in 1871, Edison constructed the Universal Stock Printer for New York’s Gold and Stock Telegraph Company.
The “Universal” was very dependable and could be manufactured in high volume. The New York Stock Exchange was a major beneficiary of the new stock tickers. These machines enabled a higher volume of trading and spread the reach of the Wall Street exchange. This allowed the NYSE to unseat smaller exchanges in other cities as people around the country could get price quotes transmitted directly to their stock tickers from New York. Over 5,000 of these devices were produced and used by investors around the world.
The ticker also got the attention of Western Union, which was also envious of The Gold and Stock Telegraph Company’s monopoly on transmitting market information from the New York Stock Exchange. Another concern for the giant was that the Gold and Stock Telegraph Company was using Edison’s Universal Private Line Printers to send out financial information. Edison’s printer only had moderate success though while a faster printer from George M. Phelps gave Western Union technological considerable competition. Along with a new stock ticker in 1870, later called the “Financial Instrument” that was faster, more efficient, and more reliable, Western Union went on the offensive. When they threatened to enter the New York market with this new ticker, Gold and Stock arranged a merger in 1871 and took over the took over the manufacturing and distribution of the stock ticker equipment.
Edison worked closely with “Mr. P” after the merger. Over the next few years they would patent the Quadruplex, considered Edison’s major contribution to telegraphy. It would allow four simultaneous telegraph transmissions on a single conducting line and would save Western Union a considerable amount of money. He also helped with Phelps’ Electro-Motor Telegraph that was ten years in development and based on an electro-motor/governor that was able to achieve speeds of up to 60 wpm. By 1875, Phelps’ transmitting apparatus allowed an operator to simultaneously transmit stock information to hundreds of different offices. It could also operate between New York and that other emerging financial center, Chicago, without using a single repeater.
The Universal Stock Ticker was Edison’s first commercial success and, in many ways, the source of his future success. The first 40 of his 1,093 patents filed with the US Patents Office came from his work with stock tickers and printing telegraphs. The success of his stock tickers provided capital and connections with wealthy investors that helped fund his many inventions, including perhaps his most outstanding achievement, the electric light bulb. In 1875, he moved to Menlo Park, New Jersey, where he set up his laboratory and the Black Maria movie studio.[4]
Citation APA (7th Edition)
Pennings, A.J. (2021, Jul 21). Thomas Edison Builds the Universal Stock Ticker. apennings.com https://apennings.com/telegraphic-political-economy/thomas-edison-builds-the-universal-ticker-tape-machine/
Notes
[1] A very good account of Black Friday events appears on the New York Times website section “On This Day”. See http://www.nytimes.com/learning/general/onthisday/harp/1016.htm. Accessed on May /10/15.
[2] Anecdotal information on Edison’s timely circumstances on Wall Street from Edison: His Life and Inventions by Frank Lewis Dyer and Thomas Commerford Martin.
[3] The stocktickercompany website has very good information on the history of the stock indicators and ticker-tape machines.
[4] It has been difficult to trace the exact timing of Edison’s activities at the time. Ultimately, I decided to follow the patents. http://www.prc68.com/I/StkTckPat.shtml#List
© ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=110)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Calahan Stock Ticker > Thomas Edison > Universal Stock Ticker
Show-Biz: The Televisual Re-mediation of the Modern Global Economy
Posted on | June 8, 2021 | No Comments
My use of “Show-Biz” refers to the meaning-making techniques of financial journalism and their relationship to the narratives that represent and drive the economy. Media industries “show” business and finance through various camera, editing, and special effects techniques, drawing in data from many sources and presenting them on different windows of high-resolution screens. These techniques create ways of seeing and showing the economy. Consequently, they influence public opinion as well as investment and trading strategies that shape global, local, and national economic activities and investment patterns.
This post concerns the televisual surveying systems that monitor and display global business and financial activities. It starts with a theory of media, called remediation, and then examines different elements or media that are combined into the broadcast or streaming of financial news. Two key concepts, transparent immediacy and hypermediation help us understand the way the media operates. These transmissions of mediated financial information have consequences for the global economy.
This is not to say that such representations are necessarily valid constructions of reality or distortions of truth. One of the central themes of this post is that strategies of visual mediation are intertwined with authentic experiences and facts and that strategies of interpretation and incredulity/skepticism are required.
Television news expanded significantly in the 1970s with the creation of cable systems and satellite networks. Several networks were dedicated to financial news. Cable and traditional TV combined when CNBC (Consumer News and Business Channel) was established in April, 1989 as a joint venture between Cablevision and NBC. Bloomberg Television was launched in the United States on January 1, 1994, and drew on a decade of financial analytics provided through the famous “Bloomberg Box.” (See image of my daughter pretending to use a Bloomberg Box)
Another successful financial news network was Fox Business News, launched on October 15, 2007. Yahoo also emerged as a major financial news provider. Bought by Verizon in 2016, it attracted over 100 million global monthly visitors on average in 2019, according to media analytics company ComScore. Yahoo Finance was recently sold by Verizon to Apollo Global along with AOL.
How is financial news mediated in these networks? What signifying practices are brought into play and for what purposes? What are the implications of their mediating styles and techniques for how we understand the health of the global economy, levels and types of employment, and the potential of innovative new industries and companies?
Financial television news is consistently broadcasting in many trading operations and other business environments. It is also popular in homes, whether by day traders or interested citizens. Many people might be invested in Bitcoin or other cryptocurrencies, concerned about housing prices, or following their 401K and other types of investments. Televised news and economic indicators play a vital role in various audiences’ perceptions of the economy.
Anchored by personalities that are informed and presentable, the television screen combines live human commentary with indexical information, graphs, and other numerical representations of different parts and states of the economy. The news anchor fixes meaning, guiding the narrative while transfixing the audience with the immediacy of their presence.
Remediation is literally the re-mediating of content through the inclusion of old media into new media. Or sometimes, including new media in an old medium, such as the use of computer and web techniques in modern television. One of the earliest media theorists, Marshall McLuhan, made these observations in the 1960s. Print remediates writing, which remediates speech. The TV anchor, an actual person of authority who “anchors” the meaning of the broadcast, is remediated on the television news screen.
However, Bolter and Grusin made a more systematic analysis in their Remediation, published by MIT Press (2000).[1] They echoed McLuhan’s observations that the content of new media is old media. They also ventured that the motivation was a desire for remediation is a “healed” media, one that provided better access to a more “authentic” version of reality. Bolter and Grusin pointed to a “double logic” of remediation – two different modes of representation that strive to better access the real. Television has coped with this dual system of remediated representation since its origins with a variety of incorporations and innovations.
One mode of remediation is transparent immediacy, the desire for a type of immersion into the medium that erases the technology and provides an unblemished experience. The cinematic movie experience strives for this authenticity with its large screen, darkened room, and conservative camera editing practices. The viewer wants to forget the presence and techniques of the movie apparatus and believe they are in the presence of the objects of representation – the actors and sets. TV less so.
McLuhan and others argued that TV was primarily an acoustic medium, mainly because sound anchors the meaning of the visual elements. Television is a storyteller, primarily an oral one. So it is no surprise that human “anchors” on broadcast news play an important role. Anchors read the news and also conduct live interviews with guest experts for additional credibility and information. They present the major topics of the day in real-time, fixing the meaning of the broadcast, organizing the narratives of the day. Financial television borrows this sonic dominance, although it streams many other sources of data and textual news.
Many celebrity financial analysts have become celebrities, such as Mohamed A. El-Erian, Jared Bernstein, Bill Gross, etc. Neel Kashkari and other Fed District presidents are also very popular. These “Talking Heads” are brought in to contribute to the narrative and bring their expertise remotely from different cities and countries, representing key companies or government positions.
Television news is interrupted occasionally by “Breaking News” that reinforces immediacy. This interruption usually includes live reporting by a journalist at a relevant location. Drone or helicopter “birds-eye” viewing enhances the dominant perspective of television news. Reports by the Fed Chair after their FOMC meeting on interest rates are very popular. These events keep viewers “glued” to the screen.
Hypermediation is the other strategy and uses techniques of representation that “foreground the medium.” Television has taken on the multi-mediated look of the computer with different windows gathering in activities, data, and events. While the anchor is prominent (although most trading environments turn off the sound) other windows display hyper-mediated representations of economic and financial data streaming in from around the world. This information is primarily in the form of charts and graphs, and indices presenting a quantitative view of the world. The reliability of this global gaze often draws on the truth claims of numeracy and remediates the spreadsheet. In particular, the visual techniques of the table are utilized to quickly communicate an augmented view of the economy.
Financial hypermediation has moved away from transparency. Instead, it integrates an augmented reality with indexical denotations of stock markets, prices of commodities like gold and silver, and currency exchange fluctuations. Indicators range from the macro-indicators such as GDP, invented to mobilize industrial responses to the Great Depression and World War II. If Women Counted by Marilyn Waring was a major critique of GDP because it didn’t count domestic work. The age of big data is also returning information that is giving us a better picture of the larger economy. Unemployment statistics are a major indicator, as are the prices of commodities like gold, silver, and copper.
Financial news probably owes a debt to sports broadcasting and news. Notably, American sports like baseball, basketball, and football (gridiron) have embraced hypermediated techniques in the service of sports entertainment. While transparent immediacy is a crucial part of sport enjoyment, a new word, “datatainment” has emerged as the moniker of the joy many people get from statistics related to their favorite teams and players. In baseball, for example, scores remain the major source of numerical pleasure as they indicate winners and losers. But batting averages, earned run averages (ERA), and runs batted in (RBIs) are statistical sources of additional satisfaction.
Conclusion
Financial news on television combines several earlier and newer types of media to represent views of the global economy. It uses anchors and interviews with guests. It used many economic indicators throughout the screen and scrolling tickertapes. It tries to survey the world and paint a picture of an authentic world that it thinks its viewers would be interested in. What are the limitations of such media strategies? What are the limitations of these strategies of representation?
Citation APA (7th Edition)
Pennings, A.J. (2021, Jun 08). Show-Biz: The Televisual Re-mediation of the Modern Global Economy. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/show-biz-the-televisual-re-mediation-of-the-modern-global-economy/
Notes
[1] Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 2000.
[2] Cook, Patrick J. Rev. of Remediation by Jay David Bolter and Richard Grusin. Resource Center for Cyberculture Studies (December 1999). 14 January 2001.
Share
© ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Bloomberg Surveillance > Bolter and Grusin > CNBC > double logic of remediation > hyermediation > Remediation
US Internet Policy, Part 5: Trump, Title I, and the End of Net Neutrality
Posted on | May 16, 2021 | No Comments
The election of Donald Trump in 2016 presented new challenges to broadband policy and the net neutrality rules passed under the Obama administration. Tom Wheeler resigned from the Federal Communications Commission (FCC), allowing Trump to pick a Republican chair and swing the power to the GOP.
The major issue would be to challenge the FCC’s 2015 classification of Internet Service Providers (ISPs) under Title II of the Communications Act of 1934 that emphasized common carriage, the commercial obligation to serve all customers equally and fairly. This post recaps the development of net neutrality rules for Internet broadband under the FCC during the Obama administration to their immediate dissolution under the Trump administration’s FCC.
The FCC’s Computer Inquiries were conducted from the 1960s to the 1980s and identified three layers of technological services for data communications. The telecommunication market offering basic services to homes and businesses would be classified as regulated Title II companies because of their monopoly positions. Data communications and processing service providers operating on top of the telco infrastructure would be lightly regulated Title I “enhanced” companies. The content companies that would offer information services were not included in the regulations. This legal framework allowed the Internet to take off in the 1990s, including the creation of over 7,000 ISPs in the US. But this was before the higher speed broadband services became available.
Broadband companies became Title I “information services” during George W. Bush administration’s FCC. Telephone companies that had carved up America during the breakup of AT&T in the 1980s became unregulated ISPs. Cable television companies had also developed IP broadband capabilities in the late 1990s and, with cable modems, competed or merged with telephone companies to provide “triple play” (TV, broadband, and voice) services to households. In 2009, cable companies were also deregulated by classifying them under Title I.
The result of these two decisions was a highly oligopolistic market structure for broadband services. These companies began to acquire smaller ISPS, often after making it difficult for them to interconnect to their facilities as they had been required as Title II companies. Customers soon found themselves limited to monopoly or duopoly ISPs in their area.
These newly deregulating companies also wanted to expand into new digital services, including payment systems and providing information, video, and search content. These actions violated the “maximum separation” rules that restricted these companies from competing with their customers. They also had designs to operate as gateways that would package games, social media, geo-locational data, and email services into bundles that they would offer at various prices. Concerns proliferated about pricing and service issues and this led to the movement for “net neutrality” and the return of common carriage.
During the first Obama administration, The FCC began a major study the broadband market structure of ISPs in the US.
In 2010, the FCC passed six broad “net neutrality principles:”
-
Transparency: Consumers and innovators have a right to know the basic performance characteristics of their Internet access and how their network is being managed;
No blocking: This includes a right to send and receive lawful traffic, prohibits the blocking of lawful content, apps, services and the connection of non-harmful devices to the network;
Level playing field: Consumers and innovators have a right to a level playing field. This means a ban on unreasonable content discrimination. There is no approval for so-called “pay for priority” arrangements involving fast lanes for some companies but not others;
Network management: This is an allowance for broadband providers to engage in reasonable network management. These rules do not forbid providers from offering subscribers tiers of services or charging based on bandwidth consumed;
Mobile: The provisions adopted today do not apply as strongly to mobile devices, though some provisions do apply. Of those that do are the broadly applicable rules requiring transparency for mobile broadband providers and prohibiting them from blocking websites and certain competitive applications;
Vigilance: The order creates an open Internet advisory committee to assist the commission in monitoring the state of Internet openness and the effects of the rules.[1]
The new rules faced a judicial challenge. The courts, while sympathetic to the goals of net neutrality, questioned the FCC’s authority to regulate Title I companies. After an appeal by Verizon, the DC circuit court sent the FCC back to the drawing boards. Judge David Tatel said that the FCC did not have the authority under the current regulatory conditions to treat telcos as “common carriers” that must pass data content through their networks without interference or preference.
The result of Verizon vs. the FCC was that without a new regulatory classification, the FCC wouldn’t have the authority to actually enforce restricting the big ISPs from banning or blocking of legal websites, throttling or degrading traffic on the basis of content, and enacting “paid prioritization” for Internet services. The latter, the so-called “fast lanes” for companies like Google and Netflix were particularly contentious.[2]
President Obama got involved and supported reclassifying ISPs as common carriers under Title II of the Communications Act of 1934. This would give the FCC the authority they needed to regulate the ILEC ISPs. In February 26, 2015, the FCC passed the new Title II Net Neutrality Rules in a 3–2 party-line vote that went into effect in the summer of 2015. The FCC’s open internet rules applied to both wired and wireless Internet connections.
Trump’s new FCC Chairman, Ajit Pai, argued that the web was too competitive to regulate effectively. Ignoring the impacts of deregulating cable and telephone companies on broadband competition, he argued ISPs did not have the incentive to throttle web speeds or restrict other services. He compared regulating ISPs with regulating websites, a clear deviation from the regulatory layers set out in the computer inquiries. Subsequently, the FCC began seeking comments on eliminating the Title II classification for broadband and removing the Obama era net neutrality rules.
On December 14, 2017, the Federal Communications Commission (FCC) voted in favor of repealing these policies, 3–2, along party lines. Pai voted with the majority of the FCC to reverse the decision to regulate the Internet under Title II of the Communications Act of 1934. Called the Restoring Internet Freedom Order, it repealed the net neutrality rules that were put in place two years earlier.
Pai’s justification speech argued that the Internet was not broken and didn’t need to be fixed. His contention was that the bureaucratic complexity of net neutrality was a burden on small ISPs and a disincentive to invest in new facilities and digital pipes. The new FCC voted to begin eliminating Obama’s net neutrality rules as it reclassified home and mobile broadband service providers as Title I information services.
The both Democrats and Republicans responded with several strategies to reverse the rule. In 2018, they attempted to invoke the Congressional Review Act (CRA) to undo the FCC order. This would bypass the filibuster and allow Congress to repeal recent administrative regulations. The motion passed the Republican-controlled vote Senate by 52–47, but did not get the necessary votes in the Republican-controlled House.
The Democrats tried after gaining a majority in the 2018 midterm elections with the Save the Internet Act of 2019 bill. It codified no blocking or throttling websites, or bundle websites or apps like a cable packages. It designated network access a “utility” under Title II of the 1996 Communications Act.
Rep. Mike Doyle (D-PA), the bill’s main sponsor and chair of the Subcommittee on Communications and Technology within the House Committee on Energy and Commerce, said he believes Internet access is a right for all and that “We want that gatekeeper to be neutral.” It passed the House 232-190 but was declared dead on arrival by Mitch McConnell, Senate Majority Leader.
Pai resigned with Trump’s departure in 2020, leaving behind a mixed legacy. While he was acknowledged for some internal changes, including creating the FCC’s Office of Economics and Analytics (OEA) that collected FCC economists in a central think tank, instead of the separate bureaus. But the FCC was slow on 5G deployment and making available the much needed supply of spectrum in the mid-band (2 GHz-6 GHz) range. Rural buildout was weak with the FCC caught working with telcos to reclassify mobile broadband requirements lower. This meant they could count lower mobile bandwidth capabilities as broadband. But, by far, the so-called Restoring Internet Freedom order that repealed net neutrality will be the legacy of the Trump era, with the central question being was it a capitulation to telco lobbyists?
In the next post, I will examine the challenges for the Biden Administration in addressing broadband policy, including net neutrality, but also the Internet of Things, and the expansion of broadband infrastructure in rural and other underserved areas.
Notes
[1] Gustin, S. (2018, September 11). FCC Passes Compromise Net Neutrality Rules. Wired. https://www.wired.com/2010/12/fcc-order/.
[2] Finley, Klint. (2017, May 18) Internet Providers Insist They Love Net Neutrality. Seriously? Wired. Conde Nast, 18 May 2017. Web.
© ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Ajit Pai > Federal Communications Commission's (FCC) > maximum separation > Net Neutrality > Title I Classification > Title II Classification > Title II Net Neutrality Rules
Digital Spreadsheets as Remediated Technologies
Posted on | May 5, 2021 | No Comments
In his classic (1964) Understanding Media: The Extensions of Man, Marshall McLuhan argued that “the content of any medium is always another medium.”[1] For example, the content of print is the written word, and the content of writing is speech. Likewise, the content of the telex was writing, and the content of television was radio and cinema. The book was notable for coining the phrase, “the medium is the message,” and pointed to the radical psychological and social impacts of technology.
McLuhan had a specific focus on the effects instead of the content transmitted by each medium. He probed how new forms of technologies extended the senses of humans and changed the activities of societies. He invited us to think of the lightbulb, not so much in terms of its luminous content, but in the way it influenced modern society. He noted it creates new environments and changes in lifestyles, particularly at night. This post will examine the media technologies embedded in the digital spreadsheet that have made it a transformative technology and changed modern life.
Mediating “Authentic” Realities
In Remediation: Understanding New Media, Jay Bolter and Robert Grusin extended McLuhan’s ideas to a number of “new media,” including television, computer games, and the World Wide Web. They argued new media technologies are designed to improve upon or “remedy” prior technologies in an attempt to capture or mediate a more “authentic” sense of reality. They used the term “remediation” to refer to this innovation process in media technologies.[2] For example, VR remediates perspectival art, which remediates human vision. TV not only remediates the radio and film but now the windowed look of computers, including the ticker-tape scrolling of information across the screen.
Unfortunately, but understandably, they neglected the spreadsheet.
And yet, the digital spreadsheet is exemplary of the remediation process. Several years ago, I initiated an analysis of the spreadsheet that focuses on the various “media” components of the spreadsheet and how they combine to give it its extraordinary capabilities. To recap, these are:
- writing and numerals;
- lists;
- tables;
- cells, and;
- formulas.
The digital spreadsheet refashioned these prior media forms: writing, lists, tables, and formulas to create a dynamic meaning-producing technology. Writing and lists have rich historical significance in the organization of palaces, temples, monasteries, as well as armies, and navies. Indo-Arabic numbers replaced Roman numerals and expanded the realm of numerical calculation with the introduction of zero and the positional place holding system. Numbers and ledgers led to the development of double-entry accounting systems and the rise of merchants and later modern businesses.
Tables helped knowledge disciplines emerge as systems of inquiry and classification, initially areas like accounting, arithmetic, and political economy. Still, later areas such as astronomy, banking, construction, finance, insurance, and shipping depended on printed tables to replace constant calculation. Charles Babbage (1791-1871), a mathematician and an early innovator in mechanical computing, expressed his frustration with constructing tables when he famously said, “I wish to God these calculations had been executed by steam.”
First with VisiCalc and then Lotus 1-2-3, these media elements worked together to form the gridmatic intelligibility of the spreadsheet. Bolter and Grusin proposed a “double logic of remediation” for the representation of reality: transparent immediacy and hypermediacy. Both work to produce meaning. However, the former tries to forget the mediation at work and produce transparent immediacy, such as watching a live basketball game on television. The latter tries to foreground the medium, especially through computer graphics. Financial news programs on TV such as Bloomberg Surveillance mix the immediacy of live news using hosts and guests, with hypermediated indexes of stock markets (DJIA, S&P 500, NASDAQ, etc.) and other economic indicators such as GDP. How do spreadsheets attempt to perceive, display, and produce reality? How do they heal our perception of reality?”
Windows to the World Wide Web
It was the personal computer (PC) that brought the spreadsheet to life. The Apple II brought us VisiCalc in 1976 with 40 columns and 25 rows, a small area that could be navigated quickly using the arrow keys. One of the first formulas developed for the spreadsheet was net present value (NPV) that calculated the return on investment (ROI) for projects, including large purchases of equipment. Microsoft’s Disk Operating System (DOS) was the technical foundation for Lotus 1-2-3 as the IBM PC and “IBM-compatibles” proliferated during the 1980s. The spreadsheet was becoming known as the “killer app” that made buying the “microcomputer” worthwhile. But it was the Graphic User Interface (GUI) that popularized the PC, and thus the spreadsheet.
The Apple Mac marked the shift to the GUI and new desktop metaphor in computing. GUIs replaced the inputted ASCII characters of the command line interface with a more “natural” immediacy provided by the interactivity of the mouse, the point-able cursor, and drop-down menus. The desktop metaphor drew on the iconic necessities of the office: the file, inboxes, trash cans, etc. A selection of fonts and typographies remediated both print and handwriting. The use of the Mac required some suspension of disbelief, but humans have been conditioned for this alteration of reality by story-telling and visual narratives in movies and TV.
Microsoft’s Excel was the first spreadsheet to use the graphic user interface (GUI) developed by Xerox PARC and Apple. Designed for the Apple Macintosh, it became a powerful tool that combined the media elements of the spreadsheet to produce more “authentic” versions of reality. An ongoing issue is the way it became a powerful tool for organizing that reality in ways that benefitted certain parties, and not others.
Excel was the center of Microsoft’s own shift to GUIs starting in 1985. Called Windows, it made spreadsheets a key part of its Office software applications package. Microsoft had captured the IBM-compatible PC market with DOS and initially built Windows on top of that OS. Windows 2.0 changed the OS to allow for overlapping windows. Excel became available on Windows in 1987 and soon became the dominant spreadsheet. Lotus had tried to make the transition to GUI with Jazz but missed the market by aiming too low and treating the Mac as a toy.
Windows suggested transparent views for the individual to different realities.
But while the emerging PC was moving towards transparent immediacy, the spreadsheet delved into what Bolter and Grusin would call hypermediacy. This is an alternate strategy for attaining an authentic access to the real. Windows promised transparent views of the world, but the spreadsheet offered new extensions of the senses – a surveying and calculative gaze – by remediating.
Spreadsheets drew on the truth-claims of both writing and arithmetic while combining them in powerful ways to organize and produce practical information. They combined and foregrounded the mediums involved to present or remediate a “healed” version of reality. Each medium provides a level of visibility or signification. The WYSIWYG (What You See Is What You Get) environment of the desktop metaphor provided a comfortable level of interactivity for defining categories, inputting data, and organizing formulas and displaying that information in charts and graphs.
The Political Economy of PC-based Spreadsheets
How has the digital spreadsheet changed modern society? Starting with VisiCalc and Lotus 1-2-3, the spreadsheet created new ways to see, categorize, and analyze the world. It combined and remediated previous media to create a signifying and pan-calculative gaze that enhanced the powers of accounting, finance, and management. Drawing on Bolter and Grusin, can we say that digital spreadsheets as remediated technology became a “healing” media? But this does beg some important questions. What was its impact on the modern political economy? What was its impact on capitalism?
The spreadsheet amplified existing managerial processes and facilitated new analytical operations. Its grid structure allowed a tracking system to monitor people and things. It connected people with tasks and results, creating new methods of surveillance and evaluation. It could register millions of items as assets in multiple categories. It itemized, tracked, and valued resources while constructing scenarios of future opportunity and profit.
Digital spreadsheets introduced a major change of pace and scale to the financial revolution that started with Nixon’s decision to go off gold and on to an “information standard.” The spreadsheet facilitated quick analysis and recalculating loan payment schedules in an era of inflation and dynamic interest rates. Spreadsheet proliferation started with accountants and bookkeepers who quickly realized that they could do their jobs with new precision and alacrity. But their use soon became ubiquitous.
PCs and spreadsheets started to show up in corporate offices, sometimes to the chagrin of the IT people. The IBM PC legitimized the individual computer in the workplace, and new software applications emerged, including new types of spreadsheet applications such as Borland’s Quattro Pro. Spreadsheet capabilities increased dramatically through the 1980s adding new formulas from a wide scope of disciplines like accounting, engineering, operations management and statistics. But it was the new processes of analyzing assets that allowed for the shift to a new era of spreadsheet capitalism.
Reaganomics’ emphasis on the financial resurgence and the globalization of news resulted in ways that money-capital could flow more freely. It’s no surprise that the digital spreadsheet brought in the era of leveraged buyouts (LBOs) and widescale privatization of public assets that characterized the late 1980s and the 1990s. Companies could be analyzed by “corporate raiders” and their assets separated into different categories/companies. Spreadsheets could determine NPV, and plans could be presented to investment bankers for short-term loans to purchase the company. Then certain assets could be sold off to pay off the loans and cash in big rewards.
Similarly, the assets of public agencies could be itemized, valued, and sold off or securitized and listed on share markets/stock exchanges. The “Third World Debt Crisis” created by the oil shocks of the 1970s and the flood of Eurodollars made available to countries created new incentives to find and sell off public assets to pay off government loans. This logic happened to telecommunications companies worldwide.
Previously, PTTs (Post, Telephone, and Telegraph) were government-owned operations that provided relatively poor telephone services but returned profits to the nation’s Treasury. But the calculative rationality of the spreadsheet was quickly turned to analyzing the PTTs. They could be used in summing the value of all the telephone poles, maintenance trucks, switches, and other assets. At first, these companies were turned into state-owned enterprises (SOEs), but they were eventually sold off to other companies or listed on share markets. By 2000, the top companies in most countries, in terms of market capitalization, were former PTTs, now transformed into privatized “telcos.”
World Trade Organization (WTO) meetings in 1996 and 1997 reduced tariffs on computers and other IT-related products. With the IMF they had pressured countries to liberalize telecommunications and complete PTT privatization. In the US and other countries, the dot.com “bull run” was taking place, aided by a spreadsheet at Worldcom that projected the “doubling meme” – continual fast growth of the Internet and all the technology associated with it.
By the late 1990s, these telcos were adopting the new Internet Protocols (IP) that allowed for the World Wide Web. Cisco Systems and Juniper Networks were two companies that were instrumental in developing new switching and routing systems. While initially used by small Internet Service Providers (ISPs) These technologies soon allowed telcos to convert their PTT infrastructures into IP providers and dominate the ISP broadband markets.
A spreadsheet is a tool, and it was also a world view – a reality by categories, data sets, and numbers. As the world moved into the financialization and globalization of the post-oil crisis Reagan era, the PC-based spreadsheet was forged into a powerful new “remediated” technology.
Was it responsible for a new era in capitalism? Where combinations of media framed by the computer windows guided and shaped the perceptions of a new era of capitalism. We have Apple’s iWork Numbers, Google Sheets, and LibreOffice Calc, but Microsoft Excel is still the dominant spreadsheet. But how has Microsoft repurposed and scaled Excel, particularly with Access and SQL language? Excel was the foundational technology for an era of database technologies. What about blockchain?
Capitalism is highly variable and subject to changes in regulations, legislation, and technologies. These can change the political economy and shape the flows of information and money. The spreadsheet was central to Reagan’s financial revolution, but also the globalized world of the Internet. Digital spreadsheets became a new way of viewing and interacting with the the world. But not through transparent immediacy, rather via a calculative rationality and hypermediated instrumentality providing new perspectives and techniques to understand and shape the relationships between capital, innovation, and management.
Citation APA (7th Edition)
Pennings, A.J. (2021, May 5) Digital Spreadsheets as Remediated Technologies. apennings.com https://apennings.com/financial-technology/digital-spreadsheets/digital-spreadsheets-as-remediated-technologies/
Notes
[1] McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964. Print.
[2] Bolter, J. D, and Richard A. Grusin. Remediation: Understanding New Media. Cambridge, Mass: MIT Press, 1999. Print.
Ⓒ ALL RIGHTS RESERVED
![Anthonybw](https://i1.wp.com/apennings.com/blog/wp-content/uploads/2014/04/Anthonybw-150x150.jpg?h=180)
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Cisco Systems > doubling meme > Jay Bolter > Marshall McLuhan > print capitalism > PTTs > Remediation > spreadsheet capitalism