Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

The Techno-Epistemology of Databases, Part I

Posted on | November 6, 2022 | No Comments

How is meaning created in an environment of computer databases and digital spreadsheets? How is that meaning fixed? How is it organized? What are the processes and consequences of categorization and classification?[1]

This post posits that databases are an epistemological technology that is part of a system of knowledge production and management. It explores the unique meaning-producing capabilities that emerged with databases, particularly table-based relational databases, and how it affects modern organizations and society. They have transformed modern forms of bureaucratic and corporate organization and are major forms of “time-space power,” the ability to endure over time and operate over distances.

Databases, like digital spreadsheets, are constitutive technologies that can shape perceptions and knowledge through their ability to capture data, store it, process it, and render it accessible for retrieval, presentation, and analysis. Consequently, they organize resources and empower control over the lived experiences of people and the dynamics of social organizations.

Epistemology is generally known as the philosophical attempt to understand knowledge and produce theories of we comprehend the world. Philosophers such as Plato, John Locke, Immanuel Kant, and Bertrand Russell, among many others, have attempted to understand how we know, what knowledge means to the knower and the society they live in, and how knowledge can be verified and justified.

Techno-epistemology refers to knowledge produced, categorized, and certified within a particular technological context. In particular, it asks how the characteristics of the technology shape the ways of perceiving, recording, and organizing knowledge. Database technology and management are at the nexus of systematizing knowledge in the organization.

Stuart Hall’s insights into culture and the creation of meaning are a helpful starting point. In his classic work on Representation he discussed the role of concepts and the processes of classification, as well as how meaning is created and shared. Hall also raised issues regarding the role of culture in creating and organizing meaning.

Categorizing or classifying information, for Hall, is a basic human social process that produces meaning and separates information into that which “belongs” and that which does not, based on distinguishing characteristics such as resemblances or differences. It involves grouping or distributing according to some common relations or attributes. A list, for example, is a category of items that delineates what is within the group, and what is on the outside.

Classification is a cultural phenomenon that involves negotiating society’s conceptual maps. Stuart Hall argued that while the ability to conceptualize and classify is an inherent human biological trait, the systems of classification that are produced are socially produced and learned. This is because humans are meaning-making entities and also operate in environments that require the systemization of knowledge. “Socialization” is the process of learning society’s cultural categories and the value structures associated with them.

To investigate the topic of techno-epistemology more thoroughly, it is useful to analyze a particular database. Using a PC desktop database is sufficient for this initial formal analysis. Formal analysis is a strategy for describing artifacts and visual information. This approach can be applied to any work of art or cultural artifact. I also draw in the notion of remediation, which is how new media incorporate older media to be functional.

I previously analyzed digital spreadsheets to examine their components and how they use older media to work together to provide the spreadsheet’s organizational power and its influence on modern capitalism. That formal analysis discussed 5 components that came together to invest spreadsheet with their power: alphanumerical representations, lists, tables, cells, and formulas.

The purpose of analyzing databases is to continue the examination of how technology shapes the organization of knowledge by examining the components of a database. They have transformed modern forms of bureaucratic and corporate organization by shaping knowledge production and management. A central questions What is remediated in the database? In the next installment I will analyze the Microsoft Access database relational database program that includes tables, fields and records as well as queries, forms, reports, macros, and modules.

Notes

[1] In “The Influence of Classification on World View and Epistemology” Gholamreza Fadaie, Faculty of Psychology & Education, University of Tehran, Iran. Proceedings of the Informing Science & IT Education Conference (In SITE) 2008.
http://proceedings.informingscience.org/InSITE2008/InSITE08p001-013Fadaie410.pdf

Citation APA (7th Edition)

Pennings, A.J. (2022, Oct 11). The Techno-Epistemology of Database Classification, Part I. apennings.com https://apennings.com/technologies-of-meaning/the-techno-epistemology-of-databases-part-i/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Zeihan’s Global Prognostics and Sustainable Development, Part I

Posted on | October 11, 2022 | Comments Off on Zeihan’s Global Prognostics and Sustainable Development, Part I

Peter Zeihan has become a popular author and media celebrity for his ideas on declining demographic patterns, the shale oil revolution, and the end of globalization. He formerly worked for Stratfor, a private geopolitical forecasting company in Austin, Texas, started by noted geopolitical futurist George Friedman.[1] His new firm Zeihan on Geopolitics provides custom analytical products and briefings on global trends and how they might impact his corporate and government clients.

This post looks at Zeihan’s hypotheses and their implications for sustainable development, roughly defined by the United Nations Brundtland Commission as meeting the needs of the present without compromising future generations. While one might say that all countries in the world are undergoing a transition to sustainable development, countries have different circumstances and need to develop unique economic policies and solutions. In any case, they must maintain situational awareness regarding their financial, geographical, and military affairs. They should also consider the relevance of the UN’s sustainable development goals (SDGs) that plot a trajectory based on economic, social, and environmental objectives.[1] Zeihan’s model, which is not beyond critique, is a helpful point of departure for assessing the development needs in each and every country and the possibilities of the SDGs to help identify and meet those needs.

Zeihan has written four books, The Accidental Superpower: The Next Generation of American Preeminence and the Coming Global Disorder (2016), The Absent Superpower (2017), Disunited Nations: The Scramble for Power in an Ungoverned World(2020), and just recently, The End of the World Is Just the Beginning: Mapping the Collapse of Globalization (2022). But YouTube has probably been his most significant outlet. You can view the below and follow the recommended videos or search directly.

Zeihan’s approach is primarily focused on demographic trends, geographic constraints, military concerns, and legacy energy availability. For example, he discusses the Russian Federation’s situation primarily in terms of its declining population, reliance on metals, natural gas, and oil exports, and a history of being invaded by foreign armies through several geographical vulnerabilities (particularly the plains of Ukraine). His major concern is that Russia’s younger population is significantly declining, which means fewer workers, fewer consumers, and less capital investment capability from savings. He considers the war in Ukraine one last chance to start barricading the 11 potential invasion points that have haunted Russia in the past. Otherwise, Russia is likely to be invaded or break up in the next few decades.

He also weaves agriculture, finance, maritime shipping, rare materials, manufacturing, and military activities into his prognostications. His general conclusion is that globalization is ending, and countries will have problems dealing with their populations getting older and with fewer younger generations. And all this will mean clear winners and losers as globalization comes to an end. And why is it ending?

The starting point for his major arguments is that the US-led world order that provided financial stability and military protection for global trade is ending due to a lack of interest and political will. Since 1945, the US has provided the primary currency for international reserves and transactions. They also orchestrated maritime and military protection for a worldwide system of trade. However, he argues the United States has been stepping down from its role as the global bank and policeman, and it will leave many countries vulnerable to drastic changes in international commerce and military affairs.

The Bretton Woods Conference in New Hampshire during World War II designed a new financial and trading order. Replacing the old gold standard, economists and policymakers from around the world created a new financial system. It was based on connecting the US dollar to gold at $35 an ounce. At the same time, other countries were required to tie the value of their currencies to the dollar at fixed rates. They also created the International Monetary Fund (IMF) and the World Bank to help countries maintain their prescribed currency exchange rate and develop economically.

But just as crucial for Zeihan’s thesis is the US creation of a global security system to protect any country’s maritime exports and imports. The US Navy has over 200 ships that patrol the world’s waterways, including 11 “super” aircraft carriers and 9 “helo” carriers. Using nuclear power, the supercarriers can operate for over 20 years without refueling. Aircraft carriers usually carry dozens of fighter jets and project tremendous power and national prestige. While China has an overall numerical advantage, its ships have a limited range that restricts the area they can patrol. China has two aircraft carriers and one helo carrier. Russia has a powerful navy, but it is divided into four areas with limited access to the world’s oceans. The United Kingdom, France, and Japan are considered other powerful global navies.

Caveat: A country gets protection providing it sides with the US against Communism (and later “terrorism). He calls this “the bribe.”

The main problem with this global protection system, Zeihan argues, is that it died in the early 1990s with the fall of the USSR. The US populace has subsequently not been particularly interested in its continuance. Although it got renewed emphasis after 9/11 and the war on “terror.” The US has steadily turned inward and away from the world’s problems. The end of this protection will leave much of the world’s shipping at the mercy of hostile states and pirates.

Global shipping has several choke points, particularly the Straits of Malacca, between Singapore and Indonesia, as well as the Gulf of Hormuz between Iran and the UAE. These routes are vulnerable to naval attacks and blockades. China’s energy imports, for example, are quite vulnerable, as are its exports to the world. For example, a naval blockade near Sri Lanka could seriously hamper China’s industries and shut down much of the country. Zeihan argues that US protection keeps China from falling into chaos. Without the US, essential trade routes will require convoy protection capabilities that China has yet to achieve and will require new forms of shipping insurance that could be very expensive.

Likewise, the US dollar is the mediating global currency, which is not likely to change soon, despite US neglect. Recurrent dollar shortages have created havoc in world markets. These shortages are due to declining US purchases of foreign oil, diminished military presence overseas, and smaller trade deficits that have made the dollar much more scarce and, thus, more expensive. The upward trend in dollar strength has recently been accelerated by rising US interest rates and global capital flight into US dollar assets due to the Russian invasion of Ukraine and China’s potential attack on Taiwan.

His second major concern is the rapidly aging populations in the developed and newly industrialized worlds that will soon send many economies into free fall. These countries are facing severe demographic problems with increased older generations and smaller new generations, primarily due to urbanization. Populations provide workers, consumers, and investors. Industrialization and scientific advances have meant a population boom worldwide, but that trend is ending. As a result, consumer spending and tax revenues dry up, while pension payments and medical care consume much of what is left. Japan has been leading the way, but China is Zeihan’s primary concern. Add in the one-child policy, and Zeihan believes China is on the verge of a significant collapse. Europe is also vulnerable, as is Russia. He argues that Russia needed to have its war with Ukraine now to expand its empire and plug the geographical holes in its defenses or lapse into a slow, unrecoverable decline.

Energy is extremely important. The US is now energy-independent primarily based on the “Shale Revolution,” which refers to the combination of hydraulic fracturing (“fracking”) and horizontal drilling that significantly increases the acquisition of natural gas and oil. Overproduction in 2017 caused price declines culminating in the tragic 2020 negative prices, but the market recovered as Covid-19 subsided and “revenge travel” ensued. Middle East countries are big producers of energy. Russia was important but is quickly going offline due to sanctions and a lack of technical expertise and may see many oil wells freeze up indefinitely. Venezuela has tremendous quantities but of a quality that is more expensive to refine.

Almost as important for his global analysis, is what countries are heavily dependent on gas and oil imports. China, Japan, Korea, and Taiwan are all severely dependent on foreign energy that must travel significant distances. These countries rely on large oil tankers (and paying in US dollars) to ship in their energy. Primarily built by France, Japan, and Sweden, and more recently South Korea. Zeihan says that a couple of destroyers in the Indian Ocean stopping oil tankers going to China could bring down the country within a year. For him, China is the most vulnerable country in the world.

Zeihan is not the biggest fan of renewable energies, although he claims to have solar panels on his home. He doesn’t think they will significantly scale in the quantities to replace oil and argues that many places are not suitable locations for wind or solar. He also points out that many of the rare elements and materials needed for the green revolution require inputs from many countries. If globalization continues to stall, Net Zero carbon pledges will be nearly meaningless.

In conclusion, Zeihan has a model of the world with several key variables that he uses to analyze the prospects for individual countries. Most are based on material conditions: geography, populations, oil, metals, etc. They produce a significant hypothesis: that globalization based on US protection is falling apart and that individual countries will have to assess their situations and decide whether to find partners or go it alone.

His model does not seem to highly value collective action, education, and innovation. These include cultural movements (including markets) and governmental momentum and policy that can assess and adapt to situations. It was, after all, the unlikely US mobilization for WWII that led to success in both the Atlantic and Pacific theaters and the technological innovations of radar, code decryption, and the atomic bomb, among other successes, that propelled the US to the global power it became and continued to sustain over nearly 80 years. He didn’t believe Ukraine would last long against Russia. (Who did?) Zeihan also downplays political influence in foreign policy and the tendency of the “military industrial complex” to shape US action.

However, the model he presents is a useful starting point for country assessments on the relevance of sustainable development goals. Every country has to evaluate its situation within the context of the world’s trading and financial arrangements. It needs to understand its demographic status to assess labor opportunities, investment trends, and consumption projections. It also needs to understand its topographical characteristics and limitations that determine the logistical capabilities of its navigable waterways, seaports, highways, and passable railways. These will help to realize what means it has available to mobilize resources and products for export as well as the access it has to world markets for the imports such as energy, plastics, steel, etc., that its producers need for everything from manufacturing to home building.

In Part II, I will go into more detail about how this model can serve the analysis of sustainable development in various countries.

Notes

[1] I’m testing this analytical approach in my undergraduate EST 230 ICT and Sustainable Development course. It’s important for the students to comprehend the context for sustainable development and understand the different challenges that different countries will face.

Citation APA (7th Edition)

Pennings, A.J. (2022, Oct 11). Zeihan’s Global Prognostics and Sustainable Development, Part I. apennings.com https://apennings.com/digital-geography/zeihans-global-prognostics-and-sustainable-development-part-i/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Weak Domestic Dollar, Strong Global Dollar

Posted on | August 3, 2022 | No Comments

The summer of 2022 was a good time for Americans to travel overseas. Everything was expensive at home and cheap abroad. In other words, the dollar was weak domestically and strong globally. In this post, I examine the dynamics of the US dollar and why it operates differently within the domestic US and globally. This description explains how US dollars are created internally and internationally, as well as their significance as a global reserve currency and a mediating vehicle for transactions worldwide.[1]

In the wake of the coronavirus (COVID-19) epidemic and its stimulus spending, supply shocks, and other dramatic tragedies that included the deaths of over a million people in the US alone, the world experienced significant economic disruption, including a serious bout of price rises. We can add the Russian invasion of Ukraine to that turbulence with its prospects for reduced trade in natural gas, metals, oil, wheat, uranium, and other resources that contributed to shortages worldwide and famine in many places in the global South that have become grain dependent. Also, the Chinese real estate crash is causing an increase in the demand for gold.

In the US, we also have political divisions that have polarized policy perspectives, particularly on major spending issues such as the latest Senate budget reconciliation deal tentatively titled the Inflation Reduction Act (IRA), which will address inflation through tax changes, capping drug costs, and climate change measures.

These are some of the main issues that influence the dollar’s strength internally and out in the global system. The consequences are immense as we interrogate government spending and consider alternatives to the US dollar as the world’s primary transacting currency, such as Bitcoin or the Chinese Yuan. Or will BRICS actually produce an alternative currency?

Fed M1 Money Supply

How are Dollars Created?

In both the US and internationally, US-denominated dollars are primarily created by private banks as they make loans. The fractional reserve banking system that most of the world runs on takes deposits and lends out money at prescribed interest rates, contingent on levels of risk, compliance, and the quality of collateral. Banks are usually restricted in loan creation by a reserve requirement set by a central bank. The Federal Reserve had traditionally required US banks to put 10 percent of their deposits in vaults or stored at the Fed. But since the pandemic started, they reduced that requirement to zero percent. This meant that US banks could lend out anything that came in the door.

Internationally, banks have developed a system of lending that is based on what are called “Eurodollars.” These are US currencies outside the jurisdiction of the Federal Reserve and other US regulatory agencies and provide the majority of the world’s transacting currency. These deposits are typically made in countries where there are fewer regulatory restrictions on foreign currency transactions. Once deposited, the U.S. dollars held in foreign banks can be lent out to other banks or financial institutions in the form of loans or interbank deposits.

This shadow money exists on the ledgers of banks and operates with no reserve requirements. Eurodollars have been active since the 1950s but exploded in the 1970s when three things happened: Nixon took us off the Bretton Woods’ Dollar-Gold standard; global oil transactions were set in US dollars; and; communications networks went digital. More on Eurodollars below.[2]

Financial institutions and market participants have developed various financial products and instruments that facilitate the creation and trading of Eurodollars. These innovations include Eurodollar deposits, certificates of deposit (CDs), Eurodollar loans, and Eurodollar futures contracts. Eurodollars can also be created through financial market transactions, such as the issuance of Eurodollar-denominated bonds or Eurodollar futures contracts. These instruments allow investors and institutions to gain exposure to U.S. dollar-denominated assets without directly holding physical dollars.

Overall, the creation of Eurodollars is a result of international banking activities, financial market transactions, and the global demand for U.S. dollar-denominated assets. Eurodollars play a significant role in international finance and are essential for facilitating cross-border trade, investment, and financial transactions.

The Domestic Dollar

While bank lending is the primary way dollars are created, some other processes are mentioned below that are relevant to the creation of the dollar in the US and its spending implications. These include actually printing them, interest rates, reserve requirements, and government spending.
The Fed can literally print money, which it does for the US Treasury. It keeps about 1.5 trillion in paper-based dollars circulating in the economy. That includes about 12 billion one dollar bills, 9 billion twenty dollar bills, and 12 billion hundred dollar bills. Another $2 billion is minted as various coins.

But mostly, what people call “printing” money is the Fed creating bank reserves in the process of targeting interest rates and “quantitative easing” (QE). It does this by buying government bonds from banks with its magical computer mouse, an unlimited credit card. This buying increases the reserves that banks can lend out to borrowers. But bank reserves are not real money unless they are converted into loans for people and businesses. Conversely, it can reduce the amount of bank reserves by selling government securities. That transaction absorbs bank reserves and can potentially reduce lending and thus pressures on inflation. So, in retrospect, the Fed’s role in money-making is not always predictable, and certainly not “money printing” unless you consider its role as the government’s bank.

Most people don’t understand how the US federal government creates money. It’s quite simple, although politicians don’t like talking about it. As Warren Mosler, a financial trader and author of Soft Currency Economics put it, “Congress appropriates the money and then the Treasury instructs the Fed to credit the appropriate accounts.” That’s it. The Fed just sends an electronic message changing figures on computer-based ledgers. But government spending does need Congressional approval and all US spending bills must originate from the House of Representatives.

The government doesn’t need to tax to get the money to spend. It doesn’t need to borrow the money. Taxing and borrowing have their purposes, but they are not required for the government to spend money because the US government issues its own currency. So why tax and why borrow if the government can just issue money to provision itself and pay its obligations?

Warren Mosler, the founder of MMT, argued that taxing keeps everyone on the national currency system and can also reduce inflationary pressures. The purpose of taxing is to keep the national currency relevant and reduce the amount of something. Taxing creates the demand for government-issued currency because they must be paid in US dollars. Taxes can reduce discretionary spending and alleviate social inequality. Taxes can channel additional resources toward national activities as they did during the Cold War and Space Race when the top marginal rates were above 90 percent.[3]

Borrowing money by auctioning off Treasury bonds creates a monetary instrument that pays interest. It’s basically another way for the government to issue its currency and keep markets operating in a dollar system. Also important, in fact quite critical, is its role as collateral in borrowing and hedge against financial risk. Its larger denominations useful for repurchase agreements (“repos”) and futures markets.

Repos are collateralized loans where a borrower of cash exchanges securities such as US Treasuries (the collateral) to the lender with an agreement to repurchase or buy them back at a specified price and time. Treasuries were critical for the Reagan Revolution’s global finance offensive. It tripled the US debt from $738 billion to $2.1 trillion, making the US the world’s largest debtor nation. This debt also provided the collateral needed for the expansion of US money-capital. US treasures provided for the liquidity to make the modern financial sphere possible.

Consequently, according to Stephanie Kelton, debt and deficits are a “myth” for a government that can issue currency. The COVID-19 response was unprecedented spending by the US government. The Senate used an old House bill to create the CARES Act of 2020, with $2.2 trillion in initial stimulus spending. Trump’s son-in-law Jared Kushner would manage the PPP while the COVID-19 vaccines were being developed by Operation Warp Speed. Another $900 billion was added before the end of the year, mostly for direct payments to individuals for the holiday season.

In 2021, the Democrats legislated another $2 trillion in the American Rescue Plan to address the K-shaped recovery. It expanded unemployment insurance, extended the enhanced child tax credit, and supported the Centers for Medicare & Medicaid Services (CMS) to ensure all Americans had access to free vaccinations. Was the COVID-19 spending the cause of the rise of inflation that rose steadily, hitting 9 percent in June of 2022?

We are still in the wake of the COVID-19 pandemic. Shortages emerged from businesses shutting down, factories closing, and shipping containers stranded at sea or in ports waiting for trucks to distribute their goods. Disposable income increased from government stimulus and inflated assets. People forced to study or work at home shifted spending from services to goods. Instead of going to a restaurant, they bought food at the local supermarket. Instead of going to the cinema, they watched Hulu, Netflix, or one of the new Internet-based streaming services.

Corporations raised prices, and purchasing power decreased. As a result, the dollar weakened domestically as the money supply inflated in 2020, and to a lesser extent in 2021. The chart above shows the M1 money supply levels since 1990 (In billions of US dollars). The increase in 2020 was dramatic.

The International US Dollar

As former US Treasury Secretary John Connally once said to his counterparts around the world at an economic summit, the dollar “is our currency, but it’s your problem.” The value of the dollar is currently at a 20-year high against other major currencies, creating a massive problem for everyone outside America buying dollar-denominated goods, which is about 85 percent of all international trade. Oil, and nearly all raw materials, from aluminum to wheat and zinc, are priced in US dollars.

As the world’s “reserve” currency, dollars are in demand around the world, primarily for transactional purposes. It is the mediating currency for transactions among dozens of different countries. If Argentina wants to do business with Australia, it conducts that business through US dollars. Why is the dollar currently very strong overseas? The problem is the significant shortages of the US dollar.

Two interrelated systems loosely organize US-denominated dollars overseas. The more official system is the US dollar reserves held by central banks. The US dollar became the world’s primary reserve currency due to the Bretton Woods Agreements at the end of World War II. The post-war trading system would be based on the US dollar backed by the gold in vaults at Fort Knox and the NY Federal Reserve Bank. The US dollar maintained this link with gold until the 1970s, when President Nixon ended the dollar’s convertibility into gold. Most countries stayed with the dollar and also began buying US Treasury securities as a safe store of the currency.

More than half of the official currency reserves are US dollars with the Euro taking up another quarter. This is actually a historic low. The Japanese yen, pounds sterling, and to a lesser, the Chinese renminbi constitute the rest of the major reserves held by central banks. Bitcoin and other blockchain cryptocurrencies are possibilities for the future, but as we see, currencies have been digital for quite a while. They just haven’t been blockchained-based. Central banks are also developing central bank digital currencies (CBDCs) that may make a multilateral transacting system outside the Eurodollar system more feasible.

Central banks get US dollars through its economy’s international trade, Foreign Direct Investment (FDI), and liquidity swaps. US aid programs, military bases, and trade deficits, especially buying foreign cars and oil, contribute to the spread of the US dollar. It was important for the US to run trade deficits to support the world’s need for dollars. The US oil independence has been particularly problematic for the world’s currency system. Many refer to the “dollar hegemony,” as it is challenging to manage, often destructive, and the US often doesn’t even seem politically motivated to try.

The other system involves the digitally created Eurodollar deposits in internationally networked banks. Eurodollars are not “Euros,” the currency of the European Union. Instead, these are US-denominated virtual currencies created and used outside the US. They not only operate outside the geography of the US, but also the legal jurisdiction of the US. Eurodollars are used by various big banks, money market and hedge funds, insurance companies, and many big corporations.

Eurodollars are primarily lent out (created) in exchange for collateral such as US Treasuries. Depending on the quality of the collateral, the lender also attaches a small fee called a “haircut” to cover any potential liquidity costs. Since the Eurodollar system works on lending, it makes sense that any bank or corporation borrowing money would want to make that transaction as cheap as possible. The price goes down by using good collateral to make it a more secure loan. One can envision a hierarchy of types of collateral with US Treasuries on top, but corporate junk bonds, mortgage-backed securities, sovereign debt, and even gold being other grades.

Overseas dollar markets are unregulated and proprietary, so accurate figures are hard to come by. As it is based on lending, the loans are often refinanced. A major issue is liquidity, being able to respond to the changing conditions in global financial markets that reach over a hundred trillion dollars. The cost and availability of the US dollar can shift for many reasons, including:

  • Changes in US interest rates
  • Shifts in global risk assessments
  • Periods of market stress such as coronavirus epidemics.

As a result, we currently see a considerable dollar shortage across the globe, dramatically increasing its value while causing depreciation of other currencies worldwide. While the international US dollar demand remains high, we can expect energy and food shortages, as well as the transmission of economic shocks such as Sri Lanka recently experienced. Goldman Sachs calculates that dollar strength is adding about $20 to a barrel oil for local currencies. These will have significant effects on both the global financial system and the global economy.

Currency swaps are one way to alleviate stresses in the global monetary system. They are repo agreements where a central bank sells a specified amount of a currency to another central bank in exchange for their currency at the prevailing market exchange rate. The first central bank agrees to buy back its currency at the same exchange rate (with interest) on a specified future date. A central bank initiates this swap transaction so they use the currency it swapped for to lend into their domestic economy. The Swiss National Bank in conjunction with the New York Fed regularly auctions off dollars to other central banks in liquidity swap operations. These are repos with needed collateral that used to be called foreign exchange swaps and have become more utilized as dollar liquidity has diminished.

Significant data gaps make assessing the risks and vulnerabilities of the global currency system challenging. Additional disclosure and data collection are needed to improve the transparency and possible regulation of the global US dollar system. In the meantime, we need to theoretically assess the dynamics of the domestic dollar administration and the possibilities of transforming the international monetary system. While it is highly unlikely that the US dollar will lose its reserve currency status, the current worldwide dollar shortage and its consequences are bringing increased scrutiny and calls for monetary reform, including more multicurrency settlements of energy.

Notes

[1] If you want a refresher on what causes inflation see my Great Monetary Surge of 2020 and the Return of Inflation from earlier this year.
[2] I did my Masters thesis on the emergence of the Eurodollar system and how it became electronic in the 1970s. I want to acknowledge Jeff Snider and the Eurodollar University for sparking the resurgence of my interest in the global currency.
[3] Unlike monetary policy enacted by the Federal Reserve which works to manage the economy by “lifting all boats” and not target individual industries, MMT will likely have to be more targeted. Although it can still focus on infrastructure and other enabling systems.

Citation APA (7th Edition)

Pennings, A.J. (2022, August 2). Weak Domestic Dollar, Strong Global Dollar. apennings.com. https://apennings.com/dystopian-economies/weak-dollar-strong-dollar/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

MMT in a Post-Covid-19 Environment

Posted on | July 29, 2022 | No Comments

Modern Monetary Theory (MMT) is a set of descriptions about government (fiscal) policy originating from the financial world. Traders of government bonds began to inquire about Federal spending and fiscal monetary processes in the post-Reagan environment as the US debt increased substantially. They looked beyond the curtain to processes of money creation and the roles of bonds and taxes in the political economy.

What is the role of money? How does the US government get its money? How does it spend money? What are the constraints on government spending? This post looks at MMT and its prospects by examining the economic problems associated with COVID-19 virus pandemic.

We hear the term “printing money” a lot, usually by gold or bitcoin enthusiasts who believe in establishing strict financial constraints. By establishing “hard money” and limiting the quantity of money in an economy, they hope to see their assets rise in value while keeping prices down. Certainly, governments do print some of their money for public use, but the preponderance of funds are entries in digital ledger accounts. I prefer the imagery of the “magical mouse” to the printing press. But I digress.

So how does the US government spend money? In the US, pursuant to Congressional legislative action, the Federal Reserve credits the appropriate accounts at the Department of the Treasury. That’s pretty much it.

Currency-issuing governments do not have the same financial constraints as a household, a business, or even a municipal government, state, or province. They can use their sovereign currency to pay for whatever is available to buy in their own denominated currency. Those are pretty important points so you may want to read them again. They don’t need to print, tax, or auction off bonds to to spend. But it is registered.

Stephanie Kelton at Stony Brook University in New York has been a strong academic and policy proponent of MMT and its possibilities. A professor of economics, she has also been a consultant to Democratic candidates. In this video, she explains how MMT sees the spending process and some of the implications of seeing money and the deficit in a new way.

Kelton and the MTT theorists have been clear about the downsides of these spending processes. While a currency may not have the restrictions many believe, resources can face limitations. Labor, education, land, materials, and even imagination are called into service under such conditions. Shortages can mean rising prices, has we have seen in the COVID shutdown of businesses and supply chains. MTT proponents have been mindful of this stress and the potential of rising prices from its inception.

But the problem of increasing costs from resource limitations never really emerged since the 1970s until the Covid-19 pandemic. Globalization (particularly the Chinese workforce), a large working force (including women), and technological innovations such at IT and the Internet increased productivity and managed prices at relatively low levels. Also, spending, despite deficits, stayed within manageable levels. Then, however, the pandemic began to shut down businesses and disrupt traditional supply lines. Also, the Federal Reserve, the US central bank, reduced interest rates and increased bank reserves. At the same time, the Federal government began to spend trillions on Payroll Protection Program (PPP) and other stimulus spending.

Stimulus spending continued into 2021 to address the K-shaped economy, a split in continuing economic benefits to those who could adjust to the pandemic and those hit the hardest. As people were hunkering down at home, many adjusted by working online or taking classes via Zoom. Others lost their jobs or had to quit to take care of children not attending school. Some $5 trillion was spent as COVID-19 stimulus, and the pandemic severely limited work and availability of goods and services. As a result, the economy was strained and thrown into disarray.

During 2021, prices started to rise, especially oil that had crashed in 2018, and to less than $20 a barrel in 2020 as the pandemic took hold. Oil production shut down until the roaring return of demand in late 2021 and 2022. In February 2022, as Russia attacked Ukraine, oil access was further reduced, as was the availability of fertilizers and materials needed to fight climate change.

In addition to resources, MMT also needs the ideas and policies to put the currency to work efficiently. Wars have been the defacto spending rationalization for governments. Military spending creates jobs and innovative technologies that often spill over into the larger economy.[1]

How do we collectively decide on alternative spending strategies? Should consumers drive social demand? Can universal healthcare drive down business costs? Can renewable energy infrastructure investment kickstart a new industrial era-based cheap electricity inputs? Should increased spending go into a war on climate change?

How do we recognize the constraints in these strategies? Materials? Labor? Criticisms of centrally planned economies abounded in the 20th century. Most concerns have lamented the lack of price signals aggregated from individual consumers and profit-seeking private firms.

Others have argued that government spending would “crowd out” the private sector by absorbing available investment capital. This would decrease private sector activity. But Kelton argues current economics gets it backward. Spending comes first.

Do they sufficiently allocate resources in a society? The Paycheck Protection Program (PPP) stimulus program under the Trump administration was probably the wildest expenditure of government money in modern history. But did it matter if the goal was to stop the slide into a major recession?

Notes

[1] Unlike monetary policy enacted by the Federal Reserve which works to manage the economy by “lifting all boats” and not target individual industries, MTT will likely have to be more targeted.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Al Gore, Atari Democrats, and the “Invention” of the Internet

Posted on | July 9, 2022 | No Comments

This is the fourth part of a narrative about how the Internet changed from a military network to a wide-scale global system of interconnected networks. Part I discussed the impact of the Strategic Defense Initiative (SDI) or “Star Wars” on funding for the National Science Foundation’s adoption of the ARPANET. In Part II, I looked at how the fears about nuclear war and Japan’s artificial intelligence (AI) propelled early funding on the Internet. Finally, Part III introduced the “Atari Democrats” and their early role in crafting the legislation in creating the Internet.

This post is a follow-up to make some points about Al Gore and the Atari Democrat’s involvement in the success of the Internet and how political leadership is needed for technological innovation and implementation. What is particularly needed is to draw lessons for future infrastructure, open-source software, and other enabling systems of innovation, social connectedness, and civic-minded entrepreneurship. These include smart energy grids, mobility networks, healthcare systems, e-commerce platforms, and crypto-blockchain exchanges.

The story of Al Gore “inventing” the Internet started after CNN’s Wolf Blitzer interviewed the Vice-President in 1999 and gained traction during the 2000 Presidential campaign against George W. Bush. The accusation circulated that Gore claimed he “invented the Internet,” and the phrase was used to tag the Vietnam vet and Vice President as a “liar” and someone who couldn’t be trusted. The issue says a lot about how election campaigns operate, the role of science and technology in the economy, and especially about the impact of governance and statecraft in economic and technological development. Here is what he actually said:

Of course, the most controversial part of this interview about Vice President Gore’s plans to announce his presidential candidacy was this statement, “During my service in the United States Congress, I took the initiative in creating the Internet.” That actual quote was turned into “inventing the Internet” and was used against him in the 2000 presidential elections. The meanings are quite different.

“Inventing” suggests a combination of technical imagination and physical manipulation usually reserved for engineers. We do after all, want our buildings to remain upright and our tires to stay on our cars as we ride down the road. To “create” has a much more flexible meaning, indicating more of an art or a craft. There was no reason to say he invented the Internet except to frame it in a way that suggested he designed it technically, which does sound implausible.

Gore never claimed engineering prowess but could never adequately counter this critique. Gore would win the popular vote in 2000 but failed in his bid for the Presidency. The Supreme Court ruled he had lost Florida’s electoral votes in a close and controversial election in the “Sunshine” state. It’s hard to say how much this particular meme contributed to the loss, but the “inventing” narrative stuck and has persisted in modern politics in subtle ways.

The controversy says more about how little we understand innovation and technological development and how impoverished our conversations have been about developing data networks and information technologies. The history of information technologies, particularly communications networking, has been one interplay between technical innovation, market dynamics, and intellectual leadership that guides policy actions, including military and research funding.

The data networking infrastructure, undoubtedly the world’s largest machine, required a set of political skills, both collective and individualized, to be implemented. In addition to the engineering skills that created the famed data packets and their TCP/IP (Transmission Control Protocol and Internet Protocol) protocols, political skills were needed for the funding, the regulatory changes, and the global power needed to guide the international frameworks that shape what is now often called Information and Communications Technologies (ICT). These frameworks included key developments at the International Telecommunications Union (ITU), the World Intellectual Property Organization (WIPO), and the World Trade Organization (WTO).

Al Gore got support from those generally considered the “real inventors” of the Internet. While Republicans continued to ridicule and “swiftboat” Gore for trying to claim he “invented the Internet,” many in the scientific community including the engineers who designed the Internet, verified Gore’s role. Robert Kahn and Vinton Cerf acknowledged Gore’s initiatives as both a Congressman and Senator.

    As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s.

    But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises. – Vint Cerf

Senator Gore was heavily involved in sponsoring legislation to research and connect supercomputers. Gore was an important member of the “Atari Democrats.” Along with Senator Gary Hart, Robert Reich, and other Democrats, they pushed forward “high tech” ideas and legislation for funding and research. The term’s meaning varied, but “Atari Democrat” generally referred to a pro-technology and pro-free trade social liberal Democrat.

Atari was a successful arcade game, home game console, and software company in the 1970s. Started by Nolan Bushnell, it gave a start to Steve Jobs and Steve Wozniak, among others. The term began to stick to some Democrats around 1982 and generally linked them to the Democrats’ Greens and an emerging “neo-liberal” wing. It also suggested they were “young moderates who saw investment and “high technology” as an answer and complement to the New Deal.” [1]

The New York Times discussed the Atari Democrats and tensions that emerged during the 1980s between the traditional Democratic liberals and the Atari Democrats. The latter attempted to find a middle ground on the economy and international affairs with the Republicans, while the former courted union workers, many of whom had shifted to Reagan and the Republicans.[2]

One of the emerging issues of the time was the trade deficit with Japan, whose cars and electronics were making significant inroads into the US economy. Gore and other Democrats were particularly concerned about losing the race for artificial intelligence. The Japanese “Fifth Generation” AI project was launched in 1982 by the country’s Japan’s Ministry of International Trade and Industry (MITI), which had a reputation at the time for guiding Japan’s very successful export strategy.

Known as the national Fifth Generation Computer Systems (FGCS) project, the AI project was carried out by ICOT, (later AITRG), project, the AI project was carried out by ICOT (later AITRG), part of the Japan Information Processing Development Corporation (JIPDEC), the Advanced IT Research Group (AITRG), a research institute that brought in Japanese computer manufacturers (JCMs), and a few other electronics industry firms. A primary US concern was that the combination of government involvement and the Keiretsu corporate/industrial structure of the Japanese political economy would give them a significant advantage in advanced computing innovations.

Congress was concerned about the competition over high-speed processors and new software systems that were recognized at the time as crucial components in developing many new military armaments, especially the space-based “Star Wars” missile defense system that President Reagan had proposed as the Strategic Defense Initiative (SDI). Any system of satellites and weaponry forming a defensive shield against nuclear attack would need advanced microprocessors and supercomputing capabilities. It would require artificial intelligence (AI).

The likely vehicle for this research was the National Science Foundation (NSF), the brainchild of Vannevar Bush, who managed Science and Technology for the US during World War II. That included the Manhattan Project that created the Atomic Bomb. The NSF was formed during the 1950s with established research areas in biology, chemistry, mathematics, and physics. In 1962, it set up its first computing science program within its Mathematical Sciences Division. At first, it encouraged the use of computers in each of these fields, and later, it provided a general computing infrastructure, including setting up university computer centers in the mid-1950s that would be available to all researchers. In 1968, the Office of Computing Activities began subsidizing computer networking. They funded some 30 regional centers to help universities efficiently use scarce computer resources and timesharing capabilities.

In 1984, a year after the military institutionalized TCP/IP, the NSF created the Office of Advanced Scientific Computing, whose mandate was to create several supercomputing centers around the US. [2] Over the next year, five centers were funded by the NSF:

General Atomics — San Diego Supercomputer Center, SDSC
University of Illinois at Urbana-Champaign — National Center for Supercomputing Applications, NCSA
Carnegie Mellon University — Pittsburgh Supercomputer Center, PSC
Cornell University — Cornell Theory Center, CTC
Princeton University — John von Neumann National Supercomputer Center, JvNC

However, it soon became apparent that they would not adequately serve the scientific community. Gore began to support high-performance computing and networking projects, particularly the  National Science Foundation Authorization Act, where he added two amendments, one calling for more research on the “Greenhouse Effect” affecting climate change and the other calling for investigating future options for communications networks for connecting research computers. This Computer Network Study Act would specifically examine the requirements for data transmission capabilities conducted through fiber optics, data security, and software capability. The NSFNET’s decision to choose TCP/IP in 1985 as the protocol for the planned National Science Foundation Network (NSFNET) would pave the way for the Internet.

In the Supercomputer Network Study Act of 1986, Gore proposed to direct the Office of Science and Technology Policy (the Office) to study critical issues and options regarding communications networks for supercomputers at universities and Federal research facilities in the United States and required the Office to report the results to the Congress within a year. The proposal was attached to the Senate Bill S. 2184: National Science Foundation Authorization Act for Fiscal Year 1987, but it was never passed.

Still, a report was produced that pointed to the potential role of the NSF in networking supercomputers, and in 1987, the NSF agreed to manage the NSFNET backbone with Merit and IBM. In October 1988, Gore sponsored additional legislation for “data superhighways” in the 100th Congress. S.2918 National High-Performance Computer Technology Act of 1988 and later H.R.3131 – National High-Performance Computer Technology Act of 1989 was sponsored by Rep. Doug Walgren to amend the National Science and Technology Policy, Organization, and Priorities Act of 1976. It directed the President, through the Federal Coordinating Council for Science, Engineering, and Technology (Council), to create a National High-Performance Computer Technology Plan and to fund a 3 Gigabit backbone network for the NSFNET.

It paved the way for S.272 High-Performance Computing and the National Research and Education Network (1991-1992) sponsored by Al Gore that passed and was signed by President George H.W. Bush on December 9, 1991. Often called the Gore Bill, it led to the development of the National Information Infrastructure (NII) and the funding of the National Research and Education Network (NREN).

The Gore Bill began a discussion of the “Information Superhighway” that enticed cable, broadcast, telecommunications, satellite, and wireless companies to start developing their digital strategies. It also provided the groundwork for Gore’s international campaign for a Global Information Infrastructure (GII) that would lead to the relatively seamless and cheap data communications of the World Wide Web.

Its $600 million appropriation also funded the National Center for Supercomputing Applications (NCSA) at the University of Illinois, where graduate student Marc Andreessen and others created Mosaic, the early Web browser that became Netscape. The Netscape IPO started the Internet’s commercial boom of the 1990s.

As chairman of the Senate Subcommittee on Science, Technology, and Space, Gore held hearings on these issues. During a 1989 hearing colloquy with Dr. Craig Fields of ARPA and Dr. William Wulf of NSF, Gore solicited information about what constituted a high-speed network and where technology was headed. He asked how much sooner NSFnet speed could be enhanced 30-fold if more Federal funding was provided. During this hearing, Gore made fun of himself during an exchange about high-speed networking speeds: “That’s all right. I think of my [1988] presidential campaign as a gigaflop.” [The witness had explained that “gigaflop” referred to one billion floating point operations per second.]

Al Gore is interesting because he has been a successful legislator and a mover of public opinion on climate change and the global Internet. He can take credit for much of the climate change discussion. He has worked hard to bring the topic to the public’s attention, mobilize action on markets for carbon credits, and accelerate developments in alternative energy. His actions and tactics are worth studying as we need more leaders, perhaps “Atari Democrats,” who can create positive futures rather than obsessing about tearing down what we have.

Summary

The transition of the Internet from a military network to a global infrastructure was shaped by key political and technological influences. This article examines the role of Al Gore and the “Atari Democrats” in fostering Internet development through policy, funding, and legislative initiatives.

The notion that Al Gore claimed to have “invented the Internet” originated from a mischaracterization of his 1999 CNN interview, where he stated, “During my service in the United States Congress, I took the initiative in creating the Internet.” This phrase was misinterpreted and politicized during the 2000 presidential election. Gore never asserted engineering credit but played a significant role in promoting the early legislative framework supporting the Internet’s expansion.

Al Gore’s contributions were validated by key Internet pioneers, such as Robert Kahn and Vinton Cerf, who acknowledged his leadership in recognizing the importance of high-speed networking and supercomputing for economic and educational development. Gore’s legislative efforts date back to the 1970s, when he advocated for advanced computing and communication networks, influencing the development of NSFNET, which became the foundation of the modern Internet.

The “Atari Democrats,” including Gore, were a faction of the Democratic Party in the 1980s that focused on technological innovation, economic growth, and global competitiveness. They sought to leverage advancements in computing and networking to maintain U.S. leadership in technology. Concerns about Japan’s artificial intelligence (AI) advancements in the 1980s further motivated U.S. policymakers to invest in high-performance computing.

Key legislative milestones include:

The 1986 Supercomputer Network Study Act, which promoted research on networking supercomputers.

The National High-Performance Computer Technology Act of 1988, which aimed to develop a national research and education network.

The High-Performance Computing Act of 1991 (commonly called the “Gore Bill”), which laid the foundation for the modern Internet by expanding the NSFNET backbone and influencing commercial Internet growth.

Gore’s leadership extended to advocating for a “Global Information Infrastructure,” emphasizing digital connectivity and international collaboration worldwide. His work not only contributed to the technological expansion of the Internet but also played a role in shaping public awareness of climate change and the importance of sustainability in innovation.

The history of the Internet highlights the interplay between technical expertise, political vision, and regulatory frameworks. Gore’s influence demonstrates the necessity of political leadership in fostering technological advancements that benefit society.

Citation APA (7th Edition)

Pennings, A.J. (2022, Jul 09) Al Gore, Atari Democrats, and the “Invention” of the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/al-gore-atari-democrats-and-the-invention-of-the-internet/

Notes

[1] E. J. Dionne, Special to The New York Times. WASHINGTON TALK; Greening of Democrats: An 80’s Mix of Idealism And Shrewd Politics. The New York Times, 14 June 1989, www.nytimes.com/1989/06/14/us/washington-talk-greening-democrats-80-s-mix-idealism-shrewd-politics.html. Accessed April 24th, 2019.
[2] Wayne, Leslie. Designing a New Economics for the “Atari Democrats.” The New York Times, 26 Sept. 1982, www.nytimes.com/1982/09/26/business/designing-a-new-economics-for-the-atari-democrats.html.

Linked References (APA 7th Edition)

Pennings, A. J. (2022). How “Star Wars” and the Japanese Artificial Intelligence (AI) Threat led to the Internet Japan. apennings.com. Retrieved from https://apennings.com/how-it-came-to-rule-the-world/how-star-wars-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-japan/

Pennings, A. J. (2022). NSFNET and the Atari Democrats. apennings.com. Retrieved from https://apennings.com/how-it-came-to-rule-the-world/how-star-wars-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-part-iii-nsfnet-and-the-atari-democrats/

Blitzer, W. (1999). Inventing the Internet [Video]. YouTube. Retrieved from https://youtu.be/HpMgne-X-Ns?t=689

Cerf, V. (2015). The Role of Al Gore in the Development of the Internet. BBC News. Retrieved from http://www.bbc.com/news/science-environment-31450389

New York Times. (1989, June 14). Greening Democrats: 80’s Mix Idealism & Shrewd Politics. The New York Times. Retrieved from https://www.nytimes.com/1989/06/14/us/washington-talk-greening-democrats-80-s-mix-idealism-shrewd-politics.html

New York Times. (1982, September 26). Designing a New Economics for the Atari Democrats. The New York Times. Retrieved from https://www.nytimes.com/1982/09/26/business/designing-a-new-economics-for-the-atari-democrats.html

Stanford University. (1982). The Japanese Fifth Generation AI Project. Stanford Digital Repository. Retrieved from https://stacks.stanford.edu/file/druid:wt917by4830/wt917by4830.pdf

ScienceDirect. (1993). Fifth Generation Computer Systems (FGCS). ScienceDirect Journal. Retrieved from https://www.sciencedirect.com/science/article/pii/0167739X93900038

National Science Foundation (NSF). (1985). NSFNET’s Decision to Choose TCP/IP. First Monday. Retrieved from https://firstmonday.org/ojs/index.php/fm/article/view/799/708

U.S. Congress. (1991). High-Performance Computing and the National Research and Education Network Act. Congress.gov. Retrieved from https://www.congress.gov/bill/102nd-congress/senate-bill/272/all-info#latestSummary-content

Wikipedia. (n.d.). National Center for Supercomputing Applications (NCSA). Retrieved from https://en.wikipedia.org/wiki/National_Center_for_Supercomputing_Applications

Pennings, A. J. (2022). Contending Information Superhighways. apennings.com. Retrieved from https://apennings.com/global-communications/contending-information-superhighways/

Pennings, A. J. (2022). Engineering TCP/IP Politics and the Enabling Framework of the Internet. apennings.com. Retrieved from https://apennings.com/telecom-policy/engineering-tcp-ip-politics-and-the-enabling-framework-of-the-internet/

Pennings, A. J. (2022). The Netscape IPO and the Internet’s commercial boom. apennings.com. Retrieved from https://apennings.com/democratic-political-economies/from-new-deal-to-green-new-deal-part-3-its-the-infrastructure-stupid/

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is a Professor at the State University of New York, South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He began his teaching career at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

U.S. Internet Policy, Part 6: Net Neutrality, Broadband Infrastructure and the Digital Divide

Posted on | June 4, 2022 | No Comments

In the era of the COVID-19 pandemic, the Internet proved to be more critical than ever. Parents telecommuted to work, kids telelearned at home, and streaming media services entertained both. Sick people began to use telemedicine instead of visiting the doctor or hospital. Many people applied for jobs online. Families also embraced the relative safety and convenience of e-commerce delivering commodities to the home.

Yet broadband services in the US were often not available, affordable, or up to the task of allowing all the people in a home to access the bandwidth needed. Previously I examined the decline of ISPs and the dominance of traditional telcos over broadband. Also, I wrote about the Trump’s administration’s support for ending net neutrality by the FCC. This post looks at the plans to increase funding for broadband infrastructure in the US and list some of the challenges facing the Biden Administration’s Internet policy.

The digital divide proved to be more consequential than ever as the K-shaped recovery took shape, exacerbating income divisions. The divide has been particularly stressful on American families as schools and other activities for kids closed down during the Covid-19 pandemic. Some 20 million Americans had none, or very slow Internet service. while another 100 million could not afford broadband.

Inequalities were deepened by the types of jobs affected, as contact jobs in service industries were particularly hard hit while more professional jobs that could be conducted online did well. Also, financial assets continued to appreciate due to the Federal Reserve’s low-interest rates, and quantitative easing kept mortgages cheap and raised home prices.

During his first 100 days, Biden proposed a $2.25 trillion infrastructure package focused on updating transportation infrastructure as well as funding to fight climate change and other provisions to prop up American families. It has also incorporated the The Accessible, Affordable Internet for All Act introduced by US Senator Amy Klobuchar (D-MN), Co-chair of the Senate Broadband Caucus, and House Majority Whip James E. Clyburn (D-SC) in early 2021. This plan to modernize underserved communities involved:

– $80 billion to deploy high-speed broadband infrastructure;
– $5 billion for a related secured loan program; and a
– New office within the National Telecommunications and Information Administration (NTIA) to monitor and ensure transparency in these projects.

In early November 2021, the House passed the Infrastructure Investment and Jobs Act bill 223-202 allocating over US$1 trillion in spending for much-needed infrastructure, with $579 billion in new spending, including US$65 billion for broadband.

The Senate had organized and passed the bill in the summer but it was held up in the House of Representatives. The House progressives wanted the bill tied to a third phase of the Build Back Better program with increased social spending for healthcare, new housing, and climate change. Eventually, Speaker of the House Nancy Pelosi organized enough votes to pass the measure independently and President Biden signed the bill on November 15, 2021.

Infrastructure Bill

The infrastructure bill allocates money to three major areas: direct payments to consumers to pay for broadband services, support for municipal networks, including those in tribal areas, and subsidies for major companies to build out more broadband infrastructure such as fiber optics lines and wireless base stations. The money is destined for the states who can demonstrate groups in need.

It allocates $14 billion to help low-income Americans pay for service at about $30 a month. President Biden announced progress in the Affordable Connectivity Program, an extension and revision of the Emergency Broadband Benefit in the spring of 2022.

The digital divide emerged as a much-needed policy issue again due to the priorities of the COVID-19 pandemic and changes in education and work. More and better access, especially in rural areas became a high priority. Major broadband providers have organized to take advantage of infrastructure spending and comply with the administrations concerns to provide $30 monthly broadband subsidies for eligible households.

Several issues linger in the public’s consciousness depending on media attention. Some have come to the forefront of public scrutiny. These include:

– Even more, better, and cheaper broadband access through mobile, satellite, and wireline facilities, especially in rural areas. Broadband strategies must also consider the implications of SpaceX’s Starlink satellite network.

Also, we should be looking at a wider range of Internet issues such as:

– Antitrust concerns about cable and telco ISPs, including net neutrality. [1]
– Privacy and the collection of behavioral data by platforms to predict, guide, and manipulate online user actions.
– Section 230 reform for Internet platforms and content producers, including assessing social media companies’ legal responsibilities for user-generated content.
– Security issues, including ransomware and other threats to infrastructure.
– Deep fakes, memes, and other issues of misrepresentation, including fake news.
– eGovernment and digital money, particularly the role of blockchain and cryptocurrencies as the Internet moves to Web 3.0.

Citation APA (7th Edition)

Pennings, A.J. (2022, Jun 4). U.S. Internet Policy, Part 6: Broadband Infrastructure and the Digital Divide. https://apennings.com/telecom-policy/u-s-internet-policy-part-6-broadband-infrastructure-and-the-digital-divide/

Share

Notes

[1] Newman, R. (2016). The Debate Nobody Knows: Network Neutrality’s Neoliberal Roots and a Conundrum for Media Reform. International Journal of Communication, 10, 5969–5988.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Victoria University in New Zealand where he studied the changes in their telecommunications industry. He researched also telecommunications issues at the East-West Center in Honolulu, Hawaii and was associated with the Pacific Telecommunications Council (PTC). He keeps his American home in Austin, Texas.

Analyzing YouTube Channels

Posted on | May 2, 2022 | No Comments

The video capabilities of the Internet have made possible a new era of media analysis. The traditions of film and television can be brought to the new online media, including YouTube. The challenge will be to both apply traditional media analysis as well as suggest new modes of televisual criticism for this relatively new medium.[1]

This post reviews some of the techniques used in cinematic and television production and suggests a strategy to analyze YouTube channels based on work in my Visual Rhetoric and IT class. It uses a semiotic (study of meaning) framework to examine the channel imagery (mise-en-scène) and editing (montage) to determine what might make such videos successful. It applies denotative and connotative strategies to describe the meaning-making practices with vocabulary and explain what YouTube creators do to inform/entertain the audience.[2]

Story-telling is an important consideration. Who is telling what story, and how? What narrative devices are being used? How does it engage the audience? This channel Lessons from the Screenplay (LFTS) discuses the narrative of how the new James Bond was introduced in 2006. Note who is speaking, whether you actually see them, and how the story is being told.

What are the main signifying practices that make the YouTube channel a success? A semiotic approach looks closely at the details visible in the video (denotation). It then connects the content with connotative meanings such as social myths and narratives. These details would include various “signs” such as indexical metrics, mainly subscribers, views, likes, and the money YouTubers make. It also looks at the typographies, logos, and other meaning-making practices, such as camera pictures, and how those images are spliced together.

The shot continues to be the major unit of analysis, reflecting the relationship between the camera and the scene. The primary visual grammar holds for the most part – establishing shot (LS), medium shot (MS), closeup (CU). The wider shot creates a meaningful context for the tighter images that provide detail and usually heightened emotion. The smartphone now offers an extraordinary high-quality camera for single shots or high-definition video that can get YouTube channelers started.

Ask more about the mise-en-scene. What do you seen in the image? The lighting? The props? The backgrounds? The special effects (FX)? Is a drone used for birds-eye shots? How is the camera used to create the shot – zooms, pans, tilts. And why? What is the motivation or reason for the technique? Who is doing the shooting? What camera techniques are being used? And why?

A bit more challenging is the editing. Applications like iMovie or just hiring someone on Fiverr.com have been helpful for YouTube channelers. The analytical challenge is to keep up with the vocabulary and understanding what the montage is doing in terms of “building” meaning in the narrative.

A narrative is a series of connected events edited in chronological significance. Montage editing can include parallel editing, flashbacks in time, and flash forwards in time as well. What about the montage is noteworthy? What is the pace of editing? What transitions are being used – and once again, what is the motivation?

Ask what drives the narrative. Narration asks who is the storyteller? Who is the narrator? Are they like the anchor in a news program? What is the mode of address: voice-over, talking to the camera, or a combination? Do we see him or her? Is it combined with a voice-over? Or is it all told from a voice of anonymous authority? -the voice of “God” booming over the montage of images.

Also relevant are the microeconomics of the channel and the overall political economy of YouTube. Understanding how YouTube channels make money either through Adsense, brand arrangements, and, more recently, patronage helps understand a channel’s messaging. In the latter, individuals and organizations become “patreons” by pledging to support a channel financially, including smaller payments of gratitude at websites such as buy me a coffee.

Recommendation engines are a key to understanding viewer captivity in YouTube. Using a sophisticated computer algorithm and data collection system, it finds content related to your search or the content you are watching. These computer programs reduce what could become complex search and decision processes to just a few recommendations. It lists a series of video thumbnails based on metadata from the currently viewed video and information gathered about your past viewings and searches.

In February 2005, Chad Hurley, Steve Chen, and Jawed Karim established YouTube in San Bruno, California. YouTube has since become the second-most visited website worldwide with 14 billion monthly views. Only Google, from the same parent company, Alphabet, has more. By 2022, almost 5 billion videos were watched on YouTube every single day, and 694,000 hours of video are streamed on YouTube each minute. More than half of those YouTube views are seen on mobile devices.

This particular form of YouTube analysis asks: What signifying practices make the YouTube channel a success?[3] It applies the denotative and connotative semiotic methodologies to language (describe) and explains what creators do in their videos to educate and/or entertain the audience. This process trains students to understand what techniques are used and their impact. Hopefully, this also contributes to the growing YouTube Studies area in academia.

Citation APA (7th Edition)

Pennings, A.J. (2022, May 2) Analyzing YouTube Channels. apennings.com https://apennings.com/media-strategies/analyzing-youtube-channels/

Share

Notes

[1] Pennings, A. (2020, Feb 2). YouTube Meaning-Creating (and Money-Creating) Practices. Retrieved from https://apennings.com/media-strategies/youtube-meaning-creating-practices/ Accessed May 1, 2022.
[2] Our class uses a series of web pages by David Chandler to learn basic televisual grammar and vocabulary, semiotics and signs, as well as denotation, connotation and myth. His book Semiotics: The Basics is also very useful.
[3] According to Karol Krol, there are several features or qualities that make a YouTube channel successful: continuous posting, using an angle, content quality, and content that is international.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and the Undergraduate Program Director for a BS in Technological Systems Management. Before joining SUNY, he was on the faculty of New York University where he created and co-managed the BS in Digital Communications and Media. He began teaching cinema studies at the University of Hawaii as a PhD student at the East-West Center in Honolulu, Hawaii.

Wireless Charging Infrastructure for EVs: Snack and Sell?

Posted on | April 22, 2022 | No Comments

I recently chaired the defense of a PhD dissertation on patents for electric vehicle (EV) charging. Soonwoo (Daniel) Chang, with his advisor, Dr. Clovia Hamilton, did a great job mapping trends associated with the technology patents central to wireless electric charging, including resonant induction power transmission (RIPT).[1] The research got me interested in exploring more about the possibilities of wireless charging as part of my Automatrix series, especially since the US government is investing billions of dollars in developing electric charging infrastructure throughout the country.

This post discusses some of the issues related to wireless charging of EVs. Cars, but also buses, vans, and long-haul trucks are increasingly using electric batteries for locomotion instead of petroleum-fueled internal combustion engines (ICE). It suggests that wireless charging infrastructure be considered that can allow EVs to partially replenish their batteries (“snack”) and in some cases sell electricity back to the grid. Currently, not many EV original equipment manufacturers (OEMs) are designing their automobiles for wireless charging. Why not?

Most likely, OEMs are quite aware of what they need to succeed in the short term. Tesla, for example, has developed an EV charging infrastructure viable for personal autos using cables that plug into a vehicle. But we should also be wary of some of the limitations of plug-in chargers and prepare to build a more dynamic infrastructure that would allow charging in multiple locations and without getting out of the vehicle. These are some of my speculations, and they do not necessarily reflect the results of the soon-to-be Dr. Chang, whose research sparked a lot of my thinking.

You may have used wireless charging for your smartphone. It’s helpful to get a quick charge without dealing with plugs and wires. Inductive charging has some limitations though. While you can play music or a podcast, or talk over a speaker mic, it’s typically immobile. Your phone also needs to be very close to the charging device, its coils aligned properly, and it gets hot. It’s generally not that energy-efficient either, often losing more than 50% of its electricity while charging. With several devices connected in nearly every home, the losses can add up, putting strain on a community’s electrical grid.

In EV wireless charging, a receiver with a coiled wire is placed underneath the vehicle and connects to the battery. This “near field” charging requires that the vehicle be near a similar charging coil. The receiver needs to come near a charging plate on the ground to transmit the energy.

However, advances in magnetic resonance technologies have increased the distance and energy efficiencies involved in wireless charging of EVs. Power transfer is increased by electromagnetically tuning the devices to each other with a magnetic flux, allowing convenient replenishing of a vehicle’s battery. Electric devices generally have conductive wires in a coil shape that maintain stability for the instrument by resisting or storing the flow of current, initially. They are classified by the frequency of their current that flows through it that consists of direct current (DC), audio frequency (AF), and radio frequency (RF). These frequencies can be managed and directed to transfer power over a short distance to another electrical coil. They are not radio waves or emit ionizing radiation that have sufficient energy to detach electrons, so they appear to be relatively safe.

WiTricity for example, uses frequencies around 85 kHz to tune the two coils and expand the charging range from a few millimeters to tens of centimeters. The Massachusetts company has spearheaded the development of this technology and has opened up many possibilities, particularly its use in the public sphere. This may also include dynamic charging that allows a vehicle to charge while moving. See the below video about WiTricity:

It’s no secret that charging issues are a limiting factor for EV diffusion. Drivers of ICE vehicles have resigned themselves to getting out of their cars, organizing a payment, inserting the nozzle into the gas tank, and waiting patiently for a few minutes while the liquid hydrocarbons poured into their cars. Unfortunately, EV owners are still dealing with a lack of available charging locations as well as challenges with nozzle standards, payment systems, long lines, and lengthy charging periods. Currently, most EV owners in the US charge at home with L2 chargers that can readily be bought online or at a Home Depot. EV owners in urban areas need to find other locations, such as parking garages and may face fines for staying too long.

Standards both permit and restrict technological solutions. The Society of Automotive Engineers (SAE) published the J2954 standard in 2020 for wireless charging with three wireless charging levels — WPT1 (3.7 kW), WPT2 (7 kW), and WPT3 (11 kW) for transfer up to 10 inches. These generally may take up to three and a half hours to fully charge.[2]

Yes, these are not impressive numbers given the competition from high-end EV superchargers. Even the EVgo chargers at my local Walmart in Austin have several standards (J-1772, CHAdeMO, CCS/SAE) that charge from 14.4 – 50 kW.[1] Note that EVs have onboard chargers with varying acceptance rates (in kW) that convert AC electricity found in homes to the DC a car battery needs to store. A Chevy Volt with 3.3 kW acceptance will not charge as fast as a Telsa Model S with 10 kW no matter where it is plugged. Other factors include charging cables that can often be awkward to handle. The experimental charging “snake” that reaches out and automatically finds and connects to the auto’s electric nozzle doesn’t seem to be viable yet. So, charging is a limiting factor for EV adoption success.

The strategy behind wireless charging will probably focus more on what WiTricity has coined “power snacking” than full meals of electricity. Snacking is actually better for your battery as longevity improves if you don’t let the battery capacity run down to below 20% or recharge it to 100 percent. Keeping the electric ions in equilibrium across the battery reduces strain and increases the number of charge cycles before degrading occurs.

The snacking can be done by a waiting taxi, a bus stopping for a queue of passengers, a quick stop at a convenience store, and perhaps EVs waiting at a red light. Shopping centers are likely to “capture” customers with charging stalls, especially if they can reduce costs by having micro-grids with solar panels on roofs.

Many countries have tested infrastructure for charging EVs in motion, although this will require substantially more investment. “Dynamic” wireless charging appears to be feasible but comes with high costs as it needs to be embedded in existing infrastructure such as bridges and highways.

The major issues for wireless charging are the infrastructure changes needed and the OEM buy-in. Wireless charging will require more planning and construction than the current charger station. They will also require monitoring and payment applications and systems. Most importantly, they will require electricity – and without significant capacity coming from renewable sources, the purpose will be mainly defeated. Vehicle manufacturers will need to include the wireless charging pads and ensure safety. On the positive side, they can use smaller batteries for many models as constant recharging will reduce range anxiety.

Wireless technologies have been successfully tested for “vehicle to grid” (V2G) transmission. This innovation means a car or truck can sell electricity back to the grid. These might be particularly useful for charging locations off the grid or places challenging to connect—for instance, charging pads at a red light. So we might see a “snack or sell” option in future cars. The prices are likely to vary by time, location, and charging speed, but this setup will present some arbitrage opportunities.

The arbitrage economics are based on ‘valley filling’ when EVs charge at low-demand hours, often overnight, and ‘peak shaving’ when an EV transmits stored energy back into the grid during high-demand hours. So, for example, a vehicle charging at home overnight with cheaper grid electricity or excess from solar panels can sell it on the way to work or at the office parking lot. You might not get rich, but considering the money currently spent on diesel or petrol, it could still help your wallet.

Effective infrastructure like highways or the Internet provides indirect network effects, allowing different entities to use the system and expanding that network’s possibilities. The Global Positioning System (GPS), for example, uses some 27 satellites that transmit pulsed time codes. These signals allowed multiple devices to be invented that triangulate and compute a latitude, longitude, and altitude position to provide different location services. In this case, an effective wireless charging infrastructure enables many different vehicles to use the electrical network. A lot of the wireless charging infrastructure will be done by corporate fleets like Amazon, Best Buy, and Schindler Elevator. Hopefully, the US Post Office will catch up.

However, the US government made a down payment on EV charging stations in the 2021 infrastructure bill. Legislation targeted $15 billion in the Infrastructure Investment and Jobs Act, for “a national network of electric vehicle (EV) chargers along highways and in rural and disadvantaged communities.” It was cut in half to $7.5 billion to provide much needed funding for electric bus fleets for schools and municipalities.[3] Should future infrastructure spending target wireless charging?

Once we move beyond the internal combustion engine (ICE) in vehicles, you will see a lot more flexibility in design of autos, buses, vans, trams, etc. They require fewer parts and are easier to construct. We see it on the lower end with electric bikes and even those controversial electric scooters. New forms of electric autonomous trolleys and vans are necessary to revive urban transportation in a quiet and sustainable way. All these changes in mobility will require changes in the electrical infrastructure.

The term “Smart Mobility” has emerged as an important sociological and technical construct. The city of Austin, Texas:

    Smart Mobility involves deploying new technology to move people and goods through the city in faster, safer, cleaner, more affordable and more equitable ways. Our mission is “to lead Austin toward its mobility future through meaningful innovation, collaboration, and education.”

Smart devices have expanded to smart vehicles. Autonomy is becoming prevalent and some cars and trucks offer full self-driving options (FSD), with or without a passenger. Wireless charging is central to this process. Auto-valet services, for example, will allow your car to drop you off and park itself, likely at a stall that can provide charging. Who is going to get out to plug it in?

Notes

[1] Gaining Competitive Advantage with a Performance-Oriented Assessment using Patent Mapping and Topic Trend Analysis: A Case for Comparing South Korea, United States and Europe’s EV Wireless Charging Patents. A 2022 PhD Dissertation by Soonwoo (Daniel) Chang for Stony Brook University in New York. He can be reached at sdchang8@gmail.com

[2] A kWh is a 1,000 Watts of electricity. Named after James Watt, the inventor of the steam engine, a watt is the unit of electrical power equal to one ampere under the pressure of one volt. The time it takes to charge a car depends on both the car’s acceptance rate and the amount of electricity sent by the charging station. Volts x Amps – Wattage.

[3] Known officially as the Infrastructure Investment and Jobs Act, it authorized over half a trillion dollars in spending for airports, bridges, broadband, rail, roads, and water systems. It also included up to $108 billion in spending for public transportation such as rail as part of the largest federal investment in public transit in the nation’s history. Another $73 billion was for upgrades to the electrical grid to transmit higher loads while efficiently collecting and allocating energy.

Citation APA (7th Edition)

Pennings, A.J. (2022, Apr 22). Wireless Charging Infrastructure for EVs: Snack and Sell? apennings.com https://apennings.com/mobile-technologies/wireless-charging-infrastructure-for-evs-snack-and-sell/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at St. Edwards University in Austin, Texas. Originally from New York, he taught at Marist College and from 2002-2012 was on the faculty of New York University where he taught digital economics and information systems managementand. His first academic job was at Victoria University in New Zealand. He was also a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    February 2025
    M T W T F S S
     12
    3456789
    10111213141516
    17181920212223
    2425262728  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.