Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Show-Biz: The Televisual Re-mediation of the Modern Global Economy

Posted on | June 8, 2021 | No Comments

My use of “Show-Biz” refers to the meaning-making techniques of financial journalism and their relationship to the narratives that represent and drive the economy. Media industries “show” business and finance through various camera, editing, and special effects techniques, drawing in data from many sources and presenting them on different windows of high-resolution screens. These techniques create ways of seeing and showing the economy. Consequently, they influence public opinion as well as investment and trading strategies that shape global, local, and national economic activities and investment patterns.

This post concerns the televisual surveying systems that monitor and display global business and financial activities. It starts with a theory of media, called remediation, and then examines different elements or media that are combined into the broadcast or streaming of financial news. Two key concepts, transparent immediacy and hypermediation help us understand the way the media operates. These transmissions of mediated financial information have consequences for the global economy.

This is not to say that such representations are necessarily valid constructions of reality or distortions of truth. One of the central themes of this post is that strategies of visual mediation are intertwined with authentic experiences and facts and that strategies of interpretation and incredulity/skepticism are required.

Television news expanded significantly in the 1970s with the creation of cable systems and satellite networks. Several networks were dedicated to financial news. Cable and traditional TV combined when CNBC (Consumer News and Business Channel) was established in April, 1989 as a joint venture between Cablevision and NBC. Bloomberg Television was launched in the United States on January 1, 1994, and drew on a Bloomberg boxdecade of financial analytics provided through the famous “Bloomberg Box.” (See image of my daughter pretending to use a Bloomberg Box)

Another successful financial news network was Fox Business News, launched on October 15, 2007. Yahoo also emerged as a major financial news provider. Bought by Verizon in 2016, it attracted over 100 million global monthly visitors on average in 2019, according to media analytics company ComScore. Yahoo Finance was recently sold by Verizon to Apollo Global along with AOL.

How is financial news mediated in these networks? What signifying practices are brought into play and for what purposes? What are the implications of their mediating styles and techniques for how we understand the health of the global economy, levels and types of employment, and the potential of innovative new industries and companies?

Financial television news is consistently broadcasting in many trading operations and other business environments. It is also popular in homes, whether by day traders or interested citizens. Many people might be invested in Bitcoin or other cryptocurrencies, concerned about housing prices, or following their 401K and other types of investments. Televised news and economic indicators play a vital role in various audiences’ perceptions of the economy.

Anchored by personalities that are informed and presentable, the television screen combines live human commentary with indexical information, graphs, and other numerical representations of different parts and states of the economy. The news anchor fixes meaning, guiding the narrative while transfixing the audience with the immediacy of their presence.

Remediation is literally the re-mediating of content through the inclusion of old media into new media. Or sometimes, including new media in an old medium, such as the use of computer and web techniques in modern television. One of the earliest media theorists, Marshall McLuhan, made these observations in the 1960s. Print remediates writing, which remediates speech. The TV anchor, an actual person of authority who “anchors” the meaning of the broadcast, is remediated on the television news screen.

However, Bolter and Grusin made a more systematic analysis in their Remediation, published by MIT Press (2000).[1] They echoed McLuhan’s observations that the content of new media is old media. They also ventured that the motivation was a desire for remediation is a “healed” media, one that provided better access to a more “authentic” version of reality. Bolter and Grusin pointed to a “double logic” of remediation – two different modes of representation that strive to better access the real. Television has coped with this dual system of remediated representation since its origins with a variety of incorporations and innovations.

One mode of remediation is transparent immediacy, the desire for a type of immersion into the medium that erases the technology and provides an unblemished experience. The cinematic movie experience strives for this authenticity with its large screen, darkened room, and conservative camera editing practices. The viewer wants to forget the presence and techniques of the movie apparatus and believe they are in the presence of the objects of representation – the actors and sets. TV less so.

McLuhan and others argued that TV was primarily an acoustic medium, mainly because sound anchors the meaning of the visual elements. Television is a storyteller, primarily an oral one. So it is no surprise that human “anchors” on broadcast news play an important role. Anchors read the news and also conduct live interviews with guest experts for additional credibility and information. They present the major topics of the day in real-time, fixing the meaning of the broadcast, organizing the narratives of the day. Financial television borrows this sonic dominance, although it streams many other sources of data and textual news.

Many celebrity financial analysts have become celebrities, such as Mohamed A. El-Erian, Jared Bernstein, Bill Gross, etc. Neel Kashkari and other Fed District presidents are also very popular. These “Talking Heads” are brought in to contribute to the narrative and bring their expertise remotely from different cities and countries, representing key companies or government positions.

Television news is interrupted occasionally by “Breaking News” that reinforces immediacy. This interruption usually includes live reporting by a journalist at a relevant location. Drone or helicopter “birds-eye” viewing enhances the dominant perspective of television news. Reports by the Fed Chair after their FOMC meeting on interest rates are very popular. These events keep viewers “glued” to the screen.

Bloomberg Intraday

Hypermediation is the other strategy and uses techniques of representation that “foreground the medium.” Television has taken on the multi-mediated look of the computer with different windows gathering in activities, data, and events. While the anchor is prominent (although most trading environments turn off the sound) other windows display hyper-mediated representations of economic and financial data streaming in from around the world. This information is primarily in the form of charts and graphs, and indices presenting a quantitative view of the world. The reliability of this global gaze often draws on the truth claims of numeracy and remediates the spreadsheet. In particular, the visual techniques of the table are utilized to quickly communicate an augmented view of the economy.

Financial hypermediation has moved away from transparency. Instead, it integrates an augmented reality with indexical denotations of stock markets, prices of commodities like gold and silver, and currency exchange fluctuations. Indicators range from the macro-indicators such as GDP, invented to mobilize industrial responses to the Great Depression and World War II. If Women Counted by Marilyn Waring was a major critique of GDP because it didn’t count domestic work. The age of big data is also returning information that is giving us a better picture of the larger economy. Unemployment statistics are a major indicator, as are the prices of commodities like gold, silver, and copper.

Financial news probably owes a debt to sports broadcasting and news. Notably, American sports like baseball, basketball, and football (gridiron) have embraced hypermediated techniques in the service of sports entertainment. While transparent immediacy is a crucial part of sport enjoyment, a new word, “datatainment” has emerged as the moniker of the joy many people get from statistics related to their favorite teams and players. In baseball, for example, scores remain the major source of numerical pleasure as they indicate winners and losers. But batting averages, earned run averages (ERA), and runs batted in (RBIs) are statistical sources of additional satisfaction.

Conclusion

Financial news on television combines several earlier and newer types of media to represent views of the global economy. It uses anchors and interviews with guests. It used many economic indicators throughout the screen and scrolling tickertapes. It tries to survey the world and paint a picture of an authentic world that it thinks its viewers would be interested in. What are the limitations of such media strategies? What are the limitations of these strategies of representation?

Citation APA (7th Edition)

Pennings, A.J. (2021, Jun 08). Show-Biz: The Televisual Re-mediation of the Modern Global Economy. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/show-biz-the-televisual-re-mediation-of-the-modern-global-economy/

Notes

[1] Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 2000.
[2] Cook, Patrick J. Rev. of Remediation by Jay David Bolter and Richard Grusin. Resource Center for Cyberculture Studies (December 1999). 14 January 2001.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 5: Trump, Title I, and the End of Net Neutrality

Posted on | May 16, 2021 | No Comments

The election of Donald Trump in 2016 presented new challenges to broadband policy and the net neutrality rules passed under the Obama administration. Tom Wheeler resigned from the Federal Communications Commission (FCC), allowing Trump to pick a Republican chair and swing the power to the GOP.

The major issue would be to challenge the FCC’s 2015 classification of Internet Service Providers (ISPs) under Title II of the Communications Act of 1934 that emphasized common carriage, the commercial obligation to serve all customers equally and fairly. This post recaps the development of net neutrality rules for Internet broadband under the FCC during the Obama administration to their immediate dissolution under the Trump administration’s FCC.

The FCC’s Computer Inquiries were conducted from the 1960s to the 1980s and identified three layers of technological services for data communications. The telecommunication market offering basic services to homes and businesses would be classified as regulated Title II companies because of their monopoly positions. Data communications and processing service providers operating on top of the telco infrastructure would be lightly regulated Title I “enhanced” companies. The content companies that would offer information services were not included in the regulations. This legal framework allowed the Internet to take off in the 1990s, including the creation of over 7,000 ISPs in the US. But this was before the higher speed broadband services became available.

Broadband companies became Title I “information services” during George W. Bush administration’s FCC. Telephone companies that had carved up America during the breakup of AT&T in the 1980s became unregulated ISPs. Cable television companies had also developed IP broadband capabilities in the late 1990s and, with cable modems, competed or merged with telephone companies to provide “triple play” (TV, broadband, and voice) services to households. In 2009, cable companies were also deregulated by classifying them under Title I.

The result of these two decisions was a highly oligopolistic market structure for broadband services. These companies began to acquire smaller ISPS, often after making it difficult for them to interconnect to their facilities as they had been required as Title II companies. Customers soon found themselves limited to monopoly or duopoly ISPs in their area.

These newly deregulating companies also wanted to expand into new digital services, including payment systems and providing information, video, and search content. These actions violated the “maximum separation” rules that restricted these companies from competing with their customers. They also had designs to operate as gateways that would package games, social media, geo-locational data, and email services into bundles that they would offer at various prices. Concerns proliferated about pricing and service issues and this led to the movement for “net neutrality” and the return of common carriage.

During the first Obama administration, The FCC began a major study the broadband market structure of ISPs in the US.

In 2010, the FCC passed six broad “net neutrality principles:”

    Transparency: Consumers and innovators have a right to know the basic performance characteristics of their Internet access and how their network is being managed;

    No blocking: This includes a right to send and receive lawful traffic, prohibits the blocking of lawful content, apps, services and the connection of non-harmful devices to the network;

    Level playing field: Consumers and innovators have a right to a level playing field. This means a ban on unreasonable content discrimination. There is no approval for so-called “pay for priority” arrangements involving fast lanes for some companies but not others;

    Network management: This is an allowance for broadband providers to engage in reasonable network management. These rules do not forbid providers from offering subscribers tiers of services or charging based on bandwidth consumed;

    Mobile: The provisions adopted today do not apply as strongly to mobile devices, though some provisions do apply. Of those that do are the broadly applicable rules requiring transparency for mobile broadband providers and prohibiting them from blocking websites and certain competitive applications;

    Vigilance: The order creates an open Internet advisory committee to assist the commission in monitoring the state of Internet openness and the effects of the rules.[1]

The new rules faced a judicial challenge. The courts, while sympathetic to the goals of net neutrality, questioned the FCC’s authority to regulate Title I companies. After an appeal by Verizon, the DC circuit court sent the FCC back to the drawing boards. Judge David Tatel said that the FCC did not have the authority under the current regulatory conditions to treat telcos as “common carriers” that must pass data content through their networks without interference or preference.

The result of Verizon vs. the FCC was that without a new regulatory classification, the FCC wouldn’t have the authority to actually enforce restricting the big ISPs from banning or blocking of legal websites, throttling or degrading traffic on the basis of content, and enacting “paid prioritization” for Internet services. The latter, the so-called “fast lanes” for companies like Google and Netflix were particularly contentious.[2]

President Obama got involved and supported reclassifying ISPs as common carriers under Title II of the Communications Act of 1934. This would give the FCC the authority they needed to regulate the ILEC ISPs. In February 26, 2015, the FCC passed the new Title II Net Neutrality Rules in a 3–2 party-line vote that went into effect in the summer of 2015. The FCC’s open internet rules applied to both wired and wireless Internet connections.

Trump’s new FCC Chairman, Ajit Pai, argued that the web was too competitive to regulate effectively. Ignoring the impacts of deregulating cable and telephone companies on broadband competition, he argued ISPs did not have the incentive to throttle web speeds or restrict other services. He compared regulating ISPs with regulating websites, a clear deviation from the regulatory layers set out in the computer inquiries. Subsequently, the FCC began seeking comments on eliminating the Title II classification for broadband and removing the Obama era net neutrality rules.

On December 14, 2017, the Federal Communications Commission (FCC) voted in favor of repealing these policies, 3–2, along party lines. Pai voted with the majority of the FCC to reverse the decision to regulate the Internet under Title II of the Communications Act of 1934. Called the Restoring Internet Freedom Order, it repealed the net neutrality rules that were put in place two years earlier.

Pai’s justification speech argued that the Internet was not broken and didn’t need to be fixed. His contention was that the bureaucratic complexity of net neutrality was a burden on small ISPs and a disincentive to invest in new facilities and digital pipes. The new FCC voted to begin eliminating Obama’s net neutrality rules as it reclassified home and mobile broadband service providers as Title I information services.

The both Democrats and Republicans responded with several strategies to reverse the rule. In 2018, they attempted to invoke the Congressional Review Act (CRA) to undo the FCC order. This would bypass the filibuster and allow Congress to repeal recent administrative regulations. The motion passed the Republican-controlled vote Senate by 52–47, but did not get the necessary votes in the Republican-controlled House.

The Democrats tried after gaining a majority in the 2018 midterm elections with the Save the Internet Act of 2019 bill. It codified no blocking or throttling websites, or bundle websites or apps like a cable packages. It designated network access a “utility” under Title II of the 1996 Communications Act.

Rep. Mike Doyle (D-PA), the bill’s main sponsor and chair of the Subcommittee on Communications and Technology within the House Committee on Energy and Commerce, said he believes Internet access is a right for all and that “We want that gatekeeper to be neutral.” It passed the House 232-190 but was declared dead on arrival by Mitch McConnell, Senate Majority Leader.

Pai resigned with Trump’s departure in 2020, leaving behind a mixed legacy. While he was acknowledged for some internal changes, including creating the FCC’s Office of Economics and Analytics (OEA) that collected FCC economists in a central think tank, instead of the separate bureaus. But the FCC was slow on 5G deployment and making available the much needed supply of spectrum in the mid-band (2 GHz-6 GHz) range. Rural buildout was weak with the FCC caught working with telcos to reclassify mobile broadband requirements lower. This meant they could count lower mobile bandwidth capabilities as broadband. But, by far, the so-called Restoring Internet Freedom order that repealed net neutrality will be the legacy of the Trump era, with the central question being was it a capitulation to telco lobbyists?

In the next post, I will examine the challenges for the Biden Administration in addressing broadband policy, including net neutrality, but also the Internet of Things, and the expansion of broadband infrastructure in rural and other underserved areas.

Notes

[1] Gustin, S. (2018, September 11). FCC Passes Compromise Net Neutrality Rules. Wired. https://www.wired.com/2010/12/fcc-order/.
[2] Finley, Klint. (2017, May 18) Internet Providers Insist They Love Net Neutrality. Seriously? Wired. Conde Nast, 18 May 2017. Web.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets as Remediated Technologies

Posted on | May 5, 2021 | No Comments

In his classic (1964) Understanding Media: The Extensions of Man, Marshall McLuhan argued that “the content of any medium is always another medium.”[1] For example, the content of print is the written word, and the content of writing is speech. Likewise, the content of the telex was writing, and the content of television was radio and cinema. The book was notable for coining the phrase, “the medium is the message,” and pointed to the radical psychological and social impacts of technology.

McLuhan had a specific focus on the effects instead of the content transmitted by each medium. He probed how new forms of technologies extended the senses of humans and changed the activities of societies. He invited us to think of the lightbulb, not so much in terms of its luminous content, but in the way it influenced modern society. He noted it creates new environments and changes in lifestyles, particularly at night. This post will examine the media technologies embedded in the digital spreadsheet that have made it a transformative technology and changed modern life.

Mediating “Authentic” Realities

In Remediation: Understanding New Media, Jay Bolter and Robert Grusin extended McLuhan’s ideas to a number of “new media,” including television, computer games, and the World Wide Web. They argued new media technologies are designed to improve upon or “remedy” prior technologies in an attempt to capture or mediate a more “authentic” sense of reality. They used the term “remediation” to refer to this innovation process in media technologies.[2] For example, VR remediates perspectival art, which remediates human vision. TV not only remediates the radio and film but now the windowed look of computers, including the ticker-tape scrolling of information across the screen.

Unfortunately, but understandably, they neglected the spreadsheet.

And yet, the digital spreadsheet is exemplary of the remediation process. Several years ago, I initiated an analysis of the spreadsheet that focuses on the various “media” components of the spreadsheet and how they combine to give it its extraordinary capabilities. To recap, these are:

  1. writing and numerals;
  2. lists;
  3. tables;
  4. cells, and;
  5. formulas.

The digital spreadsheet refashioned these prior media forms: writing, lists, tables, and formulas to create a dynamic meaning-producing technology. Writing and lists have rich historical significance in the organization of palaces, temples, monasteries, as well as armies, and navies. Indo-Arabic numbers replaced Roman numerals and expanded the realm of numerical calculation with the introduction of zero and the positional place holding system. Numbers and ledgers led to the development of double-entry accounting systems and the rise of merchants and later modern businesses.

Tables helped knowledge disciplines emerge as systems of inquiry and classification, initially areas like accounting, arithmetic, and political economy. Still, later areas such as astronomy, banking, construction, finance, insurance, and shipping depended on printed tables to replace constant calculation. Charles Babbage (1791-1871), a mathematician and an early innovator in mechanical computing, expressed his frustration with constructing tables when he famously said, “I wish to God these calculations had been executed by steam.”

First with VisiCalc and then Lotus 1-2-3, these media elements worked together to form the gridmatic intelligibility of the spreadsheet. Bolter and Grusin proposed a “double logic of remediation” for the representation of reality: transparent immediacy and hypermediacy. Both work to produce meaning. However, the former tries to forget the mediation at work and produce transparent immediacy, such as watching a live basketball game on television. The latter tries to foreground the medium, especially through computer graphics. Financial news programs on TV such as Bloomberg Surveillance mix the immediacy of live news using hosts and guests, with hypermediated indexes of stock markets (DJIA, S&P 500, NASDAQ, etc.) and other economic indicators such as GDP. How do spreadsheets attempt to perceive, display, and produce reality? How do they heal our perception of reality?”

Windows to the World Wide Web

It was the personal computer (PC) that brought the spreadsheet to life. The Apple II brought us VisiCalc in 1976 with 40 columns and 25 rows, a small area that could be navigated quickly using the arrow keys. One of the first formulas developed for the spreadsheet was net present value (NPV) that calculated the return on investment (ROI) for projects, including large purchases of equipment. Microsoft’s Disk Operating System (DOS) was the technical foundation for Lotus 1-2-3 as the IBM PC and “IBM-compatibles” proliferated during the 1980s. The spreadsheet was becoming known as the “killer app” that made buying the “microcomputer” worthwhile. But it was the Graphic User Interface (GUI) that popularized the PC, and thus the spreadsheet.

The Apple Mac marked the shift to the GUI and new desktop metaphor in computing. GUIs replaced the inputted ASCII characters of the command line interface with a more “natural” immediacy provided by the interactivity of the mouse, the point-able cursor, and drop-down menus. The desktop metaphor drew on the iconic necessities of the office: the file, inboxes, trash cans, etc. A selection of fonts and typographies remediated both print and handwriting. The use of the Mac required some suspension of disbelief, but humans have been conditioned for this alteration of reality by story-telling and visual narratives in movies and TV.

Microsoft’s Excel was the first spreadsheet to use the graphic user interface (GUI) developed by Xerox PARC and Apple. Designed for the Apple Macintosh, it became a powerful tool that combined the media elements of the spreadsheet to produce more “authentic” versions of reality. An ongoing issue is the way it became a powerful tool for organizing that reality in ways that benefitted certain parties, and not others.

Excel was the center of Microsoft’s own shift to GUIs starting in 1985. Called Windows, it made spreadsheets a key part of its Office software applications package. Microsoft had captured the IBM-compatible PC market with DOS and initially built Windows on top of that OS. Windows 2.0 changed the OS to allow for overlapping windows. Excel became available on Windows in 1987 and soon became the dominant spreadsheet. Lotus had tried to make the transition to GUI with Jazz but missed the market by aiming too low and treating the Mac as a toy.

Windows suggested transparent views for the individual to different realities.

But while the emerging PC was moving towards transparent immediacy, the spreadsheet delved into what Bolter and Grusin would call hypermediacy. This is an alternate strategy for attaining an authentic access to the real. Windows promised transparent views of the world, but the spreadsheet offered new extensions of the senses – a surveying and calculative gaze – by remediating.

Spreadsheets drew on the truth-claims of both writing and arithmetic while combining them in powerful ways to organize and produce practical information. They combined and foregrounded the mediums involved to present or remediate a “healed” version of reality. Each medium provides a level of visibility or signification. The WYSIWYG (What You See Is What You Get) environment of the desktop metaphor provided a comfortable level of interactivity for defining categories, inputting data, and organizing formulas and displaying that information in charts and graphs.

The Political Economy of PC-based Spreadsheets

How has the digital spreadsheet changed modern society? Starting with VisiCalc and Lotus 1-2-3, the spreadsheet created new ways to see, categorize, and analyze the world. It combined and remediated previous media to create a signifying and pan-calculative gaze that enhanced the powers of accounting, finance, and management. Drawing on Bolter and Grusin, can we say that digital spreadsheets as remediated technology became a “healing” media? But this does beg some important questions. What was its impact on the modern political economy? What was its impact on capitalism?

The spreadsheet amplified existing managerial processes and facilitated new analytical operations. Its grid structure allowed a tracking system to monitor people and things. It connected people with tasks and results, creating new methods of surveillance and evaluation. It could register millions of items as assets in multiple categories. It itemized, tracked, and valued resources while constructing scenarios of future opportunity and profit.

Digital spreadsheets introduced a major change of pace and scale to the financial revolution that started with Nixon’s decision to go off gold and on to an “information standard.” The spreadsheet facilitated quick analysis and recalculating loan payment schedules in an era of inflation and dynamic interest rates. Spreadsheet proliferation started with accountants and bookkeepers who quickly realized that they could do their jobs with new precision and alacrity. But their use soon became ubiquitous.

PCs and spreadsheets started to show up in corporate offices, sometimes to the chagrin of the IT people. The IBM PC legitimized the individual computer in the workplace, and new software applications emerged, including new types of spreadsheet applications such as Borland’s Quattro Pro. Spreadsheet capabilities increased dramatically through the 1980s adding new formulas from a wide scope of disciplines like accounting, engineering, operations management and statistics. But it was the new processes of analyzing assets that allowed for the shift to a new era of spreadsheet capitalism.

Reaganomics’ emphasis on the financial resurgence and the globalization of news resulted in ways that money-capital could flow more freely. It’s no surprise that the digital spreadsheet brought in the era of leveraged buyouts (LBOs) and widescale privatization of public assets that characterized the late 1980s and the 1990s. Companies could be analyzed by “corporate raiders” and their assets separated into different categories/companies. Spreadsheets could determine NPV, and plans could be presented to investment bankers for short-term loans to purchase the company. Then certain assets could be sold off to pay off the loans and cash in big rewards.

Similarly, the assets of public agencies could be itemized, valued, and sold off or securitized and listed on share markets/stock exchanges. The “Third World Debt Crisis” created by the oil shocks of the 1970s and the flood of Eurodollars made available to countries created new incentives to find and sell off public assets to pay off government loans. This logic happened to telecommunications companies worldwide.

Previously, PTTs (Post, Telephone, and Telegraph) were government-owned operations that provided relatively poor telephone services but returned profits to the nation’s Treasury. But the calculative rationality of the spreadsheet was quickly turned to analyzing the PTTs. They could be used in summing the value of all the telephone poles, maintenance trucks, switches, and other assets. At first, these companies were turned into state-owned enterprises (SOEs), but they were eventually sold off to other companies or listed on share markets. By 2000, the top companies in most countries, in terms of market capitalization, were former PTTs, now transformed into privatized “telcos.”

World Trade Organization (WTO) meetings in 1996 and 1997 reduced tariffs on computers and other IT-related products. With the IMF they had pressured countries to liberalize telecommunications and complete PTT privatization. In the US and other countries, the dot.com “bull run” was taking place, aided by a spreadsheet at Worldcom that projected the “doubling meme” – continual fast growth of the Internet and all the technology associated with it.

By the late 1990s, these telcos were adopting the new Internet Protocols (IP) that allowed for the World Wide Web. Cisco Systems and Juniper Networks were two companies that were instrumental in developing new switching and routing systems. While initially used by small Internet Service Providers (ISPs) These technologies soon allowed telcos to convert their PTT infrastructures into IP providers and dominate the ISP broadband markets.

A spreadsheet is a tool, and it was also a world view – a reality by categories, data sets, and numbers. As the world moved into the financialization and globalization of the post-oil crisis Reagan era, the PC-based spreadsheet was forged into a powerful new “remediated” technology.

Was it responsible for a new era in capitalism? Where combinations of media framed by the computer windows guided and shaped the perceptions of a new era of capitalism. We have Apple’s iWork Numbers, Google Sheets, and LibreOffice Calc, but Microsoft Excel is still the dominant spreadsheet. But how has Microsoft repurposed and scaled Excel, particularly with Access and SQL language? Excel was the foundational technology for an era of database technologies. What about blockchain?

Capitalism is highly variable and subject to changes in regulations, legislation, and technologies. These can change the political economy and shape the flows of information and money. The spreadsheet was central to Reagan’s financial revolution, but also the globalized world of the Internet. Digital spreadsheets became a new way of viewing and interacting with the the world. But not through transparent immediacy, rather via a calculative rationality and hypermediated instrumentality providing new perspectives and techniques to understand and shape the relationships between capital, innovation, and management.

Notes

[1] McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964. Print.
[2] Bolter, J. D, and Richard A. Grusin. Remediation: Understanding New Media. Cambridge, Mass: MIT Press, 1999. Print.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Internet Policy, Part 4: Obama and the Return of Net Neutrality, Temporarily

Posted on | March 26, 2021 | No Comments

The highly competitive Internet services provider (ISP) environment of the 1990s was significantly altered by Federal Communications Commission (FCC) during the Bush Administration. Two Bush appointments to the FCC Chair position guided ISP policies towards a more deregulated environment. The result, however, was a more oligopolistic market structure and less competition in the Internet space. Furthermore, these policies raised concerns that powerful ISPs could influence the flow of data through the Internet and discriminate against competing content providers to the detriment of consumers.

The FCC is an independent commission but can lean in political directions. Under the leadership of Michael Powell (January 22, 2001 – March 17, 2005), Republican from Virginia and son of General Colin Powell, FCC decisions favored cable companies. In the summer of 2005, the FCC now guided by the new FCC Chairman Kevin J. Martin Republican from North Carolina (March 18, 2005 – January 19, 2009) guided decisions that favored telcos. The FCC made cable modem services and broadband services by telecommunications companies Title I unregulated “information services.” This has raised ongoing concerns that powerful ISPs influence the flow and speed of data through the Internet and could discriminate against competing content providers or users to the determent of consumers.[1]

This post examines the Obama administration’s approach to Internet regulation and the issue of net neutrality. This involved reviving “Title II” regulation that works to guarantee the equal treatment of content throughout the Internet. Previously, I examined the legal and regulatory components of common carriage and the emergence of net neutrality as an enabling framework for Internet innovation and growth.

Comedian John Oliver explained net neutrality in his Last Week Tonight Show published on Jun 1, 2014.

The Internet’s political and social impact was becoming more apparent with the social media presidential campaign of Barack Obama in 2008. It was recognized by the Pew Research Center that some 74% of Internet users interacted with election information. Many citizens received news online, communicated with others about elections, and received information from campaigns via email or other online sources.

In 2010, the Obama administration began to write new rules dealing with Internet providers that would require ISPs to treat all traffic equally. In what were called the “Open Internet” rules, FCC Chairman Julius Genachowski, Democratic from Washington, D.C.(June 29, 2009-May 17, 2013) sought to restrict telecom providers from blocking or slowing down specific Internet services. Verizon sued the agency to overturn those rules in a case that was finally decided in early 2014. It determined the FCC didn’t have the power to require ISPs to treat all traffic equally due to their new Title I designations. The judge was sympathetic to the consumer’s plight though, and directed the ISPs to inform subscribers when they slow traffic or block services.

After the appeal by Verizon, the DC circuit court sent the FCC back to the drawing boards. Judge David Tatel said that the FCC did not have the authority under the current regulatory conditions to treat telcos as “common carriers” that must pass data content through their networks without interference or preference. The result of Verizon vs. the FCC was that without a new regulatory classification, the FCC wouldn’t have the authority to actually enforce restricting the big ISPs from banning or blocking legal websites, throttling or degrading traffic on the basis of content, and limiting “paid prioritization” of Internet services. The latter, the so-called “fast lanes” for companies like Google and Netflix were particularly contentious.[2]

So, on November 10, 2014, President Obama went on the offensive and asked the FCC to “implement the strongest possible rules to protect net neutrality” and to stop oligopolistic ISPs from blocking, slowing down, or otherwise discriminating against lawful content. Tom Wheeler, the incoming FCC Chairman, from California (November 4, 2013 – January 20, 2017), sought a new classification from the legacy of the Communications Act of 1934 by invoking Title II “common carrier” distinctions for broadband providers.

To its credit, the FCC had been extremely helpful in creating data communications networks in the past. The FCC’s classification of data services in Computer I as being “online” and not “communications” provided timely benefits. For example, it allowed early PCs with modems to connect to ISPs over telephone lines for hours without paying toll charges to the providers of local telephone service. But with a competitive Internet, opening up the deregulated broadband capabilities to telcos seemed excessive.

“Information services” under Title I is a more deregulatory stance that allows the telcos to impose more control over the Internet. “Information services” under Title I refers to “the offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications.” As mentioned previously, under the George W. Bush’s FCC, cable companies in 2002 and then telcos in 2005 were classified as Title I information services. This led to a major consolidation of US broadband service that started to be dominated by large integrated service providers such as AT&T, Comcast, Sprint, and Verizon. These companies began trying to merge with content providers, raising the specter of monolithic companies controlling information and invading privacy.

On February 26, 2015, the FCC’s new “Open Internet” rules went into effect based on Title II of the Communications Act of 1934 and Section 706 of the Telecommunications Act of 1996. The latter gave the FCC authority to regulate broadband networks, including imposing net neutrality rules on Internet service providers. Section 706 directs the FCC and state utility commissions to encourage the deployment of advanced telecommunications capability to all Americans by removing barriers to infrastructure investment and promoting competition in the local telecommunications markets.

But Section 706 authority only kicks in when the FCC finds that “advanced telecommunications capability” is “not being deployed to all Americans in a reasonable and timely fashion.”

In other words, the case needs to made that the US Internet infrastructure is lacking. For example, the FCC established 25 Mbps download/3 Mbps upload as the new standard for “advanced telecommunications capacity” for residential service. This is actually a pretty low benchmark for urban broadband users as only 8% of America’s city dwellers lack access to that level of service. But that still left some 55 million Americans behind as rural areas were largely underserved, especially in tribal lands.

In early 2015, President Obama went began to point attention towards broadband access. Consequently Chairman Wheeler announced that the FCC’s Connect America Fund will disburse $11 billion to support modernizing Internet infrastructure in rural areas. It also reformed the E-rate program to support fiber deployment and Wi-Fi service to the nation’s schools and libraries.[3]

Open Internet rules were meant to protect the free flow of content and promote innovation and investment in America’s broadband networks. It was grounded in multiple sources of authority, including Title II of the Communications Act of 1934 and Section 706 of the Telecommunications Act of 1996. In addition to providing consumer protections by restricting the blocking, throttling, and paid prioritization of Internet services, the FCC strove to promote competition by ensuring that all broadband providers have access to poles and conduits for the physical plant.

They also did not require providers to get the FCC’s permission to offer new rate plans or allow new services. Nor did they require companies to lease access to their networks and monitor interconnection complaints, a key provision that promoted ISP competition. A key dilemma was to promote the ubiquity of the Internet, while exempting broadband customers from universal service fees.

The election of Donald Trump presented new challenges to Net Neutrality and the potential of a new reversal. Tom Wheeler resigned from the FCC, allowing Trump to pick a new Democrat to the FCC and a Republican. The new FCC voted 3-2 to begin eliminating Obama’s net neutrality rules and reclassify home and mobile broadband service providers as Title I information services. A new FCC Chairman, Ajit Pai, argued that the web was too competitive to regulate effectively, and throttling some web applications and services websites might help Internet users. The FCC began seeking comments about eliminating the Title II classification. Replacing the Obama net neutrality rules was put to the vote by the end of the year, and the FCC once again returned to Title I deregulation through a declaratory ruling.

Notes

[1] Ross, B.L. and Shumate, B.A., Rein, W. “Regulating Broadband Under Title II? Not So Fast.” Bloomberg BNA. N.p., 25 June 2014. Web. 18 June 2017.
[2] Finley, Klint. “Internet Providers Insist They Love Net Neutrality. Seriously?” Wired. Conde Nast, 18 May 2017. Web. 18 June 2017.
[3] “What Section 706 Means for Net Neutrality, Municipal Networks, and Universal Broadband.” Benton Foundation, 13 Feb. 2015. Web. 18 June 2017.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Will Offshore Wind Power Print Money?

Posted on | March 15, 2021 | No Comments

Research is showing that offshore wind farms can increase biodiversity in oceans. Like sunken ships, windmill installations present unique opportunities for facilitating marine life. These new habitats create artificial reefs and marine life-protection areas. Undersea hard surfaces rapidly collect a wide range of marine organisms that build and support local ecosystems. They also provide some refuge from trawlers and other industrial fishing operations.

This post will examine the prospects of wind energy, one of the promising alternative renewable energies that will work with hydropower, solar, and even small-scale nuclear energy to power the smart electrical grids of the future. Is offshore wind feasible? What are the downsides? Will it be profitable? Can it literally “print money” once it is operational?

Wind and torque combine to transform mechanical energy into electricity. The physics of windmills means that big is better. The larger the propellers that can be built, the more efficient they become. Bigger windmills capture more wind, and that produces more torque. The more propellers can harvest the power of the wind, the more electricity they can produce. The equation belows describes the relationship between wind speed, torque, and power output.

torque and angular speed

Media economics can help us understand the economics of wind power. Media projects like books and films, and even digital software, require huge expenditures up front but the cost of each succeeding unit of output is quite low. What is the cost of each individual copy of Windows 11? Renewable energies tend to have the same characteristics, what economists call low marginal costs. Once a windmill is manufactured and installed, the cost of each kilowatt produced is low. The marginal cost of each unit of output decreases. Granted, the costs of replacement, recycling, or reusing these large machines are valid points of concern.

Personally, wind power hasn’t impressed me in the past. In graduate school in Hawaii, I remember a big windmill near the North Shore surf spots that didn’t seem to do much. Driving up into San Francisco along Interstate 5, the windmills seem big and slow. Flying regularly over northern Texas and Oklahoma, the wind farms become a bit more impressive. Understanding the fundamental economics and the basic engineering and science of wind energy is useful for policy analysis.

Unlike solar, wind power is not directly contingent on solar rays but on larger climatic events. The US Department of the Interior‘s Bureau of Ocean Energy Management (BOEM) has been conducting environmental impact studies and is giving conditional permission to build offshore wind farms. Contracts to provide wind electricity as low as 5.8 cents per kilowatt-hour are being negotiated. Massachusetts, Virginia, and the far coast of Long Island, New York are some of the major sites under development. While previously a global laggard, the US is expected to become a major offshore electricity contributor after 2024.

The future of US offshore wind energy is dependent on several economic variables. One is power purchase agreements (PPAs) that businesses and other organizations use to solidify long-term purchases of electricity. Another is renewable portfolio standards (RPSs) that obligate US states to procure a certain percentage of renewable energy. RPSs have contributed to nearly half of the growth in renewable energies since 2000. Tax incentives are important and depend on political winds. The US Treasury extended safe harbor tax credits for renewable energies, including offshore wind in light of the COVID-19 pandemic. Offshore wind auctions are also crucial as the cry “location, location, location” resonates soundly in this industry.

Renewable critics like the Manhattan Institute have been been critical of offshore windmills, arguing that they decline some 4.5% in efficiency every year. Another concern is who will pick up the decommissioning costs of deconstructing and recycling the windmills. But the technology is new as are the maintenance, recycling, and regulatory practices.

Wind could be a significant boost for coastal communities. Major cities that were wedded to the ocean due to fishing and shipping are likely to benefit as offshore wind might provide cheap electricity and much-needed economic benefits. Tourism will benefit from cheap electricity as did Las Vegas when it had access to power from the Hoover Dam. In terms of jobs and the revitalization of shore-based businesses a wide range of services will be needed. Energy control centers, undersea construction, equipment supply, and maintenance operations, are just some of the opportunities that are emerging around ocean-based renewable energy sources.

In summary, the economics of offshore wind energy are very much like media economics – high upfront costs and low marginal costs. Book publishing requires editors and pays author royalties. It also needs paper, printing presses, and the distribution capabilities required to produce fiction and non-fiction works. While some books may not be profitable, a best-seller can provide significant returns for the publisher. Movies require extensive upfront expenses in production and post-production, but each showing in cinemas worldwide costs relatively little. Wind power requires a major capital influx to set up. But the wind is free, so once operational, the windmill begins to produce electricity. Lubrication and other maintenance activities are needed at times, but electricity is created as long as the wind is blowing. If the infrastructure is set up efficiently, it will print money.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Born in New York, he had a chance to teach at Marist College near his home town of Goshen before spending most of his academic career at New York University. Before joining SUNY, he moved to Austin, Texas and has taught in the MBA program at St. Edwards University. He started his academic career at Victoria University in New Zealand. He has also spent a decade as a Fellow at the East-West Center in Honolulu, Hawaii.

COVID-19 and US Economic Policy Responses

Posted on | March 8, 2021 | No Comments

COVID-19 was recognized in early 2020 and began to spread rapidly in March of that year. The World Health Organization (WHO) identified the virus in January, and later in the month, the CDC confirmed the first US Coronavirus case. On March 13, President Trump declared the spreading coronavirus a national emergency as the US registered its 100th death. Many restaurants and other high contact industries began to shut down. Transportation and tourism ground to a halt. As a result, the US’ economic management processes worked to design a response.

In this post, I look at how the Federal Reserve and Congress (House and Senate), as well as two administrations, addressed the economic conditions and ramifications of the emerging viral pandemic. Starting in March 2020, they produced monetary and fiscal actions that reverberated through the US economy. What impact did it have on the so-called K-shaped recovery? How, if any, did the responses influence price deflation or inflation in the subsequent years?

Stock market since 2021

The US economy went into steep decline in the second quarter (April, May, June) while the virus spread and the Federal Reserve’s monetary policy and the CARES Act was being implemented. According to the Bureau of Economic Analysis (BEA), in the second quarter of 2020, US real Gross Domestic Product (GDP), contracted by 31.4 percent (9 percent at a quarterly rate). It was the starkest economic decline since the government started keeping records in 1947.

Starting March 3, the FOMC reduced the Fed Funds Rate 1.5 percentage points to 0-0.25%, making it official at its March 15th FOMC meeting. The Fed Funds Rate is the interest rate that banks purchase money from each other through its FEDWIRE network. This gives them more reserves that can be lent out at higher rates for car loans, home mortgages, and industrial capacity. The loans can also be invested in financial assets such as Bitcoin, currencies, equities, gold, etc.

Rather surprising was the Fed decision to reduce the reserve ratio to 0 from its traditional 10%. This reduction meant banks no longer had to hold a percentage of their deposits in their vaults or at the Federal Reserve. The Fed also offered a narrative framework, or “forward guidance” on their interest rates, stating they would remain low until unemployment receded and inflation increased to roughly 2% percent.

COVID Unemployment

The Fed simultaneously announced that it would begin to purchase securities “in the amounts needed to support smooth market functioning and effective transmission of monetary policy to broader financial conditions.” After its mid-March meeting, the Fed said it would begin buying some $500 billion in Treasury securities and $200 billion in government-guaranteed mortgage-backed securities. This version of quantitative easing (QE) was used, along with the $700 billion Troubled Asset Relief Program (TARP), (TARP), to recover from the 2007 financial crisis.

Over the course of the year, the Fed bond portfolio increased by $2.5 trillion from $3.9 trillion to $6.6 trillion. The purchases injected money into the economy and QE kept interest rates low, helping to keep mortgages cheap and the housing industry booming. The $6.6 trillion balance is a lot, but it can also be used to draw money out of the economy to help reduce inflation. That is what distinguishes “printing money” from QE. Printing money puts cash into the economy without adequate means to extract it during inflationary periods. Ideally, the Fed can sell off its balances and subtract money from the economy. But QE and low-interest rates became so embedded in the economy that it was not easy to let interest rates rise.

Congress worked on stimulating the economy as well. The Senate drew on the House of Representative’s Middle Class Health Benefits Tax Repeal Act, originally introduced in the U.S. Congress on January 24, 2019. All spending bills must originate from the House of Representatives, so the Senate used it a “shell bill” to begin working on economic and public health relief. They filled it in with additional content to combat the virus and protect the economy. On March 27, 2020, President Trump signed the CARES (Coronavirus Aid, Relief, and Economic Security) Act into law.

At over US$2 trillion, CARES was the largest rescue package in US history. It was twice the amount of the American Recovery and Reinvestment Act of 2009 (ARRA) that totaled $831 billion and helped revive the stalled US economy after the credit crisis. The CARES Act expanded unemployment benefits, including those for freelancers and gig workers, and gave direct payments to families. It also gave cash for grounded airlines, money for states and local governments, and half a trillion dollars in loans for corporations (although banning stock buybacks).

The result was a dramatic turnaround in GDP, not always the best economic indicator, but a key historical one. The third quarter (July, August, and September) grew dramatically. According to BEA, US real GDP increased at an annual rate of 33.1 per cent (7.4 percent at a quarterly rate). Compared to the 9 percent contraction in the 2nd quarter, this was a stunning reversal, the so-called V-shaped recovery. The BEA then reported that real GDP rose again by 4% in the fourth quarter.

Instead of a V-shaped recovery, talk of a K-shaped economy emerged, meaning that the economy was diverging. The economic crash hit different sectors unevenly, and the recovery even more so. The well-off and professionals, especially those that could telework, did well. At the same time, many in the rest of the economy faltered, often depending on racial, gender, industrial sector, and geographical differences.

Another stimulus bill was signed by President Trump in early December of 2020. The $900 billion stimulus averted a government shutdown and sent out $600 to every eligible American. Trump had wanted $2000 checks, but the delay was holding up vaccine distribution, and many people were facing eviction and the loss of unemployment benefits.

Fueled in part by Trump’s 2017 Tax Cuts and Jobs Act, significant amounts of money moved into appreciating assets. As a result, many well-off people just had more money to invest. But it was also consequential in that combined with the Fed’s low-interest rates, it spurred unprecedented speculation and borrowing on margin for investment purposes. With these monetary and fiscal stimulus packages, the financial markets recovered quickly and continued to rise into 2021.

A year ago, the S&P 500 fell some 20% from its highs in a record 16 days. A key measure of the top 500 listed companies and the market overall, it is also a major indicator of the economy. A year later, the S&P 500 recovered from its 2,304 low to a near-record close of 3,931 on Feb 17. Overall, the S&P 500 returned 15.15% in 2020.

The Dow Jones Industrial Average (DJIA) is another important indicator of the economy and financial markets, and one of the oldest (Shown above). It indexes the top 30 “blue chip” companies. In other words, companies with pricing power over their products such as Apple, Chevron, Coca-Cola, Disney, and Proctor & Gamble. The “Dow” crashed to 18,951 on March 23 from a high of just over 29,300 three weeks earlier. The dollar was also down, as was crude oil and many commodities, including gold. The Dow continued to rise and recovered to nearly 31,500 two months into the Biden presidency.

On March 6, 2021, the Senate passed a new $1.9 trillion coronavirus relief package. It came when stock markets were at record highs, Bitcoin had ballooned to over $50,000, and concerns about inflation due to increased spending and significantly diminished supply chains had emerged. The bill, known as the American Rescue Plan Act of 2021 or “Build Back Better I” proved prophetic as a new Delta variant of the virus appeared in the summer of 2021.

The new Covid-19 response has three main areas: pandemic response ($400 billion), including 14 billion for vaccine distribution; direct relief to struggling families ($1 trillion), notably the $1,400 checks for individuals and unemployment benefits of $300/week; and support for communities (in multi-year tranches) and small businesses ($440 billion), especially tourism areas hit hard by the pandemic and transit systems.

We entered 2021 with an unbalanced economy – a roaring stock market but massive poverty. Years of supply-side economics gave us a highly technological society and appreciating financial assets. But it was based on globalized supply chains and highly dependent on Russia and Saudi Arabia to support petro-intensive lifestyles and economic practices. Tax cuts transferred much of US wealth to the higher income brackets. Trump’s US$1.3 trillion tax cuts exacerbated the imbalances as the former president racked up US$7.8 trillion in national debt from his inauguration on January 20, 2017, to the capitol riots on January 6, 2021, when the electoral college votes for President Biden were tallied and declared him the winner.

A fifth major stimulus package, the $1.9 trillion American Rescue Plan, was signed into law by President Biden on March 11, 2021. It helped states, cities, counties, states, and tribal governments cover increased expenditures from the COVID-19 pandemic and replenish lost revenue. Vaccinations increased dramatically, but in the middle of 2021, the “Delta variant” emerged, bringing in a new wave of hospitalizations and death. The K-shaped recovery had a new meaning – the vaccinated could resume normal activities while the unvaccinated made up the majority of those hospitalized and dying.

Inflation started to rise as the all items index increased 6.2 percent for the year leading up to October 2021. A year of stimulus spending, tax cuts, and low-interest rates, and raising wages promised a booming economy, but it was met by chip shortages, clogged shipping ports, and low unemployment. Most painfully, the price of crude oil had steadily increased since hitting $25 a barrel after the COVID-19 exploded a year before. At roughly $83 a barrel, it was the largest increase since mid-2005 when it went over $155 a barrel.

Concerns about inflation entered the US policy discussions during the Fall of 2021. The Infrastructure Investment and Jobs Act that passed in Senate in August was delayed as the progressives wanted it tied to the third part of President Biden’s “Build Back Better” agenda. Media pressure forced the progressives to relent and pass the infrastructure bill without the American Families Plan that had been watered down to $1.7 trillion over ten years.

Infrastructure Bill

We had a medical emergency; the new COVID-19 legislation is paying the bill and hopefully taking a bit of the kick out of the K-shaped recovery. What is important in the current legislation is giving support to the sick and dispossessed, including those affected by closed businesses and the 9.5 million jobs that disappeared over the last year. Is inflation a major problem? The best cure for inflation is stopping the pandemic and restoring the circuits of food and other vital commodities.

Citation APA (7th Edition)

Pennings, A.J. (2021, Mar 08). COVID-19 and US Economic Policy Respones. apennings.com https://apennings.com/dystopian-economies/covid-19-and-the-us-economic-policy-response/

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea where he teaches financial economics and sustainable development. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in Korea, he lives in Austin, Texas.

Five Generations of Wireless Technology

Posted on | February 8, 2021 | No Comments

The ubiquity, ease, and sophistication of mobile services have proven to be an extraordinarily popular addition to modern social and productive life. The term geckobeach“generations” has been applied to wireless technology classifications as a way to refer to the major disruptions and innovations in the state of mobile technology and associated services. These innovations include the move to data and the Internet protocols associated with the convergence of multiple forms of communications media (cable, mobile, wireline) and the wide array of services that are becoming increasingly available on portable devices like laptops and smartphones. We are now on the cusp of the 5th generation rollout of wireless services with intriguing implications for enterprise mobility, “m-commerce,” public safety, and a wide array of new entertainment and personal productivity services.

By 1982, the Federal Communications Commission (FCC) had recognized the importance of the emerging wireless communications market and began to define Cellular Market Areas (CMA) and assigning area based radio licenses. It split the 40 MHz of radio spectrum it had allocated to cellular into two market segments; half would go to the local telephone companies in each geographical area and the other to interested non-telephone companies by lottery. Although AT&T’s Bell labs had effectively begun the cellular market, it had estimated the 2000 market to be slightly less than a million subscribers and consequently abandoned it during its divestiture of the regional phone companies. Meanwhile, financier Michael Milken began a process of helping the McCaw family buy up the other licenses, making them multibillionaires when they sold out to AT&T in the mid-1990s.

The first generation (1G) of wireless phones were large analog voice machines and their data transmission capability was virtually nonexistent. This initial generation was developed in the 1980s through a combination of lotteries and the rollout of cellular sites and integrated networks. It used multiple base stations with each providing service to small adjoining cell areas. Its most popular phone was the Motorola DynaTAC known sometimes as “the brick”, now immortalized by financier Gordon Gecko’s early morning beach stroll in Wall Street (1986). 1G was hampered by a multitude of standards such as AMPS, TACs, and NMT that competed for acceptance. The Advanced Mobile Phone System (AMPS) was the first standardized cellular service in the world and used mainly in the US.

The second generation (2G) of wireless technology was the first to provide data services of any significance. By the early 1990s, GSM (Global System for Mobile Communications) Motorola introduced the StarTAC in 1996was introduced first in Europe and in the U.S. by T-Mobile and other countries worldwide. GSM standards were developed in 1982 by the Groupe Spécial Mobile committee, an offshoot of the European Conference of Postal and Telecommunications Administrations (CEPT). GSM was the standard that would allow national telecoms around the world to provide mobile services. Although voice services improved significantly, the top data speed was only 14.4 Kbps.

The second generation also marked the introduction of CDMA (Code Division Multiple Access techniques). Multiple access technologies cram multiple phone calls or Internet connections into one radio channel. AT&T utilized Time-Division Multiple Access techniques (TDMA)-based systems, while Bell Atlantic Mobile (later Verizon) introduced CDMA in 1996. This second generation digital technology reduced power consumption and carried more traffic while voice quality did improve, and security became more adept. The Motorola StarTac phone was originally developed for AMPS but was sold for both TDMA and CDMA systems.

Innovations sparked the development of the 2.5G standards that provided faster data speeds. The additional “half” a generation referred to the use of data packets. Known as the General Packet Radio Service (GPRS), the new standards could provide 56-171 Kbps of digital service. It has been used for Short Message Service (SMS), otherwise known as “text messaging” and MMS (Multimedia Messaging Service) services, WAP (Wireless Application Protocol), as well as Internet access. Being able to send a message with emojis, pictures, video, and even audio content to another device provided a significant boost to the mobile phone’s utility.

An advanced form of GPRS called EDGE (Enhanced Data Rates for Global Evolution) was used for the first Apple mobile phone, considered the first version using 3G technology.

Third generation (3G) network technology was introduced by Japan’s NTT DoCoMo in 1998. Still, it was adopted slowly in other countries, mainly because of the difficulties obtaining additional electromagnetic spectrum needed for the new towers and services. 3G droidxtechnologies provided a range of new services, including better voice quality and faster speeds. Multimedia services like Internet access, mobile TV, and video calls became available. Telecom and application services such as file downloads and file sharing made it easy to retrieve, install and share apps. 3G radio standards have been largely specified by the International Mobile Telecommunications-2000 (IMT-2000) of the International Telecommunication Union but the major carriers continued to evolve their own systems such as Sprint and Verizon’s CDMA 2000 and AT&T and T-Mobile’s Universal Mobile Telecommunications System (UMTS), an upgrade of GSM based on the ITU’s IMT-2000 standard set, but an expensive one as it required new base stations and frequency allocations.

A 3.5 generation became available with the introduction of High Speed Packet Access (HSPA) with promises of 14.4Mbps although 3.5-7.2 were more likely.

Fourth generation wireless technology sought to provide mobile all-IP communications and high-speed Internet access to htc-evo, the first 4G phonelaptops with USB wireless modems, smartphones, and other mobile devices. Sprint released the first 4G phone in March of 2010 at the communication industry’s annual CTIA event in Las Vegas. With a 4.3 inch screen, two cameras, and Android 2.1 OS the new phone was able to tap into the new IP environment Fourth generation (4G) technology is being rolled out in various forms with a dedication to broadband data and Internet protocols with services such as VoIP, IPTV, live video streams, online gaming, and multimedia applications for mobile users.

While 3G was based on two parallel infrastructures using both circuit-switched and packet-switched networking, 4G relied on packet-switching protocols. 4G LTE (Long Term Evolution) refers to wireless broadband IP technology developed by the Third Generation Partnership Project (3GPP). “Long Term Evolution” meant the progression from 2G GSM to 3G UMTS and into the future with LTE. The 3GPP, an industry trade group, designed the technology with the potential for 100 Mbps downstream and 30 Mbps upstream. Always subject to various environmental influences, data rates could reach 1 Gbps speeds in the next ten years.[2]

4G phones were developed by Apple (iPhone 5-7), Samsung, and others to access WiMax (Worldwide Interoperability for Microwave Access) using the IEEE Standard 802.16 with a range of some 30 miles and transmission speeds of 75 Mbps to 200Mbps.

4G WiMax provides data rates similar to 802.11 Wi-Fi standards with the range and quality of cellular networks. The difference in technology has been the softer handoffs between base stations that allow for more effective mobility over longer distances. Going to IP enables mobile technology to integrate into the all-IP next-generation network (NGN) that is forming to offer services across broadband, cable, and satellite communication mediums.

In October 2020, Apple unveiled the first iPhones to support 5th generation (5G) connectivity with the iPhone 12. This meant Apple had to add new chips, antennas, and radiofrequency filters into the new phone. 5G wireless communications represent a major new set of challenges and opportunities. The frequencies used require higher levels of power and more base stations because the range of transmission is shorter than LTE. It will also afford new opportunities such as faster connections up to 10x faster than LTE and reduced latency. Faster speeds mean new and enhanced cloud-based services to games and videos, virtual and augmented realities, IoT in the homes and factories, and enhanced telemedicine applications.

5G uses frequencies that are 10 to 100 times higher than the radio waves used for 4G and WiFi networks. We need to know more about the power dynamics of 5G and under what conditions, if any, it can break molecular bonds or provide health risks from long-term exposure.

Notes

[1] For a history of wireless communications.
[2] This is a great review of the 4 generations of wireless technologies.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 3: The FCC and Consolidation of Broadband

Posted on | February 5, 2021 | No Comments

In this post, I look at the transition of Internet data communications from a competitive market structure to a few Internet Service Providers (ISPs). As digital technology allowed cable and telecommunications companies (telcos) to transition from traditional telephony to packet-switched Internet Protocol (IP) services, deregulation allowed them to dominate broadband services. It also allowed them to not only move data, but diverge from the traditional “common carriage” communications policy that separated the transfer of data from providing content like entertainment and news.

In Part I of this series, I looked at the emergence of the ISPs and the regulatory framework in the USA that classified them as “enhanced services.” This designation was based on the Federal Communications Commission’s (FCC) Second Computer Inquiry in 1981 that exempted online services from a number of requirements that had been imposed on telephone networks. Part II discussed the transition from dial-up modems in the early days of data communications to high-speed Digital Subscriber Lines (DSL). These “broadband” connections accelerated the business and consumer adoption of the Internet in the late 1990s. In Part 4, I will address issues of net neutrality facing the Biden administration in an era of “smart” or “edge technologies” that includes the Internet of Things (IoT) and “connected” cars.

Despite the design and the efforts of the Clinton-Gore administration to create a competitive environment, the Internet came to be increasingly controlled by a small number of ISPs. It is important to understand the policy environment and administrative actions that changed the Internet into the oligopolistic market structure that dominates broadband today. Policy changes allowed telcos to transition from the neutral transmitters of communication to the communicators themselves.

Broadband services in the USA is dominated by large integrated service providers such as AT&T, Comcast, Sprint, and Verizon. These companies have pursued “triple play” service bundles, combining high-speed Internet, cable TV, and IP phone services. Some also provide mobile services. These companies have been merging with content providers to distribute entertainment, education, and news, as well as move all the other Internet traffic. AT&T merged with Time-Warner, giving them access to Warner Bros., HBO, and Turner/CNN. Comcast has completed its merger with NBC, and Verizon bought AOL and Yahoo! Unfortunately, these deals have failed to return the huge rewards they were aiming for and deterred sufficient broadband rollout.

The highly competitive Internet services provider environment during the 1990s was significantly compromised by the Bush administration’s Federal Communications Commission (FCC). Their decisions favored cable companies and telcos and led to a consolidation of control over the Internet. The FCC’s actions raised concerns that powerful ISPs could influence the flow of data through the Internet and discriminate against some content providers or users to the detriment of consumers.

In 2002, the FCC ruled that “cable modem service” was an information service, and not a telecommunications service. Cable companies like Charter, Xfinity, Cox, and Time-Warner became unregulated broadband providers and were exempted from the common-carrier regulations and network access requirements imposed on the telcos. The Supreme Court decision in National Cable and Telecommunications Association vs. Brand X Internet Services meant that cable modem services would become Title I “information services” despite major criticism by Justice Scalia who argued that cable TV clearly offered both content services and telecommunications services. The Justice had no hesitation in calling it “bad law.”[2]

Then in 2005, another FCC decision effectively made telcos unregulated ISPs. FCC WC Docket 02-33 allowed their DSL broadband services to also become unregulated “information services.” This effectively allowed a few telcos such as Verizon and BellSouth to take over what had previously been a competitive ISP industry. The ruling allowed them to offer broadband fiber and DSL Internet access transmission while presenting challenges to previous requirements such as allowing other ISPs “access to facilities” and interconnection. Smaller ISPs had been allowed to physically connect to the “common carrier” telco facilities so that their customers could access the larger Internet.

Internet innovation came from other sources and distracted the public from broadband carrier issues. Facebook and Flickr were launched in 2004. Twitter, Microsoft’s Xbox Live, and online music streaming Spotify went online in 2006. Google bought Android in 2006 and YouTube the next year. Netflix started its streaming service in 2007, and the first iPhone was also released that year.

The success of these innovations did not escape the telcos’ view, who wanted a piece of the action. They wanted to move beyond being just carriers of information to providers of entertainment and informative content. This was evidenced by Verizon’s introduction of FIOS (Fiber Optic Service) TV service in 2005 and AT&T’s U-verse in 2006. ISPs looked to dominate home broadband service by bundling TV, Internet, and telephone voice service over their high-speed IP networks.

In 2003, Columbia Law professor Tim Wu coined the term ‘net neutrality’ to stress the importance of allowing the free flow of data for the Internet’s future. It is based on the notion of “common carriage,” a legal framework developed to ensure that railroads would serve all businesses and municipalities. It basically means that the network should stay neutral and let the bits flow interrupted from device to device at the highest speeds available. This is how the Internet was designed, but the networks have been around since the telegraph and telephone and have developed their own legal and technical ways to survive.

The Internet’s political and social impact was becoming more apparent with the presidential campaign of Barack Obama in 2008. It was recognized by the Pew Research Center that some 74% of Internet users interacted with election information. A significant number of citizens received their news online, communicated with others about elections, and received information from campaigns via email or other online sources.

In 2010, the Obama administration began to write new rules dealing with Internet providers that would require ISPs to treat all traffic equally. In what were called the “Open Internet” rules, the new administration began to design a framework to restrict telecom providers from blocking or slowing down specific Internet services.

In the next post, I will look at the development of net neutrality rules under the Obama administration. Later, the Trump administration renewed attempts to rid the ISP’s from net neutrality interference by returning to Title I. A major question for the Biden administration is the possible return to Title II rules and strengthened rules on net neutrality.

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Originally from New York, he started his academic career Victoria University in Wellington, New Zealand before returning to New York to teach at Marist College and spending most of his career at New York University. He has also spent time at the East-West Center in Honolulu, Hawaii. When not in the Republic of Korea, he lives in Austin, Texas.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    November 2024
    M T W T F S S
     123
    45678910
    11121314151617
    18192021222324
    252627282930  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.