Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Those Media Products are Misbehaving Economic Goods

Posted on | November 14, 2015 | No Comments

Bad, bad media. Or so economists would have us believe. Media and information products just don’t fit the mold, or should I say, model. Most economic thinking is based on models that make simple assumptions about the types of goods and services that are produced and purchased. Economists like their tight little models and the most relevant to their profession are the ones that bring together supply and demand curves and justify their belief in the power of the equilibrium price, that magical point where all suppliers and consumers are happy.

But this model works best when the product “behaves” – when it is purchased by one consumer and is consumed in its entirety. The models break down when applied to media products like books, broadcast television, broadband, streaming music, etc. To understand what type of economic goods – products or services that can command a price when sold, these are – we have to start an analysis with two important concepts. These can help us understand the dynamics of media goods and their ramifications regarding pricing, but also public policy.

The following video makes distinctions between different economic goods: private goods and public goods. It also specifies the characteristics of common goods and club goods, topics that will discussed in a future post.

Economists like to talk about two characteristics of goods and services called rivalry and excludability. Rivalry is when someone purchases a good, and it is consumed entirely by that person. This phenomenon is sometimes called subtractability because in a sense, the process subtracts, it deletes the product from the economy. Economists usually use some physical good like a hamburger as an example. When a burger is purchased, it is used up, consumed, subtracted from the market. A book, on the other hand, is not used up. It can be read again and again. It can be given to another person whole, and its content read in part or entirely without its consumption – without its subtraction. What about watching a movie on Netflix? Or watching it at the cinema? How about purchasing a DVD? How do these goods differ and what are the economic and policy consequences?

Some media products can be consumed without paying, such as broadcast media and much of the content on the Internet. This raises the issue of excludability. Excludability is when consumption is limited to paying customers. It is the degree to which you can exclude non-paying customers. How can you exclude someone from using or consuming your media product if they do not pay? Broadcast media faced this problem. Radio, and then television, transmitted signals that could be picked up by anyone with the appropriate receiving equipment. The digitization of sound encoded into MP3s created a crisis in the music industry as they could easily be shared over the Internet with peer-to-peer applications like Napster. Broadcast media are nonrivalrous, meaning they are not consumed, but also nonexcludable, meaning people can access without paying.

Even newspapers that are bought and discarded, say on a train, can be used by someone else. Newspapers were the first to start using advertising to alleviate this problem. In fact, the more people it reached, the more valuable was an advertising expenditure. Media goods became “dual product markets” where content is sold to audiences, and audiences are packaged and sold to advertisers. It becomes an interesting pricing calculus for media producers to determine the right mix for selling content, and consequently reducing the size of the audience, or focusing on a larger audience so that higher advertiser rates can be charged.

Digital media economic goods need to be better understood in terms of their economic characteristics. Smartphones, books, games, game consoles, movies, television shows, music CDs, music streaming, broadband service, etc. Each displays varying degrees of rivalry and excludability.

In a future post, I want to outline four different types of economic goods: private goods, public goods, common goods, and club goods. Each of these point to the characteristics of different types of economic goods. Each of these economic types displays varying degrees of rivalry and excludability. Media and IT goods differ but fall along a continuum of these products.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

TPP and the Role of Intellectual Property in the US Export Economy

Posted on | November 7, 2015 | No Comments

With the Trans-Pacific Partnership (TPP) trade deal under discussion, it’s useful to look at some of the changes in the world economy, and specifically the US export economy and the increasing role of intellectual property (IP). As it stands, I’m not for the trade deal, but I feel it’s important to parse through the details of the deal to understand what kind of economy we have developed, who benefits, who struggles, and what we can do going forward. In this post, I look at the trade context that the TPP is being negotiated in, particularly the status of IP.

What exactly is the TPP? It is a trade, investment, and juridical pact covering roughly 40 percent of the world’s economy that was negotiated between the US and 11 other Pacific Rim nations:

Australia Brunei Canada Chile
Japan Malaysia Mexico Peru
New Zealand Singapore Vietnam


Taiwan, the Philippines, and South Korea have expressed some interest in joining the TPP. China is being polite, but recognizes that it is a central target of this agreement. It is currently awaiting US Congressional approval. (Update: President Trump has recently signed an executive order to end the US involvement in the TPP)

It’s unclear what kind of impact the TPP will have on very relevant issues like agriculture, labor rights, public health, and sustainable energy. TPP’s stance on Internet freedom and net neutrality are particularly important areas in need of public scrutiny. Also, the trade dispute settlement system suggests some real threats to American national sovereignty. These issues raise important questions about the type of society Progressives want to create, including aspects of the global economy that should be enhanced and sustained.

US exports nearly tripled since the World Trade Organization (WTO) was created, although the US continues to run trade deficits, primarily due to oil imports from a number of countries. From slightly less than $800 billion in 1995, U.S. trade exports amounted to about US$2.28 trillion in 2013.[1] Almost half of that was in intellectual property or “IP” industries, based on figures from the WTO.

We like to criticize China’s dominance in manufacturing exports, but its percentage of world manufacturing export revenues is about 18% while the US share of intellectual property export revenue is nearly 40.0%. That is a significantly large share of the IP market. While Europe is a close second, the figures are somewhat misleading, as it includes international cross-border transactions between European countries. Imagine if all the transactions between US states were counted as exports.

Of the world’s reported US$329 billion in IP export revenues, some $129 billion is captured by US interests. Food and Agriculture is still the US’ largest export category at $162 billion. Automotive is next at $125 billion, followed by Finance & Insurance at $110 billion, Information Technology goods at $108 billion and Aerospace at $105 billion.[2]

It’s useful to break this IP figure down. Perhaps surprisingly, the largest category of IP is patents, accounting for US$45 billion. Royalty payments for patent use have gone up substantially as commerce and production has globalized. A close second is the export of software at $43 billion; this includes about $3 billion in video games. A distant, but significant third at $17 billion, is the traffic in trademarks, the legal right to use a logo, name, phrase, song, or symbol.

Tied for third is the major copyright category Film/TV/Music/Books at $17 billion. A note is in order here as the WTO calculates movies and music as part of “audiovisual services,” but Progressive Economy has added them into the IP statistics as copyrights are an important component of IP. Intellectual property issues are coordinated internationally under agreements with the World Intellectual Property Organization (WIPO) based in Geneva, Switzerland.

For better or worse, American trade dynamics have changed over the last 50 years, especially with its Pacific neighbors. While manufacturing is important, it has been difficult for the US to compete with low-cost labor in other parts of the world. On the other hand, the US has been going through a knowledge and technological revolution largely based its ongoing use of militarization as a national system of innovation.

The modern era of globalization was a result of commercializing the fruits of the Cold War and Middle East invasions. Computers, the Internet, space-based satellite systems and fiber optic cables for communications, came out of government subsidized research and initial utilization by the military-industrial complex. These and other technologies like big data processing, geolocation recognition, mobile technologies and remote sensing have come together create new global circuits of production, logistics, and strategic communications that have transformed creative, financial, and manufacturing industries.

Modern capitalism is optimized to provide shareholder returns, not employee benefits or national production. Labor and national interests need strong democratic will formation and participation to ensure a flow of resources to citizens as well as education, infrastructure, military and other social needs. Economists will argue that the US has a comparative advantage in intellectual property, but employment opportunities are largely in high tech/creative clusters like San Francisco, Los Angeles, Boston, Austin, and New York.

As the trade figures show, globalization is creating a “rentier” economy. Wealth is flowing more to the owners of resources rather than the producers of goods and services. While land owners have long been the beneficiaries of rental income, owners of intellectual property have seen increasing profits from royalty and licensing fees. Global supply chains, an important cost factor in the products Americans like to buy, draws heavily on the use of key intellectual properties to reduce costs and increase profits. Patents, copyrights, trademarks and even business methods have become a new global force, shaping the world’s political economy for the rentiers’ benefit.

One of the policy issues to be addressed is that profits on intellectual property can be sheltered by selling the rights to a series of subsidiaries (“Double Irish” or “Dutch Sandwich”) in tax havens like the Cayman Islands.

Intellectual property can be a serious contributor to US economic welfare but is not a panacea for a country that is desperately struggling in the global economy that it created. It may be that the Democrats need TPP to secure funding and electoral votes from California, but the US, in general, needs to do a lot of homework on TPP. It especially needs to assess how it wants to structure its international trade so that it can help tackle long-term domestic issues like climate change, debt reduction, energy independence, and food security.

Notes

[1] Exports in 1995 were about $800 (794,387) billion and grew to US$2.3 trillion in 2014 according to the US Bureau of Economic Analysis. It also grew from about 5% of GDP to nearly 15% of GDP.
[2] The Progressive Economy website is a project of the GlobalWorks Foundation.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he was on the faculty of New York University for 15 years. Previously, he taught at he taught at Hannam University in South Korea, St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Microsoft and the IBM PC Case Study: The Deal of the Century

Posted on | November 1, 2015 | No Comments

The microcomputer was starting to become popular in the late 1970s and finally caught the attention of IBM. Initially resistant, “Big Blue” became concerned that they were “losing the hearts and minds” of their clients. However, when the Apple II started to show up in many corporate environments, IBM officials were concerned about the encroachment of these small “microcomputers” into their minicomputer and mainframe businesses.[1]

Although the Apple II’s capabilities were limited, it had exciting software such as the VisiCalc spreadsheet software. IBM had attempted another microcomputer in September of 1975, the 5100, but at 55 lbs and a cost of $20,000, the laptop computer failed to take off. IBM responded to the growing popularity of the personal computer by going outside its company and contracting with new hardware and software suppliers to produce its own “Personal Computer” or PC as it came to be called. By mid-1980, “Big Blue” moved quickly to assemble and market their version of a microcomputer.

The key to the emerging microcomputer business was the microprocessor “chip.” IBM chose to build the PC around the cheaper 8-bit Intel 8088 chip, despite the availability of the much more powerful 16-bit 8086 chip. Its decision was based on two reasons. The first was that IBM feared their microcomputer might compete with their mini and mainframe businesses and wanted the less powerful chip to keep the PC in a different class. The second and probably more important reason was that a small industry of hardware and software products had already been developed for the Intel 8088. Many more supporting chips were available for the older chip, and this meant that a functioning system could be assembled within a shorter period.

One of the new suppliers that IBM contacted was Micro-soft. Initially, they approached them for software and a computer operating system. The meeting was facilitated by the connection between the IBM CEO and Gates’ mother Mary; both served on the board for the same United Way charity. Apparently IBM’s briefing books listed Microsoft as the major producer of operating systems. Bill Gates was surprised by the request for an operating system and quickly referred the IBM representatives to Gary Kildall and his operation called Intergalactic Digital Research. Kildall had written CP/M and was the predominant figure in microcomputer operating systems.

When the IBM representatives arrived, they were disappointed with their reception. The story varies but apparently Dr. Kildall had plans to go flying his airplane during the morning hours and asked his wife Dorothy to meet with the visitors. She stalled them and asked her lawyer to go over the IBM nondisclosure agreement (NDA). Together they decided not to sign it. Another story, however, was that Kildall returned in the afternoon, signed the NDA, but refused IBM’s offer to buy CP/M outright for $250,000, holding out instead for licensing it at $10 a copy. In any case, IBM’s representatives left disappointed and returned to Microsoft. Afraid to lose the opportunity to sell BASIC and other languages with the new IBM microcomputer, Gates told them he could get them an operating system.

Their secret “ace-in-the-hole” was Paul Allen’s dubious connection to a programmer who was working on a variation of the CP/M operating system that IBM wanted. The programmer was Tim Patterson who worked at a company called Seattle Computer Products (SCP). Patterson had spent nearly half of 1976 reworking the code from a manual he bought. When Allen approached the company, SCP was suspicious of Microsoft’s intentions. But they were very interested in getting a cash settlement. Eventually, they agreed to turn over the software, called Q-DOS (Quick and Dirty Operating System), for royalties that totaled some $50,000. This was the software that was soon going to make Gates and Allen, the richest men in the world.

On August 12, 1981, Big Blue introduced its personal computer. The Intel 8088 chip operated at 4.77 MHz and contained some 29,000 transistors. It handled data internally in 16-bit words but was limited to 8 bits externally. It used ASCII code for representing information that made its characters show up crisply on the 80 character monochrome screen, a major improvement over the Apple II, especially for word processing and other office applications.[2] At its core was PC-DOS, licensed from Microsoft but also available were CP/M-86 and another Pascal-based OS from the University of California at San Diego.[3] The IBM PC sold initially for $2,880 and had 64 Kbytes of memory along with a floppy disk.[4] It was a major success that first year with over 670,000 computers sold.[5]

Microsoft and the Clones

But, in one of the biggest business blunders of all time, IBM did not get an exclusive contract for PC-DOS. Gates pushed for an agreement that would allow them to license the OS to other manufacturers. Lessons from the mainframe business showed that companies could develop “plug-compatible” machines that were compatible with large IBM computers. The Amdahl Corporation, for example, developed a processor for its Model 470 V/6 that ran IBM 360 software. Japanese companies also made significant entries into American computer market via the plug-compatible business. Fujitsu began building Amdahl computers in Japan and Hitachi began selling similar machines in the USA under the name of National Advanced Systems, a division of the National Semiconductor Corporation. With this agreement, Gates was free to make similar deals with companies who could “legally” replicate the IBM computer and run the “MS-DOS” software.

While the IBM PC became the industry standard behind an expensive marketing campaign using a Charlie Chaplin look-alike, Compaq was leading the way for other microcomputer manufacturers looking to produce “IBM-compatible” machines. Compaq invested in the legal and technical expertise to “reverse engineer” a crucial part of the IBM architecture that was produced in-house by IBM. This was the BIOS, the Basic Input/Output System.

Originally developed by Gary Kildall, the BIOS concentrated the hardware-dependent aspect of his CP/M operating system into “a separate computer code module” that was stored in a read-only memory chip, thus the name ROM-BIOS. This chip allowed CP/M to be adapted to many different computers that were being developed around Intel chips.[6] IBM had developed and copyrighted its own BIOS, which needed to be “reverse engineered” to allow another manufacturer to produce it.

Compaq put 15 engineers on the job and invested over a million dollars in the project to produce a portable PC. Despite IBM’s strength, they ultimately had lower overhead than IBM and could compete with a cheaper computer. What made these clones popular though was that they could run the same software as the IBM machine. Compaq and others such as Commodore, Tandy, Zenith and most impressively, Dell provided “IBM-compatible” machines that could run the most popular non-Apple programs. The result was a proliferation of PCs, or as they came to be known, “PC clones.” Compaq sold over US$100 million of their “portable” personal computers in its first full year of business.

In one of the most extraordinary business arrangements in modern history, Microsoft leveraged its knowledge of the Intel microprocessor environment to outmaneuver IBM and establish its operating system as the dominant operating system for the PC. In a strategy Microsoft executive Steve Ballmer called, “Riding the Bear,” Microsoft worked with IBM to the point where it was strong enough to go on its own, ultimately becoming one of the richest companies in the world by having their software on nearly every PC in the world. This would first include developing programming software for the fledgling Intel-based microcomputer industry and then in association with IBM, standardizing an operating system for the non-Apple microcomputer industry.

Triumph of the Nerds video on reverse engineering and Compaq by Robert X. Cringely.

The development of the microprocessor-based computer industry by Intel and Apple resulted in the wide-scale use of electronic spreadsheets, what some would call the “killer app” of the personal computing revolution. VisiCalc, Multiplan, SuperCalc, Lotus 1-2-3, and Quattro were the first spreadsheets to become available. By the mid-1980s, electronic spreadsheets would make their way into the corporate world and became an integral tool for the progression of digital monetarism.

Citation APA (7th Edition)

Pennings, A.J. (2015, Nov 2) Microsoft and the IBM PC Case Study: The Deal of the Century. apennings.com https://apennings.com/how-it-came-to-rule-the-world/microsoft-and-the-ibm-pc-case-study-the-deal-of-the-century/

Share

Notes

[1] Quote from Jack Sams, IBM PC project manager as interviewed by Robert X. Cringely in Triumph of the Nerds based on his book Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date. Cringely has a new book about the decline of IBM.
[2] The impact of ASCII on the IBM PC from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 268.
[3] The three operating systems from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 268.
[4] Cambell-Kelly and Aspray’s (1996) Computer: A History of the Information Machine. Basic Books. p. 257.
[5] Information on IBM PC release from Michael J. Miller’s editorial in the September 4, 2001 issue of PC MAGAZINE dedicated to the PC 20th anniversary.
[6] BIOS information from Cringely, p. 58.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Mind Mapping in Higher Education

Posted on | October 22, 2015 | No Comments

An underutilized creative technology that enhances the educational experience is a writing process called mind mapping. Mind mapping is a visual form of contemplation and word association that can be used for note taking, journaling, and creative thinking. Connecting words with branching lines and adding colors and images creates a mental dynamism that facilitates getting ideas into a concrete form on paper or an electronic document that can be readily reviewed and shared.

This post describes mind maps, how they are used, their value to students, and how they can be used for lectures, class exercises, and brainstorming activities.

What are mind maps?

Inspiration Software, Inc. defines a mind map as “a visual representation of hierarchical information that includes a central idea surrounded by connected branches of associated topics.” Mind maps can be either drawn by hand or designed on computer software application like that offered by Inspiration, Mindjet or Mindmeister. The basic technique draws on the human tendency to make mental connections by word association and the power of writing to stimulate thoughts and creative ideas.

How do they work?

Mind maps start in the middle of a page or document with a central idea and expand outward using keywords on branches. They move from the general to the specific, with details they become more defined as the map expands outward. Fewer words are better than phrases or sentences, but every distinct keyword or grouping of words (or image) should be set on its own line. Be sure the lines are the same length as the word/image they support. It is important to economize on space as reaching the perimeter of the document restricts your thinking and means you are literally running out of ideas.

How students can use mind maps

These diagrams can be used in several ways. At their best, they help classify ideas or generate new connections between ideas, the very definition of creativity. They can make note-taking in classes more efficient, engaging, and fun. Notes taken in mind maps are also easier and faster to review, making studying for a test more efficient. Mind maps can help organize ideas for writing projects as they are basically non-linear outlines. Examine this mind map on Shakespeare.

Another way to use mind maps is to start a class, requesting the students to map out the last class. This facilitates learning by reinforcing key ideas, directing the student’s attention to the class at hand and its topics, and giving the students visible information to share in class. As they are good for getting to details, mind maps are helpful for event planning, such as class trips, group presentations, or graduation parties.

Using mind maps for presentations

Mind maps can be used to present information in class lectures, although certain precautions should be taken. They are inherently personal, so other viewers should be guided step-by-step. Displays of mind maps from a computer projected on a screen should be accompanied with specific guidance. First, the principles of the hierarchical structure should be reviewed. The presentation should proceed from the central idea to the details, showing the “Big picture” and the significance of related concepts.

It is efficient to organize the critical ideas of the maps clockwise starting at 1 o’clock. It also makes the map more intelligible by providing a familiar structure.

Mind maps should converted to jpegs and then to pdfs for additional usability. With a pdf, you can highlight a section of the map you want to focus on during a presentation. Using the mouse, you can click and draw squares and rectangles around the words, phrases, or images relevant to the lecture.

Group brainstorming

Mind maps drawn on a whiteboard, blackboard, or large newsprint pad on an easel can be used with groups of students to brainstorm and exchange ideas. This means capturing each person’s thoughts while simultaneously stimulating the group’s best thinking. Words are again organized radially around the central idea. Prioritization can be avoided until subsequent stages when more sequential or hierarchical structures can be arranged in a new mind map, a Gantt chart, or other form of visual organization. Presenting ideas visually makes it easier for the students to follow and can more readily contribute to the group process.

Summary

Mind mapping is a valuable tool for higher education activities. Students can use them for notetaking and reviewing previous lectures. They can also be used for class exercises that stimulate creative thinking. Mind maps have a magnetic quality where ideas attract similar thoughts. Therefore, they can be used to increase student concentration and focus.

Faculty can use mind maps to develop class presentations and actually use them to present their lectures by drawing on a board or projecting from a computer on to a screen. Although these diagrams are not immediately helpful in communicating ideas, they can be used very effectively with proper guidance. They are also good for reviewing previous lectures or reading assignments. Lastly, they can be used for brainstorming and facilitating student participation.

Citation APA (7th Edition)

Pennings, A. J. (2015, Oct 22). Mind Mapping in Higher Education. https://apennings.com/educational-innovations/mind-mapping-in-higher-education/

Notes

[1] Images from http://www.tonybuzan.com/ website. Tony Buzan is generally considered the inventor of mind mapping.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the State University of New York, South Korea and a Research Professor at Stony Brook University on Long Island, NY. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Markets: Pros and Cons

Posted on | October 4, 2015 | No Comments

The term “market” has been widely circulated to refer to any arrangement, institution or mechanism that facilitates the contacts between potential sellers and buyers and helps them negotiate the terms of a transaction. In other words, a market is any system that brings together the suppliers of goods and services with potential customers.

The term “market” evokes imagery of a medieval city center or town square filled with merchants peddling food or wares and haggling over prices with interested customers. A market depends on the conditions of voluntary exchange where buyers and sellers are free to accept or reject the terms offered by the other. Voluntary exchange assumes that the acts of trading between persons make both parties to the trade subjectively better off than they were before the trade.

Markets also assume that competition exists between sellers, and between sellers. Economic models of markets are based on the idea of perfect competition, where no one seller or buyer can control the price of an “economic good.” In this vision of a rather Utopian economic system, the acts of individuals, working in their own self-interest, will operate collectively to produce a self-correcting system. Prices will move to an “equilibrium point” where producers of economic goods will supply an adequate amount to meet the demand of consumers willing to pay that price. Unless someone was cheated, both parties end the transaction satisfied because the exchange has gained them some advantage or benefit.

An important condition is the “effective demand” of consumers – do the buyers of economic goods have sufficient bargaining power – mainly money. Consumers must have the desire, plus the money to back it up. Central to any market is a mutually accepted means of payment. While parties may exchange goods and services by barter, most markets rely on sellers offering their goods or services (including labor) in exchange for currency from buyers. Any medium of exchange will depend on trust in the monetary mechanism as buyers and sellers must readily accept and part with it for a market to function effectively. Money has had a long history of being things, most notably gold. Gold has striking physical attributes: it doesn’t rust, it doesn’t chip, and it can be melted into a variety of shapes. Other metals such as silver and platinum have also served as money.

It is interesting that societies gravitate towards the use of some symbolic entity to facilitate these transactions. At its most basic level, money can be anything that a buyer and seller agree is money. At times, commodities such as rice or tobacco and even alcohol have served the roles money. Market enthusiasts often overlook the importance of money, focusing instead on the behaviors of market participants.

The pros and cons of markets are hotly debated today. Some believe markets are an ideal system to organize society. They often cite Adam Smith’s famous “invisible hand” as the God-given mechanism that organizes a harmonious society based on market activity. Others believe markets are prone to failure and give rise to unequal conditions and challenge democratic participation.

One of the best explanations of the strengths and weaknesses of the market system comes from The Business of Media Corporate Media and the Public Interest (2006) by David Croteau and William Hoynes. They point to the strengths of markets such efficiency, responsiveness, flexibility, and innovation. They also discuss the limitations of markets as well. These include enhancing inequality, amorality, their failure to meet social needs, and the failure to meet democratic needs. Below is a summary of some of their key ideas.

The market provides efficiency by forcing suppliers to compete with each other and into a relationship with consumers that requires their utmost attention. The suppliers of goods and services compete with one another to provide the best products, and the competition among them forces them to bring down prices and improve quality. Firms become organized around cutting costs and finding advantages over other companies. They have immediate incentives to produce efficiencies as sales and revenue numbers from market activities provide an important feedback mechanism.

Responsiveness is another feature of markets that draws on the dynamics of supply and demand. Companies strive to adapt to challenges in the marketplace. New technologies and expectations, incomes and tastes and preferences of consumers require companies to make changes in their products, delivery methods, and retail schedules. Likewise, consumers respond to new conditions in their ability to shop for bargains, find substitute goods, and adopt new trends.

Flexibility refers to the ability of companies to adapt to changing conditions. In the absence of a larger regulatory regime, companies are able to produce new products, new versions of products, or move in entirely new directions. In a market environment, companies can compete for consumers by making changes within their organizational structure, including adjustments in production, marketing, and finance.

Lastly, markets stimulate innovation in that they provide rewards for new ideas and products. The potential for rewards, and necessities of gaining competitive advantages, drive companies to innovate. Rewards include market share, but also increased profits. They point out that without competition, firms avoid risk, an essential component of innovation as many experiments fail.

Croteau and Hoynes also point out serious concerns about markets that economists do not generally address.

The tendency of markets to reproduce inequality is one important drawback to markets. While some inequality produces contrast and incentives to work hard or to be entrepreneurial, a society with a major divide between haves and have-nots will tend towards dystopia, a “sick” place. Thomas Piketty’s Capital addresses this issue head-on and warns that capital gravitates towards more inequality, and the trickle-down effects tend to lead to a slower and slower drip. Neo-elites benefiting from the rolling back of the estate tax have advantages that others don’t have. Croteau and Hoynes use the vote metaphor, “one dollar, one vote” to refer to the advantages the rich have over the poor, as they have many more dollars, and thus many more votes.

The second concern they have about markets is that they are amoral. Not necessarily immoral, but rather that the market system only registers purchases and prices and doesn’t make moral distinctions between, for example, human trafficking, drug trafficking, and oil trafficking. The commerce in a drug to cure malaria does not register differently from a drug that provides a temporary physical stimulation. Markets do not judge products unless it registers changes in demand. It does not favor child care, healthy foods, or fuel efficient cars, unless customers make their claims in currency.

Can markets meet social needs? This has been a pressing question for the last thirty years as privatization was often forwarded by market enthusiasts as an effective strategy to replace government services – for some of the reasons listed above. But a number of services and sometimes goods should probably be provided by some level of government – defense, education, family care and planning, fire protection, food safety, law enforcement, traffic management, roads and parks.

Can markets meet democratic needs? Aldous Huxley warned of a society with too many distractions, too much trivia, seeped in drugged numbness and pleasures. Because markets are amoral, they can become saturated with economic goods that service vices rather than public spirit. Competition, in this case, may result in a race to the lowest common denominator rather than higher social ideals. Rather than political dialogue that would enhance democratic participation, the competition among media businesses tends to drive content towards sensationalist entertainment.

Comedian Robin Williams once quipped, “Cocaine is God’s way of telling you that you are making too much money.” Markets are a powerful system of material production and creative engagement, but they create inequalities, often with products and services that are of dubious social value. How a society enhances and/or tempers market forces continues to be a major challenge for countries around the world.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

The Telecom Crash of 2002

Posted on | September 27, 2015 | No Comments

One of the economic history’s most fascinating questions will deal with the stock market advances during the eight Clinton-Gore years, especially those in the Internet and telecommunications sectors. During their tenure in the 1990s, the stock markets boomed, and the investments created the lowest unemployment in years. They also contributed significantly to the administration’s goal of budget surpluses. Even Alan Greenspan’s speech on December 5, 1996, in which he questioned the “irrational exuberance” of stock purchases, merely marked the beginning of the historical “bull run” that would end with the Telecom Crash of 2002.

Billions of electronic dollars moved through newly computerized stock markets into established and new technology companies as the economy roared forward rapidly. ECNs (electronic communication networks) such as Island were approved by the SEC to match buyers and sellers and create a much-needed trading facility in the hectic new environment. While most people participated primarily through institutions providing mutual funds, 401Ks, and pension funds, day trading emerged through the Internet to provide a more direct interface to the financial markets. The American investment economy had not seen such universal participation since the 1920s when radio companies such as RCA set the pace of speculative investing.

NASDAQ went from 675 in January 1993 to 2,796 at the end of January 2001 (a 414% increase), although it hit a high of 5132 in March of 2000. The DJIA went from 3,370 to 10,887 (323%) during that period, with a high of 11,497 in December 1999. The Dow dropped from 8,883 to 7,539 in August of 1998 after the Asian and Russian financial crises but recovered quickly.[1]

Any “information superhighway” would need a legislative framework to build it. A significant element of this new regime was the rewriting of the basic telecommunications law. Central to the administration’s economic success was changing the Communications Act of 1934. Written and passed as part of the New Deal, it aimed to curb the abuses of monopoly while ensuring the development and availability of efficient and nationwide broadcasting, telegraph, and telephone services. The Act authorized separate monopolies in each of the communications sectors: radio broadcasting, telegraphy, and telephony (this list would grow to include cable TV, satellites, and wireless) subject to the FCC rulings on appropriate prices and services. A new framework was needed within the new regime that allowed digital convergence among these separated sectors.[2]

The Clinton Administration and its appointed FCC Chairman, Reed Hundt, worked with the Republican-controlled Congress to pass the Telecommunications Act of 1996. It was an awkward piece of legislation, but one that allowed the FCC to create a new environment for the Internet and e-commerce to grow. For instance, it required telcos to allow other companies to lease their network for interconnection purposes at regulated rates. It led to a large number of ISPs offering low-cost Internet access. The legislation had lot of political pork; in particular, the broadcast networks were bought off by giving them free spectrum for future offerings of digital television.[3]

Part of the Act broke open the segmentation that had separated the different sectors. After the Act, companies could invade the other’s turf. It removed barriers to entry to the once-protected monopoly-controlled sectors. For example, broadcasters could move into broadband, and carriers could offer content. It also allowed the consolidation of different media companies, creating a new frenzy of mergers. AT&T, for example, made major multi-billion purchases of Media One and TCI to become a cable behemoth after they invented cable modems for broadband delivery along with television services. AOL, by far the largest ISP, merged with Time Warner in an attempt to cross-pollinate between the “old economy” and the “new economy.”

In the wake of the Telecommunications Act of 1996, investors poured some $2 trillion dollars into the telecommunications industry.[4] Enron, Global Crossing, Tyco, Winstar, and Worldcom were a few of the companies that attempted to position themselves on the forefront of the telecommunications revolution. Enron, primarily an energy company, bought and built a 15,000-mile fiber optic network throughout the US and set up a new division, Enron Broadband Services to manage the network. Enron also attempted to create a spot market for its bandwidth capacity services as well as other carriers, ISPs, and major users with leased lines. Global Crossing, led by Michael Milken protégé Gary Winnick, built an extensive fiber optic network including a substantial international portion. In addition, it also took over Frontier, a Rochester NY based telephone company. A major conglomerate, Tyco, also made major investments in international optical IP networks including a major hub on Oahu, Hawaii.[5] Worldcom incorporated telecom competition pioneer MCI and became the largest Internet backbone service in the world through its purchase of Internet provider UUNET. The investments in the telecommunications sector dramatically increased the speed and scope of its digital transmission capacity, transforming it into the major conduit for worldwide Internet traffic.

The newly deregulated environment heated the battle for broadband access to the home. The FCC’s longstanding distinction between basic and enhanced services allowed a new breed of ISPs to connect to home computer users from local toll sites for unlimited time periods. But the Bell companies retaliated with services including ISDN and then DSL and would eventually work the system to gain control over most local ISPs.

Competition from cable television and the satellite industry was looming. New modems and switching capabilities allowed the cable companies to use their high-bandwidth coaxial and fiber optic facilities to provide high-speed access to the Internet. Satellite television services developed new compression techniques that not only allowed many more channels to be broadcast but with encryption techniques that allowed for individual subscriber services. But satellites still suffered the signal delay that made it difficult to provide interactive services from a spacecraft located over 36,000 kilometers above the earth. Although all three were moving towards the use of IP networking, cable television had the advantage of moving into broadband Internet service with its legacy infrastructure which would soon include voice-over-IP (VOIP) telephone sets.

After Al Gore had wrapped up the Democratic nomination on Super Tuesday, March 7, 2000, NASDAQ reached a high of over 5000. But in April prices began to fall although the Dow, the top 30 oligopolies, continued to rise throughout the summer. November’s election was extremely close with Gore winning the nation’s overall vote, but Bush narrowly winning the Presidency by Electoral College, taking Florida’s 29 electoral votes by a mere 537 popular votes after a controversial Supreme Court decision.[6] From the NASDAQ’s high of over 5000 in March of 2000, it quickly declined over 3000 points by the time the Bush administration settled into their new offices less than a year later.

With the new presidency, tech stocks fell to earth. The “new economy” was over, and the “old economy” had retaken Washington DC. President George Bush and most of his immediate staff (Cheney, Evans, Rice) were veterans of the oil business. The new administration represented a new set of economic values and geopolitical priorities. The NASDAQ continued to fall while the Dow-Jones Index of 30 established corporations climbed even higher, surpassing the 11,000 mark again in February 2001.

The telecommunications industry went into steep decline. The first revelation came with the meltdown of Enron, a company that was consistently listed in the top 10 of the Fortune 500. Heavily burdened by its junk bond debt, Enron desperately sought new ways to raise cash including putting together an 18,000-mile fiber optic network.[7] One of its schemes embraced the Internet in an interesting but futile attempt to create a trading environment for bandwidth as it had for natural gas spot contracts. But it was too little, too late. Enron Online made its first trade in December 1999 as the markets were at their peak, and Enron was trading near $90. It would soon wind up as the largest bankruptcy in history and destroy accounting giant Arthur Andersen in the process. But not before FORBES would tout Enron as “The Most Innovative Company in America” for the fifth year in a row.[8]

As the Worldcom case would show, many companies started to engage in illegal accounting techniques after the markets faltered. In late June 2002, CNBC reported that Worldcom had been discovered to have accounting irregularities dating back to early 2001. Nearly US$4 billion had been illegally documented as capital expenditures. Worldcom had registered $17.79 billion in 2001 “line cost” expenses instead of the $14.73 billion it should have reported. The result was that it reported US$2.393 billion in 2001 profits instead of showing what should have been a $662 million dollar loss. Shares dropped quickly. Although the stock had already fallen from its 1999 high of $64 a share to just over $2, it soon dropped to mere pennies before the stock stopped trading.[9] Other companies such as Qwest and Tyco further reduced the vestiges of the general public’s confidence in the stock market, and particularly its telecommunications companies.

The stock markets continued to decline as new corporate scandals were revealed. It finally reached into the DJIA during mid-2002. The “Dow”, representing mainly the stalwarts of the old economy, would maintain its strength during the new administration’s early years. It would dip below but return to highs over 10,000 intermittently until the summer of 2002 when the corporate scandals were exposed. Bush’s SEC chief, Harvey Pitt, failed to gain the confidence of investors and eventually resigned. The Wilshire Total Market Index fell from $17.25 trillion on March 24, 2000 to $10.03 trillion on July 18, 2002. By August, 2002 over $7 trillion of stock values had dissipated. Particularly hard hit were the tech sectors.

Telecommunications services, which had accounted for 7.54% of the Wilshire Total Market Index at the end of March, 2000 saw its total value fall to only 3.63%. Information technology fell from 36.2% to 15.01% and even Microsoft saw its market capitalization fall from $581.3 billion to $276.8 billion. Finally, Congress passed the Sarbannes-Oxley Bill in August requiring all corporate CEOs to sign off on their company’s books. The fall abated, but at a cost of trillions of investor dollars from IRAs, mutual funds, individual investments, and pension funds.[10]

Notes

[1] Stock figures for both the NASDAQ and DJIA are from Yahoo! Finance, the “poor man’s Bloomberg”. They indicate monthly figures and may represent the daily or weekly highs. Greenspan “irrational exuberance” information from Daniel Gross’ (2000) Bull Run: Wall Street, the Democrats, and the New Politics of Personal Finance. New York: Public Affairs. p. 19.
[2] Information on Communications Act of 1934. Martin, J. (1976) Telecommunications and the Computer. (2nd Edition) NJ: Prentice Hall. p. 31.
[3] Republican Bob Dole was a particularly vocal critic of the spectrum giveaway.
[4] BUSINESS WEEK, (2002) “Inside the Telecom Game”. Cover Story, August 5, 2002. Pp. 34-40.
[5] An article by John Duchemin on the Honolulu Advertiser’s website chronicled the travails of the Tyco telecommunications hub in Wai’anae. The 38,000 square foot center went unused when the telecom market collapsed. August 13, 2002.
[6] The USA operates its Presidential elections with an electoral college system. People don’t elect the President directly but when a candidate wins a state, a determined number of electoral votes go to that candidate. The system keeps any one section of the country from dominating the country.
[7] We see Michael Milken’s mark who was a consultant to Lay and helped him fend off Irwin Jacobs and other potential corporate raiders, but at the price of a incurring huge debt. See Peter C. Fusaro and Ross M. Miller’s (2002) What Went Wrong at Enron. NJ: John Wiley & Sons.
[8] Fusaro, P.C. and Miller, R.M. (2002) What Went Wrong at Enron. NJ: John Wiley & Sons. p. 172.
[9] Worldcom figures from THE NEW YORK TIMES, Friday, June 28, 2002. pp. C1-C6. Articles by Michael Wilson and Leslie Wayne.
[10] Feaster, S.W. (2002) “The Incredible Shrinking Stock Market: More Than $7 Trillion Gone,” NEW YORK TIMES. July 21, 2002. p. 14 WK.

Citation APA (7th Edition)

Pennings, A.J. (2015, Sep 27). The Telecom Crash of 2002. apennings.com https://apennings.com/telecom-policy/the-telecom-crash-of-2002/

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York in South Korea. Previously, he taught at New York University from 2002-2012. He started at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii. When not in Korea he lives in Austin, Texas were he has also taught in the Digital Media MBA at St. Edwards University.

Hootsuite and the Social Media Curriculum

Posted on | September 21, 2015 | No Comments

I’ve been training people in Hootsuite for the last three years. The desktop and mobile dashboard system was designed to integrate and manage social media applications like Facebook, Google+, LinkedIn and Twitter. Some others you might recognize are Foursquare, MySpace, WordPress, TrendSpottr, and Mixi. While it has some competition, it is still the best social media management tool available.

Hootsuite is promoted as a brand management or communications management tool that is handy for day-to-day social networking management and/or campaign management. It can be used at a personal level, for social media consultants working for several clients, and for media management at large-scale enterprises.

I was originally exposed to Hootsuite at my first SXSW in Austin, TX in the spring of 2012. I was invited by my colleague David Altounian, who taught it in his classes at St. Edward’s University and was on a pre-conference discussion put on by Hootsuite University about social media and education. The image below features Hootsuite moderator Kirsten Bailey, Dr. William Ward from the Newhouse School of Public Communications at Syracuse, Lea Lashley of Southern Methodist University and David Altounian from St. Edwards University.

Hootsuite University at SXSW

They discussed the value of social media education, problems in getting academia to recognize or respond to the social media environment, teaching skills vs. principles, and does social media change the marketing discipline.

A few years ago I posted my thoughts on important curriculum issues for social media education. It is now an NYU course that I have taught online, and I’m currently teaching Hootsuite in a Political Communication class at Hannam University in the Republic of Korea. These are the general areas that I proposed should be considered in a program on social media.

– New developments in social media technologies and techniques;
– Key communication and economic attributes that power this medium, including important metrics;
– How social media can be used as part of an organization’s communications strategy;
– Key skill sets and knowledge students can acquire for entrepreneurial innovation and employment in this area;
– Legal, privacy, and other unfolding social concerns that accompany this dynamic new medium;
– Issues of social change, citizen engagement, and democratic prospects;
– Research implications of social media and the theorization and methodological skills needed to conceptualize research projects.

I guess I would add the importance of learning a social media management tool like Hootsuite, Spredfast, or Sprout Social to that list, even though it is suggested in the first topic.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Fed Watcher’s Handbook is on Amazon

Posted on | August 24, 2015 | No Comments

The world is abuzz with talk about US Federal Reserve and whether it will raise interest rates in September. Interest rates have hovered near 0 percent since 2008, and a rate hike to .25% or .5% is possible. Analysts are asking basic questions: Is the economy strong enough? What do we make of economic problems in China? Is the housing market heating up? Do declining oil prices mean a global slowdown? Are employment rates sufficient? Have we seen the end to Greek crisis? How would an interest rate hike affect the volatile financial and stock markets?

My book, just out on Amazon, doesn’t predict what the Fed will do, but provides background on the Fed and how it works. It also introduces a process that classes and organizations can use to make their own analysis of the situation.

FedWatch1The Fed Watcher’s Handbook: Simulating the Federal Reserve in Classrooms and Organizations is available on Amazon either as a paperback or electronic Kindle version.

The book introduces the Federal Reserve System, the Federal Open Market Committee (FOMC), and a simulation I developed at New York University to make economics and monetary policy more relevant and easier to understand. I later used it at the MBA level, and I’m currently exploring its use as a team-building exercise for organizations.

In the simulation, participants become a member of the US central bank, either a district president or someone on the Board of Governors. The “Fed” is made up of 12 districts such as Chicago, Boston, Atlanta, New York, Dallas, etc. Participants have to study the bios of the Presidents or Governors and then report on their district.

Then we meet a day or two before the actual Federal Reserve meets and simulate their discussions and voting process to determine interest rates. We even collectively write the press release after the group decides. These activities allow them to compare the process and results of the simulation with what occurred at the actual Fed FOMC meeting.

FedWatch1b

Share

© ALL RIGHTS RESERVED

Anthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    December 2024
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    3031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.