Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

How the US Mobile Industry Came Together, Part I: Transformation of the Network

Posted on | June 27, 2019 | No Comments

The ubiquity, ease and sophistication of mobile “smartphone” services has proven to be an extraordinarily popular addition to modern social and productive life. We are now on the cusp of the 5th generation (5G) rollout of wireless services. It is different to gauge the characteristics of social change in the midst of such dramatic technological change, but mobile services have arguably extended our sense of self, allowed us to reintegrate caring (and some annoying relationships), and provided new ways to create and to learn.

This post addresses the transformation of the telecommunications industry and its expansion into mobile services. This was initiated by a series of actions that led to the breakup of the AT&T monopoly and opened the environment for the major business, regulatory and technological disruptions that led to the proliferation of smartphones and other portable devices. The next post will examine the emergence of the wireless industry which grew rather independently of the telcos, until the 1990s anyway. I delved into my notes first for the below explanation of the “breakup” of AT&T and how it opened up the telecommunications to new innovations, including wireless.

In 1980, IBM officially went into competition with AT&T when it offered its Satellite Business Systems (SBS) service, a partnership with Aetna Life & Casualty and Comsat. SBS was set up to provide “intranet” telecommunications services for banks and other businesses after the FCC deregulated satellite services in the early 1970s. Using new Ku band technology, SBS services such as data, facsimile, electronic mail, telephony and video conferencing could be transmitted directly to smaller antennas at the user’s premises. This meant that they could effectively bypass AT&T’s network and provide a type of direct broadcasting service, except with two-way capabilities. Jill Hills: “It seems to have been the entry of IBM via its SBS consortium into direct competition that altered AT&T’s attitude from formal toleration of competition to outright aggression in defense of its monopoly.”[1] The Consent Decree of 1956 had restricted AT&T from computer activities and now it saw IBM entering a potentially very lucrative market. With some US$1.26 billion invested in this competing satellite communications system that could circumvent Ma Bell’s existing facilities, it got the full attention of AT&T. [2]

Consequently, in the early 1980s, AT&T looked to the political realm for protection. “Action switched from the FCC to Congress as AT&T under its chairman, John de Butt, set about raising political consciousness of the FCC decisions.”[3] However, members of Congress did not rush to support AT&T, a potent contributor, as a number of criticisms were in the process of being raised. These primarily concerned:

    1) competing equipment suppliers who wanted to break into the telecommunications and customer premise equipment (CPE) markets that were dominated by AT&T’s Western Electric;
    2) business users, especially banks, who wanted more sophisticated services, including those offered by SBS, and;
    3) competing carriers who found it difficult to compete against AT&T’s dominance of long distance transmission facilities and “local loop” telephony services. In fact, the government was looking to create a more competitive, privatized data-networking environment free from the dominance of AT&T. [4]

Deregulatory actions during the 1960s and 1970s cut into AT&T’s legislatively mandated monopoly on all telecommunications, but it was still by far the major service and equipment provider. The Justice Department had filed an anti-trust suit against Ma Bell in 1974, challenging its competitive practices and seeking to divest Western Electric, its manufacturing arm, in addition to its Bell Operating Companies. Then 1980, the FCC issued its Computer II Decision that was specifically focused on the future role of AT&T. It deregulated all data processing services and customer premises equipment allowing AT&T to enter other markets, although through a separate subsidiary. Despite some restrictions, this officially allowed AT&T to compete outside its traditional purview for the first time.[5]

Despite his ideological convictions, Ronald Reagan was noncommittal about the breakup, mainly because his Secretary of Commerce William Baldridge presented the threat of foreign equipment manufacturers entering the US market and also because his Secretary of Defense, Casper Weinberg, argued forcibly “that an integrated AT&T was desirable for national security.” However, his Justice Department was ideologically motivated enough to carry on with the divestiture, particularly William Baxter, the Assistant Attorney General for Anti-Trust.[6] Baxter argued in 1981 that the radical restructuring of the world’s largest company would forward the new administration’s promise of reducing regulation and promoting competition. In face of the opposition from the Departments of Commerce and Defense, the DOJ continued with its anti-trust case.

The result was a three-pronged approach. The Department of Justice continued its investigation of AT&T while the FCC sought, through Computer II, to distinguish between regulated basic service (both long distance and local loop services) and unregulated enhanced services such as data communications. The Republican Congress, led on this issue by Senator Packwood, chair of the Senate Commerce Committee, proceeded with a legislative solution along the basic/enhanced distinctions desired by FCC but with awkward accounting and regulatory requirements. Although passed by the Senate and supported by AT&T, the Packwood bill failed to make it through the House and it never became law. It was the Justice Department that won out, agreeing to a Consent Decree with AT&T in January of 1982. AT&T, facing the DOJ’s persuasive case, decided to settle. The result was a political decision bypassing the FCC and other traditional forms of telecommunications policy. Ultimately called the Modified Final Judgement (MFJ), it split AT&T into a competitive company consisting of a long distance carrier and a manufacturing arm and a set of regulated Bell Operating Companies (BOCs) that would provide local exchange services. AT&T and the Justice Department agree on tentative terms for settlement of anti-trust suit filed against AT&T in 1974.

The MFJ created 22 Bell Operating Companies, 7 Regional Bell Operating Companies (RBOCs or “Baby Bells”) and AT&T.

While the MFJ did not provide the total datacom solution wanted in the corporate world, it initiated the opening up what was previously a significantly closed telecommunications system. The next year, in a coffee shop in Hattiesburg, Mississippi. Bernard J. Ebbers, Bill Fields, David Singleton and Murray Waldron worked out details for starting a long distance company, LDDS (Long Distance Discount Service), the precursor to WorldCom. In 1984 President Reagan approved international competition for the INTELSAT international satellite system after petition from Orion to provide transatlantic telecommunications services for corporate users. But while significant liberalization was being made in the United States, countries around the world still held on to a government owned and controlled telecommunications system. In order for the new internationalization of financial services and digital commerce to be enacted, telecommunications changes needed to occur worldwide.

As a result of the Department of Justice’s 1982 Consent Decree, AT&T divested its Regional Bell Operating Companies (RBOCs) which retained control over the basic telecommunications exchanges and local lines in their respective territories. A Consent Decree means that all parties agree to the remedy. The RBOCs or “Baby Bells” as they were sometimes called were restricted from entering the long-distance interstate communications markets and also from manufacturing telephone equipment. The parent company was called ATT and it retained its continental Long Lines Division, research center Bell Telephone Labs (Bellcore) and Western Electric, its manufacturing arm. Two new subsidiaries were created to allow the company to expand into previously restricted areas. ATT Information Systems was an unregulated subsidiary offering, what was termed by the Second Computer Inquiry, “enhanced” services. These included information services and customer premise equipment, including computers.

AT&T had been involved with the creation of the mobile phone through its Bell Labs division but lost the initiative to Motorola in 1970s. It later effectively ceded the mobile business to the RBOCs after an AT&T report estimated the market for mobile phones in the year 2000 would be 900,000. In January of 1982, AT&T indicated that cellular would be given to the local companies, the RBOCs, after the FCC divided the 40 MHz it allocated to cellular into two segments. One segment would go to the local telephone company in each market and the other would be open to non-telephone companies that wanted it.

As a result of this bifurcated market, the mobile telephone industry emerged largely in this “other” market, thanks largely to the controversial financing methods of Michael Milken and the defunct investment firm Drexel Burnham Lambert. Milken helped the McCaw family assemble the wireless industry during the 1980s, raising raising money for the formation of McCaw Wireless which it sold to AT&T in 1994.

Notes

[1] Hills, J. (1986) Deregulating Telecoms: Competition and Control in the United States, Japan and Britain. p. 66. Quote on IBM’s threat to AT&T.
[2] Hills, J. (1986) p. 63. Investment figures on SBS.
[3] Hills, J. (1986) p. 66. Raising concerns about FCC to members of Congress.
[4] Dan Schiller. Telematics and Government. p. 92. Stance of US government towards AT&T.
[5] For example, the subsidiary could not use Bell Labs software. Hills, J. (1986) Deregulating Telecoms. p. 66.
[6] Brock, G. (1981) The Telecommunications Industry: The Dynamics of Market Structure. Harvard University Press. p. 156-157.
[7] Schiller, D. (1982), p. 163.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Metrics, Analytics, and Visualization

Posted on | May 21, 2019 | No Comments

Understanding metrics is a significant component in evaluating global media, social media, and culture industries. Measurements of magazine and newspaper circulation, book readership, as well as television and radio audiences, have each had a distinguished history in the media field. The Nielsen ratings, for example, have been a crucial part of the television industry and its ability to attract advertisers. The introduction of digital technologies has made them even more important, and Nielsen has expanded its monitoring activities beyond TV to PCs, mobile devices, and perhaps the automobile dashboard if the “driverless car” takes off.[1]

Search engines, social media platforms, and a myriad of new media applications on the web and mobile devices have increased the production of useful data, and new systems of analysis have augmented their utility. Analytics is directly involved in monitoring ongoing content delivery, registering user choices, and communicating with fans and other users. They also connect incoming flows of information to key organizational concerns concerning financial sustainability, legal risks, and brand management involved in digital media activities. This type of information is increasingly important for management of private and public organizations.

Organizations are beginning to acknowledge the importance of social media metrics across the range of corporate and non-profit objectives, especially those that involve legal, human resources, as well as advertising and marketing activities. These new metrics can be roughly categorized into three areas. At a basic level, they are granular metrics that quantify activities like the number of retweets, check-ins, as well as likes and subscribers. Strategically, metrics can also be useful for designing new products and services, facilitating support and promoting advocacy for an organization, fostering dialogues about products and services, and monitoring marketing/social media campaigns. Lastly, metrics are of particular concern to upper management as they can be useful to provide information on the sustainability of an organization.

Those in the “C-suites” (CIOs, CFOs, COOs, and CEOs) can use the information on an organization’s financial status, technical performance, and legal risks to assist management decision-making. Metrics present connections from social media investments to key concerns such product development, service innovation, policy changes, market share, and stock market value. Recognizing the increasing utility of metrics, management has increasingly appreciated digital dashboards as a way to collect and display data in a visually comprehensive way.

The increased attention on metrics has suggested an era of “big data” analytics has emerged in the digital media sphere. The collection of unstructured information from around the web (as opposed to pre-defined, structured data from traditional databases) presents unprecedented opportunities to conceptualize and capture empirical information from networked environments for various parts of an organization. Techniques such as pattern and phrase matching use new types of algorithms to capture and organize information from throughout the Internet, including the rapidly growing “Internet of Things” (IoT). The result is a transformation in the way commerce, politics and news are organized and managed.

Combined with artificial intelligence (AI) and natural language processing (NPL) for instance, cognitive systems like IBM’s Watson are disrupting industries like healthcare and finance. Watson made a spectacular debut by beating two human contestants on the TV game show Jeopardy, a challenging combination of cultural, historical and scientific questions. While Watson is struggling to catch on, AI is emerging in autonomous vehicles, voice recognition, game systems and educational technology. Apple’s Siri and Samsung’s Bixby are central to modern uses of smart phones. Amazon’s Alexa is becoming popular on kitchen counters around the US, shared by all members of the family. AI is likely to be a major influence on a wide range of cultural and experience industries.

Project managers and media producers can use the metrics to see connections between content/cultural products, audience participation, customer satisfaction, and market share. C-suites executives utilize the information on financial status, technical performance, and legal risks. Besides assisting management decision-making, analytics can provide useful performance information, and improve the development of new content products, cultural expressions, and experience-based services while targeting them to individual customers. While new developments like AI-assisted genre innovation and other machine incursions into the creative process are a justifiable means for concern, cognitive-logistical support for cultural event planners, film surveyors, and other creative content producers could be a welcome provision in the cultural/media industries.

This move to “big data” science requires the ability to conceptualize and enact strategies for data collection, analysis, and visualization. Employees and management should develop an appreciation for research and statistical thinking, as well as visual literacies and competencies for representing information and designing graphics that display data results in interesting and aesthetically appealing formats. The ability to identify and capture data from spreadsheets, the web and other online sources is important, as well as the ability to formulate pertinent questions and hypotheses that shape data research and models that can help explain or predict human or system behaviors.

Not everyone will be comfortable with these new types of data collection and intelligence-producing systems, but like them or not, AI and big data are encroaching rapidly on modern life.

Notes

[1] This is from a section in Pennings, A. (2017, October). Emerging Areas of Creative Expertise in the Global Digital Economy. [Electronic version] GDM Quarterly 1 (2), 1-11.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Lasswell and Hall – Power and Meaning in Media

Posted on | March 6, 2019 | No Comments

Harold Dwight Lasswell was one of the founding influences on the formation of the study of communication, media studies, and sociology. Stuart Hall was a Rhodes Scholar from Jamaica who helped pioneer an area of research at Birmingham Open University coined “British Cultural Studies.” Both contributed significantly to media studies during their tenures, and some of their contributions can be discussed by examining the phrase, first stated by Lasswell:

“Who, says what, in which channel, to whom, with what effect?”

Lasswell Model of Communication

Published in his 1948 essay, “The Structure and Function of Communication in Society,” the Lasswell model was easy to understand and works for various communication and media activities.[1] Noted for its emphasis on effects – how a message influences an audience member- psychology-oriented scholars picked it up to examine the media’s influence on human behaviors, including consumption, violence, and voting. It was coined just after the Second World War, when radio was still dominant, and social scientists were interested in the dramatic impacts of propaganda and advertising. Radio was important in mobilizing Allied and Axis populations during the 1930s.

The Lasswell model was often criticized for stressing linear, one-way information flows. Feedback and message-disturbing “noise” were not stressed, as they would be in new areas of study called cybernetics (Wiener, Forrester) and information theory (Shannon and Weaver). Hall offered another criticism: the Lasswell model minimized the role of power in the communication process.

Hall reworked Lasswell’s formula: “Who has the power, in what channels, to circulate which meanings, to whom (with what effect)?” He wanted us to examine the meanings of images in such a way as to show how different interests work to hold a preferred interpretation of that image. What organizations (news media, industry boards, advertising, government, etc.) and areas of specialization (journalism, medicine, finance, etc.) have the power to enforce and police such meanings?

Drawing on the academic area of semiotics, Hall emphasized that because the meanings in images are fundamentally flexible and fluid, “power” works to arrest or fix the meanings associated with the image. Brand management, political communication, and public relations are primarily about establishing a set understanding of media images and continually policing their interpretations.

Here is some of a lecture by Stuart Hall (see entire video) where he discusses how culture gives meaning to things. He argues that the human tendency to build maps of intelligibility and create classification systems jointly, works to create “commune”-ication. What he calls “signifying practices” in media production, such as image composition, narration, and editing work to make meaningful narratives and stories. At the end of this clip, he challenges the Lasswell model. (See the Media Education Foundation (MEF) for the video’s transcript)[2]

Hall’s work was especially interested in the relationship between language, power, and social structures. One way power operates to secure preferred meanings is through systems of classification. Classification often involves the establishment of standards and norms. Standardized classification systems ensure consistency and comparability, promoting uniformity in practices, measurements, or descriptions. One of the most powerful returns from generative AI are sets of categories or classifications.

Organizing the world into categories helps maintain order by discerning distinct differences in things and making sure they stay in certain boxes. He was interested in how language and media contribute to classifying and categorizing individuals and groups, reinforcing or challenging power dynamics. It is important to realize that culture creates and maintains these categories through differentiation and control processes. These include antidotes, jokes, memes, metaphors, stories, etc.

For Hall, classification is generative. Once a system of classification is created, it provides a framework that simplifies the understanding and management of diverse elements by grouping them together. Items fall into place or out based on shared or dissimilar characteristics. Classification provides a framework that simplifies the understanding and management of diverse elements.

Classification serves to maintain the order of the overall structure. In the US, the category of the presidency has recently been challenged by a series of elected candidates that upset traditional notions of who is eligible. George W. Bush was the son a previous president, raising issues of nepotism. Barack Obama, being black, challenged many Americans’ sense of who was racially eligible to be in the “Oval Office.” More recently, Donald Trump, who never held elected office or ran a large organization, was considered by many to be unfit and unqualified to hold the office. These category fits are important to people and a major source of social strife in modern society.

Hall’s primary interest was cultural studies. He helped shift the the study of culture from high culture to popular culture. “Culture, he argued, does not consist of what the educated élites happen to fancy, such as classical music or the fine arts.” The value in such studies is that they can tell about other parts of society that have been marginalized. They can tell us things about how race, gender, and economic classes are rendered in modern society.[3]

His work on Race: The Floating Signifier is a classic on the fluidity of meanings associated with representations of race. Here he expands his work on classification, drawing on the area of semiotics. Semiotics is based on the study of signs, divided into signifiers and signified; the thing and its meaning. He used this approach to examine how culture influences the way we see race as part of a system of classification that is used to order society and the types of people within it.

For Hall, race has physical characteristics, primarily, hair and skin color. But a range of meanings and values are associated with races. He uses a concept by Mary Douglas, “matter out of place” to describe the implications of those classifications. In her Purity and Danger: An Analysis of Concepts of Pollution and Taboo (1966), she argued that we constantly construct symbolic orders that evaluate and rank items and events. Hall recounts her example of dirt in the bedroom and dirt in the garden. One is “dirty” the other is natural. One is problematic and needs to be addressed at once, the other is invisible. A related example is the “back of the bus” that was allocated for black people during the height of segregation in the southern states of America.

For Hall, this is part of the Enlightenment’s project to bring all experience into observation and understanding. The panoptic view of Hall reiterates Michel Foucault’s admonition in Power/Knowledge (1980) that it is not so much that information is power, but rather, it is power that shapes information.[4][5]

Stuart Hall died on February 10, 2014. He was a major contributor to the creation of cultural studies, specifically the Birmingham School of Cultural Studies. He engaged the study of signs – semiotics – and warned that power secures preferred meanings. This keeps the understanding of images from sliding into other interpretations and thus empowering alternative groups or individuals who would benefit from another set of meanings.[6]

Notes

[1] Lasswell, H. D. (1948). “The Structure and Function of Communication in Society.” In L. Bryson (Ed.), The Communication of Ideas. (pp. 37-51). New York: Harper and Row.
[2] The transcript and the video Representation & the Media Produced & Directed by Dr. Sut Jhally. Edited by: Sanjay Talreja, Sut Jhally & Mary Patierno. Featuring a lecture by Stuart Hall Professor, The Open University and introduced by Sut Jhally University of Massachusetts at Amherst. Distributed by Media Education Foundation (MEF) that produces and “distributes documentary films and other educational resources to inspire critical thinking about the social, political, and cultural impact of American mass media.”
[3] Hsu, Hua. “Stuart Hall and the Rise of Cultural Studies.” The New Yorker, The New Yorker, 17 July 2017, www.newyorker.com/books/page-turner/stuart-hall-and-the-rise-of-cultural-studies.
[4] Foucault, M. (1980) Power/Knowledge: Selected Interviews and Other Writings, 1972-1977. Trans. Colin Gordon et al. New York: Pantheon.
[5] Mason, Moya K. Foucault and His Panopticon. 2019. Accessed March 6, 2019.
[6] Chandler, D. (2017) Semiotics for Beginners. Routledge.

Citation APA (7th Edition)

Pennings, A.J. (2019, Mar 6). Lasswell and Hall – Power and Meaning in Media. apennings.com https://apennings.com/media-strategies/lasswell-and-hall-power-and-meaning-in-media/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea teaching broadband policy and the visual rhetoric of ICT. From 2002-2012 he was on the faculty of New York University where he taught digital media and information systems management. He also taught digital media at St. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Spreadsheets and the Rhetoric of Ratios

Posted on | March 2, 2019 | No Comments

In this post, I examine the figuring of ratios as a conceptual technique for constructing systems of understanding in the modern political economy. The ratio is an important mathematical device for reality construction in a wide range of activities, but their role in financial and management environments are especially notable. These type of ratios are a result of dividing one account balance or financial measurement with another. Examples are debt-to-equity, return on assets (ROA), and return on investment (ROI).

This post continues my series on meaning-making practices and the power of digital spreadsheets. I previously wrote about the historical components of spreadsheets – the lists, rows, numbers, words, and tables that combine to give users of spreadsheets their visual and analytical proficiencies. Accounting, in particular, used the lists and tables of spreadsheets to create “time-space” power that propels organizations across the span of months and years, and over the span of long distances.

More recently, I’ve been examining the various formulas that provide additional calculative and analytical capabilities for the spreadsheet. With almost 500 different types of formulas, from simple arithmetic like AVERAGE, SUM, and MIN/MAX to more complex formulas such as CHOOSE, CONCAT/CONCATENATE, HLOOKUP, INDEX MATCH, PMT and IPMT, XNPV and XIRR.

Below, I explore the communicative usage of ratios to construct an understanding of relationships, in this case, a corporation’s productivity.

Productivity ratios provide one evaluative structure for indicating the efficiency of a company. It is a ratio or fraction of output over input, where output is the number of goods or services produced by an industry, company, person, or machine and input is the amount of resources you want to measure.

Ratios are used mainly to establish a quantitative relation between two numbers, showing the times the value of one amount is contained within the other such as total revenue divided the number of employees. For example, in 2015, Apple had revenue of $182,800,000,000 and just under 98,000 employees. This meant that Apple made $1,865,306 per employee.

$182,800,000,000 / 98,000 = $1,865,306

Google was next with $1,154,896 of revenue per employee. Japan’s Softbank made $918,449 per employee for third place while Microsoft made $732,224 per employee. Measuring revenue per employee (R/E) provides an understanding of the productivity of the company and possibly how efficiently the company runs. It also provides a metric for comparing it with other companies.

Creating ratios in Excel requires the use of the GCD function (Greatest Common Divisor) or use the TEXT and SUBSTITUTE functions.

Spreadsheets vary in complexity and purpose, but they primarily organize and categorize data into logical formats by using rows and columns that intersect in active cells. They can store information and make calculations and data reorganization based on models. They display information in tabular form to show relationships and can help make elaborate visualizations with the data. Consequently, they make it easier to leverage organizational data to make relationships apparent and answer what-if questions. With the use of ratios, they can also identify high and low performing assets, track employee performance, and evaluate profitability.

A ratio denotes a relationship, usually between two numbers, but in any case, between amounts. They indicate how many times the first amount is “contained” in the second. They can be a valuable technique for comparison and a measurement of progress against set goals such as market share versus a key competitor. They can also be used for tracking trends over time or identifying potential problems such as a looming bankruptcy.

Ratios are a technique that “fixes” or freezes a relationship in order to construct a moment of reality. While they attempt to apprehend truth, they are instrumental in solidifying, at least temporarily, the appearance of concrete realities. Ratios have analytic capacity, such as how much productivity comes from the average individual worker.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY Korea, he taught at New York University (NYU) for 15 years and at Marist College in New York before his move to Manhattan. His first academic position was at Victoria University in New Zealand. He has also spent ten years at the East-West Center in Honolulu, Hawaii. Originally from New York, his US home is now in Austin, Texas where he also taught in the Digital MBA program at St. Edward’s University.

From Gold to G-20: Flexible Currency Rates and Global Power

Posted on | November 11, 2018 | No Comments

When British author Ian Fleming published the James Bond spy novel Goldfinger in 1959, the world was locking into a new framework for managing global trade and foreign exchange transactions. The New Deal’s Bretton Woods agreements tied participating currencies into a fixed exchange rate with the US dollar that itself was tied to $35 for an ounce of gold. The Goldfinger movie (1964) centered around the stocks of gold nestled away at the Fort Knox Bullion Depository, Kentucky. The sinister plan (spoiler alert) suggested Auric Goldfinger was conspiring with the mafia to rob the famous bank, but the actual plan was to work with the Chinese to nuke Fort Knox so that the rest of the world’s gold supplies (that he had been buying up) increased dramatically in value. Below is a clip of the conclusion of the movie.

This post discusses how the major industrialized nation-states organized to manage the transition from fixed exchange rates to the global, floating, digital trading environments that emerged in the 1970s and shaped the modern world.

President Franklin Delano Roosevelt (FDR) created Fort Knox in 1935 during the New Deal after outlawing the use of gold for private wealth holdings and as common currency. The Public Works Administration began construction of the giant vault in 1935 as fears of spreading fascism in Europe convinced him to move US-held gold from mints and treasuries along the Eastern seaboard and store them far inland and next to Fort Knox, a US military training facility that had pioneered the use of tank tactics during World War I.

As the Nazis began to conquer neighboring countries to steal their gold, much of the durable metal escaped and flowed into Fort Knox. Britain bought US military equipment and supplies, and other countries sought refuge for their metallic wealth. By the end of the war, most of the world’s gold was tucked away at Fort Knox and formed the basis for a new international currency arrangement.

As the war was coming to an end, the Allies met in Bretton Woods, New Hampshire, to iron out a new international framework for currency control and trade. The resulting 1944 agreement created the International Monetary Fund (IMF), the World Bank, and a precursor to the World Trade Organization (WTO), and the International Trade Organization (ITO). The ITO was defeated by the US Congress at the time, but the main result of the conference was an agreement to stabilize world currencies by tying the US dollar to gold at $35 an ounce and requiring trading partners to keep the value of their currencies connected to the dollar by 1-3 percent in value. This arrangement was successful but started to fall apart due to spread of dollars around the world that were circulating back to the US as a claim to the gold at Fort Knox.

When Richard Nixon shocked the world of the Bretton Wood’s gold-dollar standard in 1971, he initiated what would become a new global economy based on flexible currency exchange rates with the US dollar as the principal currency. The USD became “fiat” money, no longer tied to the promise of gold convertibility. His New Economic Policy (NEP) was partially responsible for the transition to digital currency trading as commercial financial institutions intensified their interest in the F/X markets. As the international financial system became more electronically-based and transactions escalated in scale and scope, a new regime of power emerged. However, the process was a long and often painful one.

The announcement that Nixon merely “closed the gold window,” made to a national television audience on August 15th of that year, was a rhetorical understatement meant to lessen the startlingly dramatic changes he was imposing on the world economy. Nixon was stating that the US was not going to intervene in the foreign exchange markets to manage the price of the dollar or continue to deplete US gold reserves. To give him a bargaining chip in future international negotiations, he also added a 10 percent surcharge to imports in his NEP. The move destabilized the world of fixed currency rates and ended the Bretton Woods control over international finance. It concluded what had in effect beena gold standard.[1]

The Nixon moves were strongly opposed by Japan and Western Europe, fast becoming major economic competitors due to automobiles and other cheap and attractive imports. The “Nixon Shokku” drove the value of the US dollar down and made exports into the US more expensive, increasing price inflation. He responded with charges that his economic counterparts were trading unfairly with mercantile tactics, and freeloaded off the US by failing to contribute to the NATO defense effort. The US dollar lost nearly a third of its value during the first six years after the dollar had its link to gold severed.

Nixon was also working with the OPEC countries to raise oil prices and cripple his economic foes, who were much more reliant on external petroleum production than the US. OPEC was essentially a cartel designed to managed the price of oil, which were priced the US dollar. The floating exchange rates were leading to a dramatic fall in the value of the US dollar and nations had to buy dollars first to purchase OPEC crude. Consequently, OPEC countries started to increase the price of oil.

Nixon had strong allies in the banking community who were intrigued by the profit possibilities in arbitrage, consulting, and hedging. The policy signaled a significant shift away from government control to a new, free yielding environment where US banks could break out of their highly regulated and geographically limited reins. Walter Wriston, who would later coin the term “information standard,” argued that in this new environment, his Citicorp enterprise, “ought to have a price-to-earnings (P/E) ratio of 15 to 20, typical of growth stocks, rather than a ratio of 5 to 6, which was typical of public utility companies.” In the next few years, Citicorp stock reached a P/E ratio of 25. The stock moves invigorated the banking industry as they began to move into international activities such as Eurodollar lending.[3]

To buy time to sort out the implications of the broken Bretton Woods system, the US initiated a conference via the IMF on world monetary reform. The IMF was organized around its Articles of Agreement consented to by its participating countries, although the US held a privileged place due to its overall power and dollar’s strength. The IMF set up the Committee of 20 (C-20) in 1972 to prepare proposals for a new financial regime in the wake of Nixon’s closing the gold window. The committee was made up of representatives of the twenty countries most involved in non-communist international trading and finance.

The C-20 met between 1972 and 1974 to develop a new international system that would subordinate currencies to Special Drawing Rights (SDRs), a global reserve asset created in 1969. Although SDRs allocations provided some liquidity and supplemented member countries’ official reserves, it proved inadequate to stabilize the monetary system. The C-20 broke down in 1974 over concerns about countries being forced to inflate their currencies to buy up excessive US dollars.

Concerns were raised by the Group of Ten (G-10). They had created the infamous Interim Committee that had been responsible for the ill-fated Smithsonian Agreement in December 1971. Those accords had established a new dollar-gold standard, where the currencies of many industrialized nations were again pegged to the US dollar, but at an inflated $38 an oz. A crisis ensued, and the US dollar continued to drop in value.[3]

The committee reconvened on March 25, 1973, at the White House to discuss the international monetary crisis at the invitation of US Secretary of the Treasury, George Schultz and his undersecretary Paul Volcker (later appointed as Federal Reserve Chairman). Their list included finance ministers from England, France, and Germany. Although they resolved to address the issue of destabilized currency rates, late that year, overwhelmed by the first Middle East “Oil Crisis,” the Group of Ten decided by default that exchange rates would float.[4]

The Oil Crisis, sparked by OPEC’s embargo of petroleum to the US for its support of Israel, increased widespread volatility in the foreign exchange markets. A beneficiary was a startup project by Reuters called Money Monitor Rates. A well-established news agency, Reuters created a two-sided electronic market for currency markets. It charged selling banks to list currency prices on computer monitors, while also charged buying banks to access the currency prices. The crisis helped the Reuters endeavor become profitable as currency traders valued the electronic information.

Schultz and Volcker next agreed to invite the Japanese, and the news media labeled the group the “G-5” and began to refer to the “Group of Five” meetings. The next year, two of the finance ministers became the head of government. When Valery Giscard d’Estaing from France and Helmut Schmidt from Germany became leaders of their respective countries, they attempted to elevate the meetings to include not only the ministerial, but heads of government as well.

The new group met for the first time during November 15-17, 1975 at the Chateau de Rambouillet, France.[5] At that meeting the heads of state and the finance ministers of France, Germany, Japan, the United States, and Great Britain, the G-5 prepared (some reluctantly) the Declaration of Rambouillet. They forsook a system of stable monetary rates for a “stable system of exchange rates.”[6] Instead of governments agreeing to peg currencies according to political agreements, they instead would manage the macro-environment in which foreign exchange rates would be set by market forces.

The Second Amendment to the IMF’s Articles of Agreement officially eliminated the unique role of gold and legitimized floating exchange rates in 1978. The governments of the major currencies decided to let the “markets” decide the proper exchange rates for its currencies. In the future, it would be infeasible for sovereign governments, except maybe the US, to be able to dictate the value of their money.

The next year Canada attended, and after the fall of the USSR, Russia began to participate. In the wake of Bretton Woods demise, the G-# summits played an important role in providing overall stability for the foreign exchange markets and world economy as a whole. Although America clearly led the group, and volatility was rampant, the group produced global coordination and symbolic power.[7] One of its most important roles is to solidify the notion of national sovereignty. In the age of the transnational money markets, the individual nation-state needed to be reified. That is, propped up in part to maintain the system of different currencies. Notable examples were the debt crises of the 1980s and the Asian currency crisis in 1997.

Susan Strange:

    The Group of Ten developed countries, by contributing to rescue funds for troubled Asian economies, reiterated more forcefully than ever the conviction that in the modern international financial system, bankruptcy was not an option, at least, not bankruptcy in the sense that a failed business closes down, gets sold off to the best bidder or taken over lock, stock, and barrel. The appearance of an immortal, sovereign state was to be preserved—not for its own sake, but for the greater security of the world system.

The financial disruptions of the 1990s, particularly in Asia and the USSR, led to the formation of the G-20 in 1999 at the G-7 Cologne Summit. They saw the need include “systemically important countries” in the discussions about the global economy. Initially, it was the finance ministers who met every year, until George W. Bush invited the G-20 leaders to Washington DC to discuss the unfolding economic crisis.

After its inaugural meeting of the national leader’s summit in 2008, the G-20 announced in September 2009 that it would replace the G-8 as the main economic council of wealthy nations. The Obama administration pushed for the change in the midst of the Great Recession in recognition of the influence of many other countries on the global economy and the need to coordinate policy among the major countries.

The G-# summits have been held annually since that time to address a number of macroeconomic issues affecting the global political economy as well as crime, drugs, the environment, and human rights. During the 1990s, they synchronized the “global information infrastructure” and its transition to the global Internet.

Meanwhile, the financial system’s “information standard” has solidified its power as the replacement for the gold with the extraordinary increase in digital money and flows of currency. In junction with the G-#’s political power, a modicum of systemic stability and coordination keeps the world turning, as James Bond would have it.

Citation APA (7th Edition)

Pennings, A.J. (2018, Nov 11). From Gold to G-20: Flexible Currency Rates and Global Power. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/from-gold-to-g-5-flexible-currency-rates-and-global-power/

Share

Notes

[1] Greider, W. (1987) Secrets of the Temple. How the Federal Reserve Runs the Country. New York: Simon and Schuster. pp. 337-338. A very extensive treatise on the economic conditions of the 1970s.
[2] Smith, R. (1989) The Global Bankers. New York: Truman Talley Books/ Plume. p. 34.
[3] The G-10 consisted of governments of eight International Monetary Fund (IMF) members—Belgium, Canada, France, Italy, Japan, the Netherlands, the United Kingdom, and the United States—and the central banks Germany and Sweden. This group was, by default, in charge of maintaining the economic growth and stability of international currencies. Although in effect, its powers are limited, it still presented an important image of national sovereignty.
[4] Gowan, P. Global Gamble: Washington’s Faustian Bid for World Dominance. London: Verso. p. 20-21.
[5] G-7/G-8 Summits: History and Purpose. Fact sheet released by the Bureau of European and Canadian Affairs, U.S. Department of State, April 30, 1998 accessed on May 01, 2001 from: http://www.state.gov/www/issues/economic/summit/fs_980430_g8_sumhistory.html.
[6] Eichengreen, B. (1996) Globalizing Capital: A History of the International Monetary System. Princeton, NJ: Princeton University Press. I am indebted to his fifth chapter, “From Floating to Monetary Unification,” which contains a good overview of this process. Pages 136-141 were particularly useful.
[7] Peter Gowan (1999) makes a strong case for the dominance of the US in the G-7 oligarchy in his book The Global Gamble: Washington’s Faustian Bid for World Dominance. London: Verso.
[8] Strange, S. (1997) Mad Money: When Markets Outgrow Governments. Ann Arbor: University of Michigan Press. p. 9.



© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Apple’s GUI and the Creation of the Microsoft’s Excel Spreadsheet Application

Posted on | October 19, 2018 | No Comments

Microsoft’s famous spreadsheet application, Excel, was originally designed for Apple’s Macintosh personal computer. This post explores the beginning years of the personal computer and its transition to its more modern interface pioneered by Apple and its Macintosh computer. This transition opened the way for new software innovations, particularly the development of the Excel spreadsheet application by Microsoft. Excel has since become the mainstay spreadsheet application in organizations around the world.

The Apple Macintosh or “Mac” was based on the Graphical User Interface (GUI, pronounced “gooey”), that was primarily developed at Xerox Parc in Palo Alto, CA. It was sometimes called WIMP for its Windows, Icons, Mouse, and Pull-down menus technology. The Apple II and IBM PC were still based on something called a command line interface, a “black hole” next to a > prompt that required code to entered and executed on. For example, to find a word or other string of characters, you would type find /V “any string” FileName at the prompt C:\>

This command line system required extensive prior knowledge and/or access to readily available technical documentation. The user needed to know the codes or copy them from a manual. The GUI on the other hand, allowed you to point to information already on the screen or categories that contained subsets of commands. Eventually, menu categories such as File, Edit, View, Tools, Help were standardized on the top of GUI screens.

A crucial issue for the Mac was good third-party software that could work in its GUI environment, especially a spreadsheet. Representatives from Jobs’ Macintosh team visited the fledgling companies that had previously supplied microcomputer software. Good software came from companies like Telos Software that produced the picture-oriented FileVision database and Living Videotext’s ThinkTank used “dynamic outlines” to capture levels of thought and promote creative thinking. By April 1985, they had sold 30,000 copies, or to about 10% of all Mac owners. However, that number was still small compared to the potential of the business world.

For the Mac to be useful for a business, Apple needed a new VisiCalc. Jobs’ longstanding relationship with VisiCorp was strained because the VisiCalc distributor was trying to develop its own “PARC-like system” for IBM PCs called Visi-On.[1] Lotus was the up-and-coming software producer and signed on with Apple to produce an ambitious spreadsheet application called “Jazz,” but the software soon ran into trouble. Steven Levy wrote:

    Apple was desperate for the Macintosh equivalent to VisiCalc, something so valuable that people would buy the computer solely to run it. It had high expectations for Lotus’s product—after all, Lotus 1-2-3 had been the IBM PC’s VisiCalc—but Lotus’s Jazz turned out to be a dud. Mitch Kapor’s charges had clumsily missed the point of Macintosh. In the Lotus view, Mac was a computer for beginners, for electronic dilettantes who still clung to a terror of technology. Jazz was the equivalent of a grade school primer, an ensemble of crippled little applications that worked well together but were minimally useful. No one bought it.[2]

Other companies had partial success in creating a spreadsheet for the Mac, but part of the problem of getting a good spreadsheet for the Mac was that its screen was small and not conducive to spreadsheet work. Microsoft’s Multiplan for the Mac was an early offering. Ashton Tate produced a spreadsheet for the Mac called Full Impact that contained much of the software used in an Apple in-house spreadsheet called Mystery House. Unlike its Apple II predecessor, the Macintosh was failing to make significant inroads into a business world that was enamored with the PC and Lotus 1-2-3.

Apple’s innovative Macintosh technology did not go unnoticed by Microsoft. Microsoft was shown the embryonic Mac near the end of 1981. Apple authorized Microsoft to develop software languages and apps for the Mac GUI-based system. Gates and company had gone public in March of 1985 netting the co-founder’s 45 percent share in the company some $311 million in net worth. The young company was working secretly on their own GUI interface while continuing to develop software for the Mac. Meanwhile, Microsoft pressured Apple to give them a license for the GUI interface and threatened to stop work on the software they were producing for Apple.[3] In October 1985, Apple CEO John Scully gave in and offered them a license for the Mac GUI. But much to the ire of Apple, Gates’ company had developed their own GUI that it layered on top of its DOS operating system. In November 1985, Microsoft began to ship Windows 1.0, a DOS operating system with an awkward but much friendlier face.

The Macintosh’s initial “killer app” turned out desktop publishing that helped them develop their WYSYWIG (what you see is what you get) technology. In mid-July 1985, a company called Aldus released a final version of PageMaker. For the producers of corporate newsletters and other small publishers, the Macintosh began to show great promise. For the first time, documents could be manipulated and shown on the screen exactly as it would be published. But the WYSYWIG appearance on the screen was only minimally effective without the capacity to print what was shown. The printing solution came from another computer scientist from Xerox PARC. John Warnock had created technology called Postscript that allowed a laser printer to print exactly what was on the screen. In 1982 he created a company called Adobe that caught the attention of Apple. Jobs canceled work on other projects and bought nearly 20% of Adobe while carefully integrating their technology into the design of a new Laser printer.

Combined with applications like Pagemaker and Macpaint, the new software-print combination inspired thousands of graphic artists and painters to try the Macintosh. While Apple lost market share in the financial sphere, it became the darling of graphic designers and publishers. One casualty, however, was Steve Jobs who was ousted by CEO John Scully and the Apple board in May 1985 after Apple recorded its first-ever quarterly loss. In September, Jobs sold his shares of Apple.

The new GUI-enhanced machines presented a major challenge for the software industry. The original company that created the famous VisiCalc spreadsheet was sued by its distributor VisiCorp, formerly Personal Software, in September 1983 for failing to keep the software current. A counter-suit was filed and the legal hassles distracted software development. Consequently, the spreadsheet application never made an adequate jump to a GUI environment. Lotus Development bought out Software Arts after a chance meeting between Bricklin and Kapor on an airline flight. Soon after, Lotus discontinued VisiCalc. But Lotus also took a nosedive as its failure with Jazz was a fateful mistake, providing a crucial opening for Microsoft. Gates and company released Excel 2 for the “fat Mac” in 1985. Although it was the first version, it was numbered to correspond with the new Mac. Soon Excel became a popular spreadsheet for the Macintosh, especially after the release of Excel 2.2 for the Macintosh in 1989 that was nearly twice as fast as the original.

Microsoft went on to develop several versions of Windows, its new GUI operating system and as they got better, the Excel spreadsheet became more popular. Windows 2.0 was released in 1987 and Microsoft offered versions of Excel for Windows and for DOS that year. But both applications were awkward and did not become very popular. Apple sued Microsoft in 1988 after the release of Windows 2.01, claiming its interface design had been copied, but to no avail.

Lotus 1-2-3 had been the top-selling software product of 1989 but that was also the year the GUI gained popular acceptance within the DOS world with the introduction of Windows 3.0. The DOS-based Quattro Pro by Borland had been on the rise against the dominance of Lotus 1-2-3, but neither could resist the power of the user-friendly Excel 3.0 and the even better Excel 4.0 released in 1992. Meanwhile, Excel remained the only spreadsheet available for Windows until 1992 when Lotus countered with its Lotus 1-2-3 4.0 for Windows. But Excel 5.0 signaled Microsoft’s rise to dominance, partially because of the inclusion of the Visual Basic Programming System. Windows soon monopolized the PC desktop and Excel became its flagship business tool.[5]

Notes

[1] Visi-On information from Steven Levy’s Insanely Great. The Life and Times of Macintosh, The Computer that Changed Everything. NY: Penguin Books. p. 160.
[2] Quote on the Jazz failure from Steven Levy’s (1995) Insanely Great. The Life and Times of Macintosh, The Computer that Changed Everything. NY: Penguin Books. p. 219-20.
[3] Information on John Warnock and Postscript from (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 38. Also see Levy (1995) p. 212-213.
[4] Apple vs Microsoft in Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 361.
[5] Information on spreadsheet competition from Rob Clarke’s “A Formula for Success,” PC WORLD, August 1993, pp. 15-16.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Potential Bill on Net Neutrality and Deep Pocket Inspection

Posted on | October 17, 2018 | No Comments

Just got this discussion draft by Eliot Lance Engel (D-NY) from one of my former colleagues at New York University and a former Verizon executive, Thomas Dargan. Eliot Lance Engel (D-NY) is the U.S. Representative for New York’s 16th congressional district that contains parts of the Bronx and Westchester County.[1] US telecommunications policy is based on the Communications Act of 1934 that created the Federal Communications Commission and established the importance of common carriage, a concept that is included in current understandings of net neutrality.

115TH CONGRESS 2D SESSION

[DISCUSSION DRAFT]

H. R. __

To amend the Communications Act of 1934 to prohibit broadband internet access service providers from engaging in deep packet inspection.

IN THE HOUSE OF REPRESENTATIVES
—— introduced the following bill; which was referred to the Committee on ______________

A BILL

To amend the Communications Act of 1934 to prohibit broadband internet access service providers from engaging in deep packet inspection.

Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled,

SECTION 1. SHORT TITLE.
This Act may be cited as the “Deep Packet Privacy Protection Act of 2018”.

SEC. 2. PROHIBITION ON DEEP PACKET INSPECTION.

(a) IN GENERAL.—Title VII of the Communications Act of 1934 (47 U.S.C. 601 et seq.) is amended by adding at the end the following:

“SEC. 722. PROHIBITION ON DEEP PACKET INSPECTION.

“(a) IN GENERAL.—A broadband internet access service provider may not engage in deep packet inspection, except in conducting a reasonable network management practice.

“(b) RULE OF CONSTRUCTION.—Nothing in this section shall be construed to prohibit a broadband internet access service provider from engaging in deep packet inspection as required by law, including for purposes of criminal law enforcement, cybersecurity, or fraud prevention.

“(c) DEFINITIONS.—In this section:
“(1) BROADBAND INTERNET ACCESS SERVICE.—

“(A) IN GENERAL.—The term ‘broadband internet access service’ means a mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all internet endpoints, including any capabilities that are incidental to and enable the operation of the communications service, but excluding dial-up internet access service.

“(B) FUNCTIONAL EQUIVALENT; EVASION.—The term ‘broadband internet access service’ also includes any service that—

“(i) the Commission finds to be providing a functional equivalent of the service described in subparagraph (A); or

“(ii) is used to evade the prohibitions set forth in this section.

“(2) DEEP PACKET INSPECTION.—The term ‘deep packet inspection’ means the practice by which a broadband internet access service provider reads, records, or tabulates information or filters traffic based on the inspection of the content of packets as they are transmitted across their network in the provision of broadband internet access service.

“(3) NETWORK MANAGEMENT PRACTICE.—The term ‘network management practice’ means a practice that has a primarily technical network management justification, but does not include other business practices.

“(4) REASONABLE NETWORK MANAGEMENT PRACTICE.—The term ‘reasonable network management practice’ means a network management practice that is primarily used for and tailored to achieving a legitimate network management purpose, taking into account the particular network architecture and technology of the broadband internet access service, including—

“(A) delivering packets to their intended destination;

“(B) detecting or preventing transmission of malicious software, including viruses and malware; and

“(C) complying with data protection laws and laws designed to prohibit unsolicited commercial electronic messages, including the CAN-SPAM Act of 2003 (15 U.S.C. 7701 et seq.) and section 1037 of title 18, United States Code.”.

(b) DEADLINE FOR RULEMAKING.—Not later than 180 days after the date of the enactment of this Act, the Federal Communications Commission shall issue a rule to implement the amendment made by subsection (a).

(c) EFFECTIVE DATE.—The amendment made by this section shall apply beginning on the date that is 270 days after the date of the enactment of this Act.

Notes

[1] Tom Dargan can be reached at US 914-582-8995
[2] Eliot Lance Engel (D-NY) official website.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

TIME Magazine’s “Machine of the Year”

Posted on | October 10, 2018 | No Comments

The Apple II was quite a success when it was introduced in 1977 with sales of US$770,000 in its first year. Its growth over the next few years, however, was tremendous. Revenues hit $7.9 million in its second year of operation and $49 Time's Machine of the Year covermillion in 1979. Its founders, Steve Jobs and Steve Wozniak, were soon multimillionaires despite still being in their early 20s. Apple’s sales were initially concentrated in the hobbyist market with recreational software such as games dominating. Another important market was education, with simulation games for teaching mathematics, music, and science, etc. Most of these programs were poor in quality though, and in both of these areas, the software industries failed to develop significantly. It was not until the business market for microcomputer software matured that demand for the smaller machines solidified and ultimately ensured Apple’s success, but not without a challenge from a similar machine – the IBM Personal Computer (PC).

Although it was not apparent at first, three software packages: databases, spreadsheets, and word processing created significant demand for the PC in the business world. In 1983, dBase emerged as the database leader with 150,000 units sold, WordStar sold 700,000 packages to take the lead among word processing software, while VisiCalc led the spreadsheet market with 800,000 units sold.[1] It was the spreadsheet though that had the most significant allure. When VisiCalc was created for the Apple II in late 1979, sales of both increased rapidly. Software Arts, who marketed and priced its VisiCalc for around $100, had sales of $11 million its first year while Apple’s sales also continued to grow, reaching to $600 million in 1982.[2] With the success of these three software areas, microcomputers were proving to be more than toys.

Until the electronic spreadsheet, the Apple II was largely considered primarily a hobby toy for “men with big beards” to control things like model train sets.[3] But VisiCalc combined the microcomputer and money in extraordinary new ways. It was the “killer app” which launched the PC revolution, but it also brought powerful new techniques of numerical analysis to the individual. Countering the prevailing notion that accounting calculations were the domain of meekish accountants and subordinate secretaries, electronic spreadsheets reached a whole new range of entrepreneurs, executives, traders, students, etc. Competition was growing though, and in 1982 it was a small company called Microsoft whose new spreadsheet called Multiplan received acclaim. The software continued to advance however, especially when a former employee of Personal Arts (a company hired to market VisiCalc) took his earnings from rights to software he developed to start his own company and create a new spreadsheet program that would dominate sales for the rest of the decade. The new software package was called Lotus 1-2-3.

Lotus 1-2-3 was a product of a new company started by Mitch Kapor. Like Steve Jobs and Wozniak, Kapor came from a bastion of military investment in computer technology, but in this case, it was Boston, not Silicon Valley. In the 1960s, he actually had the chance to learn computer programming in his high school and had built a small computer/adding machine using a telephone rotary dialer for inputting data. But also like Jobs, he was highly influenced by the counter-cultural movement, primarily a reaction to the Vietnam War. After exploring a wide variety of life experiences including teaching meditation for awhile and getting a masters degree in counseling psychology, Kapor returned to his computer roots. He went to work for Personal Arts and designed a program called VisiPlot/VisiTrend in order to increase the readability of the spreadsheet.

But after a management change he left the company. Before he left though, he received $1.2 million for the rights to his software program. Despite his counter-big business sensibilities, he took the money and started his own company called Micro Finance Systems. Their first product was called the Executive Briefing System. But before he released it, he changed the name of the company to Lotus Development Corporation in honor of a mediation practice. Soon he got venture capitalist Ben Rosen to invest in a new product that would integrate a spreadsheet and a word processor into his graphics program. After an unsuccessful attempt to produce the word processor, they added a database program and began to market Lotus 1-2-3.[4]

The success of this new spreadsheet software was tied intimately to the success of another new microcomputer, the IBM Personal Computer or the “PC.” On August 12, 1981, Big Blue introduced its personal computer based on a 4.77 MHz 8088 Intel chip with 16K (expandable to 256K) of RAM and an operating system, PC-DOS, licensed from an upstart company called Microsoft. The IBM PC came with a 5.25-inch floppy disk and room for another. Working secretly as “Project Acorn” in Boca Raton, Florida, the renegade IBM group also created expansion cards, monochrome monitors, and printers for new machine as well as a host of new software programs. The IBM PC was attractive to traditional businesses and to mainstream computer users who had previously considered the small computer little more than a toy. The availability of the spreadsheet however, turned the IBM PC into a practical business tool.

Just as VisiCalc helped the Apple II take off, Lotus 1-2-3 contributed greatly to the IBM’s success and vice-versa. The new spreadsheet package actually integrated its calculative abilities with graphics and database capabilities thus the numerical suffix on its name. For the user, it meant that not only could they do sophisticated types of calculating, they could also print the results out for business meetings, and store it as data in an organized manner. Within a year of its software release, the Lotus Corporation was worth over a $150 million. The relationship with the IBM PC was symbiotic, as Big Blue’s new little computer sold over 670,000 units were sold in its first year.[5] Lotus Corp meanwhile became the number one supplier of software as its sales revenues grew to $258 million in its third year.[6]

Lotus 1-2-3 also benefited greatly from what was arguably the deal of the century, Microsoft’s ability to license its operating system to IBM without granting it exclusive rights. Microsoft’s 1980 deal with IBM, which allowed it to sell its DOS software to other PC manufacturers meant that Lotus 1-2-3 could be used on any “IBM-compatible” microcomputer, including the semi-portable Compaq machine. Compaq was started in 1982 and set out immediately to reverse engineer IBM’s BIOS (Basic Input/Output System) in order to produce its own IBM “clone”. Soon it had developed its own microcomputer that would run the same software as the IBM PC. Compaq achieved remarkable sales of $111 million in its first year and went on to become a Fortune 500 Company.[7] Meanwhile, Lotus 1-2-3 became the most popular PC software program sold throughout the 1980s. Not unrelated, Bill Gates was on his way to becoming the richest man in the world as he made money off both the OS for the IBM PC but also each of the clones that used Microsoft’s operating system.

The personal computer was fast becoming a popular icon. By 1982 the sales of the Apple II were strong and the IBM PC was quickly gaining a piece of the rapidly growing market share. Furthermore, other companies were looking to compete and by the end of the year over 5.5 million personal computers were being used as Atari, Commodore, Compaq, Osborne, Sinclair, Tandy Radio Shack, and Texas Instruments each offered their own models for sale.[8] Another important factor for the personal computer’s popularity was its new data communication capability. Hayes successfully marketed a 1200bps modem, allowing computer users to access information services like Compuserve, Lockheed, and The Well.

The new PCs were so successful that TIME magazine decided to honor them. Originally it planned to name Steve Jobs as its “Man of the Year”. But because sales of other PCs were rising so dramatically, they changed their mind. Instead, in a January 1983 issue, TIME decided to name the “Personal Computer” its “Machine of the Year”. Although the magazine’s yearly acknowledgement generally goes to real people and was originally scheduled to go to Apple’s Steve Jobs, the dramatic sales of the IBM PC at the end of the year convinced them to change their minds.

Notes

[1] Computer: A History of the Information Machine. p. 260.
[2] Apple statistics from Rose, Frank (1989) East of Eden: The End of Innocence at Apple Computer. NY: Viking Penguin Group. p.47. VisiCalc information from Jones Telecommunications & Multimedia Encyclopedia at http://www.digitalcentury.com/encyclo/update/bricklin.html, accessed on October 24, 2001.
[3] This is a term from the documentary, Triumph of the Nerds that played on PBS during 1996.
[4] Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 338-344. This section contained biographical information on Mitch Kapor and the beginnings of Lotus 1-2-3
[5] Information on IBM PC release from Michael J. Miller’s editorial in the September 4, 2001 issue of PC MAGAZINE dedicated to the PC 20th anniversary.
[6] Information on Kapor and Lotus from the (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 128.
[7] Information on Compaq from (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 38.
[8] Number of PCs in 1982 was accessed on December 10, 2001 from: http://www.digitalcentury.com/encyclo/update/comp_hd.html.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    May 2024
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.