The Transformation of Telecom to Global IP, GATT to GATS
Posted on | October 19, 2014 | No Comments
In a previous post I wrote about how the U.S. Clinton-Gore administration used the notion of the Global Information Infrastructure (GII) to push for the adoption of Internet protocols through multilateral trade negotiations and telecom privatization. Below I address how the inclusion of services in trade talks helped facilitate the international spread of the Internet and dramatically reduce the costs of voice and video communications.
International trade negotiations had historically concentrated on physical goods while services were never even seriously considered until the November 1982 GATT ministerial meeting. Subsequent calls by several OECD countries for a new multilateral trade round with services as one of the top agenda items emerged in the mid-1980s. The inclusion of services in the 1987 Uruguay Round of trade negotiations led to the General Agreement on Trade in Services (GATS) as part of the mandate of the World Trade Organization (WTO).
The GATS extended the WTO into unprecedented areas never previously recognized as coming under the scrutiny of trade policy. In the communications services area these included audiovisual services such as radio and television, educational services, entertainment services such as theatre and circus productions, news agencies, and of course telecommunications services such as email, packet-switching transmission services, and voice telephone services. Former WTO Director-General Renato Ruggiero remarked on the inclusion of services into the realm of international trade negotiations, “I suspect that neither governments nor industries have yet appreciated the full scope of these guarantees or the full value of existing commitments.”[1]
Sixty-nine nations party to the WTO, including the U.S., reached an agreement to open up their telecommunications markets as part of the GATS on February 15, 1997. The Agreement on Basic Telecommunications Services codified an accord to “negotiate on all telecommunications services,” particularly new rules for telecommunications deregulation. This included data communications, facsimile services, private leased lines, PCS, cellular telephone, as well as both fixed and mobile satellite services. These countries agreed to privatize and open their own telecommunications infrastructures to foreign penetration and competition by other telcos through either resale or their own facilities. Active in these negotiations was the International Telecommunications Union as well as the United States Trade Representative (USTR) for the Clinton-Gore administration.
The WTO GATS agreement allowed U.S. companies to compete for the estimated $600 billion global market in local, long-distance and international services. Acting US trade commissioner Charlene Barshefsky claimed that the agreement would lead to approximately 1 million new jobs in the US over the following 10 years. “From submarine cables to satellites, from wideband networks to cellular phones, from business intranets to fixed wireless for rural and under-served regions, the market access opportunities cover the entire spectrum of innovative communications technologies pioneered by American industry and workers,” Barshefsky said at a press conference. FCC Commissioner Reed Hundt, who worked closely with Al Gore on developing the overall communications policy, predicted the treaty would reduce the price of international calls by 80% over the following ten years.[2]
As the WTO agreement on basic telecommunication services went into effect in February 1998, the number of countries committing to the agreement on basic telecommunications services had reached 72 with 59 agreeing to more liberalization; including competition, foreign investment, interconnection guarantees, and an independent FCC-type regulator. Most realized the benefits of the agreement as lower international prices for phone calls. By 2001, the number of participating countries reached 75.
The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not negotiated reduced tariffs for crucial networking and computer equipment and also the liberalization of telecommunications services around the world. As we Skype with friends and relatives in other countries or if we click Like or comment on a Facebook update we can attribute that to aggressive trade negotiations in the 1990s.[3]
Notes
[1] Director-General Renato Ruggiero’s quote from Christoph Strawe’s “GATS – Service to Whom?”
[2] Washington Post, “Telecom Pact Opens Up World Phone Markets” February 16, 1997. A good resource for the time period was Reed Hundt’s (2000) You Say You Want a Revolution? A Story of Information Age Politics.
[3] As a graduate student I participated in the conferences entitled TIDE 2000 (Telecommunications, Information and Interdependent Economies) organized by the Japanese Foreign Affairs Ministry and the East-West Center in Honolulu, Hawaii. It was a major forum for the early debates on trade negotiations on services and telecommunications. It was reported in Jussawalla, M. et al. (eds.) Information Technology and Global Interdependence. New York: Greenwood Press. 1989.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. His first faculty position was at Victoria University in Wellington, New Zealand.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Al Gore > FCC Commissioner Reed Hundt > GATS > General Agreement on Trade in Services (GATS) > Information and Interdependent Economies) > TIDE 2000 (Telecommunications > USTR > WTO agreement on basic telecommunication services
The Transformation of Telecom to Global IP – GII to WTO
Posted on | October 19, 2014 | No Comments
In the mid-1990s, the international telecommunications world experienced a fundamental transformation. With the introduction of the “Global Information Infrastructure” (GII) in 1994, Vice-President Gore fired a warning shot that was followed up with a series of reforms designed with the globalization of communications services and e-commerce in mind. By 1995, a powerful redefinition was settling over the industry. “In successfully shifting the locus of international regulation away from the International Telecommunications Union (ITU), a European and developing-country power base, to the World Trade Organization (WTO), where its power reflects its huge, high-income market, the United States has also fundamentally shifted the conceptualization of telecommunications away from the postwar public utility, security related, monopoly model, to that of a customer driven, trade-related, service industry.”[1] This post examines what and how international institutions had a bearing on the development and diffusion of the global Internet Protocols (IP) that have transformed communications and social media around the world.
The WTO met quickly in Singapore in 1996 and quickly resolved to reduce tariffs on the flow of information technologies. The next year it met in Geneva and established rules for the continued privatization of national telecommunications operations. The telco environment moved from highly regulated bureaucratic telecoms united under the umbrella of the ITU to less regulated privatized telcos operating, however, within an international trade regime. They have shed their government PTT (Post, Telegraph, and Telephone) bureaucracies only to find themselves embroiled in a larger net cast by the international treaties of dominant countries. However, these multilateral arrangements could break down another set of bureaucratic organizations, the broadcasters, and with it usher in a new age of television, characterized by a multiplicity of interactive services and new business models based more on e-commerce rather than mass advertising.
Vice-President Gore had introduced the concept of the GII at the annual ITU meeting in Buenos Aires during March of 1994. The target was the national PTT monopolies, the ITU’s main clientele. “He described a new communications revolution driven by the export of three American ideas: competition instead of monopoly, the rule of law, and the connection of networks to existing networks at a fair price.”[2] Gore’s approach was to use the government to ensure competition and economic development. “He outlined the basic principles of a policy revolution: the Administration would repudiate the embrace of monopoly by government and instead use the power of law to open all markets to innovation, competition, new investment, new entrepreneurs.”[3]
The GII was not a practical, technical solution. Still, the Internet had not emerged as the obvious global telecommunications medium at the time. Wireless, cable television, and direct-to-home satellite systems were all in competition to emerge as the dominant “information superhighway”. The GII was a conceptual framework to further challenge the PTTs and pave the way for data communications and all the related services that had been promised by ISDN (Integrated Services Digital Network) and the proliferation of Internet Protocols (IP).
Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations which called for the creation of the World Trade Organization. It would be the WTO that would help facilitate the modernization of telecom networks around the world and break down the barriers to global IP. During that summer President Clinton, following Democratic tradition reaching back to the 1934 Reciprocal Trade Agreements Act that authorized the President to negotiate trade agreements with other countries, urged the development of an international information infrastructure at the G-7 meeting in Naples, Italy.
The G-7 had emerged since the breakdown of Bretton Woods in the 1970s as a powerful vehicle for coordinating international policy and pressuring multilateral organizations. The next year, after the WTO was formed, the G-7 nations (sometimes G-8) met in Brussels, Belgium for a Ministerial Conference on the Information Society. Britain, Canada, France, Germany, Italy, Japan, and the United States agreed in principle to develop the Global Information Infrastructure (GII) and funded a number of projects to test international broadband networking as well as special projects on emergency management and telemedicine. Despite rising opposition, Congressional Republicans supported the Clinton-Gore initiative and helped to ratify the international trade agreement.
The World Trade Organization was formed on January 1st 1995. The WTO was conceived as an organization designed to negotiate reductions on international tariffs and other trade barriers. Formerly the General Agreement on Tariffs and Trade (GATT), the WTO was created for the liberalization of international trade and economic cooperation across national boundaries. Due to the complexity of international economic interdependence, the contracting parties of GATT launched the eighth major trade negotiation round at a ministerial meeting in Punta del Este, Uruguay, in September 1986. More than one hundred nations participated in negotiations regarding international economics. Over the next three years, the World Trade Organization (WTO) tackled crucial issues that paved the way for the Internet and global e-commerce. Clinton’s re-election and the signing of the Telecommunications Act of 1996 in February 1997 gave the administration a powerful negotiating position and they stressed and pushed Gore’s telecommunications plan.
In late 1996, the WTO met in Singapore and agreed to reduce tariffs on information technology trade, including personal computers. The Information Technology Agreement (ITA) was concluded at the December 1996 Ministerial Meeting in Singapore and took effect July 1, 1997. The ITA was a benchmark agreement that significantly reduced tariffs on a wide range of information technology products. It reduced customs duties on computers and telecommunications and planned to eliminate them by 2000. This affected a whole range of products from computers, keyboards, printers, modems, switching equipment, semiconductors, software and scientific equipment. It specifically allowed American companies to sell their IT wares more competitively.
Cisco was particularly aggressive in advocating further liberalization of trade in information technology products through its membership in the American Electronics Association and other industry coalitions. Cisco’s CEO John Chambers was one of several high-tech leaders that served on President Clinton’s Advisory Committee on Trade Policy Negotiations (ACTPN) and helped in developing a plan for addressing Internet commerce issues at the WTO.[4] The ITA significantly reduced tariffs on over 90% of information technology products traded globally. The agreement meant savings for U.S. exporters of some $5 billion each year.
The WTO met early the following year in Geneva, Switzerland and addressed a new round of trade negotiations on information technologies and telecommunications services.[5] The February meeting sought a new Information Technology Agreement (ITA II) that was intended to further liberalize markets and benefit information technology manufacturers and consumers. In the end, trade negotiators failed to reach an agreement on the ITA II because of continuing disputes regarding questions about what products would be covered. However, an agreement was reached that signaled good news for cheap telecommunications and the internationalization of Internet technologies and the World Wide Web.
Citation APA (7th Edition)
Pennings, A.J. (2014, Oct 19).The Transformation of Telecom to Global IP – GII to WTO. apennings.com https://apennings.com/global-e-commerce/the-transformation-of-telecom-to-global-ip-gii-to-wto/
Notes
[1] Quote on the WTO from Jill Hills, “U.S. Rules. OK?”, in Robert W. McChesney and John Bellamy Foster (1998) Capitalism and the Information Age: The Political Economy of Global Communication Revolution. p. 101.
[2] Hundt, Reed. (2000) You Say You Want a Revolution? A Story of Information Age Politics. p. 45.
[3] ibid, p. 25.
[4] Cisco would become the world’s largest company by market capitalization in 2000.
[5] Early debate on services inclusion from Jonathan D. Aronson, “Trade Negotiations, Telecom Services, and Interdependence,” in Jussawalla, M. et al. (eds.) Information Technology and Global Interdependence. New York: Greenwood Press. 1989. This book consisted largely of contributions to three conferences entitled TIDE 2000 (Telecommunications, Information and Interdependent Economies) organized by the Japanese Foreign Affairs Ministry and the East-West Center in Honolulu, Hawaii.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor at the State University of New York, South Korea. SUNY Korea offers BS, MS, and PhD degrees from Stony Brook University. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. His first faculty position was at Victoria University in Wellington, New Zealand.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: G-7 > GATT (General Agreement on Tariffs and Trade) > General Agreement on Tariffs and Trade (GATT) > Information Technology Agreement > ISDN (Integrated Services Digital Network) > PTT > Reciprocal Trade Agreements Act > Telecommunications Act of 1996 > Uruguay Round of the GATT > World Trade Organization > WTO
Robin Williams, Dead Poets, and Symbolic Investments in the Virtual Classroom
Posted on | October 16, 2014 | No Comments
Like most of us, I was saddened by the loss of Emmy, Grammy, and Golden Globe winning actor Robin Williams. Here is an excerpt from one of my PhD essays, “Dead Poets and the Lawnmower Man,” that drew on the movie, The Dead Poets Society and his excellent performance to investigate virtual reality as an educational tool.[4]
In the movie The Dead Poets Society (1989), the stark contrasts between the closed moral community of the preparatory Welton Academy and the emotional and intellectual capers of John Keating, its new teacher played by Robin Williams, presents an opportunity to question the processes of signification (meaning-making) and energetic investments in modern educational environments.
The repressed libidinous and spiritual “economies” of the all-male boarding school invite a reading of The Dead Poets Society that focuses on its sociosignifying practices – how meaning is fixed and organized through processes like language, dress, and action. It is of particular interest in that it refigures the role of the teacher as what Jean-Joseph Goux referred to as a “symbolic third”.
Goux, in his quest for a general economics (beyond money), gives us a strategy to view the teacher as symbolically elevated figure operating as a type of currency. Using his Symbolic Economies, we can see the teacher as a condensation of values that respectively raises his position to that of privileged mediator and arbitrator of intellectual values, meaning and performance – as well as the construction of facts and “truth.”
In the Dead Poets Society, the teacher, played by Robin Williams, is a “media event” in the sense that, by elaborating a series of emotionally and intellectually rich forms of signification, he disrupts the school’s anti-erotic sovereignties and traditional forms of educational homage. John Keating is a carefully constructed teacher-character who maintains a credible front to his peers while engaging his students in a series of revaluing exercises. His invoking of the philosophy of “carpe diem” for example, disrupts the ascetic delays of pleasure and self-gratification that serve to channel emotional and intellectual investments into the student subjectivities prescribed by the school’s bourgeois “govern-mentality.”
His unusual behavior and pedagogy invoke a curiosity in his students that addresses their subjugated desires and self-construction. His former pact with an “ancient” secret society of self-proclaimed poets awakens their dormant dreams of social adventure and expressive identities. This secret knowledge, time-tested by the ancients of their alma mater, promises sexual conquest and alternative forms of imagination. “Spirits soared, women swooned, and gods were created.” By re-presenting literary classics of Shakespeare and Milton but with the voice of macho film star and arch-American John Wayne, he distorts the distinctions between “high” and “low” cultures and encourages the dissolution of aesthetic boundaries that work to solidify not only class distinctions but the socio-energetic rigidifications of emotional affect.
The members of the reincarnated club, “Dead Poets Society”, organized their meetings in a cave located in a nearby forest. There they read unauthorized poetry, smoked cigarettes, mixed with women – all the activities they are forbidden at the school. As Gebauer points out, the symbology of the cave has never been about the outside world, but about the inside one. “Our imagination remains captive in the cave. We do, in fact, repeatably seek out the cave in a different form.” Our ontology has its commencement in the topography of the cave, and he points out: “In one way or another, all our notions of paradise are linked with situations of the cave.”[3]
Keating’s enthusiastic ideations soon come into conflict with other domains of symbolic control however, including the potent Oedipal dynamics which have proved to rein too tight a grip on one of his students. In his quest to act in a community play, the student goes against his father’s demands to cut down on his extracurricular activities, forges a permission slip, and performs the main role of Buck in A Midsummer Night’s Dream.
The father inadvertently discovers the disobedience and shows up at the play to observe. He fiercely pulls his son away from the backstage party despite the acclaim and obvious success. After a confrontation at home, where among other things, the mother’s disappointment is invoked to punish the son, he is forbidden to act again or at least until he goes on and finishes medical school. Faced with this paternal injunction, he takes his own life.
The death of a student presents a moral catastrophe that overpowers Keating’s privileged text of spontaneity and impunity. These are now recoded as degenerate improprieties, and their “unproductive” forms of expenditure are tallied against the teacher as infractions within the Calvinistic ledgers of the schoolmasters. The conflicting father is able to easily organize the dismissal of the teacher.
The students respond by pledging their allegiance at the resolution of the film, by standing on top of their class desks and citing the title of Walt Whitman’s famous poem, “O Captain! My Captain!” acknowledging and respecting Keating’s role as their navigator through the uncharted course of adolescent squanderings and discoveries.
The Dead Poets Society reflects the profound symbolic and historic investments structuring traditional education and how the currency of the teacher can facilitate new types of energetic and intellective exchanges. If educational space is to become cyberspace in a socially and politically responsive way, than it behooves us to mark its inception with at least one strategy that is sensitive to the “economies” which mediate and control its energetic and symbolic investments.
Notes
[1] He played a teacher at a private boarding school. It was part of a section called “Dead Poets and the Lawnmower Man”, in Symbolic Economies and the Politics of Global Cyberspaces. (1993) about the possibilities of teaching eventually in virtual reality classrooms.
[2] Economie et symbolique, Ed. du Seuil, 1973.(Translation of these two books in one volume: Symbolic Economies, Cornell University Press, 1990.)
[3] Gebauer, G. (1989) “The Place of Beginning and End: Caves and Their Systems of Symbols,” In Kamper & Wulf (eds.) Looking Back on the End of the World. (NY: Semiotext(e) Foreign Agents Series). p. 28.
[4] Adding a late note about the use of language as prescribed by one of my favorite mentors, Michael J. Shapiro who taught us that one objective of writing is to invoke “delirium” in the reader, as in taking them out of their normal channel (Latin: Lirium = canal) by exploring the edges of intelligibility on a subject.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. His first faculty position was at Victoria University in Wellington, New Zealand.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Dead Poets and the Lawnmower Man > Dead Poets Society > Robin Williams > Virtual classroom > virtual reality
Lotus Spreadsheets – The Killer App of the Reagan Revolution – Part 1
Posted on | September 23, 2014 | No Comments
The major feature of the “Reagan Revolution,” according to Peter Gowan’s Global Gamble, was to “put money-capital in the policy saddle for the first time in decades.”[1] From the time of his presidential inauguration in early 1981 and throughout his eight-year tenure, Reagan’s administration sought to propel the financial sector through widespread policy changes designed to “roll back” the containment of finance instituted during FDR’s reign and the early Cold War. Part of what made this political and economic movement consequential was the development of the electronic spreadsheet and its use on the newly invented personal computers such as the Apple II and IBM PC.
In this post I want to set the context for the success of the spreadsheet, particularly Lotus 1-2-3. In a future post, I will explore the formal aspects of the spreadsheet as a meaning-making application and why they are so effective in social and economic realms. I’m not taking a technological determinist position here, but rather arguing that spreadsheets and related financial technology facilitated the impact of what is sometimes called the “Reagan Revolution.” The major policy characteristics of this economic transformation included:
- the deregulation of banking and financial industries;
- relaxing the laws on anti-trust and corporate acquisitions;
- major tax cuts to privatize surplus wealth;
- securitization of student debt and other financial instruments;
- removing the caps on interest rates that banks could charge on credit cards and other loans;
- increasing US government debt to feed the bond industry and provide an additional hedge for financial risk-taking;
- selling off government agencies and assets;
- strengthening the dollar to help export production capital to low-cost countries, as well as;
- pressuring countries around the world to enhance the flow of US-produced news; and the
- liberalization of global capital controls.[2]
Also important was the increased military spending and commercialization of Cold War technology that facilitated globalization of capital with fiber optics, microprocessors and packet-switched data communications. These technologies were the primary drivers of financial innovation and economic activity in the 1980s and their productive legacy continues to shape the global economy.
The ingeniously innovative “microcomputer” spreadsheet, VisiCalc, was created when Dan Bricklin teamed with his friend Bob Franston in 1977 on an homework assignment for his Harvard MBA degree. Not surprising, it was an assignment to do the calculations for one corporation to take over another corporation that sparked Bricklin’s computing solution. Faced with doing the monotonous calculations on standard green ledger sheets, he fantasized about creating a calculating tool that combined the usability of fighter plane “heads-up display” simulation with re-editing capability of a word processor. The two of them went to work with an Apple II and the result was a new “visible calculator” technology that rocked the financial world.[3]
The use of the spreadsheet exploded after IBM introduced its own “Personal Computer” in August of 1981. Soon after, Lotus 1-2-3 became available for the “PC” and all the “IBM-compatible” clones such as Compaq and Dell. Lotus 1-2-3 was named for its spreadsheet, graphing, and database capabilities that combined to produce an extraordinary new facility to both conceptually and textually organize financial information. Although the earliest PCs were weaker than their bigger contemporaries – mainframes, and even the relatively large minicomputer, they had several advantages that increased their usefulness.
The main advantage of the PC-based spreadsheet was its immediacy – it put computing power in the hands of a single user and bypassed the traditional authority structures of the data processing centers organized around mainframes and minicomputers. The microcomputer was characterized in part by its accessibility: it was small, relatively cheap, and available via a number of retail outlets. It used a keyboard for human input, a cathode ray monitor to view data, and a newly invented floppy disk for storage. Together they allowed a user to input their own numbers and play with different combinations. The main benefit being the new flexibility in terms of the speed and amount of information immediately available. Unlike using a spreadsheet on a mainframe, which required trips to the EDP department for each data input change, the PC-based spreadsheet allowed new data to be entered easily via the keyboard and provided immediate results on the screen.
One implication was that frustrated accountants and financial analysts would go out and buy their own computers and software packages, often over the objections or indifference of the EDP department. People could do the calculations themselves, and ignore the bureaucracy.[4] Lotus 1-2-3, with its combination of graphics, spreadsheets, and data management caught the eye of many business entrepreneurs and corporate executives who saw the value of a computer program that simplified the monumental amount of numerical calculations and manipulation needed by the modern corporation. By October 1985, CFO magazine was reporting that “droves of middle managers and most financial executives are crunching numbers with spreadsheet programs such as Lotus 1-2-3.”[5]
Microcomputer based spreadsheets became ubiquitous in the business world and became a major productivity tool. In an era of incredible economic and financial flux, the electronic spreadsheet became the “killer app” that guaranteed the success of the PC industry and also provided an incredible new utility for individuals in the financial sphere. They were empowered to create dramatic new numerical calculations and construct new financial “what-if” scenarios in unprecedented short timeframes. As the Reagan Revolution took hold, the spreadsheet was there to to itemize and measure value, mobilize dormant resources, and place them into the transactional circuits of the global economy.
For example, spreadsheets were used around the world in a process called “privatization” where national assets were minutely valued to produce collateral for international loans or in the case of state-owned enterprises (SOES), turned into shares that could then be listed on national or international stockmarkets and sold. Margaret Thatcher started this process with the sale of British Telecom and soon after New Zealand became the international model when it “corporatized” and sold off its telecommunications agency to retire one-third of the accumulated national debt.
Within a liberalized regulatory infrastructure and an interlinked chain of financial institutions, financial traders eager to become “masters of the universe” quickly adopted the new technology. “Spreadsheet knowledge” began to have an extraordinary ability to capture and fix value in monetary terms. Spreadsheets are not so much a reflective technology as they are a constitutive and productive technology. They do not reveal the world as much as they create new financial meaning by creating and solidifying relationships between previously disparate resources. They were increasingly used by accountants and other financial magicians to construct value in such a way that it can be entered into the flows and accumulation processes of corporations other organizations enmeshed in the monetary flows of the global economy.
The PC-based spreadsheet created a new visualization process that combined financial calculation with interactive manipulation in such a way as to help create a new financial-based economic dynamism. It is this combination of financial deregulation and technological innovation that created the trajectory of digital money-capital and enshrined the legacy of the Reagan Revolution. That inheritance lives on in the disparities of debt and wealth so prevalent in today’s dystopic global economy.
In Part II, I will discuss some of the historical precedents that led up to the 1980s as a period of intensifying financialization that welcomed the use of the PC-based spreadsheets. The corporate environment was particularly vulnerable to a variety of financial raids enhanced by spreadsheet technologies like Lotus 1-2-3.
In later posts, I will examine the more formal aspects of how the spreadsheet works by using a combination of cultural and media analysis to explore its internal machinations and external implications. An important question in this inquiry examines the importance of “spreadsheet capitalism,” the role of these calculative devices in organizing and evaluating the financial information is central to modern organizations and the global political economy.
Citation APA (7th Edition)
Pennings, A.J. (2014, Sep 23). Lotus Spreadsheets – The Killer App of the Reagan Revolution – Part 1 apennings.com https://apennings.com/how-it-came-to-rule-the-world/spreadsheets-the-killer-app-of-the-reagan-revolution-part-1/
Notes
[1] Peter Gowan’s (1999) Global Gamble: Washington’s Faustian Bid for World Dominance.
[2] US debt tripled under Reagan to over $2 trillion. Notable liberalization of global money flows occurred when Reagan addressed eurodollars by allowing onshore facilities. This list is compiled from my work on How IT Came to Rule the World which examined the Reagan legacy in such entries as From Sputnik Moment to Reagan Revolution and How Star Wars and Japan’s Artificial Intelligence Threat Led to the Internet.
[3] Bricklin quote from (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 30.
[4] Stephen Levy’s “Spreadhsheet Way of Knowledge” was an early influence. So much so that I asked one of my NYU students to create the linked website. It was originally published as Chapter 10 in Tom Forester’s (ed.) Computers in the Human Context: Information Technology, Productivity and People. Basil Blackwell. 108 Cowley Road, Oxford OX4 1JF, UK.
[5] Quote from CFO on the impact of Lotus 1-2-3 in the corporate world from David M. Katz, “The Taking of Lotus 1-2-3? Blame Microsoft.” CFO.com. December 31, 2002.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a Professor in the Department of Technology and Society, State University of New York, Korea where he teaches digital economics. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Global Gamble > Lotus 1-2-3 > Reagan Revolution > spreadsheet > spreadsheet capitalism > VisiCalc
Management and the Abstraction of Workplace Knowledge into Big Data
Posted on | August 30, 2014 | No Comments
The factory of the future will have only two workers: a man and a dog. The human being’s job is to feed the dog, whose function is to keep the man away from the machine. – Warren Gameliel Bennis
Understanding information technologies and the emergence of “big data” in the workplace requires some scrutiny of work processes, the relationship between labor and human bodies, and the historic role of management. In particular, how has a worker’s laboring activities been transformed into knowledge that could be collected, analyzed and used by managers? What are the implications of this abstraction of labor and its transformation into abstract data and technology-assisted management?
This post looks at how industrial intelligence has been removed from the site of the working body and relocated to the electronic space of cybernetic analysis, control and communication. It also discusses how this process has been transferred to other aspects of economic and social and has been part of a new phenomenon called “big data.”
Shoshana Zuboff’s In the Age of the Smart Machine: The Future of Work and Power (1988) was one of the more interesting inquiries into the processes of computerization and electronic communications to emerge out of the 1980s. It was a significant contribution to the organizational and sociological discussion on the way information technologies were being used in manufacturing and service sectors. One of her main contributions, the verb “informating,” provided important insights into the key practice of the new technologies and the construction of digital data in the cybernetic age.
Analyzing pre-Internet computerized environments, she identified informating as the process of digitally registering a wide range of information related to computer tasks. She both connected and compared informating to the processes of automating. Computers have historically been involved in automating – the process of replacing human activities and work with machinery. Zuboff distinguished automating from informating because the latter “produces a voice that symbolically renders events, objects, and processes so that they become visible, knowable, and shareable in a new way.”[1] Consequently, informating is an effective concept for approaching that vast data gathering and analysis project that is currently consolidating a wide range of structured and unstructured data from throughout the cybersphere.
The data collection processes involved in computerization are significant. They lead to an accumulation of information that is intimately related to the individual, and yet are essential for the continuance of modern private and public bureaucracies. As they monitor the various activities of everyday life as well as industrial production, they also keep a record that can be accessed or fed into larger databases across the Internet. For example, in a supermarket, your groceries’ barcodes are read and fed into a computer. Not only does it tabulate the price but it enters the information into database files for inventory, finance, and marketing that can later be analysed, examined, graded, and shared. In conjunction with loyalty programs and recommendation engines, big data is used by supermarkets to identify customer attributes like parenting or gluten-free preferences and tailor digital coupons and other promotions through email and social media.
Zuboff’s concern with the codification of the work environment intricacies into machine-compatible texts opened up a range of inquiry that is applicable to other facets of modern life. Drawing on what she terms the dual capacity of information technology: its ability to both automate and informate productive activities; she was able to analyze how technology changes the practices of work, managerial authority, and the supervision of employees. The “Internet of Things” (IoT), a connective network of electronic devices or “things” embedded with microelectronics, algorithmic software, and multi-faceted sensors, collects and exchanges data from dispersed objects in cloud-based data facilities for analysis. For example, “solar roads” that collect sunlight for electricity will be equipped with sensors that monitor highway conditions and alert oncoming cars as well as transportation authorities. If there’s debris or snow on the road, sensors in the smart pavement will detect it and relay the data.
The simultaneous growth of industry and bureaucracy at the beginning of the twentieth century created new demands for skills, machinery and control mechanisms that could be implemented in the workplace. Industrialism was maturing as was the consumer society, and manufacturing was gearing up for mass production. Work and workers became objects of intense study so that their skills and knowledge could be abstracted and translated into new work procedures and technologies. This process also created a growing class of managers whose job it was to study, refine and supervise these processes.
Frederick Taylor emerged as the leading figure in the trend towards observing, describing, and then systematizing worker’s skills so that they could be “re-engineered,” to use a modern buzzword. Taylor’s studies of minute worker activity lead to “time studies” designed to refine muscular movement in manufacturing and other work activities. They were also meant to “provide the quantitative empirical basis for a more rationalized control of industrial production.”[4] In Zuboff’s terms: “Taylorism meant that the body as the source of skill was to be the object of inquiry in order that the body as a source of effort could become the object of more control.”
She elaborates on the use of the information:
-
Once explicated, the workers know-how was expropriated to the ranks of management, where it became management’s prerogative to reorganize that knowledge according to its own interests, needs, and motives. The growth of the management hierarchy depended in part on upon this transfer of knowledge from the private sentience of the worker’s active body to the lists, flowcharts, and other systems of measurement in the planner’s office.[2]
Taylor’s work was published as The Principles of Scientific Management (1911). His ideas were a major inspiration for the efficiency movement that sought to identify and eliminate waste in all social and economic areas of what would be called the Progressive Era in the US.
Taylor’s “scientific management” ideas were never implemented by any one company without some modification. However, Henry Ford was able to simplify the process with his moving assembly line for automobile production. He implemented a series of conveyor belts, overhead rails and planned sequences that would keep production in constant flow. Based on the Midwest’s great meat-packing “disassembly” lines, Ford aspired to the ease in which oil and other liquids and gasses could be moved and processed.[4]
By further reducing the need for physical effort and skill, Ford was able to develop economies of scale and create the gigantic new automobile industry that could grow and include new unskilled immigrants and rural laborers. One of the costs involved, however, was the loss of skilled labor. Worker’s capabilities became “congealed” in the machinery, in the sense that their energies and skills are designed into the machinery of production, including robots. Also, one working body could be replaced easily by another. Often the benefit was an easier job for the worker in terms of physical toil, but it came at the price of the autonomy and negotiating power.
Managers facilitated the movement of bodily effort and skill into the machines and industrial techniques and then expanded into the intellectual areas of the owner/executive. Workers and managers operate with different types of literacies. Workers have been generally body-oriented and utilize the action-centered skills developed in physical labor. They develop implicit knowledges gained through performance and learned by observation and imitation. Zuboff called the activities when laborers use their bodies to work on materials and tools “acting-on.” Whether stirring paper pulp, operating a forklift or typing on computer keyboards, their major concern is with working with things rather than people.[5]
Conversely, white-collar workers use their bodies in significantly different ways. Although differences occur between top managers and middle managers, she uses the term “acting-with” to distinguish managers’ main responsibilities from the “acting-on” activities that monopolize workers’ activities. Top managers are also very much engaged in bodily activities, but primarily those that call on their abilities to interact with other people. Bodily presence, manifested primarily through the voice but also through dress and non-verbal behaviors are key to their success. Face-to-face verbal interchanges culling gossip, opinion, hearsay, and physical cues while transmitting in a way that heightens their personal charisma and sociability is a primary responsibility of top managers. Zuboff returns to the word “sentience” to describe the way top managers develop a “feel” for people and situations.
Zuboff’s study of working environments was conducted in the era of traditional databases that collected, sorted, and retrieved data according to prescripted formats and stored on a mainframe’s magnetic tape. With the introduction of Internet, cheap servers, data centers and software solutions like Hadoop, a new system became possible. It was feasible to collect unstructured data from mobile devices, PCs, and the whole Internet of “things” in the workplace. Information from data sources such as environmental sensors, production schedules, timesheets, etc., increasingly became fodder for analysis and innovative value creation.
She also drew on the politics of Michel Foucault, who focused, in part, on the “panopticon” of procedures of examination and file-building that were a crucial for the exercise of modern power. The examination works to hold their subjects of attention “in a mechanism of objectification.”[6] Examination turns the economies of surveillance and visibility into an operation of control. It proceeds by the textualization, the informating of data according to a set of prescribed protocols and knowledges. The file has an agenda and not just a loose collection of random documents. Under this official gaze, individuals become blank slates to be evaluated, classified, and registered in the official system of files. Max Weber had already identified the file to be crucial for the organization of bureaucracy. The examination that places individuals in a field of surveillance also situates them in a network of big data collection; it engages them in a whole mass of documents that capture and fix them.”[7]
These “cybernetic identities” are characteristic of the information age where the proliferation of multimediated information is changing the way people operate in the arenas of their lives. Furthermore, since information technology is largely developed out of institutional requirements, it is inherently political. Cybernetic identities are connected to the great bureaucratic spaces of credit, education, and production. They are the result of types of observation, classification, and registration. They result from a penetrating gaze which codes, disciplines, and files under the appropriate heading. Actions lose their actuality, and bodies lose their corporeality.
Mark Poster used Foucault to think about the consequences of computer databases on subjectivity and its multiplication of selves to feed an extensive array of organizational files. He was less concerned with databases as “an invasion of privacy, as a threat to a centered individual, but as the multiplication of the individual, the constitution of an additional self, one that may be acted upon to the detriment of the ‘real’ self without that ‘real’ self ever being aware of what is happening.” The texture of postmodern subjectivity is dispersed among multiple sources of information production and storage. In The Mode of Information, he warned of the “destabilization of the subject,” a fixed self no more but rather one “multiplied by databases, dispersed by computer messaging and conferencing, decontextualized and re-identified by TV ads, dissolved and materialized continuously in the electronic transmission of symbols.”[8] In an age when Google wants to “organize the world’s information,” we are still trying to determine the implications of that multiplication of identity within the networks of institutional power.
Notes
[1] Zuboff, Shoshana. In the Age of the Smart Machine: the Future of Work and Power. New York: Basic, 1988. Print., p. 9.
[4] Beniger, The Control Revolution: Technological and Economic Origins of the Information Revolution. 1986; p. 298-299.
[2] Beniger, JamesThe Control Revolution: Technological and Economic Origins of the Information Society. 1986; p. 294.
[3] Zuboff, S. In The Age Of The Smart Machine: The Future Of Work And Power. 1988; p. 43.
[5] Distinctions between “acting-on” and acting-with” from Zuboff, S. In The Age Of The Smart Machine: The Future Of Work And Power. 1988; p. ??.
[6] Rabinow, Paul, comp. The Foucault Reader. London: Penguin, 1991. Print., p. 200-201.
[7] Poster, Mark. The Mode of Information: Poststructuralism and Social Context. Chicago: University of Chicago, 1990. Print. p. 98
[8] Poster, Mark. The Mode of Information: Poststructuralism and Social Context. Chicago: University of Chicago, 1990. Print. p. 15.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Big Data > Frederick Taylor > Informating > scientific management > Shoshana Zuboff > Taylorism
Discerning Media Economics
Posted on | August 25, 2014 | No Comments
The area of media economics has made important contributions to understanding traditional broadcast technologies such as newspapers, television and radio, but has only recently addressed the capabilities and consequences of digital media. This post seeks to investigate two interrelated areas: media economics and the political economy of communications/media; then begin to apply them to the realm of new digital media and the web by reviewing some important contributions to field.
When I arrived at New York University in the wake of the dot.com crash, one of the first courses we created was the Political Economy of Digital Media. The hubris of the “New Economy” met with a bitter setback in those first few years of the new millennium as hundreds of new companies in New York City’s “Silicon Alley” and around the country ran out of cash and gullible investors. So the course became a foundation for a digital media management program that was more attuned to the economic realities of New York City’s very dynamic but competitive media environment. The course combined microeconomic concerns about the management and operations of digital media firms with the larger macroeconomic issues of the emerging “new media” industry and its relationship to employment patterns, investment activities and international trade.
Gillian Doyle’s Understanding Media Economics (2008) for example examined these different media industry sectors including film and “new media”, but lacked a comprehensive understanding of the role of digitalization and its impact on convergence of these industry sectors. A better approach was pursued in Media Economics: Applying Economics to New and Traditional Media by Colin Hoskins, Stuart McFadyen, and Adam Finn that organized its inquiry into media activities by economic areas such as supply and demand, consumer behavior, production and market structure. However, it still relied heavily on the analysis of traditional media with little more than token references to digital media and the Internet.
The political economy of communication/media genre has admirably placed emphasis on the role of media in society, problems associated with monopoly, and tensions in the workplace; but it has also relied on the traditional mass media model and has failed to connect with significant audiences despite its major goal of mobilizing for social action and political intervention. Critical texts like The Business of Media Corporate Media and the Public Interest (2006) by David Croteau and William Hoynes, while providing a very useful discussion of the role of the citizen knowledge and the public sphere, failed to anticipate key aspects of digitalization, social media and netcentric commerce that are radically changing the news industry and the organization of online knowledge.
Many of the political economy of media texts are written from a Marxist perspective, providing interesting social insights, but organizing their critiques around claims to an internal validity that have not been sufficiently substantiated. Furthermore, they over-utilize insular language that reduces external validity – their applicability to contemporary issues, and thus their relevance to the activism needed to address and confront social problems brought on by the new media. Vincent Mosco’s Political Economy of Communication (2009) for example, was claimed to be a major rewrite of his classic manuscript by the same name, but has been criticized for its adherence to an economic reductionism forged in an era of durable goods manufacturing and an insular debate with cultural studies. It neglects to apply its analysis to the web economy where the “click” is a new form of laboring.
Robert McChesney’s The Political Economy of Media: Enduring Issues, Emerging Dilemmas (2008), while reminding us of the problems of a media-saturated society such as censorship, propaganda, commercialism, and the depoliticization of society failed to address the relationship of media to economic sustainability and innovation, creative expression, as well as learning and education. As a result, his emphasis on the “critical” path of media scholarship, while dismissing what he disdainfully refers to the “administrative” path of communications, hasn’t framed its arguments in a manner that reaches students confronting the economic issues of their lives as well as practitioners in the field facing highly complex design and implementation problems.
He is such a major contributor to the area though, that it is hard to be too critical of his stance and I invite the reader to look at McChesney’s considerable body of work at Amazon. Likewise, take a look at Gilliam Doyle’s newer (2013) Understanding Media Economics, as she made an interesting transition from examining separate media areas like film, print and television, to looking at the characteristics of a more converged digital environment. With more emphasis on network effects, technological disruption, and the economics of content distribution, her analysis transcends some of the traditional barriers between these various media.
This area of economic inquiry is very promising for the future as it now encompasses a wide realm of digital media activities, going beyond traditional media to incorporate e-commerce and a number of other digital applications from drone journalism to quantifying health technologies. What is particularly exciting is the possibility of combining the development of individual skills and productive capabilities with exposure to progressive, socially conscious media and a new dimension of overall economic analysis.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. His first faculty position was at Victoria University in Wellington, New Zealand.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Media Economics > Political Economy of Communication > Political Economy of Media
Reviewing Castells’ Global Automaton
Posted on | August 3, 2014 | No Comments
In my long-term quest to find some answers as to what constitutes the techno-informational framework of the global financial system, I ran across Manuel Castells’ description of the “Automaton” a number of years ago. He wrote a chapter called “Information Technology and Global Capitalism” in Global Capitalism (2000) where he made some linkages between the global financial system and popular imagery about machinery and robots. He makes several incisive points about how trading in financial instruments, in conjunction with the surveying force of the informational/news infrastructure that supports it, has transformed into a sort of entity for its own sake. The trading in bonds, currencies, derivatives, and stocks in digital markets around the world has transformed into a disciplinary mechanism with governments, corporations and other entities in its binds. In several posts I have used Walter Wriston’s “information standard” in a similar way, but I think it’s useful to review Castell’s perspective:
The outcome of this process of financial globalization may be that we have created an Automaton, at the core of our economies, decisively conditioning our lives. Humankind’s nightmare of seeing our machines taking control of our world seems on the edge of becoming reality, not in form of robots eliminating our jobs or government computers that police our lives, but as an electronically based system of financial transactions.[1]
While that ending falls a bit flat, it is useful to pick up on that last word — transactions. The point Castells wants to make is that transactions do not equal markets. He says “While capitalists, and capitalist managers, still exist, they are all determined by the Automaton. And this Automaton is the not the market. It does not follow market rules — at least not the kind of rules based on supply and demand which we learned from our economics primers.”
Equilibrium—based market concepts have never worked exactly to basic economic theory in the financial area because speculative forces make rising prices attractive. Instead of reducing demand, as the “invisible hand” dictates, rising prices increase demand. As prices continue to rise, they allow an asset bubble to form. And we know what happens to bubbles.
In a world where banks, hedge, mutual and sovereign wealth funds transact in sums totaling tens of trillions of dollars a day, Castells contends that the motivations involved in financial transactions are complex. “Movements in financial markets are induced by a mixture of market rules, business and political strategies, crowd psychology, rational expectations, irrational behavior, speculative manoeuvres and information turbulences of all sorts.” For Castells, what he calls the “Automaton”, creates a “collective capitalist” system that operates with its own set of conditions.[2]
For Castells, this Automaton increasingly controls the operation of global financial markets. Furthermore, he suggests that this growing dependence on computer systems in the financial world means that the global economy is on the cusp of becoming, for all practical purposes, beyond the control of individuals, corporations, or governments. Felix Stalder has raised questions about the Castells’ analysis of informational capitalism in his Manuel Castells and the Theory of the Network Society (2006). He questions the claim that the economy is beyond the control of anyone and specifically asks “What do we win, and what do we lose, when we call the financial markets an “automaton”?”[3]
To address these concerns it is useful to outline the ways Castells argues financial markets have emerged over the years.
One is that in today’s world, electronic transaction systems allow for the fast movement of capital instruments between countries. With the advent of computer systems and a world networked via fiber optic cables and artificial satellites, computerized transaction systems have allowed fast movements of capital instruments between countries. I would add that this is not just a technological feat but one that required a dramatic transformation in the way telecommunications enterprises are organized and also they way they relate to their respective governments. The privatization of telecom operations around the world has allowed them to modernize with Internet-based technologies and largely transcend national borders. International regulatory bodies, such as the IMF and the World Trade Organization (WTO) have applied considerable pressure on countries to adopt policies that encourage global inter-connectivity and allow for unfettered capital flows.
Second, these global systems permit investors to rapidly make trades and transfer capital from one country to another in the search for optimal returns. Financial traders have always had incentives to invest in the newest and fastest technology. From carrier pigeons to the telegraph and stock ticker to modern computers and satellites, innovations in financial technology can provide a commercial advantage. Speed is a strategic priority for trading activities and high-frequency trading (HFT) is the latest in this historic trend. HFT uses sophisticated tools and proprietary computer algorithms to move in and out of financial positions in fractions of a second.
Third, an important aspect of modern capitalism is that its representational systems and virtual markets allow for nearly instantaneous translation between types of financial instruments. Bonds can be sold quickly and invested in gold, or oil positions can be liquidated to purchase shares of a company. Furthermore, different countries offer different bonds, different currencies, and a range of different types of derivatives as well as traditional shares of listed corporations.
The development of a wide array of new financial instruments, such as the collateralized debt obligations (CDO) and credit default swaps that ruined the housing markets and brought the global economy to its knees in the “Great Recession” of 2008, has added to the global volatility. This instability further intensifies the need for portfolio diversification that involve mixing investments across a range of different countries as risk management scenarios call for the hedging of financial bets across as many global markets and products as possible.
The system is facilitated by networks of global mass media that analyze and broadcast financial information that can instantaneously impact a wide range of financial decisions. I recently gave a presentation about the surveying aspects of financial media in Seoul, South Korea. I pointed out how news in general works to provide a type of surveillance of society and is becoming increasingly sophisticated with representational techniques that convey all sorts of statistical and graphical information relevant to financial transactions. The system provides a variety of general political and macroeconomic information as well as immediately actionable intelligence. Social media has now become part of the financial world, providing tweets and viral shares of news items that are potentially consequential to the pricing of financial instruments.
Another trend is the use of quantitative algorithms to discern patterns out of the frenetic energies of the global markets. Wall Street has increasingly been influenced by “quants”, a new type of financial trader more reliant on computer modeling than the gut-based decision making built intuitively and through years of “pit” experience. Castell’s sums it up. “All these elements are recombined in increasingly unpredictable patterns whose frantic modeling occupies would-be Nobel Prize recipients and addicted financial gamblers (sometimes embodied in the same persons).”
In short, Castells followed Walter Wriston in proposing that interconnected financial news and transaction networks, along with domestic and international deregulation, have created a radical and potentially unstable global system. They both argued that the increased flow of market-related media information across national borders and the torrent of financial transactions totaling trillions of dollars a day make the specter of financial machines taking control of our political economy an unnerving possibility.
Returning to Felix Stalder’s above question; we win in terms of some conceptualization of some relatively invisible global financial networks. This understanding is more political than the traditional narrative of free enterprise and self-regulating markets. However, we lose in terms of allocating responsibility for the system and thus a focus for policy concerns. I plan to stick with the information standard as an integrative concept in this area because I like its linkage to the gold standard and feel it is more flexible in terms of suggesting a path of analysis and reform.
Notes
[1] The “Automaton” was first named in a chapter entitled “Information Technology and Global Capitalism” in a compiled book on Global Capitalism (2000) that was edited by Will Hutton and Anthony Giddens.
[2] Quote on what drives the “collective capitalist” system from M. Castells, “Information Technology and Global Capitalism”. In (2000) Hutton, W. and Giddens, A. (eds.) Global Capitalism. NY: The New Press. p. 57.
[3] Felix Stalder. Manuel Castells and the Theory of the Network Society. Polity Press, 2006.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. His first faculty position was at Victoria University in Wellington, New Zealand.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Automaton > financial technology > high-frequency trading (HFT) > Information Standard > Manuel Castells > quantitative algorithms
New York City’s Emphasis on Global Media Management
Posted on | July 11, 2014 | No Comments
Several years ago I started exploring whether it was prudent to create a degree program in Global Media Management. The idea was reinvigorated being here in South Korea in a program that is focused on global issues and skills. So I’m going back to some of my initial research when I was in New York City and Mayor Bloomberg and others recognized the need for skills and additional emphasis in this area.
“New York City is the media capital of the world, but—with the industry undergoing profound changes—it’s incumbent on us to take steps now to capitalize on growth opportunities and ensure we remain an industry leader,” warned New York City Mayor Michael Bloomberg, himself a founder of a multi-billion dollar media empire, while announcing several initiatives to help traditional media workers transition to the digital sector.
The initiatives came out of MediaNYC 2020, a program that gathered media and information technology executives along with government officials and university faculty to develop a NYC-based research lab, create digital media apprenticeships, offer a technology-equipment bond program to provide tax-exempt financing for media and technology companies, and award fellowships to 20 “rising star” digital media entrepreneurs every year.[1]
One very active participant in the Media NYC 2020 initiative was the The Levin Institute, a free standing institution of the SUNY system dedicated to researching global issues. They received a $300,000 grant from the Carnegie Institute to conduct a research and public engagement project to research the dynamics of globalization called New York in the World. The Levin Institute and the Economic Development Corporation of New York City (NYCEDC) hosted a panel discussion in 2009 called Media NYC 2020: NYC as a Global Media Center. It was part of the MediaNYC 2020 initiative that laid out the history of the media industry in NYC and the challenges to its major legacy industries: print, television, and advertising.
The panel followed a previous study examining the specific implications for education in the area of global media management. The Levin Institute interviewed more than 25 corporate practitioners, professors and students in the media industry to gain an understanding of the issues and challenges related to globalization facing the industry. These research strategies ranged from exploratory conversations with the leaders of global firms, in-depth critical incident assessments from leading analysts and additional input from consultants. Their conclusion: “The findings from this research confirmed and validated the urgent need and nuanced demand for a specialized, unique program in Global Media.” They elaborated:
-
SUMMARY OF FINDINGS
Across the board, our data gathering has revealed a critical deficit in global media talent. As the industry weathers the revolutionizing effects of consolidation and digitization, media companies both big and small must grow and innovate across borders and platforms in order to survive. This requires both region-specific and medium-specific knowledge; managers with a deep understanding of the dynamics of foreign markets and the singularity of multiple medias, who are capable of solving contextual issues and forging valuable partnerships. According to our conversations, these managers have been difficult to find.
[2]
So the challenge is to continually define and address the talent needs in the global media sector. I recently defined several digital media “archetypes” of skills sets needed including:
- Design;
- Technology/Programming;
- Business Management;
- Communications;
- and Analytics.
I might add Global Acumen to that list. Understanding the challenges and opportunities for digital firms operating at least in part, globally, requires strong sets of localization skills as well as the ability to scale operations and platforms across wide geographical and temporal spans. It’s a big world with a lot of different regions, countries, economies and cultures. I usually start my students of with several (4-10) required viewings of The Commanding Heights so they have some understanding of the dynamics of global political economies.
I like this phrase “the singularity of multiple medias” mentioned above. Going back to New York City, I commend Cornell University’s new program in “Connective Media” At NYU I created BS programs in both Digital Communications and Media as well as Information Systems and worked hard to integrate them. Cornell has partnered with the Technion-Israel Institute of Technology to offer a MS in Information Systems with a specialization in Connective Media. It looks to address the analytics component mentioned above by integrating the expertise of software engineers and data scientists with that of media content designers, production teams, and editorial staffs.[3]
As I mentioned in “Producing Digital Content Synergies,” media firms in this new digital environment are increasingly combining multiple sets of skills and expertise to cross-produce and cross-promote content concepts (think Harry Potter series or The Hunger Games) across the organization or in cooperation with other firms. This means utilizing a wide range of available production and post-production resources to develop, package, distribute and monetize cultural products and other digital properties and promote them in a number of global/local markets.
While this emphasis on global media management reminds me of the “Silicon Alley” phenomenon in NYC during the dot.com era, it also makes me wonder to what extent they had the idea right, but just had too much money and not enough time to develop the technology and business skills to make it work.
Notes
[1] This post “New Initiatives Will Help NYC Continue as the Global Media Capital in the Digital Age” on Bloomberg’s personal blog is an informative update on these initiatives.
[2] The summary of findings are from http://www.levin.suny.edu/global-media.cfm. This link appears to be no longer available and was accessed on 8/14/09.
[3] More information on Cornell’s new presence in NYC and Connective Media.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: cross-production > cross-promotion > Global Media > Media Management > MediaNYC 2020 > Silicon Alley