Origins of Currency Futures and other Digital Derivatives
Posted on | January 13, 2016 | No Comments
Thus, it was fitting that Chicago emerged as the Risk Management Capital of the World—particularly since the 1972 introduction of financial futures at the International Monetary Market, the IMM, of the CME.
– Leo Melamed to the Council of Foreign Relations, June 2004
International currency exchange rates began to float in the post-Bretton Woods environment of the early 1970s. The volatility was created by US President Richard Nixon’s decision to end the dollar’s convertibility to gold and a new uncertainty produced in the international sphere, particularly by the two oil crises and the spread of “euro-dollars.”
Multinational corporations and other operations grew increasingly uneasy as prices for needed foreign monies fluctuated significantly. Those requiring another country’s money at a future date were unnerved by the news of geopolitical and economic events. Tensions in the Middle East, rising inflation, the U.S. military failure in Southeast Asia, were some of the major factors creating volatility and price variations in currency trading. This post introduces these new challenges to the global economy and how financial innovation tried to adapt to the changing conditions by creating a new system for trading currency derivatives.
Currency trading had long been a slow, glamour-less job in major banks around the world. Traders watched the telegraph, occasionally checked prices with another bank or two, and conducted relatively few trades each day. But with the end of the Bretton Woods controls on currency prices, a new class of foreign exchange techno-traders emerged in banks around the world. Armed with price and news information from their Reuters computer screens and with trading capabilities enhanced by data networks and satellite-assisted telephone services, they began placing arbitrage bets on the price movements of currencies around the world. Currency trading was transformed into an exciting profit center and career track. But the resulting price volatility raised concerns about additional risks associated with the future costs and availability of foreign currencies.
It was a condition that did not go unnoticed in Chicago, the historical center of US commodity trading. Members of the Chicago Mercantile Exchange (CME) in particular were curious if they could develop and trade contracts in currency futures. Enlisting the help of economist Milton Friedman, the traders at the CME lobbied Washington DC to allow them to break away from their usual trading fare and transform the financial markets. It helped that fellow and former University of Chicago professor George Schultz was the current Secretary of the Treasury.
In early 1972, the International Monetary Market (IMM) was created by the CME to provide futures contracts for six foreign currencies and start a new explosion of new financial products “derived” from base instruments like Eurodollars and Treasury bills.
The end of Bretton Woods also created the opportunity for a new “open outcry” exchange for trading in financial futures. The IMM was an offshoot of the famous exchange that preferred, initially, to distance themselves from “pork belly” trading and offer seats at a cheaper rate to ensure that many brokers would be trading in the pit. Growing slowly, at first, the IMM would be soon taken under the wing of the CME again and become the start of an explosion of derivative financial instruments that would soon grow into a multi-trillion dollar business.
The computer revolution would soon mean the end of “open outcry” trading strategies in the “pits” of the CME and the floors of other financial institutions like the NYSE. Open outcry is a system of trading whereby sellers and buyers of financial products aggressively make bids and offers in a face-to-face situation using hand signals. The system was preferably in trading pits because of deafening noise and the speed at which trading could occur. It also overcame crowd situations as traders could interact across a trading floor. But it lacked some of the flexibility and efficiencies that came with the computer.
In 1987, the members of the Chicago Mercantile Exchange voted for developing an all-electronic trading environment called GLOBEX, developed by the CME in partnership with Reuters. GLOBEX was designed as an automated global transaction system that was meant to replace eventually open outcry with a system of trading futures contracts that would not be limited to the regular business hours of the American time zones. They wanted an after-hours trading system that could be accessed in other cities around the world, and that meant trading 24/7.
Today’s $5.3 trillion dollars a day currency trading business is going through another transformation. Automation and electronic dealing are decimating foreign-exchange trading desks. Currency traders are now being replaced by computer algorithms.
Notes
[1] Leo Melamed, “CHICAGO AS A FINANCIAL CENTER IN THE TWENTY FIRST CENTURY,” A speech to the Council on Foreign Relations, Chicago, Illinois. June 2004.From http://www.leomelamed.com/Speeches/04-council.htm, accessed on November 14, 2004.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College, and Victoria University in New Zealand. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Bretton Woods > Chicago Mercantile Exchange > Chicago Mercantile Exchange (CME) > CME > eurodollars > IMM > International Monetary Market (IMM) > Milton Friedman
Statecraft and the First E-Commerce Administration
Posted on | January 7, 2016 | No Comments
One of techno-economic history’s most fascinating questions will deal with the stock market advances and technology developments during the 1990s. The eight years of the Clinton-Gore administration saw the proliferation of the Internet and telecommunications sectors. The Internet, a product of the Cold War, became a tool of global commerce.
The Presidential election of 1992 was notable for the phrase, “It’s the Economy Stupid” as Clinton attacked incumbent president George H. Bush for ignoring economic and social problems at home. The reigning president had a dramatic military victory with the “Desert Storm” military offensive that drove the Iraqis out of Kuwait, but years of federal budget deficits under the Republicans spelled his doom.
When the Clinton Administration moved into the White House in early 1993, the new administration was looking at budget deficits approaching a half a trillion dollars a year by 2000. Military spending and massive tax cuts of the 1980s had resulted in unprecedented government debt and yearly budget interest payments that exceeded US$185 billion in 1990, up substantially from the $52.5 billion a year when Ronald Reagan took office in 1982.[1] Reagan and Bush (like his son, George W.), while strong on national defense, never had the political courage to reduce government spending.
Clinton was forced to largely abandon his liberal social plans and create a new economic agenda that could operate more favorably within the dictates of global monetarism. This new trajectory meant creating a program to convince the Federal Reserve and bond traders that the new administration would reduce the budget deficit and lessen the government’s demand for capital.[2] Clinton made the economy the administration’s number one concern, even molding a National Economic Council in the image of famed National Security Council. Bob Rubin and others from Wall Street were brought in to lead the new administration’s economic policy. Clinton even impressed Federal Reserve Chairman Alan Greenspan, who had grown weary of the Reagan legacy and what President Bush I had once called “voodoo economics.”[3]
Although the potential of the communications revolution was becoming apparent, what technology would constitute the “information highway” was not clear. Cable TV was growing quickly and offering new information services, while the Bell telephone companies were pushing Integrated Services Digital Network (ISDN) and experimenting with ADSL and other copper-based transmission technologies. Wireless was also becoming a viable new communications option. But as the term “cyberspace” began to circulate as an index of the potential of the new technologies, it was “virtual reality” that still captured the imagination of the high-tech movement.
Enter the Web. Although email was starting to become popular, it was not until the mass distribution of the Mosaic browser that the Internet moved out of academia into the realm of the popular imagination and use. At that point the Clinton Administration would take advantage of rapidly advancing technologies to help transform the Internet and its World Wide Web into a vibrant engine of economic growth.
President Clinton tapped his technology-savvy running mate to lead this transformation, at first as a social revolution, and then a commercial one. Soon after taking office in early 1993, Clinton assigned responsibility for the nation’s scientific and technology affairs to Vice-President Gore.[3] Gore had been a legislative leader in the Senate for technology issues, channeling nearly $3 billion into the creation of the World Wide Web with the High Performance Computing Act of 1991, also known as the “Gore Bill.” While the main purpose of the Act was to connect supercomputers, it resulted in the development of a high bandwidth (at the time) network for carrying data, 3-D graphics, and simulations.[5] It also led to the development of the Mosaic browser, the precursor to Netscape and the Mozilla Firefox browsers. Perhaps, more importantly, it provided the vision of networked society open to all types of activities, including the eventual promise of electronic commerce.
Gore then shaped the Information and Technology Act of 1992 to ensure that Internet technology development would apply in public education and services, healthcare, and industry. Gore drove the National Information Infrastructure Act of 1993, that passed in the Congressional House in July of that year, but fizzled out due to a new mood to turn to the private sector for more direct infrastructure building. He turned his attention to the NREN (National Research and Education Network), which attracted attention throughout the US academic, library, publishing, and scientific communities. In 1996, Gore pressed the “Next Generation Internet” project. Gore had indeed “taken the initiative” to help create the Internet.
But the Internet was still not open to commercial activity. The National Science Foundation (NSF) nurtured the Internet during most of the 1980s. Still, its content remained strictly noncommercial by legislative decree, even though it contracted its transmission out to private operators. The Provisions in the NSF’s original legislation restricted commerce because of the “acceptable use policy” clause required in its funded projects.
But pressures began to mount on the NSF as it was becoming clear that the Internet was showing more commercial potential. Email use and file transfer were increasing dramatically. Also, the release of Gopher, a point-and-click way of navigating the ASCII files by the University of Minnesota, displayed textual information that could be readily accessed. Finally, Congressman Rick Boucher introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the NSFNET.[6] A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law. The era of Internet-enabled e-commerce had begun.
This is a good review of how the Internet and World Wide Web came to be.
Notes
[1] Debt information from Greider, W. (1997) One World, Ready or Not: The Manic Logic of Global Capitalism. New York: Simon & Schuster. p. 308.
[2] Bob Woodward’s The Agenda investigated the changes the Clinton-Gore administration’s implemented after being extensively briefed on the economic situation they inherited. Woodward, B. (1994) The Agenda: Inside the Clinton White House. NY: Simon & Schuster.
[3] Information on Greenspan’s relationship with Clinton from Woodward, B. (1994) The Agenda: Inside the Clinton White House. NY: Simon & Schuster.
[4] Information on Gore’s contribution as Vice-President. Kahil, B. (1993) “Information Technology and Information Infrastructure,” in Branscomb, C. ed. Empowering Technology. Cambridge, MA: The MIT Press.
[5] Breslau, K. (1999) “The Gorecard,” WIRED. December, p. 321. Gore’s accomplishments are listed.
[6] Segeller, (1998) Nerds 2.0.1: A Brief History of the Internet
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College, and Victoria University in New Zealand. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Al Gore > National Science Foundation (NSF) > NSFNET > Rick Boucher
America’s Financial Futures History
Posted on | January 2, 2016 | No Comments
In his book, Nature’s Metropolis, (1991) William Cronen discussed the rise of Chicago as a central entrepot in the formation of American West. The city was strategically located between the western prairies and northern timberlands and with access routes by river and the Great Lakes. As a dynamic supplier of the nation’s food and lumber resources, Chicago was ideally suited to become the port city for the circulation of the West’s great natural bounty. Located on the shore of Lake Michigan, “Chicago stood in the borderland between the western prairies and eastern oak-history forests, and the lake gave it access to the white pines and other coniferous trees of the north woods. Grasslands and hardwoods and softwood forests were all within reach.”[1] With the westward expansion of agricultural settlements throughout the 19th century, farmers started to look for markets to sell their non-subsistence beef, pork, and wheat supplies. Likewise, they were attracted to the big city to buy northeastern industrial goods at lower prices.
By the 1880s, Chicago enthusiasts were soon making comparisons with Rome, noting however, that while “all roads lead to Rome;” Chicago would become “the Rome of the railroads.” The famous city got its start after 1833, when the local Indian tribes were forced to sign away the last of their legal rights to the area and the construction of the Erie Canal meant a new waterway to New York and the East Coast. It prospered through the Civil War where it played a key role in supplying the North with foodstuffs and other resources from the western frontier. Into the next century, Chicago would continue to grow into the central node of a vast trading network of nature’s bounty, what would generically be called “commodities.”
The “commodity” emerged as an abstract term to refer to items such as grains, meat products, and even metals that are bought and sold in a market environment. Commodities also came to be sold in “futures”, a legal contract specifying the change of ownership at a future time and at a set price. Bakeries in the East, for example, could lock in prices for future wheat deliveries. Key developments in the emergence of commodity trading were the notions of grading and interchangeability. To the extent that products could be separated into different qualities, they could be grouped together according to a set standard and they could then be sold anonymously.
While technological innovations such as the railroad and the steamship would dramatically increase the efficiency of transporting goods, the development of Chicago’s famed exchanges would facilitate their allocation. The Chicago Board of Trade was started in 1848 as a private organization to boost the commercial opportunities of the city, but would soon play a crucial part in the development of a centralized site for the Midwest’s earthly gifts. It was the Crimean War a few years later that spurred the necessity for such a commodities market. As the demand for wheat sales doubled and tripled, the CBOT prospered, and membership increased proportionally.
In 1856, the Chicago Board of Trade made the “momentous decision to designate three categories of wheat in the city – white winter wheat, red winter wheat, and spring wheat – and to set standards of quality for each.”[2] The significance of the action was that it separated ownership from a specific quantity of grain. As farmers brought their produce to the market, the elevator operator could mix it with similar grains and the owner could be given a certificate of ownership of an equal quantity of similarly graded grain. Instead of loading grain in individual sacks, it could be shipped in railroad cars and stored in silos. The grain elevators were a major technological development that increased Chicago’s efficiency in coordinating the movements and sales of its grains.
The Board was still having problems with farmers selling damp, dirty, low-quality and mixed grains, so it instituted additional variations. By 1860, the Chicago Board of Trade had more than ten distinctions for grain and its right to impose such standards was written into Illinois law making it a “quasi-judicial entity with substantial legal powers to regulate the city’s trade.”[3]
During this same time period, the telegraph was spreading its metallic tentacles throughout the country, providing speedy access to news and price information. The western end of the transcontinental link was built in 1861, connecting Chicago with cities like Des Moines, Omaha, Kearney, Fort Laramie, Salt Lake City, Carson City, Sacramento, and on to San Francisco.[4] The western link quickly expanded to other cities and connected Chicago with farmers, lumberjacks, prospectors and ranchers eager to bring their goods to market. News of western harvests often triggered major price changes as it was transmitted rapidly between cities. News of droughts, European battles, and grain shortages brought nearly instant price changes in Chicago. Newspapers were a major beneficiary of the electric links as they printed major news stories as well as price information coming over the telegraph. The telegraph quickly made the Chicago Board of Trade a major world center for grain sales and linked it with a network of cities such as Buffalo, Montreal, New York, and Oswego that facilitated the trade of grains and other commodities.
The telegraph also instituted the futures market in the US. As trust in the grade system sanctioned by the Chicago Board of Trade grew, confidence in the quality of the anonymous, interchangeable commodity also increased. The telegraph allowed for one of earliest e-commerce transactions to regularly occur. The “to arrive” contract specified the delivery of a specified amount of grain to a buyer, most often in an eastern city. The railroad and the steamboat made it easier to guarantee such a delivery and the guarantees also provided a needed source of cash for the Western farmers and their agents. These contracts could be used as collateral to borrow money from banks. While the “to arrive” contracts had seen moderate use previously, the telegraph (along with the grading system) accelerated their use. Along with the traditional market in grain elevator receipts, a new market in contracts for the future delivery of grain products emerged. Because they followed the basic rules of the Board, they could be traded. “This meant that futures contracts–like the elevator receipts on which they depended—were essentially interchangeable, and could be bought and sold quite independently of the physical grain that might or might not be moving through the city.”[5]
During the 1860s, the futures market became institutionalized at the Chicago Board of Trade. The Civil War helped facilitate Chicago’s futures markets as such commodities as oats and pork were in high demand by the Union Army. After the war, European immigration increased substantially in the eastern cities, further increasing the demand for western food and lumber commodities. Bakers and butchers needed a stable flow of ingredients and product. The new market for futures contracts transferred risk from those who could ill-afford it, the bread-bakers and cookie-makers, to those that wanted to speculate on those risks.
In 1871, Chicago suffered from a devastating fire, but the city came back stronger than ever. By 1875, the market for grain futures reached $2 billion, while the grain cash business was estimated at the lesser $200 million.[6] Abstract commodity exchange had surpassed the market of “real” goods. “To arrive” contracts facilitated by telegraph in combination with elevator receipts had made Chicago as the futures capital of the world.
In 1874, the precursor to the Chicago Mercantile Exchange was formed. Called the Chicago Produce Exchange, it focused on farm goods such as butter, eggs, and poultry. In 1919, at the end of World War I, the Chicago Produce Exchange changed its name to the Chicago Mercantile Exchange (CME). It later expanded to trade frozen pork bellies as well as live hog and live pigs in the 1960s. The CME expanded yet again to include lean hogs and fluid milk, but its most important innovations came a decade later in the financial field.
Notes
[1] Quote on Chicago’s strategic location from Nature’s Metropolis, (1991) William Cronen, p. 25.
[2] Chicago Board of Trade’s momentous decision from William Cronen Nature’s Metropolis, (1991), p. 116.
[3] Additional information on the Chicago Board of Trade William Cronen Nature’s Metropolis, (1991), p. 118-119.
[4] Transcontinental link cities from Lewis Coe’s (1993) The Telegraph. London: McFarland & Co. p. 44.
[5] Quote on futures interchangeability from William Cronen Nature’s Metropolis, (1991), p. 125.
[6] Estimates of the grain and futures markets from William Cronen Nature’s Metropolis, (1991), p. 126.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College, and Victoria University in New Zealand. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Chicago Board of Trade > commodity > Erie Canal > Nature's Metropolis > William Cronen
Computer Technology and Problem-Based Learning (PBL)
Posted on | December 7, 2015 | No Comments
Many computer labs are now designed around the concept of problem-based learning (PBL), a student-centered approach where participants work in groups to solve open-ended problems. Instead of teachers presenting relevant material first and then have students applying the knowledge to solve problems, PBL engages the students in the problem, inviting them to engage higher-order thinking skills such as analysis, creativity, evaluation, and logic while working cooperatively in teams. Computer labs can facilitate problem-based learning as they are increasingly being designed with an eye towards collaboration, critical thinking, communication, research, and project management.
Theory Based Pedagogy
PBL applies insights from social constructivist and cognitive learning theories to create educational environments that reflect real-world complexities and foster personal responsibility for problem-solving. It builds on prior knowledge and fosters multiple ways of understanding and solving problems. It is an active process that seeks to drive motivation and understanding through challenge, relevance and accomplishment. PBL can be framed within the theory of John Dewey and educational philosophy of pragmatism that believed human beings learn best through a practical ‘hands on’ process.
Applications of PBL
PBL emerged primarily in medical schools as a curriculum development and delivery system to help students develop and hone problem solving skills in an information intensive environment. Medical professionals need to develop life-long learning skills to keep up with new information and techniques in their field, so PBL was implemented to help students to acquire necessary knowledge and skills. PBL is increasingly used in corporate environments as it promotes stronger working relationships and problem-solving tendencies. The case study method is a classic example of problem-based learning used in business schools.
Advantages of PBL
Don Woods has suggested that PBL is any learning environment in which the problem drives the learning. The problem is meant to be the vehicle that moves the group and each student towards the cognition of the situation and the construction of a set of solutions.
PBL mainly involves working in teams but can include independent and self-directed learning. Teams offer opportunities for leadership, stress communication and listening skills, and reward cooperation. Technologies enable asynchronous and synchronous collaboration schemes that challenge leadership skill-sets but translate into real world capabilities to manage projects and facilitate teamwork.
Central to team success is the self-awareness of individual capabilities and the evaluation of group processes. PBL stresses competences in oral and written communication as well as the ability to explain and often visualize key concepts in schematic forms for infographic slide presentations and short videos.
Whether working in a group or alone, critical thinking and analysis are combined with the ability to do additional online in-depth research across various related disciplines. Independent work can facilitate self-reliance and a sense of self. Self-directed learning lets students recognize their own cognitive strategies and understand concepts at their own pace. Major components are the application of the problem to course content and being sure it is situated in real world examples.
What are the Steps Involved?
Rather than teaching relevant material and subsequently having students apply the knowledge to solve problems, the problem is presented first. The process generally involves these steps:
– Introduce the problem in a carefully constructed, open-ended set of questions
– Establish groups, including rules and etiquette
– Organize and define roles including a leader/facilitator and a record keeper
– Initiate discussion by examining and defining the problem
– Explore what is already known about the problem
– Clarify related issues and define variables
– Determine what additional information needs to be learned and who is it pursue it
– Determine where to acquire the information and tools necessary to solve the problem
– Pursue individual tasks
– Regroup, share, and evaluate possible ways to solve the problem
– Strive for consensus and decide on a strategy to solve the problem
– Report on their findings and proposed solutions
– Assess learning acquisition and skill development
– Participants may need fundamental knowledge in key areas
– They must be willing to work within a group
– Time to organize is needed to engage fully, and report findings in a coherent way
– Assessment becomes more complicated and often less quantitative
– Teachers must learn to be facilitators of the learning process
– Group dynamics can compromise effectiveness
– Wrong decisions can influence real world solutions
– Results are better when working within flexible classroom spaces
PBL and Technology
This last risk raises questions of how technology and designed learning environments can assist the collaborative and research processes that are central to PBL. Educational technologies can help the learner and diminish many challenges faced by the instructor. Technology can streamline the instructional processes by structuring communicative interactions and knowledge finding. Group processes can proceed synchronously in a simultaneous face-to-face system or asynchronously online and across geographic divides. Web-based learning can offer personalized learning tools to students of different abilities in and out of the classroom.
Computers are increasingly being used in classrooms to provide interactive experiences. Students working around a table with one computer within reach and easy viewing can conduct collaborative, and group decision-making exercises. These desk-based terminals provide many kinds of curricular resources: informative websites, games, manuals, online books, simulations, etc. that present problems that can challenge the students and drive learning. Games, in particular, can engage students in relevant and rigorous “meaningful play” through structured activities that provide context and feedback in instructional learning.
Web-based tutoring systems have not consistently been designed to be challenging and deliver interesting curricular knowledge with a tested instructional pedagogy. However, computer-based learning environments based on PBL and gamification have come a long way in offering online and packaged media with problem-based learning and tutoring complete with homework assignments with grading and feedback.
To what extent are educational pedagogies limited by the architecture, spaces, and technology available to us? This is a major challenge now in education and seems to the be contingent on the willingness and creativity of educational professionals. While the most immediate responsibility rests on the shoulder of teachers and professors, it is also contingent on classroom designers and school administrators to construct spaces that support coherent strategies and technologies to implement problem-based learning.
In summary, PBL is a curriculum development and delivery system that recognizes the need to develop problem-solving skills, including information literacy and research capabilities that reach across disciplines. Many medical and professional schools already use these practices as the training they impart has real-world consequences. It requires a structured approach to communication and group organization while remaining committed to real-world relevance as well as a willingness to let the students control their own learning.
Resources
Monahan, Torin. 2002. “Flexible Space & Built Pedagogy: Emerging IT Embodiments.” Inventio 4 (1): 1-19.
Pawson, E., Fournier, E., Haight, M., Muniz, O., Trafford, J., and Vajoczki, S. 2006. “Problem-based Learning in Geography: Towards a Critical Assessment of its Purposes, Benefits and Risks.” Journal of Geography in Higher Education 30 (1): 103–16.
“Problem-Based Learning.” Cornell University Center for Teaching Excellence, n.d. Web. 29 Nov. 2015.
Woods, Don. “Problem-Based Learning (PBL).” N.p., n.d. Web. 30 Nov. 2015.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
YouTube: Alice’s Rabbithole or Hypertext to Hell?
Posted on | November 24, 2015 | No Comments
Edited remarks from the conference on
YouTube: Ten Years After
November 20, 2015
Hannam University
Linton School of Global Business
I’m happy to be a part of this commemoration of YouTube’s tenth year anniversary and thank Dr. Youngshin Cho for being our keynote speaker and enlightening us with his very informative talk.[1]
Hopefully, you are familiar with the story of Alice’s Adventures in Wonderland. The young girl from Lewis Carroll’s (1865) classic follows a giant rabbit into a dark hole and falls into a world of strange experiences, exotic creatures, and unusual paradoxes…
The story has been much discussed over the years, and speculations on the meanings of Alice’s adventure range from experiences with psychedelic drugs, spoiled foods, wild dreams, and even repressed sexual fantasies.
YouTube has all of these things, but my interest today is not to focus on content specifically but to point to, and for us to think about, the path, the journey, the discovery process inherent in the YouTube experience. I want to talk about the user experience, and specifically how it is guided by computer algorithms and a type of artificial intelligence.
While the meaning of Alice’s story will no doubt continue to be a topic of curious debate, there is a consensus that Carroll’s narrative is based on a sequence, a trajectory, a path that she follows. The term “rabbit hole” does not refer to a destination as much as it has come to mean a long, winding, and sometimes meandering journey with many connections and offshoots that often lead to serendipitous and surprising discoveries.
I’m not the first to connect Alice’s journey to online activities. Others have pointed to the Internet as essentially designed to function as a “rabbit hole;” primarily because of the way hyperlinks work, connecting one web page with another, taking the user down a voluntary “surfing” trip into new sites and visiting strange ideas and wonders along the way.
Interestingly, before we had the World Wide Web, we had the Xanadu project by Ted Nelson. He coined the terms “hypertext” and “hypermedia” in a publication way back in 1965. Xanadu is originally a name for Kublai Khan’s mythical summer palace, described by the enigmatic Marco Polo. “There is at this place a very fine marble palace, the rooms of which are all gold and painted with figures of men and beasts and birds, and with a variety of trees and flowers, all executed with such exquisite art that you regard them with delight and astonishment.” Nelson’s Xanadu strove to transform the reading experience with computer technology for an equally rich experience.
Let’s go to another mythical place, South Korea’s Gangnam district. Psy’s musical parody of the Seoul suburb’s delights and duplicities was an extraordinary YouTube hit.
With over 2,450,000,000 hits, the Korean singer’s Gangnam Style is leading YouTube with the most plays, but my immediate interest is the list of thumbnails of recommended hypertexted videos on the lower right side.
A key to understanding YouTube is its recommendation engine, a sophisticated computer algorithm and data collection system that finds content related to your search or the content you are watching. These computer programs reduce what could become a complex decision process to just a few recommendations. It will immediately list a series of videos based on metadata from the video currently displaying and from information gathered about your past viewings.
In a business context, this is called customer captivity, an important competitive advantage and a barrier to entry for other firms. It’s a technique to engage people in your content and keep someone on a web site by offering related or similar choices. Amazon and other e-commerce firms are increasingly using it. Netflix’s recommendation engine is also noteworthy.
The list of Youtube’s recommended videos are likely to vary but will probably include other versions of Gangnam Style and the rather awful Gentlemen, which brings me to my next topic.
Hypertext to Hell? Well, you may have heard the music hit “Highway to Hell” by Australian heavy metal band AC/DC whose lead singer drank himself to death a year later. Well, I wish him all the best and thank him for the excusing my homonym.
I think hell differs for each of us. There is an SNL skit where a very famous musician, Paul Simon, of Simon and Garfunkel, meets with the devil and sells his soul in order to be a famous musician. Years later, when the time comes to pay up, he discovers he is stuck in an elevator, listening to Muzak, the soft “easy listening” instrumentals of his classic songs, such as The Sound of Silence and Mrs. Robinson, for all eternity.
Here I am referring not to anything afterlife but rather the earthly hell we put ourselves into on occasion. YouTube can offer that same experience, if you choose. I discovered one pathway to a personal hell in a series of videos arguing for the validity of the “flat earth.” Some people actually believe the idea that the earth is round is NASA propaganda. I guess that is my personal hell because I was always fascinated with space and interstellar travel and perhaps quaintly believe astronauts landed on the Moon.
Well, I don’t want to leave you in hell. So I’ll return you to (1980) Xanadu with this video from the movie of the same name with the late Olivia Newton-John and the Electric Light Orchestra.
In closing, I’m going to make a slight jump and suggest that the human mind works very much like a recommendation engine. It gives you more of the thoughts on which you choose to focus. If you focus on positive thoughts, your mind will start to give you more of those, presenting you with similar ideas and emotional experiences. If you focus on topics you hate, your mind will give you more of that, driving you into a personal hell.
So be careful with your mental clicks, and avoid the highways to hell. Sometimes you go down that rabbit hole; sometimes you traverse the hypertext to hell. Make the right choices and get out if you find yourself going down the dark side (switching metaphors). Just do another search.
Notes
[1] Dr Youngshin Cho is a Senior Research Fellow at SK Research Institute in Seoul where he is conducting research on media trends and strategies of media and ICT companies, as well as looking at the impact of change in media policy. He obtained his PhD in Media and Telecommunication Policy from Pennsylvania State University in 2007 and his MA from Yonsei University in 1998. One of the country’s acknowledged authorities on these subjects, Dr Cho is also a consultant with the government on media policy.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Rabbit Hole > recommendation engines > Ted Nelson > Xanadu Project > YouTube > YouTube Tenth Anniversary
Those Media Products are Misbehaving Economic Goods
Posted on | November 14, 2015 | No Comments
Bad, bad media. Or so economists would have us believe. Media and information products just don’t fit the mold, or should I say, model. Most economic thinking is based on models that make simple assumptions about the types of goods and services that are produced and purchased. Economists like their tight little models and the most relevant to their profession are the ones that bring together supply and demand curves and justify their belief in the power of the equilibrium price, that magical point where all suppliers and consumers are happy.
But this model works best when the product “behaves” – when it is purchased by one consumer and is consumed in its entirety. The models break down when applied to media products like books, broadcast television, broadband, streaming music, etc. To understand what type of economic goods – products or services that can command a price when sold, these are – we have to start an analysis with two important concepts. These can help us understand the dynamics of media goods and their ramifications regarding pricing, but also public policy.
The following video makes distinctions between different economic goods: private goods and public goods. It also specifies the characteristics of common goods and club goods, topics that will discussed in a future post.
Economists like to talk about two characteristics of goods and services called rivalry and excludability. Rivalry is when someone purchases a good, and it is consumed entirely by that person. This phenomenon is sometimes called subtractability because in a sense, the process subtracts, it deletes the product from the economy. Economists usually use some physical good like a hamburger as an example. When a burger is purchased, it is used up, consumed, subtracted from the market. A book, on the other hand, is not used up. It can be read again and again. It can be given to another person whole, and its content read in part or entirely without its consumption – without its subtraction. What about watching a movie on Netflix? Or watching it at the cinema? How about purchasing a DVD? How do these goods differ and what are the economic and policy consequences?
Some media products can be consumed without paying, such as broadcast media and much of the content on the Internet. This raises the issue of excludability. Excludability is when consumption is limited to paying customers. It is the degree to which you can exclude non-paying customers. How can you exclude someone from using or consuming your media product if they do not pay? Broadcast media faced this problem. Radio, and then television, transmitted signals that could be picked up by anyone with the appropriate receiving equipment. The digitization of sound encoded into MP3s created a crisis in the music industry as they could easily be shared over the Internet with peer-to-peer applications like Napster. Broadcast media are nonrivalrous, meaning they are not consumed, but also nonexcludable, meaning people can access without paying.
Even newspapers that are bought and discarded, say on a train, can be used by someone else. Newspapers were the first to start using advertising to alleviate this problem. In fact, the more people it reached, the more valuable was an advertising expenditure. Media goods became “dual product markets” where content is sold to audiences, and audiences are packaged and sold to advertisers. It becomes an interesting pricing calculus for media producers to determine the right mix for selling content, and consequently reducing the size of the audience, or focusing on a larger audience so that higher advertiser rates can be charged.
Digital media economic goods need to be better understood in terms of their economic characteristics. Smartphones, books, games, game consoles, movies, television shows, music CDs, music streaming, broadband service, etc. Each displays varying degrees of rivalry and excludability.
In a future post, I want to outline four different types of economic goods: private goods, public goods, common goods, and club goods. Each of these point to the characteristics of different types of economic goods. Each of these economic types displays varying degrees of rivalry and excludability. Media and IT goods differ but fall along a continuum of these products.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: economic models > excludability > Media Economics > nonrivalry > public goods > rivalry
TPP and the Role of Intellectual Property in the US Export Economy
Posted on | November 7, 2015 | No Comments
With the Trans-Pacific Partnership (TPP) trade deal under discussion, it’s useful to look at some of the changes in the world economy, and specifically the US export economy and the increasing role of intellectual property (IP). As it stands, I’m not for the trade deal, but I feel it’s important to parse through the details of the deal to understand what kind of economy we have developed, who benefits, who struggles, and what we can do going forward. In this post, I look at the trade context that the TPP is being negotiated in, particularly the status of IP.
What exactly is the TPP? It is a trade, investment, and juridical pact covering roughly 40 percent of the world’s economy that was negotiated between the US and 11 other Pacific Rim nations:
Australia | Brunei | Canada | Chile |
Japan | Malaysia | Mexico | Peru |
New Zealand | Singapore | Vietnam |
Taiwan, the Philippines, and South Korea have expressed some interest in joining the TPP. China is being polite, but recognizes that it is a central target of this agreement. It is currently awaiting US Congressional approval. (Update: President Trump has recently signed an executive order to end the US involvement in the TPP)
It’s unclear what kind of impact the TPP will have on very relevant issues like agriculture, labor rights, public health, and sustainable energy. TPP’s stance on Internet freedom and net neutrality are particularly important areas in need of public scrutiny. Also, the trade dispute settlement system suggests some real threats to American national sovereignty. These issues raise important questions about the type of society Progressives want to create, including aspects of the global economy that should be enhanced and sustained.
US exports nearly tripled since the World Trade Organization (WTO) was created, although the US continues to run trade deficits, primarily due to oil imports from a number of countries. From slightly less than $800 billion in 1995, U.S. trade exports amounted to about US$2.28 trillion in 2013.[1] Almost half of that was in intellectual property or “IP” industries, based on figures from the WTO.
We like to criticize China’s dominance in manufacturing exports, but its percentage of world manufacturing export revenues is about 18% while the US share of intellectual property export revenue is nearly 40.0%. That is a significantly large share of the IP market. While Europe is a close second, the figures are somewhat misleading, as it includes international cross-border transactions between European countries. Imagine if all the transactions between US states were counted as exports.
Of the world’s reported US$329 billion in IP export revenues, some $129 billion is captured by US interests. Food and Agriculture is still the US’ largest export category at $162 billion. Automotive is next at $125 billion, followed by Finance & Insurance at $110 billion, Information Technology goods at $108 billion and Aerospace at $105 billion.[2]
It’s useful to break this IP figure down. Perhaps surprisingly, the largest category of IP is patents, accounting for US$45 billion. Royalty payments for patent use have gone up substantially as commerce and production has globalized. A close second is the export of software at $43 billion; this includes about $3 billion in video games. A distant, but significant third at $17 billion, is the traffic in trademarks, the legal right to use a logo, name, phrase, song, or symbol.
Tied for third is the major copyright category Film/TV/Music/Books at $17 billion. A note is in order here as the WTO calculates movies and music as part of “audiovisual services,” but Progressive Economy has added them into the IP statistics as copyrights are an important component of IP. Intellectual property issues are coordinated internationally under agreements with the World Intellectual Property Organization (WIPO) based in Geneva, Switzerland.
For better or worse, American trade dynamics have changed over the last 50 years, especially with its Pacific neighbors. While manufacturing is important, it has been difficult for the US to compete with low-cost labor in other parts of the world. On the other hand, the US has been going through a knowledge and technological revolution largely based its ongoing use of militarization as a national system of innovation.
The modern era of globalization was a result of commercializing the fruits of the Cold War and Middle East invasions. Computers, the Internet, space-based satellite systems and fiber optic cables for communications, came out of government subsidized research and initial utilization by the military-industrial complex. These and other technologies like big data processing, geolocation recognition, mobile technologies and remote sensing have come together create new global circuits of production, logistics, and strategic communications that have transformed creative, financial, and manufacturing industries.
Modern capitalism is optimized to provide shareholder returns, not employee benefits or national production. Labor and national interests need strong democratic will formation and participation to ensure a flow of resources to citizens as well as education, infrastructure, military and other social needs. Economists will argue that the US has a comparative advantage in intellectual property, but employment opportunities are largely in high tech/creative clusters like San Francisco, Los Angeles, Boston, Austin, and New York.
As the trade figures show, globalization is creating a “rentier” economy. Wealth is flowing more to the owners of resources rather than the producers of goods and services. While land owners have long been the beneficiaries of rental income, owners of intellectual property have seen increasing profits from royalty and licensing fees. Global supply chains, an important cost factor in the products Americans like to buy, draws heavily on the use of key intellectual properties to reduce costs and increase profits. Patents, copyrights, trademarks and even business methods have become a new global force, shaping the world’s political economy for the rentiers’ benefit.
One of the policy issues to be addressed is that profits on intellectual property can be sheltered by selling the rights to a series of subsidiaries (“Double Irish” or “Dutch Sandwich”) in tax havens like the Cayman Islands.
Intellectual property can be a serious contributor to US economic welfare but is not a panacea for a country that is desperately struggling in the global economy that it created. It may be that the Democrats need TPP to secure funding and electoral votes from California, but the US, in general, needs to do a lot of homework on TPP. It especially needs to assess how it wants to structure its international trade so that it can help tackle long-term domestic issues like climate change, debt reduction, energy independence, and food security.
Notes
[1] Exports in 1995 were about $800 (794,387) billion and grew to US$2.3 trillion in 2014 according to the US Bureau of Economic Analysis. It also grew from about 5% of GDP to nearly 15% of GDP.
[2] The Progressive Economy website is a project of the GlobalWorks Foundation.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he was on the faculty of New York University for 15 years. Previously, he taught at he taught at Hannam University in South Korea, St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: intellectual property > IP > Trans-Pacific Partnership > Trans-Pacific Partnership (TPP) > US exports
Microsoft and the IBM PC Case Study: The Deal of the Century
Posted on | November 1, 2015 | No Comments
The microcomputer was starting to become popular in the late 1970s and finally caught the attention of IBM. Initially resistant, “Big Blue” became concerned that they were “losing the hearts and minds” of their clients. However, when the Apple II started to show up in many corporate environments, IBM officials were concerned about the encroachment of these small “microcomputers” into their minicomputer and mainframe businesses.[1]
Although the Apple II’s capabilities were limited, it had exciting software such as the VisiCalc spreadsheet software. IBM had attempted another microcomputer in September of 1975, the 5100, but at 55 lbs and a cost of $20,000, the laptop computer failed to take off. IBM responded to the growing popularity of the personal computer by going outside its company and contracting with new hardware and software suppliers to produce its own “Personal Computer” or PC as it came to be called. By mid-1980, “Big Blue” moved quickly to assemble and market their version of a microcomputer.
The key to the emerging microcomputer business was the microprocessor “chip.” IBM chose to build the PC around the cheaper 8-bit Intel 8088 chip, despite the availability of the much more powerful 16-bit 8086 chip. Its decision was based on two reasons. The first was that IBM feared their microcomputer might compete with their mini and mainframe businesses and wanted the less powerful chip to keep the PC in a different class. The second and probably more important reason was that a small industry of hardware and software products had already been developed for the Intel 8088. Many more supporting chips were available for the older chip, and this meant that a functioning system could be assembled within a shorter period.
One of the new suppliers that IBM contacted was Micro-soft. Initially, they approached them for software and a computer operating system. The meeting was facilitated by the connection between the IBM CEO and Gates’ mother Mary; both served on the board for the same United Way charity. Apparently IBM’s briefing books listed Microsoft as the major producer of operating systems. Bill Gates was surprised by the request for an operating system and quickly referred the IBM representatives to Gary Kildall and his operation called Intergalactic Digital Research. Kildall had written CP/M and was the predominant figure in microcomputer operating systems.
When the IBM representatives arrived, they were disappointed with their reception. The story varies but apparently Dr. Kildall had plans to go flying his airplane during the morning hours and asked his wife Dorothy to meet with the visitors. She stalled them and asked her lawyer to go over the IBM nondisclosure agreement (NDA). Together they decided not to sign it. Another story, however, was that Kildall returned in the afternoon, signed the NDA, but refused IBM’s offer to buy CP/M outright for $250,000, holding out instead for licensing it at $10 a copy. In any case, IBM’s representatives left disappointed and returned to Microsoft. Afraid to lose the opportunity to sell BASIC and other languages with the new IBM microcomputer, Gates told them he could get them an operating system.
Their secret “ace-in-the-hole” was Paul Allen’s dubious connection to a programmer who was working on a variation of the CP/M operating system that IBM wanted. The programmer was Tim Patterson who worked at a company called Seattle Computer Products (SCP). Patterson had spent nearly half of 1976 reworking the code from a manual he bought. When Allen approached the company, SCP was suspicious of Microsoft’s intentions. But they were very interested in getting a cash settlement. Eventually, they agreed to turn over the software, called Q-DOS (Quick and Dirty Operating System), for royalties that totaled some $50,000. This was the software that was soon going to make Gates and Allen, the richest men in the world.
On August 12, 1981, Big Blue introduced its personal computer. The Intel 8088 chip operated at 4.77 MHz and contained some 29,000 transistors. It handled data internally in 16-bit words but was limited to 8 bits externally. It used ASCII code for representing information that made its characters show up crisply on the 80 character monochrome screen, a major improvement over the Apple II, especially for word processing and other office applications.[2] At its core was PC-DOS, licensed from Microsoft but also available were CP/M-86 and another Pascal-based OS from the University of California at San Diego.[3] The IBM PC sold initially for $2,880 and had 64 Kbytes of memory along with a floppy disk.[4] It was a major success that first year with over 670,000 computers sold.[5]
Microsoft and the Clones
But, in one of the biggest business blunders of all time, IBM did not get an exclusive contract for PC-DOS. Gates pushed for an agreement that would allow them to license the OS to other manufacturers. Lessons from the mainframe business showed that companies could develop “plug-compatible” machines that were compatible with large IBM computers. The Amdahl Corporation, for example, developed a processor for its Model 470 V/6 that ran IBM 360 software. Japanese companies also made significant entries into American computer market via the plug-compatible business. Fujitsu began building Amdahl computers in Japan and Hitachi began selling similar machines in the USA under the name of National Advanced Systems, a division of the National Semiconductor Corporation. With this agreement, Gates was free to make similar deals with companies who could “legally” replicate the IBM computer and run the “MS-DOS” software.
While the IBM PC became the industry standard behind an expensive marketing campaign using a Charlie Chaplin look-alike, Compaq was leading the way for other microcomputer manufacturers looking to produce “IBM-compatible” machines. Compaq invested in the legal and technical expertise to “reverse engineer” a crucial part of the IBM architecture that was produced in-house by IBM. This was the BIOS, the Basic Input/Output System.
Originally developed by Gary Kildall, the BIOS concentrated the hardware-dependent aspect of his CP/M operating system into “a separate computer code module” that was stored in a read-only memory chip, thus the name ROM-BIOS. This chip allowed CP/M to be adapted to many different computers that were being developed around Intel chips.[6] IBM had developed and copyrighted its own BIOS, which needed to be “reverse engineered” to allow another manufacturer to produce it.
Compaq put 15 engineers on the job and invested over a million dollars in the project to produce a portable PC. Despite IBM’s strength, they ultimately had lower overhead than IBM and could compete with a cheaper computer. What made these clones popular though was that they could run the same software as the IBM machine. Compaq and others such as Commodore, Tandy, Zenith and most impressively, Dell provided “IBM-compatible” machines that could run the most popular non-Apple programs. The result was a proliferation of PCs, or as they came to be known, “PC clones.” Compaq sold over US$100 million of their “portable” personal computers in its first full year of business.
In one of the most extraordinary business arrangements in modern history, Microsoft leveraged its knowledge of the Intel microprocessor environment to outmaneuver IBM and establish its operating system as the dominant operating system for the PC. In a strategy Microsoft executive Steve Ballmer called, “Riding the Bear,” Microsoft worked with IBM to the point where it was strong enough to go on its own, ultimately becoming one of the richest companies in the world by having their software on nearly every PC in the world. This would first include developing programming software for the fledgling Intel-based microcomputer industry and then in association with IBM, standardizing an operating system for the non-Apple microcomputer industry.
Triumph of the Nerds video on reverse engineering and Compaq by Robert X. Cringely.
The development of the microprocessor-based computer industry by Intel and Apple resulted in the wide-scale use of electronic spreadsheets, what some would call the “killer app” of the personal computing revolution. VisiCalc, Multiplan, SuperCalc, Lotus 1-2-3, and Quattro were the first spreadsheets to become available. By the mid-1980s, electronic spreadsheets would make their way into the corporate world and became an integral tool for the progression of digital monetarism.
Citation APA (7th Edition)
Pennings, A.J. (2015, Nov 2) Microsoft and the IBM PC Case Study: The Deal of the Century. apennings.com https://apennings.com/how-it-came-to-rule-the-world/microsoft-and-the-ibm-pc-case-study-the-deal-of-the-century/
Notes
[1] Quote from Jack Sams, IBM PC project manager as interviewed by Robert X. Cringely in Triumph of the Nerds based on his book Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date. Cringely has a new book about the decline of IBM.
[2] The impact of ASCII on the IBM PC from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 268.
[3] The three operating systems from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 268.
[4] Cambell-Kelly and Aspray’s (1996) Computer: A History of the Information Machine. Basic Books. p. 257.
[5] Information on IBM PC release from Michael J. Miller’s editorial in the September 4, 2001 issue of PC MAGAZINE dedicated to the PC 20th anniversary.
[6] BIOS information from Cringely, p. 58.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: BIOS > CP/M > Gary Kildall > IBM clones > IBM PC > Intel 8008 > Intel 8080 > PC-DOS > ROM-BIOS > Seattle Computer Products (SCP) > Tim Patterson > VisiCalc