Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Emerging Areas of Digital Media Expertise, Part 3: Global Knowledge and Geopolitical Risk

Posted on | March 5, 2016 | No Comments

This is the third post of a discussion on what kind of knowledge, skills, and abilities are needed for working in emerging digital media environments. It is recognized that students gravitate towards certain areas of expertise according to their interests and perceived aptitudes and strengths. In previous posts, I discussed Design, Technical, and Strategic Communication aspects and then later Analytics and Visualizations. Below I will examine an additional area:

Global knowledge is increasingly expected in work environments, especially those dealing with digital media and cultural industries that operate across ethnic, racial, and national borders. While this area is fluid due to rapid technological and political changes, knowledge of regions, individual countries, and subnational groupings are relevant. The globalization of media and cultural industries is very much a process of adjusting to local languages, aesthetic tastes, economic conditions (including currency exchange rates), and local consumer preferences.

The challenge of operating internationally presents significant risks of alienating customers, audiences, colleagues, and potential partners. Cultural differences in tolerances to risk avoidance, emotionalism, punctuality, ethnic differences, and formality could influence product acceptance, viewer habits, and workplace friction. The world of social media, in particular, contains significant potential risks for brand and reputation management. Unresolved cultural differences can lead to losses in efficiency, negotiations, and productivity when management strategy fails to account for cultural sensitivities in foreign contexts.

Assessing economic, health and political risks are also key to understanding the dynamics of foreign markets and operations. Economic factors such as exchange rate instability or currency inconvertibility, tax policy, government debt burden, interference by domestic politics and labor union strikes can influence the success of foreign operations. Extreme climate, biohazards, pollution and traffic dangers can threaten not only operations, but personnel as well.

Political risks from the violence of war or civil disturbances such as revolutions, insurrections, coup d’états, and terrorism should be assessed as well as corruption and kidnapping. Regulatory changes by host countries can influence media operations such as Russia’s recent requirement to store all data on Russian citizens within the country. Copyright and other intellectual property infringements are of particular concern to cultural and media industries.

Global knowledge often involves understanding how digital media and ICT can contribute to national development goals. Sustainable development objectives regarding education, health, sanitation, as well as energy and food production involving digital media have become high priorities in many countries. Knowing how indigenous and local communities can use these technologies for their specific goals and how performers and cultural practices can protect their intellectual property are high priorities. Gender justice and the empowerment women often rely on media to enhance social mobility and viability. Local governance involving infrastructure planning, social change, ethnic harmony may also involve technological components.

Digital services operating globally face trade-offs between projecting standardized features and customizing for local concerns. The latter involves cultural and emotional sensitivities as well as the ability to apply rational analysis to assess dangers to property, operational performance and most importantly, the people involved in foreign operations.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Digital Content Flow and Life Cycle: Global E-Commerce

Posted on | February 25, 2016 | No Comments

The term “e-commerce” provokes connotations of computer users “surfing” the web and using their credit cards to make online purchases. While this has been the popular conception, e-commerce continues to transform. The traditional model will continue to drive strong e-commerce sales for the retail sector, but other technologies and business models will also be important, especially with the proliferation of mobile and social technologies. E-commerce is as dynamic as the technologies and creative impulses involved and can be expected to morph and expand in concert with new innovations.

In this post, I will start to examine how e-commerce is a crucial stage in adding value to digital content. The graphic below represents the key steps in the digital media production cycle, starting at one o’clock.

Digital Content Life Cycle_apennings

The introduction of e-commerce into the media content life cycle is relatively new. I gave a talk to the Asian Business Forum in Singapore during the summer of 2000 that suggested broadcasters adopt an e-commerce model that included digitally serving, customizing, transacting, monetizing, interacting, delivering, and personalizing content.[1] I received quite a few blank stares at the time, maybe because it was a conference on “Broadcasting in the Internet Age.” Coming from New York City, I was familiar with the Silicon Alley discussions about the Internet and the changing media industries.

An analysis of the flow of digital content production can help companies determine the steps of web content globalization. A chain analysis involves diagramming then analyzing the various value-creating activities in the content production, storage, monetization, and distribution process. Furthermore, it involves an analysis of “informating” process, the stream of data that is produced in the content life cycle process. Digital content passes through a series of value-adding steps that prepare it for global deployment via data distribution channels to HDTV, mobile devices, and websites.[4]

Through this analysis, the major value-creating steps of content production and marketing process can be systematically identified and managed. In the graph above, core steps in digital production and linkages between them are identified in the context of advanced digital technologies. These are general representations but provide a framework to analyze the activities and flow of content production. Identifying these processes helps to understand the equipment, logistics, and skill sets involved in modern media production and also processes that lead to waste. In other posts I will discuss each of the following:

The Digital Value Chain
Content Creation and Production
Content Storage and Management
Content E-Commerce
Content Distribution and Delivery
Content Usability and Consumption
Content Analytics and Critique

Notes

[1] “Internet and Broadcasting: Friends or Foes?” Invited Presentation to the to the Asian Business Forum, July 07, 2000. Conference on Broadcasting in the Internet Age. Sheraton Towers, Singapore.
[2] Porter, Michael E. Competitive Advantage. 1985, Ch. 1, pp 11-15. The Free Press. New York.
[3] A version of this article appeared in the November–December 1995 issue of the Harvard Business Review.
[4] Singh, Nitish. Localization Strategies for Global E-Business.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

When Finance Went Digital

Posted on | February 5, 2016 | No Comments

By the end of the 1970s, a basic domestic and international data communications system was created that provided for the regime of digital monetarism to expand around the world. Drawing on ARPANET technologies, a set of standards emerged that was conducive to the public-switched telephony networks operated by the national telecom authorities such as France Telecom and Japan’s NTT (Nippon Telephone and Telegraph).

Rather than the user-oriented TCP/IP protocols that would take hold later, early packet-switched networking using a series of ITU technical recommendations emerged. The move was indicative of major tensions emerging between the governments that controlled international communications and the transnational commercial and financial institutions that wanted to operate globally. National telecommunications systems wanted to organize and control the networks with financial firms as their customers. They developed packet-switching protocols such as X.25 and X.75 for national and globally interconnected networks. Finance found their networks slow and wanted much more robust service that could give them competitive advantages and securer communications.

The increase in financial activity during the 1970s fed the demand for data communications and new computer technologies such as packet-switching networking, minicomputers, and “fail-safe” mainframes, such as Tandem Computers.

After the dissolution of the Bretton Woods Agreements, banks, and other financial businesses made it clear that new data services were needed for this changing monetary environment. They wanted computerized “online” capabilities and advanced telecommunications for participating in a variety of global activities including currency transactions, organizing syndicated loans, communicating with remote branches, as well as the complex data processing activities involved with managing ever-increasingly complex accounts and portfolios for clients.

For corporations, it meant computerized trading systems to manage their exposure in foreign currencies as they were forced, in effect, to become currency gamblers to protect their overseas revenues. Their activities required new information technologies to negotiate the complex new terrain brought on by the new volatility in international foreign exchange currency and the exponential rise in lending capital due to the syndicated Eurodollar markets.

The flood of OPEC money coming from the Oil Shocks increased the need for international data communications, as money in the post-Bretton Woods era was becoming increasingly electronic, and the glut of OPEC petrodollars needed a more complex infrastructure to recirculate its bounty to industrializing countries around the world.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Origins of Currency Futures and other Digital Derivatives

Posted on | January 13, 2016 | No Comments

Thus, it was fitting that Chicago emerged as the Risk Management Capital of the World—particularly since the 1972 introduction of financial futures at the International Monetary Market, the IMM, of the CME.
Leo Melamed to the Council of Foreign Relations, June 2004


International currency exchange rates began to float in the post-Bretton Woods environment of the early 1970s. The volatility was created by US President Richard Nixon’s decision to end the dollar’s convertibility to gold and a new uncertainty produced in the international sphere, particularly by the two oil crises and the spread of “euro-dollars.”

Multinational corporations and other operations grew increasingly uneasy as prices for needed foreign monies fluctuated significantly. Those requiring another country’s money at a future date were unnerved by the news of geopolitical and economic events. Tensions in the Middle East, rising inflation, the U.S. military failure in Southeast Asia, were some of the major factors creating volatility and price variations in currency trading. This post introduces these new challenges to the global economy and how financial innovation tried to adapt to the changing conditions by creating a new system for trading currency derivatives.

Currency trading had long been a slow, glamour-less job in major banks around the world. Traders watched the telegraph, occasionally checked prices with another bank or two, and conducted relatively few trades each day. But with the end of the Bretton Woods controls on currency prices, a new class of foreign exchange techno-traders emerged in banks around the world. Armed with price and news information from their Reuters computer screens and with trading capabilities enhanced by data networks and satellite-assisted telephone services, they began placing arbitrage bets on the price movements of currencies around the world. Currency trading was transformed into an exciting profit center and career track. But the resulting price volatility raised concerns about additional risks associated with the future costs and availability of foreign currencies.

It was a condition that did not go unnoticed in Chicago, the historical center of US commodity trading. Members of the Chicago Mercantile Exchange (CME) in particular were curious if they could develop and trade contracts in currency futures. Enlisting the help of economist Milton Friedman, the traders at the CME lobbied Washington DC to allow them to break away from their usual trading fare and transform the financial markets. It helped that fellow and former University of Chicago professor George Schultz was the current Secretary of the Treasury.

In early 1972, the International Monetary Market (IMM) was created by the CME to provide futures contracts for six foreign currencies and start a new explosion of new financial products “derived” from base instruments like Eurodollars and Treasury bills.

The end of Bretton Woods also created the opportunity for a new “open outcry” exchange for trading in financial futures. The IMM was an offshoot of the famous exchange that preferred, initially, to distance themselves from “pork belly” trading and offer seats at a cheaper rate to ensure that many brokers would be trading in the pit. Growing slowly, at first, the IMM would be soon taken under the wing of the CME again and become the start of an explosion of derivative financial instruments that would soon grow into a multi-trillion dollar business.

The computer revolution would soon mean the end of “open outcry” trading strategies in the “pits” of the CME and the floors of other financial institutions like the NYSE. Open outcry is a system of trading whereby sellers and buyers of financial products aggressively make bids and offers in a face-to-face situation using hand signals. The system was preferably in trading pits because of deafening noise and the speed at which trading could occur. It also overcame crowd situations as traders could interact across a trading floor. But it lacked some of the flexibility and efficiencies that came with the computer.

In 1987, the members of the Chicago Mercantile Exchange voted for developing an all-electronic trading environment called GLOBEX, developed by the CME in partnership with Reuters. GLOBEX was designed as an automated global transaction system that was meant to replace eventually open outcry with a system of trading futures contracts that would not be limited to the regular business hours of the American time zones. They wanted an after-hours trading system that could be accessed in other cities around the world, and that meant trading 24/7.

Today’s $5.3 trillion dollars a day currency trading business is going through another transformation. Automation and electronic dealing are decimating foreign-exchange trading desks. Currency traders are now being replaced by computer algorithms.

Notes

[1] Leo Melamed, “CHICAGO AS A FINANCIAL CENTER IN THE TWENTY FIRST CENTURY,” A speech to the Council on Foreign Relations, Chicago, Illinois. June 2004.From http://www.leomelamed.com/Speeches/04-council.htm, accessed on November 14, 2004.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College, and Victoria University in New Zealand. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.

Statecraft and the First E-Commerce Administration

Posted on | January 7, 2016 | No Comments

One of techno-economic history’s most fascinating questions will deal with the stock market advances and technology developments during the 1990s. The eight years of the Clinton-Gore administration saw the proliferation of the Internet and telecommunications sectors. The Internet, a product of the Cold War, became a tool of global commerce.

The Presidential election of 1992 was notable for the phrase, “It’s the Economy Stupid” as Clinton attacked incumbent president George H. Bush for ignoring economic and social problems at home. The reigning president had a dramatic military victory with the “Desert Storm” military offensive that drove the Iraqis out of Kuwait, but years of federal budget deficits under the Republicans spelled his doom.

When the Clinton Administration moved into the White House in early 1993, the new administration was looking at budget deficits approaching a half a trillion dollars a year by 2000. Military spending and massive tax cuts of the 1980s had resulted in unprecedented government debt and yearly budget interest payments that exceeded US$185 billion in 1990, up substantially from the $52.5 billion a year when Ronald Reagan took office in 1982.[1] Reagan and Bush (like his son, George W.), while strong on national defense, never had the political courage to reduce government spending.

Clinton was forced to largely abandon his liberal social plans and create a new economic agenda that could operate more favorably within the dictates of global monetarism. This new trajectory meant creating a program to convince the Federal Reserve and bond traders that the new administration would reduce the budget deficit and lessen the government’s demand for capital.[2] Clinton made the economy the administration’s number one concern, even molding a National Economic Council in the image of famed National Security Council. Bob Rubin and others from Wall Street were brought in to lead the new administration’s economic policy. Clinton even impressed Federal Reserve Chairman Alan Greenspan, who had grown weary of the Reagan legacy and what President Bush I had once called “voodoo economics.”[3]

Although the potential of the communications revolution was becoming apparent, what technology would constitute the “information highway” was not clear. Cable TV was growing quickly and offering new information services, while the Bell telephone companies were pushing Integrated Services Digital Network (ISDN) and experimenting with ADSL and other copper-based transmission technologies. Wireless was also becoming a viable new communications option. But as the term “cyberspace” began to circulate as an index of the potential of the new technologies, it was “virtual reality” that still captured the imagination of the high-tech movement.

Enter the Web. Although email was starting to become popular, it was not until the mass distribution of the Mosaic browser that the Internet moved out of academia into the realm of the popular imagination and use. At that point the Clinton Administration would take advantage of rapidly advancing technologies to help transform the Internet and its World Wide Web into a vibrant engine of economic growth.

President Clinton tapped his technology-savvy running mate to lead this transformation, at first as a social revolution, and then a commercial one. Soon after taking office in early 1993, Clinton assigned responsibility for the nation’s scientific and technology affairs to Vice-President Gore.[3] Gore had been a legislative leader in the Senate for technology issues, channeling nearly $3 billion into the creation of the World Wide Web with the High Performance Computing Act of 1991, also known as the “Gore Bill.” While the main purpose of the Act was to connect supercomputers, it resulted in the development of a high bandwidth (at the time) network for carrying data, 3-D graphics, and simulations.[5] It also led to the development of the Mosaic browser, the precursor to Netscape and the Mozilla Firefox browsers. Perhaps, more importantly, it provided the vision of networked society open to all types of activities, including the eventual promise of electronic commerce.

Gore then shaped the Information and Technology Act of 1992 to ensure that Internet technology development would apply in public education and services, healthcare, and industry. Gore drove the National Information Infrastructure Act of 1993, that passed in the Congressional House in July of that year, but fizzled out due to a new mood to turn to the private sector for more direct infrastructure building. He turned his attention to the NREN (National Research and Education Network), which attracted attention throughout the US academic, library, publishing, and scientific communities. In 1996, Gore pressed the “Next Generation Internet” project. Gore had indeed “taken the initiative” to help create the Internet.

But the Internet was still not open to commercial activity. The National Science Foundation (NSF) nurtured the Internet during most of the 1980s. Still, its content remained strictly noncommercial by legislative decree, even though it contracted its transmission out to private operators. The Provisions in the NSF’s original legislation restricted commerce because of the “acceptable use policy” clause required in its funded projects.

But pressures began to mount on the NSF as it was becoming clear that the Internet was showing more commercial potential. Email use and file transfer were increasing dramatically. Also, the release of Gopher, a point-and-click way of navigating the ASCII files by the University of Minnesota, displayed textual information that could be readily accessed. Finally, Congressman Rick Boucher introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the NSFNET.[6] A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law. The era of Internet-enabled e-commerce had begun.

This is a good review of how the Internet and World Wide Web came to be.

Notes

[1] Debt information from Greider, W. (1997) One World, Ready or Not: The Manic Logic of Global Capitalism. New York: Simon & Schuster. p. 308.
[2] Bob Woodward’s The Agenda investigated the changes the Clinton-Gore administration’s implemented after being extensively briefed on the economic situation they inherited. Woodward, B. (1994) The Agenda: Inside the Clinton White House. NY: Simon & Schuster.
[3] Information on Greenspan’s relationship with Clinton from Woodward, B. (1994) The Agenda: Inside the Clinton White House. NY: Simon & Schuster.
[4] Information on Gore’s contribution as Vice-President. Kahil, B. (1993) “Information Technology and Information Infrastructure,” in Branscomb, C. ed. Empowering Technology. Cambridge, MA: The MIT Press.
[5] Breslau, K. (1999) “The Gorecard,” WIRED. December, p. 321. Gore’s accomplishments are listed.
[6] Segeller, (1998) Nerds 2.0.1: A Brief History of the Internet

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College, and Victoria University in New Zealand. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.

America’s Financial Futures History

Posted on | January 2, 2016 | No Comments

In his book, Nature’s Metropolis, (1991) William Cronen discussed the rise of Chicago as a central entrepot in the formation of American West. The city was strategically located between the western prairies and northern timberlands and with access routes by river and the Great Lakes. As a dynamic supplier of the nation’s food and lumber resources, Chicago was ideally suited to become the port city for the circulation of the West’s great natural bounty. Located on the shore of Lake Michigan, “Chicago stood in the borderland between the western prairies and eastern oak-history forests, and the lake gave it access to the white pines and other coniferous trees of the north woods. Grasslands and hardwoods and softwood forests were all within reach.”[1] With the westward expansion of agricultural settlements throughout the 19th century, farmers started to look for markets to sell their non-subsistence beef, pork, and wheat supplies. Likewise, they were attracted to the big city to buy northeastern industrial goods at lower prices.

By the 1880s, Chicago enthusiasts were soon making comparisons with Rome, noting however, that while “all roads lead to Rome;” Chicago would become “the Rome of the railroads.” The famous city got its start after 1833, when the local Indian tribes were forced to sign away the last of their legal rights to the area and the construction of the Erie Canal meant a new waterway to New York and the East Coast. It prospered through the Civil War where it played a key role in supplying the North with foodstuffs and other resources from the western frontier. Into the next century, Chicago would continue to grow into the central node of a vast trading network of nature’s bounty, what would generically be called “commodities.”

The “commodity” emerged as an abstract term to refer to items such as grains, meat products, and even metals that are bought and sold in a market environment. Commodities also came to be sold in “futures”, a legal contract specifying the change of ownership at a future time and at a set price. Bakeries in the East, for example, could lock in prices for future wheat deliveries. Key developments in the emergence of commodity trading were the notions of grading and interchangeability. To the extent that products could be separated into different qualities, they could be grouped together according to a set standard and they could then be sold anonymously.

While technological innovations such as the railroad and the steamship would dramatically increase the efficiency of transporting goods, the development of Chicago’s famed exchanges would facilitate their allocation. The Chicago Board of Trade was started in 1848 as a private organization to boost the commercial opportunities of the city, but would soon play a crucial part in the development of a centralized site for the Midwest’s earthly gifts. It was the Crimean War a few years later that spurred the necessity for such a commodities market. As the demand for wheat sales doubled and tripled, the CBOT prospered, and membership increased proportionally.

In 1856, the Chicago Board of Trade made the “momentous decision to designate three categories of wheat in the city – white winter wheat, red winter wheat, and spring wheat – and to set standards of quality for each.”[2] The significance of the action was that it separated ownership from a specific quantity of grain. As farmers brought their produce to the market, the elevator operator could mix it with similar grains and the owner could be given a certificate of ownership of an equal quantity of similarly graded grain. Instead of loading grain in individual sacks, it could be shipped in railroad cars and stored in silos. The grain elevators were a major technological development that increased Chicago’s efficiency in coordinating the movements and sales of its grains.

The Board was still having problems with farmers selling damp, dirty, low-quality and mixed grains, so it instituted additional variations. By 1860, the Chicago Board of Trade had more than ten distinctions for grain and its right to impose such standards was written into Illinois law making it a “quasi-judicial entity with substantial legal powers to regulate the city’s trade.”[3]

During this same time period, the telegraph was spreading its metallic tentacles throughout the country, providing speedy access to news and price information. The western end of the transcontinental link was built in 1861, connecting Chicago with cities like Des Moines, Omaha, Kearney, Fort Laramie, Salt Lake City, Carson City, Sacramento, and on to San Francisco.[4] The western link quickly expanded to other cities and connected Chicago with farmers, lumberjacks, prospectors and ranchers eager to bring their goods to market. News of western harvests often triggered major price changes as it was transmitted rapidly between cities. News of droughts, European battles, and grain shortages brought nearly instant price changes in Chicago. Newspapers were a major beneficiary of the electric links as they printed major news stories as well as price information coming over the telegraph. The telegraph quickly made the Chicago Board of Trade a major world center for grain sales and linked it with a network of cities such as Buffalo, Montreal, New York, and Oswego that facilitated the trade of grains and other commodities.

The telegraph also instituted the futures market in the US. As trust in the grade system sanctioned by the Chicago Board of Trade grew, confidence in the quality of the anonymous, interchangeable commodity also increased. The telegraph allowed for one of earliest e-commerce transactions to regularly occur. The “to arrive” contract specified the delivery of a specified amount of grain to a buyer, most often in an eastern city. The railroad and the steamboat made it easier to guarantee such a delivery and the guarantees also provided a needed source of cash for the Western farmers and their agents. These contracts could be used as collateral to borrow money from banks. While the “to arrive” contracts had seen moderate use previously, the telegraph (along with the grading system) accelerated their use. Along with the traditional market in grain elevator receipts, a new market in contracts for the future delivery of grain products emerged. Because they followed the basic rules of the Board, they could be traded. “This meant that futures contracts–like the elevator receipts on which they depended—were essentially interchangeable, and could be bought and sold quite independently of the physical grain that might or might not be moving through the city.”[5]

During the 1860s, the futures market became institutionalized at the Chicago Board of Trade. The Civil War helped facilitate Chicago’s futures markets as such commodities as oats and pork were in high demand by the Union Army. After the war, European immigration increased substantially in the eastern cities, further increasing the demand for western food and lumber commodities. Bakers and butchers needed a stable flow of ingredients and product. The new market for futures contracts transferred risk from those who could ill-afford it, the bread-bakers and cookie-makers, to those that wanted to speculate on those risks.

In 1871, Chicago suffered from a devastating fire, but the city came back stronger than ever. By 1875, the market for grain futures reached $2 billion, while the grain cash business was estimated at the lesser $200 million.[6] Abstract commodity exchange had surpassed the market of “real” goods. “To arrive” contracts facilitated by telegraph in combination with elevator receipts had made Chicago as the futures capital of the world.

In 1874, the precursor to the Chicago Mercantile Exchange was formed. Called the Chicago Produce Exchange, it focused on farm goods such as butter, eggs, and poultry. In 1919, at the end of World War I, the Chicago Produce Exchange changed its name to the Chicago Mercantile Exchange (CME). It later expanded to trade frozen pork bellies as well as live hog and live pigs in the 1960s. The CME expanded yet again to include lean hogs and fluid milk, but its most important innovations came a decade later in the financial field.

Notes

[1] Quote on Chicago’s strategic location from Nature’s Metropolis, (1991) William Cronen, p. 25.
[2] Chicago Board of Trade’s momentous decision from William Cronen Nature’s Metropolis, (1991), p. 116.
[3] Additional information on the Chicago Board of Trade William Cronen Nature’s Metropolis, (1991), p. 118-119.
[4] Transcontinental link cities from Lewis Coe’s (1993) The Telegraph. London: McFarland & Co. p. 44.
[5] Quote on futures interchangeability from William Cronen Nature’s Metropolis, (1991), p. 125.
[6] Estimates of the grain and futures markets from William Cronen Nature’s Metropolis, (1991), p. 126.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College, and Victoria University in New Zealand. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.

Computer Technology and Problem-Based Learning (PBL)

Posted on | December 7, 2015 | No Comments

Many computer labs are now designed around the concept of problem-based learning (PBL), a student-centered approach where participants work in groups to solve open-ended problems. Instead of teachers presenting relevant material first and then have students applying the knowledge to solve problems, PBL engages the students in the problem, inviting them to engage higher-order thinking skills such as analysis, creativity, evaluation, and logic while working cooperatively in teams. Computer labs can facilitate problem-based learning as they are increasingly being designed with an eye towards collaboration, critical thinking, communication, research, and project management.

Theory Based Pedagogy

PBL applies insights from social constructivist and cognitive learning theories to create educational environments that reflect real-world complexities and foster personal responsibility for problem-solving. It builds on prior knowledge and fosters multiple ways of understanding and solving problems. It is an active process that seeks to drive motivation and understanding through challenge, relevance and accomplishment. PBL can be framed within the theory of John Dewey and educational philosophy of pragmatism that believed human beings learn best through a practical ‘hands on’ process.

Applications of PBL

PBL emerged primarily in medical schools as a curriculum development and delivery system to help students develop and hone problem solving skills in an information intensive environment. Medical professionals need to develop life-long learning skills to keep up with new information and techniques in their field, so PBL was implemented to help students to acquire necessary knowledge and skills. PBL is increasingly used in corporate environments as it promotes stronger working relationships and problem-solving tendencies. The case study method is a classic example of problem-based learning used in business schools.

Advantages of PBL

Don Woods has suggested that PBL is any learning environment in which the problem drives the learning. The problem is meant to be the vehicle that moves the group and each student towards the cognition of the situation and the construction of a set of solutions.

PBL mainly involves working in teams but can include independent and self-directed learning. Teams offer opportunities for leadership, stress communication and listening skills, and reward cooperation. Technologies enable asynchronous and synchronous collaboration schemes that challenge leadership skill-sets but translate into real world capabilities to manage projects and facilitate teamwork.

Central to team success is the self-awareness of individual capabilities and the evaluation of group processes. PBL stresses competences in oral and written communication as well as the ability to explain and often visualize key concepts in schematic forms for infographic slide presentations and short videos.

Whether working in a group or alone, critical thinking and analysis are combined with the ability to do additional online in-depth research across various related disciplines. Independent work can facilitate self-reliance and a sense of self. Self-directed learning lets students recognize their own cognitive strategies and understand concepts at their own pace. Major components are the application of the problem to course content and being sure it is situated in real world examples.

What are the Steps Involved?

Rather than teaching relevant material and subsequently having students apply the knowledge to solve problems, the problem is presented first. The process generally involves these steps:

– Introduce the problem in a carefully constructed, open-ended set of questions
– Establish groups, including rules and etiquette
– Organize and define roles including a leader/facilitator and a record keeper
– Initiate discussion by examining and defining the problem
– Explore what is already known about the problem
– Clarify related issues and define variables
– Determine what additional information needs to be learned and who is it pursue it
– Determine where to acquire the information and tools necessary to solve the problem
– Pursue individual tasks
– Regroup, share, and evaluate possible ways to solve the problem
– Strive for consensus and decide on a strategy to solve the problem
– Report on their findings and proposed solutions
– Assess learning acquisition and skill development

Limitations of PBL Include:

– Participants may need fundamental knowledge in key areas
– They must be willing to work within a group
– Time to organize is needed to engage fully, and report findings in a coherent way
Assessment becomes more complicated and often less quantitative
– Teachers must learn to be facilitators of the learning process
– Group dynamics can compromise effectiveness
– Wrong decisions can influence real world solutions
– Results are better when working within flexible classroom spaces

PBL and Technology

This last risk raises questions of how technology and designed learning environments can assist the collaborative and research processes that are central to PBL. Educational technologies can help the learner and diminish many challenges faced by the instructor. Technology can streamline the instructional processes by structuring communicative interactions and knowledge finding. Group processes can proceed synchronously in a simultaneous face-to-face system or asynchronously online and across geographic divides. Web-based learning can offer personalized learning tools to students of different abilities in and out of the classroom.

Computers are increasingly being used in classrooms to provide interactive experiences. Students working around a table with one computer within reach and easy viewing can conduct collaborative, and group decision-making exercises. These desk-based terminals provide many kinds of curricular resources: informative websites, games, manuals, online books, simulations, etc. that present problems that can challenge the students and drive learning. Games, in particular, can engage students in relevant and rigorous “meaningful play” through structured activities that provide context and feedback in instructional learning.

Web-based tutoring systems have not consistently been designed to be challenging and deliver interesting curricular knowledge with a tested instructional pedagogy. However, computer-based learning environments based on PBL and gamification have come a long way in offering online and packaged media with problem-based learning and tutoring complete with homework assignments with grading and feedback.

To what extent are educational pedagogies limited by the architecture, spaces, and technology available to us? This is a major challenge now in education and seems to the be contingent on the willingness and creativity of educational professionals. While the most immediate responsibility rests on the shoulder of teachers and professors, it is also contingent on classroom designers and school administrators to construct spaces that support coherent strategies and technologies to implement problem-based learning.

In summary, PBL is a curriculum development and delivery system that recognizes the need to develop problem-solving skills, including information literacy and research capabilities that reach across disciplines. Many medical and professional schools already use these practices as the training they impart has real-world consequences. It requires a structured approach to communication and group organization while remaining committed to real-world relevance as well as a willingness to let the students control their own learning.

Resources

Monahan, Torin. 2002. “Flexible Space & Built Pedagogy: Emerging IT Embodiments.” Inventio 4 (1): 1-19.
Pawson, E., Fournier, E., Haight, M., Muniz, O., Trafford, J., and Vajoczki, S. 2006. “Problem-based Learning in Geography: Towards a Critical Assessment of its Purposes, Benefits and Risks.” Journal of Geography in Higher Education 30 (1): 103–16.
“Problem-Based Learning.” Cornell University Center for Teaching Excellence, n.d. Web. 29 Nov. 2015. .
Woods, Don. “Problem-Based Learning (PBL).” N.p., n.d. Web. 30 Nov. 2015. .

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

YouTube: Alice’s Rabbithole or Hypertext to Hell?

Posted on | November 24, 2015 | No Comments

Edited remarks from the conference on
YouTube: Ten Years After
November 20, 2015
Hannam University
Linton School of Global Business

I’m happy to be a part of this commemoration of YouTube’s tenth year anniversary and thank Dr. Youngshin Cho for being our keynote speaker and enlightening us with his very informative talk.[1]

Hopefully, you are familiar with the story of Alice’s Adventures in Wonderland. The young girl from Lewis Carroll’s (1865) classic follows a giant rabbit into a dark hole and falls into a world of strange experiences, exotic creatures, and unusual paradoxes…

The story has been much discussed over the years, and speculations on the meanings of Alice’s adventure range from experiences with psychedelic drugs, spoiled foods, wild dreams, and even repressed sexual fantasies.

YouTube has all of these things, but my interest today is not to focus on content specifically but to point to, and for us to think about, the path, the journey, the discovery process inherent in the YouTube experience. I want to talk about the user experience, and specifically how it is guided by computer algorithms and a type of artificial intelligence.

While the meaning of Alice’s story will no doubt continue to be a topic of curious debate, there is a consensus that Carroll’s narrative is based on a sequence, a trajectory, a path that she follows. The term “rabbit hole” does not refer to a destination as much as it has come to mean a long, winding, and sometimes meandering journey with many connections and offshoots that often lead to serendipitous and surprising discoveries.

I’m not the first to connect Alice’s journey to online activities. Others have pointed to the Internet as essentially designed to function as a “rabbit hole;” primarily because of the way hyperlinks work, connecting one web page with another, taking the user down a voluntary “surfing” trip into new sites and visiting strange ideas and wonders along the way.

Interestingly, before we had the World Wide Web, we had the Xanadu project by Ted Nelson. He coined the terms “hypertext” and “hypermedia” in a publication way back in 1965. Xanadu is originally a name for Kublai Khan’s mythical summer palace, described by the enigmatic Marco Polo. “There is at this place a very fine marble palace, the rooms of which are all gold and painted with figures of men and beasts and birds, and with a variety of trees and flowers, all executed with such exquisite art that you regard them with delight and astonishment.” Nelson’s Xanadu strove to transform the reading experience with computer technology for an equally rich experience.

Let’s go to another mythical place, South Korea’s Gangnam district. Psy’s musical parody of the Seoul suburb’s delights and duplicities was an extraordinary YouTube hit.

With over 2,450,000,000 hits, the Korean singer’s Gangnam Style is leading YouTube with the most plays, but my immediate interest is the list of thumbnails of recommended hypertexted videos on the lower right side.

A key to understanding YouTube is its recommendation engine, a sophisticated computer algorithm and data collection system that finds content related to your search or the content you are watching. These computer programs reduce what could become a complex decision process to just a few recommendations. It will immediately list a series of videos based on metadata from the video currently displaying and from information gathered about your past viewings.

In a business context, this is called customer captivity, an important competitive advantage and a barrier to entry for other firms. It’s a technique to engage people in your content and keep someone on a web site by offering related or similar choices. Amazon and other e-commerce firms are increasingly using it. Netflix’s recommendation engine is also noteworthy.

The list of Youtube’s recommended videos are likely to vary but will probably include other versions of Gangnam Style and the rather awful Gentlemen, which brings me to my next topic.

Hypertext to Hell? Well, you may have heard the music hit “Highway to Hell” by Australian heavy metal band AC/DC whose lead singer drank himself to death a year later. Well, I wish him all the best and thank him for the excusing my homonym.

I think hell differs for each of us. There is an SNL skit where a very famous musician, Paul Simon, of Simon and Garfunkel, meets with the devil and sells his soul in order to be a famous musician. Years later, when the time comes to pay up, he discovers he is stuck in an elevator, listening to Muzak, the soft “easy listening” instrumentals of his classic songs, such as The Sound of Silence and Mrs. Robinson, for all eternity.

Here I am referring not to anything afterlife but rather the earthly hell we put ourselves into on occasion. YouTube can offer that same experience, if you choose. I discovered one pathway to a personal hell in a series of videos arguing for the validity of the “flat earth.” Some people actually believe the idea that the earth is round is NASA propaganda. I guess that is my personal hell because I was always fascinated with space and interstellar travel and perhaps quaintly believe astronauts landed on the Moon.

Well, I don’t want to leave you in hell. So I’ll return you to (1980) Xanadu with this video from the movie of the same name with the late Olivia Newton-John and the Electric Light Orchestra.

In closing, I’m going to make a slight jump and suggest that the human mind works very much like a recommendation engine. It gives you more of the thoughts on which you choose to focus. If you focus on positive thoughts, your mind will start to give you more of those, presenting you with similar ideas and emotional experiences. If you focus on topics you hate, your mind will give you more of that, driving you into a personal hell.

So be careful with your mental clicks, and avoid the highways to hell. Sometimes you go down that rabbit hole; sometimes you traverse the hypertext to hell. Make the right choices and get out if you find yourself going down the dark side (switching metaphors). Just do another search.

Notes

[1] Dr Youngshin Cho is a Senior Research Fellow at SK Research Institute in Seoul where he is conducting research on media trends and strategies of media and ICT companies, as well as looking at the impact of change in media policy. He obtained his PhD in Media and Telecommunication Policy from Pennsylvania State University in 2007 and his MA from Yonsei University in 1998. One of the country’s acknowledged authorities on these subjects, Dr Cho is also a consultant with the government on media policy.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    April 2024
    M T W T F S S
    1234567
    891011121314
    15161718192021
    22232425262728
    2930  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.