Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Mind Mapping in Higher Education

Posted on | October 22, 2015 | No Comments

An underutilized creative technology that enhances the educational experience is a writing process called mind mapping. Mind mapping is a visual form of contemplation and word association that can be used for note taking, journaling, and creative thinking. Connecting words with branching lines and adding colors and images creates a mental dynamism that facilitates getting ideas into concrete form on paper or an electronic document that can be readily reviewed and shared.

This post describes mind maps, how they are used, their value to students, and how they can be used for lectures, class exercises, and brainstorming activities.

What are mind maps?

Inspiration Software, Inc. defines a mind map as “a visual representation of hierarchical information that includes a central idea surrounded by connected branches of associated topics.” Mind maps can be either drawn by hand or designed on computer software application like that offered by Inspiration, Mindjet or Mindmeister. The basic technique draws on the human tendency to make mental connections by word association and the power of writing to stimulate thoughts and creative ideas.

How do they work?

Mind maps start in the middle of a page or document with a central idea and expand outward using keywords on branches. They move from the general to the specific, with details become more defined as the map expands outward. Fewer words are better than phrases or sentences, but every distinct keyword or grouping of words (or image) should be set on its own line. Be sure the lines are the same length as the word/image they support. It is important to economize on space as reaching the perimeter of the document restricts your thinking and means you are literally running out of ideas.

How students can use mind maps

These diagrams can be used in a number of ways. At their best, they help classify ideas or generate new connections between ideas, the very definition of creativity. They can make note taking in classes more efficient, engaging, and fun. Notes taken in mind maps are also easier and faster to review, making studying for a test more efficient. Mind maps can help organize ideas for writing projects as they are basically non-linear outlines. Examine this mind map on Shakespeare.

Another way to use mind maps is to start off a class requesting the students to map out the last class. This facilitates learning by reinforcing key ideas, directs the student’s attention to the class at hand and its topics, and gives the students visible information to share in class. As they are good for getting to details, mind maps are useful for event planning such as class trips, group presentations or graduation parties.

Using mind maps for presentations

Mind maps can be used to present information in class lectures, although certain precautions should be taken. They are inherently personal so other viewers should be guided step-by-step. Displays of mind maps from a computer projected on a screen should be accompanied with specific guidance. First, the principles of the hierarchical structure should be reviewed. The presentation should proceed from the central idea to the details, showing both the “Big picture” was well as the significance of related concepts.

It is efficient to organize the key ideas of the maps in a clockwise fashion starting at 1 o’clock. It also makes the map more intelligible by providing a familiar structure.

Mind maps should converted to jpegs and then to pdfs for additional usability. With a pdf you can highlight a section of the map you want to focus on. Using the mouse you can click and draw squares and rectangles around the words, phrases or images that are relevant to the lecture.

Group brainstorming

Mind maps draw on a whiteboard, blackboard or a large newsprint pads on an easel can be used with groups of students to brainstorm and exchange ideas. This means capturing each person’s thoughts while simultaneously stimulating the group’s best thinking. Words are again organized radially around the center idea. Prioritization can be avoided until subsequent stages, when more sequential or hierarchical structures can be arranged in a new mind map, or a Gantt chart or other form of visual organization. By presenting ideas visually, it is easier for the students to follow and can more readily contribute to the group process.

Summary

Mind mapping is a valuable tool for higher education activities. Students can use them for notetaking and reviewing previous lectures. They can also be used for class exercises that stimulate creative thinking. Mind maps have a magnetic quality where ideas attract similar thoughts. Therefore they can be used to increase student concentration and focus.

Faculty can use mind maps to develop class presentations and actually use them to present their lectures by drawing on a board or projecting from a computer on to a screen. Although these diagrams are not immediately useful for communicating ideas, with proper guidance they can be used very effectively. They are also good for reviewing previous lectures or reading assignments. Lastly, they can be used for brainstorming and facilitating student participation.

Notes

[1] Images from http://www.tonybuzan.com/ website. Tony Buzan is generally considered the inventor of mind mapping.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Markets: Pros and Cons

Posted on | October 4, 2015 | No Comments

The term “market” has been widely circulated to refer to any arrangement, institution or mechanism that facilitates the contacts between potential sellers and buyers and helps them negotiate the terms of a transaction. In other words, a market is any system that brings together the suppliers of goods and services with potential customers.

The term “market” evokes imagery of a medieval city center or town square filled with merchants peddling food or wares and haggling over prices with interested customers. A market depends on the conditions of voluntary exchange where buyers and sellers are free to accept or reject the terms offered by the other. Voluntary exchange assumes that the acts of trading between persons make both parties to the trade subjectively better off than they were before the trade.

Markets also assume that competition exists between sellers, and between sellers. Economic models of markets are based on the idea of perfect competition, where no one seller or buyer can control the price of an “economic good.” In this vision of a rather Utopian economic system, the acts of individuals, working in their own self-interest, will operate collectively to produce a self-correcting system. Prices will move to an “equilibrium point” where producers of economic goods will supply an adequate amount to meet the demand of consumers willing to pay that price. Unless someone was cheated, both parties end the transaction satisfied because the exchange has gained them some advantage or benefit.

An important condition is the “effective demand” of consumers – do the buyers of economic goods have sufficient bargaining power – mainly money. Consumers must have the desire, plus the money to back it up. Central to any market is a mutually accepted means of payment. While parties may exchange goods and services by barter, most markets rely on sellers offering their goods or services (including labor) in exchange for currency from buyers. Any medium of exchange will depend on trust in the monetary mechanism as buyers and sellers must readily accept and part with it for a market to function effectively. Money has had a long history of being things, most notably gold. Gold has striking physical attributes: it doesn’t rust, it doesn’t chip, and it can be melted into a variety of shapes. Other metals such as silver and platinum have also served as money.

It is interesting that societies gravitate towards the use of some symbolic entity to facilitate these transactions. At its most basic level, money can be anything that a buyer and seller agree is money. At times, commodities such as rice or tobacco and even alcohol have served the roles money. Market enthusiasts often overlook the importance of money, focusing instead on the behaviors of market participants.

The pros and cons of markets are hotly debated today. Some believe markets are an ideal system to organize society. They often cite Adam Smith’s famous “invisible hand” as the God-given mechanism that organizes a harmonious society based on market activity. Others believe markets are prone to failure and give rise to unequal conditions and challenge democratic participation.

One of the best explanations of the strengths and weaknesses of the market system comes from The Business of Media Corporate Media and the Public Interest (2006) by David Croteau and William Hoynes. They point to the strengths of markets such efficiency, responsiveness, flexibility, and innovation. They also discuss the limitations of markets as well. These include enhancing inequality, amorality, their failure to meet social needs, and the failure to meet democratic needs. Below is a summary of some of their key ideas.

The market provides efficiency by forcing suppliers to compete with each other and into a relationship with consumers that requires their utmost attention. The suppliers of goods and services compete with one another to provide the best products, and the competition among them forces them to bring down prices and improve quality. Firms become organized around cutting costs and finding advantages over other companies. They have immediate incentives to produce efficiencies as sales and revenue numbers from market activities provide an important feedback mechanism.

Responsiveness is another feature of markets that draws on the dynamics of supply and demand. Companies strive to adapt to challenges in the marketplace. New technologies and expectations, incomes and tastes and preferences of consumers require companies to make changes in their products, delivery methods, and retail schedules. Likewise, consumers respond to new conditions in their ability to shop for bargains, find substitute goods, and adopt new trends.

Flexibility refers to the ability of companies to adapt to changing conditions. In the absence of a larger regulatory regime, companies are able to produce new products, new versions of products, or move in entirely new directions. In a market environment, companies can compete for consumers by making changes within their organizational structure, including adjustments in production, marketing, and finance.

Lastly, markets stimulate innovation in that they provide rewards for new ideas and products. The potential for rewards, and necessities of gaining competitive advantages, drive companies to innovate. Rewards include market share, but also increased profits. They point out that without competition, firms avoid risk, an essential component of innovation as many experiments fail.

Croteau and Hoynes also point out serious concerns about markets that economists do not generally address.

The tendency of markets to reproduce inequality is one important drawback to markets. While some inequality produces contrast and incentives to work hard or to be entrepreneurial, a society with a major divide between haves and have-nots will tend towards dystopia, a “sick” place. Thomas Piketty’s Capital addresses this issue head-on and warns that capital gravitates towards more inequality, and the trickle-down effects tend to lead to a slower and slower drip. Neo-elites benefiting from the rolling back of the estate tax have advantages that others don’t have. Croteau and Hoynes use the vote metaphor, “one dollar, one vote” to refer to the advantages the rich have over the poor, as they have many more dollars, and thus many more votes.

The second concern they have about markets is that they are amoral. Not necessarily immoral, but rather that the market system only registers purchases and prices and doesn’t make moral distinctions between, for example, human trafficking, drug trafficking, and oil trafficking. The commerce in a drug to cure malaria does not register differently from a drug that provides a temporary physical stimulation. Markets do not judge products unless it registers changes in demand. It does not favor child care, healthy foods, or fuel efficient cars, unless customers make their claims in currency.

Can markets meet social needs? This has been a pressing question for the last thirty years as privatization was often forwarded by market enthusiasts as an effective strategy to replace government services – for some of the reasons listed above. But a number of services and sometimes goods should probably be provided by some level of government – defense, education, family care and planning, fire protection, food safety, law enforcement, traffic management, roads and parks.

Can markets meet democratic needs? Aldous Huxley warned of a society with too many distractions, too much trivia, seeped in drugged numbness and pleasures. Because markets are amoral, they can become saturated with economic goods that service vices rather than public spirit. Competition, in this case, may result in a race to the lowest common denominator rather than higher social ideals. Rather than political dialogue that would enhance democratic participation, the competition among media businesses tends to drive content towards sensationalist entertainment.

Comedian Robin Williams once quipped, “Cocaine is God’s way of telling you that you are making too much money.” Markets are a powerful system of material production and creative engagement, but they create inequalities, often with products and services that are of dubious social value. How a society enhances and/or tempers market forces continues to be a major challenge for countries around the world.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

The Telecom Crash of 2002

Posted on | September 27, 2015 | No Comments

One of the economic history’s most fascinating questions will deal with the stock market advances during the eight Clinton-Gore years, especially those in the Internet and telecommunications sectors. During their tenure in the 1990s the stock markets boomed, and the investments created the lowest unemployment in years as well as contributed significantly to the budget surpluses. Even Alan Greenspan’s speech on December 5, 1996, in which he questioned when the “irrational exuberance” of stock purchases, merely marked the beginning of the historical “bull run” that would end with the Telecom Crash of 2002.

Billions of electronic dollars moved through newly computerized stock markets into established and new technology companies as the economy roared forward on a record pace. ECNs (Electronic Communication Networks) such as Island were approved by the SEC for matching buyers and sellers and created a much-needed trading facility in the hectic new environment. While most people participated primarily through institutions providing mutual funds, 401Ks, and pension funds; day trading emerged through the Internet to provide a more direct interface to the financial markets. The American investment economy had not seen such universal participation since the 1920s when radio companies such as RCA set the pace of speculative investing.

NASDAQ went from 675 in January 1993 to 2,796 at the end of January 2001 (a 414% increase), although it hit a high of 5132 in March of 2000. The DJIA went from 3,370 to 10,887 (323%) during that same period with a high of 11,497 in December 1999. The Dow dropped from 8,883 to 7,539 in August of 1998 after the Asian and Russian financial crises, but managed to recover quickly.[1]

Any “information superhighway” would need a legislative framework within which it could be built. A major element of this new regime was the rewriting of the basic telecommunications law. Central to the administration’s economic success was changing the Communications Act of 1934. Written and passed as part of the New Deal, its aim was to curb the abuses of monopoly while ensuring the development and availability of efficient and nationwide broadcasting, telegraph, and telephone services. The Act authorized separate monopolies in each of the communications sectors: radio broadcasting, telegraphy, and telephony (this list would grow to include cable TV, satellites and wireless) subject to the FCC rulings on appropriate prices and services. Within the new regime, a new framework was needed that allowed digital convergence among these separated sectors.[2]

The Clinton Administration and its appointed FCC Chairman, Reed Hundt, worked with the Republican-controlled Congress to pass the Telecommunications Act of 1996. It was an awkward piece of legislation, but one which allowed the FCC to create a new environment for the Internet and e-commerce to grow. For instance, it required telcos to allow other companies to lease their network for interconnection purposes at regulated rates. It lead to a large number of ISPs offering low-cost Internet access. The legislation had lot of political pork; in particular, the broadcast networks were bought off by giving them free spectrum for future offerings of digital television.[3]

Part of the Act broke open the segmentation that had separated the different sectors. Subsequent to the Act, companies could invade the other’s turf. It removed barriers to entry to the once protected monopoly-controlled sectors. For example, broadcasters could move into broadband and carriers could offer content. It also allowed consolidation of different media companies, creating a new frenzy of mergers. AT&T for example, made major multi-billion purchases of Media One and TCI to become a cable behemoth after they invented cable modems for broadband delivery along with television services. AOL, by far the largest ISP, merged with Time-Warner in an attempt to cross pollinate between the “old economy” and the “new economy”.

In the wake of the Telecommunications Act of 1996, investors poured some $2 trillion dollars into the telecommunications industry.[4] Enron, Global Crossing, Tyco, Winstar, and Worldcom were a few of the companies that attempted to position themselves on the forefront of the telecommunications revolution. Enron, primarily an energy company, bought and built a 15,000-mile fiber optic network throughout the US and set up a new division, Enron Broadband Services to manage the network. Enron also attempted to create a spot market for its bandwidth capacity services as well as other carriers, ISPs, and major users with leased lines. Global Crossing, led by Michael Milken protégé Gary Winnick, built an extensive fiber optic network including a substantial international portion. In addition, it also took over Frontier, a Rochester NY based telephone company. A major conglomerate, Tyco, also made major investments in international optical IP networks including a major hub on Oahu, Hawaii.[5] Worldcom incorporated telecom competition pioneer MCI and became the largest Internet backbone service in the world through its purchase of Internet provider UUNET. The investments in the telecommunications sector dramatically increased the speed and scope of its digital transmission capacity, transforming it into the major conduit for worldwide Internet traffic.

The newly deregulated environment heated the battle for broadband access to the home. The FCC’s longstanding distinction between basic and enhanced services allowed a new breed of ISPs to connect to home computer users from local toll sites for unlimited time periods. But the Bell companies retaliated with services including ISDN and then DSL and would eventually work the system to gain control over most local ISPs.

Competition from cable television and the satellite industry was looming. New modems and switching capabilities allowed the cable companies to use their high-bandwidth coaxial and fiber optic facilities to provide high-speed access to the Internet. Satellite television services developed new compression techniques that not only allowed many more channels to be broadcast but with encryption techniques that allowed for individual subscriber services. But satellites still suffered the signal delay that made it difficult to provide interactive services from a spacecraft located over 36,000 kilometers above the earth. Although all three were moving towards the use of IP networking, cable television had the advantage of moving into broadband Internet service with its legacy infrastructure which would soon include voice-over-IP (VOIP) telephone sets.

After Al Gore had wrapped up the Democratic nomination on Super Tuesday, March 7, 2000, NASDAQ reached a high of over 5000. But in April prices began to fall although the Dow, the top 30 oligopolies, continued to rise throughout the summer. November’s election was extremely close with Gore winning the nation’s overall vote, but Bush narrowly winning the Presidency by Electoral College, taking Florida’s 29 electoral votes by a mere 537 popular votes after a controversial Supreme Court decision.[6] From the NASDAQ’s high of over 5000 in March of 2000, it quickly declined over 3000 points by the time the Bush administration settled into their new offices less than a year later.

With the new presidency, tech stocks fell to earth. The “new economy” was over, and the “old economy” had retaken Washington DC. President George Bush and most of his immediate staff (Cheney, Evans, Rice) were veterans of the oil business. The new administration represented a new set of economic values and geopolitical priorities. The NASDAQ continued to fall while the Dow-Jones Index of 30 established corporations climbed even higher, surpassing the 11,000 mark again in February 2001.

The telecommunications industry went into steep decline. The first revelation came with the meltdown of Enron, a company that was consistently listed in the top 10 of the Fortune 500. Heavily burdened by its junk bond debt, Enron desperately sought new ways to raise cash including putting together an 18,000-mile fiber optic network.[7] One of its schemes embraced the Internet in an interesting but futile attempt to create a trading environment for bandwidth as it had for natural gas spot contracts. But it was too little, too late. Enron Online made its first trade in December 1999 as the markets were at their peak, and Enron was trading near $90. It would soon wind up as the largest bankruptcy in history and destroy accounting giant Arthur Andersen in the process. But not before FORBES would tout Enron as “The Most Innovative Company in America” for the fifth year in a row.[8]

As the Worldcom case would show, many companies started to engage in illegal accounting techniques after the markets faltered. In late June 2002, CNBC reported that Worldcom had been discovered to have accounting irregularities dating back to early 2001. Nearly US$4 billion had been illegally documented as capital expenditures. Worldcom had registered $17.79 billion in 2001 “line cost” expenses instead of the $14.73 billion it should have reported. The result was that it reported US$2.393 billion in 2001 profits instead of showing what should have been a $662 million dollar loss. Shares dropped quickly. Although the stock had already fallen from its 1999 high of $64 a share to just over $2, it soon dropped to mere pennies before the stock stopped trading.[9] Other companies such as Qwest and Tyco further reduced the vestiges of the general public’s confidence in the stock market, and particularly its telecommunications companies.

The stock markets continued to decline as new corporate scandals were revealed. It finally reached into the DJIA during mid-2002. The “Dow”, representing mainly the stalwarts of the old economy, would maintain its strength during the new administration’s early years. It would dip below but return to highs over 10,000 intermittently until the summer of 2002 when the corporate scandals were exposed. Bush’s SEC chief, Harvey Pitt, failed to gain the confidence of investors and eventually resigned. The Wilshire Total Market Index fell from $17.25 trillion on March 24, 2000 to $10.03 trillion on July 18, 2002. By August, 2002 over $7 trillion of stock values had dissipated. Particularly hard hit were the tech sectors.

Telecommunications services, which had accounted for 7.54% of the Wilshire Total Market Index at the end of March, 2000 saw its total value fall to only 3.63%. Information technology fell from 36.2% to 15.01% and even Microsoft saw its market capitalization fall from $581.3 billion to $276.8 billion. Finally, Congress passed the Sarbannes-Oxley Bill in August requiring all corporate CEOs to sign off on their company’s books. The fall abated, but at a cost of trillions of investor dollars from IRAs, mutual funds, individual investments, and pension funds.[10]

Notes

[1] Stock figures for both the NASDAQ and DJIA are from Yahoo! Finance, the “poor man’s Bloomberg”. They indicate monthly figures and may represent the daily or weekly highs. Greenspan “irrational exuberance” information from Daniel Gross’ (2000) Bull Run: Wall Street, the Democrats, and the New Politics of Personal Finance. New York: Public Affairs. p. 19.
[2] Information on Communications Act of 1934. Martin, J. (1976) Telecommunications and the Computer. (2nd Edition) NJ: Prentice Hall. p. 31.
[3] Republican Bob Dole was a particularly vocal critic of the spectrum giveaway.
[4] BUSINESS WEEK, (2002) “Inside the Telecom Game”. Cover Story, August 5, 2002. Pp. 34-40.
[5] An article by John Duchemin on the Honolulu Advertiser’s website chronicled the travails of the Tyco telecommunications hub in Wai’anae. The 38,000 square foot center went unused when the telecom market collapsed. August 13, 2002.
[6] The USA operates its Presidential elections with an electoral college system. People don’t elect the President directly but when a candidate wins a state, a determined number of electoral votes go to that candidate. The system keeps any one section of the country from dominating the country.
[7] We see Michael Milken’s mark who was a consultant to Lay and helped him fend off Irwin Jacobs and other potential corporate raiders, but at the price of a incurring huge debt. See Peter C. Fusaro and Ross M. Miller’s (2002) What Went Wrong at Enron. NJ: John Wiley & Sons.
[8] Fusaro, P.C. and Miller, R.M. (2002) What Went Wrong at Enron. NJ: John Wiley & Sons. p. 172.
[9] Worldcom figures from THE NEW YORK TIMES, Friday, June 28, 2002. pp. C1-C6. Articles by Michael Wilson and Leslie Wayne.
[10] Feaster, S.W. (2002) “The Incredible Shrinking Stock Market: More Than $7 Trillion Gone,” NEW YORK TIMES. July 21, 2002. p. 14 WK.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Hootsuite and the Social Media Curriculum

Posted on | September 21, 2015 | No Comments

I’ve been training people in Hootsuite for the last three years. The desktop and mobile dashboard system was designed to integrate and manage social media applications like Facebook, Google+, LinkedIn and Twitter. Some others you might recognize are Foursquare, MySpace, WordPress, TrendSpottr, and Mixi. While it has some competition, it is still the best social media management tool available.

Hootsuite is promoted as a brand management or communications management tool that is handy for day-to-day social networking management and/or campaign management. It can be used at a personal level, for social media consultants working for several clients, and for media management at large-scale enterprises.

I was originally exposed to Hootsuite at my first SXSW in Austin, TX in the spring of 2012. I was invited by my colleague David Altounian, who taught it in his classes at St. Edward’s University and was on a pre-conference discussion put on by Hootsuite University about social media and education. The image below features Hootsuite moderator Kirsten Bailey, Dr. William Ward from the Newhouse School of Public Communications at Syracuse, Lea Lashley of Southern Methodist University and David Altounian from St. Edwards University.

Hootsuite University at SXSW

They discussed the value of social media education, problems in getting academia to recognize or respond to the social media environment, teaching skills vs. principles, and does social media change the marketing discipline.

A few years ago I posted my thoughts on important curriculum issues for social media education. It is now an NYU course that I have taught online, and I’m currently teaching Hootsuite in a Political Communication class at Hannam University in the Republic of Korea. These are the general areas that I proposed should be considered in a program on social media.

– New developments in social media technologies and techniques;
– Key communication and economic attributes that power this medium, including important metrics;
– How social media can be used as part of an organization’s communications strategy;
– Key skill sets and knowledge students can acquire for entrepreneurial innovation and employment in this area;
– Legal, privacy, and other unfolding social concerns that accompany this dynamic new medium;
– Issues of social change, citizen engagement, and democratic prospects;
– Research implications of social media and the theorization and methodological skills needed to conceptualize research projects.

I guess I would add the importance of learning a social media management tool like Hootsuite, Spredfast, or Sprout Social to that list, even though it is suggested in the first topic.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Fed Watcher’s Handbook is on Amazon

Posted on | August 24, 2015 | No Comments

The world is abuzz with talk about US Federal Reserve and whether it will raise interest rates in September. Interest rates have hovered near 0 percent since 2008, and a rate hike to .25% or .5% is possible. Analysts are asking basic questions: Is the economy strong enough? What do we make of economic problems in China? Is the housing market heating up? Do declining oil prices mean a global slowdown? Are employment rates sufficient? Have we seen the end to Greek crisis? How would an interest rate hike affect the volatile financial and stock markets?

My book, just out on Amazon, doesn’t predict what the Fed will do, but provides background on the Fed and how it works. It also introduces a process that classes and organizations can use to make their own analysis of the situation.

FedWatch1The Fed Watcher’s Handbook: Simulating the Federal Reserve in Classrooms and Organizations is available on Amazon either as a paperback or electronic Kindle version.

The book introduces the Federal Reserve System, the Federal Open Market Committee (FOMC), and a simulation I developed at New York University to make economics and monetary policy more relevant and easier to understand. I later used it at the MBA level, and I’m currently exploring its use as a team-building exercise for organizations.

In the simulation, participants become a member of the US central bank, either a district president or someone on the Board of Governors. The “Fed” is made up of 12 districts such as Chicago, Boston, Atlanta, New York, Dallas, etc. Participants have to study the bios of the Presidents or Governors and then report on their district.

Then we meet a day or two before the actual Federal Reserve meets and simulate their discussions and voting process to determine interest rates. We even collectively write the press release after the group decides. These activities allow them to compare the process and results of the simulation with what occurred at the actual Fed FOMC meeting.

FedWatch1b

Share

© ALL RIGHTS RESERVED

Anthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

Emerging Areas of Digital Media Expertise, Part 2, Data Analytics and Visualization

Posted on | August 11, 2015 | No Comments

This post is the second part of a discussion on what kind of knowledge, skills, and abilities are needed for working in emerging digital media environments. Previously I pointed out that students gravitate towards certain areas of expertise according to their interests and perceived aptitudes and strengths. I discussed Design, Technical, and Strategic Communication aspects of new digital environments. Below I examine Data Analytics and Visualization starting with the importance of metrics in media and other industries.

Understanding metrics is an important aspect of global media and culture industries. Measurements of magazine and newspaper circulation, book readership, as well as television and radio audiences, have each had a distinguished history in the media field. The Nielsen ratings, for example, have been a crucial part of the television industry and its ability to attract advertisers. The introduction of digital technologies have made them even more important, and Nielsen has expanded its monitoring activities to PCs and mobile devices.

Search engines, social media platforms, and a myriad of new media applications on the web and mobile devices have increased the production of useful data and new systems of analysis have augmented their utility. Analytics is directly involved in monitoring ongoing content delivery, registering user choices, and communicating with fans and other users. They also connect incoming flows of information to key organizational concerns about financial sustainability, legal risks, and brand management involved in digital media activities. This type of information is increasingly important for management of private and public organizations.

Organizations are beginning to acknowledge the importance of social media metrics across the range of corporate and non-profit objectives, especially those that involve legal, human resources, as well as advertising and marketing activities. These new metrics can be roughly categorized into three areas. At a basic level, they are granular metrics that quantify activities like the number of retweets, check-ins, as well as likes and subscribers.

Strategically, metrics can also be useful for designing new products and services, facilitating support and promoting advocacy for an organization, fostering dialogues about products and services, and monitoring marketing/social media campaigns. Lastly, metrics are of particular concern to upper management as they can be useful to provide information on the sustainability of an organization. Those in the “C-suites” (CIOs, CFOs, COOs, and CEOs) can use the information on an organization’s financial status, technical performance, and legal risks to assist management decision-making. Metrics present connections from social media investments to key concerns such product development, service innovation, policy changes, market share, and stock market value. Recognizing the increasing utility of metrics, dashboards have grown in importance to management as a way to collect and display data in a visually comprehensive way.

The increased attention on metrics has suggested an era of “big data” analytics has emerged in the digital media sphere. The collection of unstructured information from around the web (as opposed to pre-defined, structured data from traditional databases) present unprecedented opportunities to conceptualize and capture empirical information for various parts of an organization. Techniques such as pattern and phrase matching use new types of algorithms to capture and organize information from throughout the Internet, including the rapidly growing “Internet of Things” (IoT). The result is a transformation in the way commerce, politics and news are organized and managed.

Combined with artificial intelligence (AI) and natural language processing (NPL) for instance, cognitive systems like IBM’s Watson are disrupting industries like healthcare and finance. Watson made a spectacular debut by beating two human contestants on the TV game show Jeopardy, a challenging combination of cultural, historical and scientific questions. While Watson is struggling to catch on, AI is emerging in autonomous vehicles, voice recognition, game systems and educational technology. They could have a major impact on cultural and experience industries as well.

Project managers and media producers can use the data to see connections between content/cultural products, audience participation, customer satisfaction, and market share. Those in the “C-suites” (CIOs, CFOs, COOs, and CEOs) can use the information on financial status, technical performance, and legal risks. Besides assisting management decision-making, analytics can provide useful performance information, and improve the development of new content products, cultural expressions, and experience-based services while targeting them to individual customers. New developments like AI-assisted genre innovation and other incursions into the creative process are a justifiable means for concern, cognitive-logistical support for cultural event planners, film surveyors, and other creative content producers could be a welcome provision in the cultural/media industries.

This move to big data requires the ability to conceptualize and enact strategies for data collection, analysis, and visualization. Students should develop an appreciation for research and statistical thinking, as well as visual literacies and competencies for representing information and designing graphics that display data results in interesting and aesthetically appealing formats. The ability to identify and capture data from spreadsheets, the web and other online sources is important, as well as the ability to formulate pertinent questions and hypotheses that shape data research and models that can help explain or predict human or system behaviors.

Not everyone will be comfortable with these new types of data collection and intelligence-producing systems, but like them or not, AI and big data are encroaching rapidly on modern life.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

The FCC Helps Business Go “Online”

Posted on | July 16, 2015 | No Comments

The use of computers was starting to become an important tool for businesses by the mid-1960s and with the introduction of timesharing, a communications component was adding value and enhancing productivity. Factories began using data processing to control chemical flows and machine tools and warehouses used them to monitor inventories. Bank branch offices started to use minicomputers to process the day’s transactions and transmit the information to the bank headquarters. These new applications of computer communications were highly valued by the financial industries that were about to apply them in globally revolutionary ways.

The term “online” emerged as a way to avoid the FCC’s purview to regulate all communications.[1] While the nascent computer industry was experimenting with data transfer over telephone lines, it was coming to the attention of the FCC whose responsibility according to the Communications Act of 1934 was to regulate “all communication by air or wire.” The agency initiated a series of “Computer Inquiries” to determine what, if any, stance it should take regarding data communications.[2]

The First Computer Inquiry initiated during the 1960s investigated whether data communications should be deregulated. But just as important, it provided an early voice for the computer users to initiate change in the telecommunications network structure. It was after all, a time in which the only thing attached to the telephone network was a black rotary phone sanctioned by the Bell System. Computer One’s verdict in the early 1970s was to grant more power to corporate users to design and deploy a data communications infrastructure that would best suit their needs. The FCC subsequently created a distinction between unregulated computer processing and regulated telecommunications.

Such a differentiation did not ensure however, the successful growth and change of network services for eager corporate computer users. Computer Two was initiated in 1976 amidst a widespread adoption of computer technologies by the Fortune 500. But they needed to use the basic telecommunications infrastructure that had been largely built by AT&T. Although AT&T’s Bell Labs had invented the transistor and connected SAGE’s radars over long distances to their central computers, they were not moving fast enough for corporate users. The Bell telephone network was preoccupied with offering universal telephone service and did not see connecting large mainframes as a major market, at first.

Their hesitancy was also the result of previous regulation. The Consent Decree of 1956 had restricted AT&T from entering the computer business as well as engaging in any international activities. The FCC’s decision at the conclusion of the Second Computer Inquiry allowed AT&T to move into the data communications area through an unregulated subsidiary. However, the ultimate fate of domestic data communications would require the resolution of a 1974 antitrust suit against AT&T. In 1982, the Justice Department’s Consent Decree settled against the domestic blue chip monopoly and broke up the company. This action had a dramatic influence on the shaping of data communications and the Internet until the Telecommunications Act of 1996.

In retrospect, Computer One and Computer Two determined that the FCC would continue to work in the interests of the corporate users and the development of data communications, even if that meant ruling against the dominant communications carrier.

Notes

[1] See Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation for background on the word “online”. p. 22.
[2] The Communications Act of 1934 was one of the last of the New Deal reforms.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

SAGE, SABRE and the Airline Industry

Posted on | June 17, 2015 | No Comments

Military funding lead to the invention of the modem and other data communications technologies for a North American defense system. This grid of radar and other sensors connected to central computers with over a million miles of telephone line. Its headquarters would later be located deep within the Colorado Cheyenne Mountains and be the model for the WOPR, the errant machine in movie War Games (1983). The first commercial application of that technology was adopted by the airline industry.

In 1952, IBM had been chosen along with MIT to build 56 large computers (at nearly $30 million each) for the SAGE defense project.[1] In addition to access to the considerable knowledge base about electronics at MIT and the billions of dollars it earned, IBM also developed very valuable expertise in engineering and production know-how required to design and mass produce printed circuit boards and magnetic core memories. As SAGE was effectively the first wide-area network, the technology translated over the next few years into a new project to combine data processing and communications called Semi-Automatic Business-Research Environment (SABRE).

The airlines, in particular, had an interest in coordinating their activities over long distances, much like the railroads. Pan Am, for example, was growing rapidly as a commercial airline and needed computer and communication facilities to integrate its passenger reservations and cargo control. The company would later implement this “Panamac” computer communication system, but in the early 1950s, this was still an impractical ideal. Both computers and telecommunications still needed to undergo major technological improvements before this vision could be implemented.

SABRE was installed by American Airlines in 1964 and used the SAGE technology to track and coordinate airline seats. Some 1200 terminals were connected to the large mainframe computer over 12,000 miles of telephone wires.[2] During the 1960s, Pan Am developed a network of computer hookups from Honolulu to Europe. Messages extending beyond had to be relayed through telegrams, telex, and leased telegraph lines. United Airlines also had a communications network running at this time that would incorporate 105 cities.

The computerized system reduced the need for reservation processing staff and improved the load factor on flights. Computers helped democratize airlines so others besides the “Jet Set” would be able to use them. In an era of rapidly increasing business, airlines were able to manage and allocate passenger seating and cargo much more efficiently.[3]

The SABRE network was developed for the airline business, but it also became the first real-time transaction processing system used for hotel reservations, industrial process control, and automated financial transactions.[4] SABRE would remain a crucial airline reservations system for travel agents around the world until the commercialization of the Internet; they began to offer Travelocity, one of the premiere e-commerce sites on the World Wide Web.

SAGE and its SABRE offspring provided the model of command, control, and communication for industry using computers. The strategic importance of communications was well known for posts and telegraphy. Telegraphs had been crucial for modern businesses for nearly a century. Electronic voice communications had been recognized for connecting business as early as 1877 when J. Lloyd Haigh ordered the first subscriber line over the unfinished Brooklyn Bridge to connect his office at 81 John St. in Manhattan to his factory in South Brooklyn.[5] SABRE was the first wide area network for computer control and coordination.

Computer communications presented a new opportunity to operate business at a distance. While data could be punched on paper cards or transferred to magnetic tape and then moved to remote locations, electronic transmission offered speed and immediacy. Computers were connected “inplant”, within and/or between buildings without the use of a common carrier or “out-plant” connecting separate premises through lines bought or leased from a telecommunications company. In the 1960s corporations started to piece together data links with leased alphanumeric and voice lines.

Notes

[1] Edwards, Paul, N. (1997) The Closed World: Computers and the Politics of Discourse in Cold War America. p. 101.
[2] Burg, Urs. Von. (2001) The Triumph of Ethernet: Technological Communities and the Battle for the LAN Standard. Stanford, CA: Stanford University Press. P. 56.
[3] Martin, J. (1976) Telecommunications and the Computer. 2nd Edition. Englewood, N.J: Prentice-Hall.
[4] Flamm, K. (1988) Creating the Computer: Government, Industry and High Technology. Washington D.C: The Brookings Institute. p. 89.
[5] My gratitude to Professor Jan Mainzer at Marist College for providing this reference. She provided photocopies of The City of New York (1915) by Henry Collins Brown, NY: Old Colony Press.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    July 2024
    M T W T F S S
    1234567
    891011121314
    15161718192021
    22232425262728
    293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.