Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Geopolitical Risk and the Information Standard

Posted on | April 1, 2012 | No Comments

I’ve often used Walter Wriston’s “Information Standard” as part of an analytical framework for computerized tradingunderstanding the global economy and the implications of emerging digital financial practices. Maybe not with all the connotations as the Citicorp CEO Walter Wriston conceived it, but the basic idea that the gold-backed international currency framework that shaped the global economy after World War II has long been replaced by a new techno-structural system of computerized transactions, news flow, risk management, and technical analysis based on the US dollar and the collateral strength of US treasuries.[1]

Now gold is another dataset in the complex algorithm that shapes the prices and trades of the global financial system and consequently influences the policy decisions of nations and the flows of investment capital. In this post, I examine some of the recent historical threads of risk analysis and their relationship to computerized trading and the formation of a global “Information Standard” [2].

Risk has always been a factor in investing and lending, whether investing cattle in a sailing expedition to the Spice Islands or British pounds in a transcontinental railroad in the Americas. Captains and merchant ship owners knew that Edward Lloyd’s coffeehouse was the place to go for marine insurance and shipping news. That was the start of Lloyd’s of London, one of the most storied insurance companies in risk history. Certainly the “tulip mania” of 1630s Amsterdam is part of our lexicon of financial bubbles and risk, even getting a short cameo and description by Gordon Gekko in Oliver Stone’s Wall Street: Money Never Sleeps. Risk emerged a more predominant factor in the 1970s when a series of events and innovations occurred in the interconnected areas of international finance, news, and telecommunications.

The New Deal started an era of financial containment, but it couldn’t withstand the stresses of globalization and the rise of new technologies. In addition to legislation like the Glass-Steagall Act of 1933, which kept investment banks from gambling with depositor’s money, FDR confiscated all gold used for currency and built Fort Knox in Kentucky Fort Knowduring the mid-1930s to keep it out of the hands of the Nazis. The Allies used these bullion stores at the end of World War II as the basis of a new international economy by tying the dollar to gold at $35/oz (28.35 grams) and the monies of other Allies to the dollar. This experiment, organized in July 1944 when the United Nations Monetary and Financial Conference was held in Bretton Woods, New Hampshire, ran counter to US policy and was destined to failure. When Nixon ended the arrangement in August of 1971, it introduced unprecedented levels of volatility and volume into the global economy. The end of the Bretton Woods agreement was in part at blame for the two oil crises of the 1970s and the raging inflation that plagued the decade.

This new era of floating exchange rates was characterized by a new virtualization of foreign exchange (F/X) trading through interbank spot markets, facilitated largely by Reuter’s new Monitor Money Rates that displayed prices of currencies from banks around the world. Recognizing reuters money monitor the need for new F/X risk management techniques, currency futures were introduced at an offshoot of the Chicago Mercantile Exchange (CME). On May 16, 1972, the International Monetary Market (IMM) was opened to provide future delivery of currencies at fixed prices. It initially traded contracts in seven currencies: the British pound, the Canadian dollar, the German deutsche mark, the French franc, the Japanese yen, the Swiss franc, and Mexican peso. Both currency markets received a financial boost in October of 1973 when several Arab states launched an attack on Israel, setting off further volatility in the currency markets, especially as oil production was cut back in response to international support for the Jewish nation-state.

A related process was the recirculation of eurodollars (US currency held outside the country), from OPEC countries to developing and Soviet countries. As almost all oil sales are denominated in US dollars, it created a surplus in devalued “petrodollars” which western banks lent out to OPEC countries like Argentina, Bolivia, New Zealand, Poland and South Korea. To reduce their exposure, several banks would come together to make a syndicated loan to a country like Brazil, with a leading bank taking a major position and several others sharing the risk. Propelled by the rhetoric that sovereign countries don’t go bankrupt, this financial bubble would have a dramatic effect on nations around the world.

The resultant “Third World Debt Crisis” changed the global landscape, forcing countries to sell off government assets and state-owned enterprises (SOEs) and pressuring countries to open their borders to unrestricted and unexamined flows of data and news. Many of these countries were transformed from developing countries into “emerging markets”, open to international flows of direct and portfolio investments and subject to increasing amounts of country risk analysis and examination. The term “Washington Consensus” gained circulation as a set of policy prescriptions involving fiscal discipline, liberalization of trade and the inflows of capital, privatization, tax reform, and the deregulation of a wide range of industries to allow competition and foreign ownership. Government PTT (Post, Telephone, and Telegraph) operations were liberalized and sold off opening the way for modernization including the World Wide Web. Debt was transformed into equity and traded on new electronic platforms around the world while US$3 trillion flowed daily through the global currency creating a new global sovereign power that could influence the policies of individual countries.

As this global environment emerged in the late 1980s and early 1990s, it manifested a new type of trading organization intent on utilizing computerized risk management and trading techniques. Called “hedge funds”, they aimed to combine high leverage with various types of arbitrage methods and pairs trading. The general strategy was based on the thesis that multiple risky positions taken together can effectively eliminate risk itself. One company in particular, The story of LTCMLong Term Capital Management (LTCM) gathered together some of best financial minds to create the ultimate hedge fund. The payroll included Nobel Prize laureates Myron Scholes, half of the team that came up with the Black-Scholes algorithm for trading options and Robert C. Merton, who also received his prize for working on how to determine the value of derivatives. LTCM strove to take advantage of this new trading environment by using “dynamic hedging” to trade continuously and globally. They raised money from large investors and developed electronic trading systems that literally drew on a type of rocket science called “Ito calculus“, developed to guide a missile microsecond by microsecond. Their trading strategy was to use these computerized systems to continuously monitor and trade a combination of financial derivatives and securities globally, based on probability theories and risk management techniques.

While hugely successful at first, they ran into a series of geopolitical events. LTCM returned over US$2.5 billion to its investors in 1997. While indicating success, it also reduced their capital base, and consequently their ability to deal with volatility. LTCM ran into trouble during the Asian financial crisis in 1997 and especially in August 1998 when Russia defaulted on its government bond payments, losing over a half a billion dollars in a 24 hour period just three days later. Over-leveraged and under-capitalized, it lost over $4.5 billion in the course of a few months. With the entire financial system at stake, the New York Fed and several major financial institutions initiated a bailout. LTCM operated for a few more years before it was quietly disbanded.

The reputation of mathematical risk models took a hit with the fall of LTCM but they were by no means abandoned, especially after the Commodity Futures Modernization Act of 2000 further deregulated derivatives trading. The extraordinary surplus wealth that had been accumulating in the 20th century, along with new digital calculation and transaction methods that were recruited to make money in the financial markets, has intensified the complex of global data and trading activities that Wriston argued make up the global “Information Standard”. He was particularly intrigued with the power of computerized F/X trading that now exceeds $3 trillion a day. But the environment has since become increasingly complex as “quants” entered the equation with sophisticated algorithmic techniques to manage risk while trading a wide range of financial instruments for profit. Debt instruments in particular have been a focus of speculation and the sovereign bond market has become an additional gauge of national risk. This has become particularly evident in Europe where the euro has become a multinational currency, superceding the national economic policies of the so-called “PIIGS” (Portugal, Italy, Ireland, Greece, and Spain) countries. Having lost control of their currencies, their sovereign debt has become a focus of trading scrutiny.

Like the traditional Gold Standard, the Information Standard is no panacea for the management of the global economy. Both impose restrictions on national political structures and policy decision-making. Politicians like Republican Ron Paul want to go back to the Gold Standard, but powerful forces are invested in the current global system[3]. Also currency regimes are usually organized by dominant creditors and that status is moving from the US to China.

Risks are inherent in both value regimes but it is likely that the Information Standard is reflective of a complexity that is more inclusive of a wider group of participants and away from the government/big banker nexus that characterizes the economic organization around gold. But the trillions of dollars that are in play in the system continue to create a chaotic turbulence of both bubbles and innovation that are highly disruptive to established and more stationary routinized lives and political structures. That is why it will be interesting to see if China has the capability to impose a new order of managed exchange rates and flows of capital.

Notes

[1] Walter Wriston went to work for Citicorp after World War II and eventually become CEO in 1969. The son of a university president, Wriston was often noted for his knowledge of international relations and diplomacy and championed the recirculating OPEC dollars to developing countries as well as other technological innovations such as CDs and the ATM. The Tufts Digital Library contains the Walter Wriston Archives that holds many of his speeches and published articles.
[2] Much of the discussion on the Information Standard had been on its disciplining of nation-states as Wriston discussed in this The Twilight of Sovereignty: How the Information Revolution Is Transforming Our World. Wriston’s interpretation of the Information Standard is organized around a rhetoric of assurance, not a critical analysis. The power of multinational corporations, nation-state dictatorships, and any aggregation of power antithetical to democratic prospects will fall to the sovereign power of the information standard.
[3] The size of banks would have to be considered in evaluating whether the Information Standard has explanatory or analytical power. The credit crisis of 2007 and its remedies significantly increased the power of major banks.

©
ALL RIGHTS RESERVED

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED

Music’s Year of the Cloud

Posted on | January 12, 2012 | No Comments

I’m heading to Hawaii this January to talk about how cloud computing services and mobile technologies are influencing the music industry at The Pacific Telecommunications Council’s annual conference (PTC ’12: Harnessing Disruption: Global, Mobile, Social, Local).

I’ve had some help from my wife who, when she is not marketing green tea for Ito-en NA, is a semi-professional hula dancer. She has been listening to Hawaiian music on her iPhone and uses a program called Pandora. This service streams personalized music on your computer or mobile phone. When you type in a name of an artist, band, composer, or just a song, it begins to build a playlist of the music you chose and even more music like it.

iphone playing pandora

The last year has seen some interesting changes in the music industry with the maturation of cloud and mobile services. Established companies like Amazon, Apple, and Google continued to develop their music “digital locker” services that allow people to store their purchased music in a cloud server that they can access from multiple devices.[1] More intriguing are a number of new companies that have sprung up like MOG, Pandora, and Spotify that stream music like a personalized radio station.

The “cloud” is a metaphor for the massive data storage, processing, and transmission capabilities connected to the Internet and wireless services that can be accessed on mobile devices. They get their name from the cloud-like symbol that’s often used to represent the Internet in network schematics and flowcharts. These are the cloud computing technologies that power Google docs and other “recombinant” software-based services and digital platforms for global e-commerce.

So we can point to two distinct cloud offerings for the music industry at this time. Amazon MP3The first is the storage and streaming services such as Amazon’s Cloud Drive and Player, Apple’s iCloud, and Google Music. These services sell standardized MP3s or 256Kbps AAC DRM-free copies and allow you to store albums and songs on their servers and access them through a variety of consumer devices (iPods, iPads, Droid phones, Samsung Blue Ray players, etc.) and computers. They have yet to reach licensing agreements with the major music labels to allow them offer streaming of unpurchased music. Apple’s iCloud specifically advertises itself as “Your content, on all your devices”.

The other cloud service having a major impact on the music industry are those that enable streaming subscription services such as MOG, Pandora, Rdio, and Spotify. They allow consumers to sign up to listen to streamed music they haven’t personally bought. In general, they work off a a “freemium” model where you can get free music with some advertising or a premium paid service with no advertisements.

These services generally use recommendation engines, software applications that build models of a user’s preferences, in order to provide the types of songs they would like. These are used in a number of e-commerce platforms such Amazon and Netflix. Recommendation engines for example are updating algorithms and increasingly utilizing social data points to develop a referral systems connecting grids of personal contacts.

Both strategies will rely heavily on social commerce and its enhanced marketing, search and transaction capabilities. Logging into MOG for example, you will immediately be asked if you MOGwant to sign in using Facebook. That immediate access to some 800 million potential subscriptions has resulted in extraordinary growth for MOG and its 13 million song catalog. MOG collects the information you share to not only personalize music offerings but also to help you discover music through your contacts. Reducing search costs for reliable goods and services is an important part of social commerce’s appeal. Music has always had a subjective dimension where choices are factored in part based on preferences of others. Artists also often operate as currency paying the admission price for social acceptance. But how will artists and the music industry monetize music in this new cloud environment? Will they pay for the digital “lockers” to store their purchased music? Will they put up with the advertisements that crashed FM radio? Will they purchase subscriptions to stream personalized music? The are complex questions which will require a careful scrutiny of all the new services that cloud services offer and the dynamics of the music industry.

Notes

[1] “Digital Music’s Cloud Revolution”, by Steve Knopper in Rolling Stone. Dec 22-Jan 5, 2012, p.16.

Share

© ALL RIGHTS RESERVED

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital economics, information systems management, and global communications. © ALL RIGHTS RESERVED

Healthcare IT and the American Recovery and Reinvestment Act of 2009

Posted on | December 11, 2011 | No Comments

Citation APA (7th Edition)

Pennings, A.J. (2011, Dec 11) Healthcare IT and the American Recovery and Reinvestment Act of 2009. apennings.com https://apennings.com/enterprise-systems/healthcare-it-and-the-recovery-act-of-2009/

Within weeks of his inauguration, President Obama signed the American Recovery and Reinvestment Act of 2009, abbreviated ARRA, as part of a $787 billion stimulus package to help revive the ailing economy suffering from the Great Financial Crisis (GFC) of 2007.[1]

In 2008, the nation was teetering on the edge of economic ruin. Oil peaked at $157 a barrel in July, and the highly leveraged global securitization and credit default scheme collapsed, and with it the inflated housing market. The Dow-Jones Industrial Average (DJIA) fell from its closing high of 14,164 on Oct. 9, 2007 to a low of 7,882 on Oct. 10, 2008, just a few weeks before the election.

The Act has been criticized by the Left as too small and by the Right as a gift basket for US Congress members. However, the legislation did push a number of technological initiatives that would change the future of IT in healthcare.

One of the most important initiatives designated $19.2 billion towards an interoperable, standards-based infrastructure for the secure exchange of electronic health care information and medical records (EHR or EMR) among doctors, hospitals, laboratories, pharmacies, and healthcare research facilities.

The initiative got its start in the Bush Administration with a 2004 executive order creating the Office of the National Coordinator for Health Information Technology. It became part of the Department of Health and Human Services (DHHS) headed by then-Secretary Michael O. Leavitt (See video above). The Office was only funded at $60 million a year though, and the legislation never received congressional approval. The program engaged in only preliminary planning with its report, The Decade of Health Information Technology: Delivering Consumer-centric and Information-rich Health Care that called for a ten-year plan to develop a Nationwide Health Information Network (NHIN) of health care providers.

This network would connect regional health information organizations (RHIO) with regional health information exchanges (RHIEs), both of which would integrate clinical and public health data via electronic health record systems (EHR-S) with the goal of improving patient safety and delivering quality health care. It specified four objectives:

  • Bringing information tools to the point of care, especially by investing in EHR systems in physician offices and hospitals.
  • Building an interoperable health information infrastructure so that records follow the patient and clinicians have access to critical health care information when treatment decisions are being made.
  • Using health information technology to give consumers more access and involvement in health decisions.
  • Expanding capacity for public health monitoring, quality of care measurement, and bringing research advances more quickly into medical practice.[2]

With the Obama administration’s ARRA stimulus program, the diffusion of health information technology and the protection of medical records’ privacy and security were legislatively codified. Notably, the Health Information Technology for Economic and Clinical Health Act (HITECH) secured the national coordinator position and office. HITECH also provided some $2 billion for discretionary spending, primarily for grants and loans to implement health-related information and communications technologies.

To rush money into the economy, it established two related federal advisory committees to address healthcare ICT. One committee addressed standards to design a system of networked and interoperable electronic health records. The other committee developed policies to protect patient privacy and security. Together they worked with the private sector and consumer groups to develop the specifics of a health information network. The network permitted the ready exchange of certified electronic health records and other data while protecting patient privacy.[3] Much of the support was offered through “immediate funding” via federal agencies and grants to states, including a loan program to help providers purchase EHR systems as well as related training and technical support.

Most significantly, the HITECH Act allocated $17.2 billion in Medicare and Medicaid financial incentives for physicians and hospitals to implement Electronic Health Record (EHR) systems. Participating physicians could earn between $44,000 to $64,000 over the following five years for utilizing an electronic record system, providing they make “meaningful use” of the EHR installation.

Meaningful use included implementing certified EHR technologies with electronic prescribing capability meeting Department of Health and Human Services (HHS) guidelines: connectivity to other healthcare providers providing interoperable access to a patient’s health history, and; the capability to report to the HHS on how they are using the technology and its effectiveness, including fewer errors, clinical decision-making support, alerts, and other reminders.

Postscript

Whether the “stimulus package” saved the country from the economic abyss of the “Great Recession” will require a historical accounting of its positive economic impact on employment, GDP, and inflation. The economic stimulus package was often derided as a failure, even as unemployment fell from 8% to 5% by the end of the Obama administration.

Likewise, its influence on health care will require examining costs, mortality counts, patient care, and the populace’s overall well-being. As the 2012 elections for US President heated up, “Obamacare,” the Patient Protection and Affordable Care Act (PPACA) got most of the attention. The American Recovery and Reinvestment Act of 2009 propelled several technological initiatives, and its impact on the deployment of health information and communications technologies deserves a separate analysis.

Citation APA (7th Edition)

Pennings, A.J. (2011, Dec 11) Healthcare IT and the American Recovery and Reinvestment Act of 2009. apennings.com https://apennings.com/enterprise-systems/healthcare-it-and-the-recovery-act-of-2009/

Notes

[1] The American Recovery and Reinvestment Act of 2009. H.R.1.
[2] Quoted from “DHHS Office of National Coordinator for Health Information Technology (ONC).” Public Health Data Standards Consortium PHDSC – Promoting Standards Through Partnerships. Web. 11 Dec. 2011.
[3] Steinbrook, M.D., Robert. “Health Care and the American Recovery and Reinvestment Act.” N Engl J Med 360 (2009): 1057-060. March 12, 2009, 12 Mar. 2009. Web. 11 Dec. 2011.

Share

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, media economics, and global communications. © ALL RIGHTS RESERVED

The New Frontier of “Big Data”

Posted on | December 4, 2011 | No Comments

The uses of database technology have been going through some dramatic shifts due to new storage, processing, representation, and transmission capabilities. Innovations have made data continuously cheaper to store, quicker to organize, easier to visualize and faster to transmit. As a result, various private and public sectors organizations are developing new data-driven analytic strategies to capture value from a wide variety of information sources and the large volumes data that are now routinely collected and stored.

The Internet has added a whole new dimension to the “mining” of data from outside the organization while user generated content has added tremendously to the “infoverse” of obtainable and usable data. Terms like “terabytes”, “petabytes’, “exabytes” and “zettabytes” are being used to describe the large data volumes that are in play now. Mobile and satellite technologies are adding yet more new realms of information, leading overall to what some are calling the era of “Big Data“.

A recent report by the McKinsey Global Institute (MGI) and McKinsey’s Business Technology Office entitled Big Data: The Next Frontier for Innovation, Competition, and Productivity studied the implications of the “Big Data” phenomenon for a variety of private and public sector organizations; specifically healthcare and retail in the US, governments in Europe, as well as personal-location data globally that in itself they estimate could become a $600 billion industry. They concluded that the ability to capture, analyze, and visualize large data sets would be key to “new waves of productivity growth, innovation, and consumer surplus”. The research offered seven major points:

1. Data resources are now what economists call a factor of production. With land, labor and capital, they are integral to the production of economic goods and services that are key to the future viability of the US economy.

2. Big data can create value in a number of ways. The collection and organization of digital data can make information transparent and usable at much higher frequencies. Transactional data can provide useful performance information, assist management decision-making; and improve the development of new products and services while targeting them to individual customers.

3. Distilling value from mountains of data will become key for individual firms. Established firms, competitors and new entrants will continually compete to capture value from big data and develop data-driven strategies to leverage the scale and scope of companies’ access to data to drive innovation and competitive advantage. Data consultant Larry Mantrone, summarized the relevance of these trends: “Big Data tools offer organizations the ability to analyze information ranging from sales patterns and internal business processes to public perceptions of the organization itself. Current tools can even process data that isn’t stored in relational databases, such as application log files, plain text or even Twitter feeds.”

4. The use of big data offers benefits to consumers, although the amount of information that can be collected is rather disturbing. Personalization capabilities reduce search and transaction costs for consumers and reduce marketing costs for producers. On the other hand, data collectors promise they can deliver anonymity by stripping identifying characteristics, such as a name or physical description, from the data collected. Terence Craig, co-author of Privacy and Big Data thinks otherwise. “We’ve seen the Netflix de-anonymization, the AOL search release, and others. There’s been several cases where medical data has been released for laudatory goals, but that data has been de-anonymized rather quickly.” This includes the medical records of former Massachusetts governor William Weld that were identified by combining the State of Massachusetts voter’s list with PII healthcare records by Dr. Latanya Sweeney, the Director and Founder of the Data Privacy Lab at Harvard University. In her work on K-anonymity and the 1990 US census, she revealed that some 80% of the US population could be identified based on just three attributes: the 5-digit zip code, a birth date, and a person’s gender.

5. Some economic sectors like computers, finance, government, insurance and manufacturing are likely to experience greater gains of productivity and value from big data than other areas. Arts, construction, education; not so much.

6. Talent is a major issue. A shortage of some 140,000 to 190,000 people with “deep analytical skills” is expected in the US economy over the next 6 years. Also needed are another 1.5 million managers who can use the analysis of quantitative and qualitative data to make strategically effective decisions.

7. A number of ethical and social issues will continue to require scrutiny and policy responses. Protection of trade secrets and individual privacy will be crucial as well as the intellectual property and liability issues from drawing information from multiple sources. The situation gets more complex when you consider the use of third parties.

A practical example that provides interesting insights is the case of Google. The search and advertising behemoth is by most accounts the leader in the use of big data. A recent article in Wired magazine entitled “Googlenomics” discusses their strategies for gathering and monetizing the vast amounts of data they collect.

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is a professor of global media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii.

Visiting the Future of the Panama Canal

Posted on | November 2, 2011 | No Comments

I recently went to Panama with a group of our students in New York University’s MS in Management and Systems degree program that I am chairing and two faculty members from the Stern School of Business. Kristen Sosulski and Harry Chernoff were developing a course called Operations in Panama examining global supply chain management, logistics, and real estate development. We met with business leaders and government officials and visited some small and large businesses. Panama is a study in contrast with some of the largest rain forests in the world surrounding its capital city which impressed me as one of the tallest cities I have ever seen (And I live in NYC!).

As one might expect, visiting the Panama Canal was the trip’s highlight. It had special relevance for me because when my parents immigrated to the US from the Netherlands, they went first through the Canal and on to California, where they caught a train to New York and settled to raise our family.

The Canal was being upgraded to handle more and larger ships. Panama charges ships by their size, with the largest vessels paying over a half million US dollars for the transit. As 60% of the world’s shipping is conducted through containers, it is worth noting that the upgrades planned for the Canal will allow ships carrying nearly 12,000 containers, more than twice as much as the current limit of 4,800 containers. Dr. Sosulski wrote this interesting blog about different ways to understand the Panama Canal through various modes of visualization.

Panama Canal We happened to be at the Canal the morning President Obama signed the trade agreement with Panama. The Panama – United States Trade Promotion Agreement or Tratado de Libre Comercio (TLC) entre Panama y Estados Unidos is a bilateral agreement that has been in the works for the last ten years. The deal is part of Panama’s plan to leverage the Canal to become a major commercial hub. Panama is striving to go beyond just providing passage between the great oceans to adding value by building infrastructure such as a major international airport nearby and fiber optic networks. It also is working to improve its educational facilities and enticing companies to set up operations centers. It hopes to become the Singapore of the Americas.

The agreement is controversial, but it opens Panama to many international financial, insurance, and IT services. As the country continues to become a logistical hub for trade in the region, it will continue to need increasingly sophisticated services. These will help better coordinate its shipping traffic, financial activities, and distribution operations. Hopefully, it will also enhance safe passage for goods and services.

Postscript 2022: Shortages and Inflation

The expanded canal began commercial operation in June of 2016. The project built two new locks, one on the Atlantic side and another on the Pacific side. It dredged new channels to the enlarged locks and deepened and widened existing waterways. As a result, bulk shipments such as Liquefied Natural Gas (LNG) increased immediately and container shippage also grew steadily.

A relevant question for 2022 is whether shipping ports have prepared sufficiently for increased container traffic? Have ports such as Baltimore, Charleston, Los Angeles, Miami, New York, Philadelphia, Portland, and Virginia, modernized their facilities to become “big ship ready”? Ships anchored offshore, containers piling up on docks, and long-haul trucks queued for hours to get to the containers suggest they have not.

East Coast ports, in particular, have not been prepared for shipments from the largest and busiest ports: Shanghai, China; Singapore; Hong Kong; and Busan, South Korea. Massive ships coming in from Asia through the Panama Canal may need deeper waterways, wider turning basins, longer berths, bigger docks, and taller cranes that could stretch across 20 or more containers.

The supply crisis contributing to the inflation surge of 2022 has shed light on many of these insufficiencies, resulting in increased public support for modernizing these shipping infrastructures.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital economics, information systems management, and global communications. © ALL RIGHTS RESERVED

Popular Programming Languages

Posted on | October 12, 2011 | No Comments

It has been a while since I reviewed the latest statistics about the most popular programming languages.

The top 10 most popular programming languages according to the TIOBE Index in Sept of 2011 are.

  1. Java
  2. C
  3. C++
  4. C#
  5. PHP
  6. Objective-C
  7. (Visual) Basic
  8. Python
  9. Perl
  10. JavaScript

The list is a bit misleading. According to the Language Popularity Index, C and Java split just over 50% of the compiled and general purpose languages among the two with C having a slight edge. C was developed by AT&T’s Bell Labs in the early 1970s in conjunction with its work on Unix. Java was developed by Oracle’s subsidiary Sun Microsystems in the mid-1990s for interactive TV and mobile devices but found a more immediate home in the emerging World Wide Web. Java is currently a source of contention between Oracle and Google due to its influence on the Android operating system.

For non-compiled scripting languages PHP is the most popular with a 27.476% share followed by Perl with 16.415% and Python with about 13%. JavaScript returns to the top ten although the number 5 scripting language (after Ruby) with a 5.5% share.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED

The Price of Neglect

Posted on | October 8, 2011 | No Comments

I don’t always agree with Tom Friedman but the guy thinks about important questions and is an active contributor to major economic debates. Lately he has been promoting his new book with Michael Mandelbaum on the American predicament called That Used to be US: How America Fell Behind in the World It Invented and How We Can Come Back. They list the main challenges for America as they see it:

  • Globalization
  • Information technologies
  • National debt
  • Patterns of energy usage

I like this interview on Charlie Rose.

Their basic contention is that America is neglecting the basic formula which has made it great. And that is the focus on four aspects of American society that require some collective action:

  • Educate people up to the latest developments in science and technology
  • Rebuild American infrastructure such as airports, energy, roads, and telecommunications
  • Open up immigration to allow the brightest and hardest working to come to America
  • Restructure investment capabilities to support innovation and production capabilities

We’ll see if I have more to say on the book after I get some more time to read it. I like their focus on collective action but unfortunately too many Americans believe government investments go to someone besides them. Collective action for them means bailing out Wall Street, providing healthcare benefits for illegal aliens, overpaying government bureaucrats, or providing welfare for people who don’t want to work. And while a case can be made for each of these arguments, we are throwing the baby out with the bathwater, so to speak. This is the real tragedy of our times.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED

The Walt Disney of Japan

Posted on | July 29, 2011 | No Comments

with Justin Restivo
Hayao Miyazaki is one of the most renowned animation creators in the world, known primarily for his classic Japanese feature-length films such as My Neighbor Totoro, Princess Mononoke, Spirited Away and the recent release, Ponyo. He founded the Japanese animation venture, Studio Ghibli. “Ghibli” is the Arabic name for the Mediterranean wind, and refers to Miyazaki’s intention for a wind of change to blow through the Japanese animation industry. Many have come to consider him to be the Walt Disney of Japan because of the memorable animated characters he created and the impact of his films.

Miyazaki was born during World War II in a suburb of Tokyo. He was fascinated with aviation as a child, and loved to draw airplanes as well as trains and automobiles. In high school, he saw the Japan’s first feature length anime film, The Tale of the White Serpent and became interested in the field of animation. Although Miyazaki obtained his college degree in economics, he began working at the Toei Animation Studio as an artist and assistant animator but maintained a strong interest in the industry, becoming the leader of their labor union.

By 1968 Miyazaki became a chief animator and concept artist for Hols: Prince of the Sun directed by Isao Takahata, who he continued to work with for the next few decades. Miyazaki left Toei Animation Studio to direct several episodes of the television show, Lupin III, an action comedy and also worked on several films at A-Pro and Nippon studios. His signature film was the feature length Lupin III: The Castle of Cagliostro which he directed.

He was developing his own oeuvre which were incorporated in his next film Nausicaa, The Valley of the Wind. His was beginning to explore issues of pacifism, feminism, a fascination with flight, and the impact of humans on the environment. Miyazaki was also concerned with exploring the morality of his characters and particularly his villians, which he infused with ambiguity. He would pursue these themes successfully after he started Studio Ghibli in 1985 with films such as Castle in the Sky, My Neighbor Totoro, Kiki’s Delivery Service and Porco Rosso. A major achievement came in 1997 with Princess Mononoke, which became Japan’s highest grossing film of all time was the first animated film to win Japan’s equivalent of “Best Picture” award.

This relationship brought global fame to Studio Ghibli as Miyazaki’s films were developing a fan base in the West. Spirited Away won ”Best Animated Feature” at the 2002 American Academy Awards and eventually surpassed Princess Mononoke‘s box office totals while Ponyo is reaching even higher levels of acclaim and revenue as the distribution partnership with Disney continues.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    January 2025
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.