TOP SOCIAL MEDIA AND FORUM SITES IN THE US
Posted on | October 18, 2012 | No Comments
Here is a list of the top social media websites for October 2012 as compiled by eBizMBA [1]. Next to their name is an estimate of Unique Monthly Visitors (EUMVs).
I have added links in case you are not familiar with the individual sites but I also highly recommend that you visit the eBizMBA site for their extraordinary lists and articles such as the Top 15 Most Popular File Sharing Websites and the Top 15 Most Popular Music Websites.[2]
1. Facebook – 750,000,000 EUMVs
2. Twitter – 250,000,000
3. LinkedIn – 110,000,000
4. MySpace – 70,500,000
5. Google Plus+ – 65,000,000
6. DeviantArt – 25,500,000
7. LiveJournal – 20,500,000
8. Tagged – 19,500,000
9. Orkut – 17,500,000
10. Pinterest – 15,500,000
11. CafeMom – 12,500,000
12. Ning – 12,000,000
13. Meetup – 7,500,000
14. myLife – 5,400,000
15. Badoo – 2,500,000
I’m not quite sure why Youtube is not on this list. I do see it on their top video websites with 450,000,000 EUMVs which would put it second on this list. Likewise Wikipedia with 350,000,000 Estimated Unique Monthly Visitors would rank very high as well. It is listed under Top 15 Most Popular Reference Websites.
I like this infographic by Fred Cavazza for providing a taxonomy of social media sites.[3]
Notes
[1] eBizMBA Rank gathers information from Alexa Global Traffic Rank, and U.S. Traffic Rank to compile an average for each website.
[2] Statistics from Compete and Quantcast.
[3] Fred lives in Paris but also keeps a blog for Forbes called Welcome to the 21st Century.
Anthony J. Pennings, PhD recently joined the Digital Media Management program at St. Edwards University in Austin TX, after ten years on the faculty of New York University. © ALL RIGHTS RESERVED
WSJ in the Ether about Inventing the Internet
Posted on | July 31, 2012 | No Comments
The privatization and commercialization of Cold War technology, a central part of the Reagan Revolution, entered the limelight during the 2012 presidential election with the attention given to the role of government in the economy.
Controversy emerged recently with Gordon Crovitz’s opinion piece in the Wall Street Journal entitled “Who Really Invented the Internet?” Crovitz’s article is part of the backlash to President Obama’s somewhat poorly phrased, but nonetheless accurate assertion, “The Internet didn’t get invented on its own. Government research created the Internet so that all companies could make money off the Internet.”
Crovitz covers a number of points about the Internet but manages to avoid Al Gore’s “During my service in the United States Congress, I took the initiative in creating the Internet” CNN quote from the Vice-President.
Instead his major focus is on Xerox and the creation of Ethernet networking technologies at their research and development facilities in Silicon Valley. The Palo Alto Research Center (PARC) was made famous when its technology was picked over by Steve Jobs for the Macintosh computer. Crovitz’s focus on the importance of Ethernet is an interesting contribution to the study of the evolving Internet but a more in-depth look at this technology does more to illuminate the government’s startup role in the Internet rather than its disprove its participation.
PARC was sent up by Xerox in 1970 to establish leadership in the “architecture of information”, a sufficiently vague but enticing term coined by Xerox CEO Peter McColough. To do that, Xerox itself picked over the expertise and technology created and funded by the government’s Advanced Research Projects Agency (ARPA). ARPA was created by President Eisenhower in reaction to the USSR’s Sputnik satellite success. One of its major accomplishments was that it literally created the computer science field by funding departments throughout the US university system. It also guided the development of the ARPANET, the first implementation of packet-switched data communications technology, and generally considered the original precursor to the Internet.
Drawing on Xerox’s great wealth surplus, harvested during the 1960s, in part from the Great Society’s new bureaucracies; PARC recruited the brightest of ARPA’s brain trust. Starting with Robert Taylor, the director of ARPA’s main computer division, PARC recruited some of the best people in the emerging computer science field including those with expertise in personal computing and timesharing. Taylor had worked under the famous J.C.R Licklider, the first director of information processing at ARPA and had funded many seminal projects including Doug Engelbart’s NLS project which had invented the mouse. He even hired Larry Roberts who coordinated the ARPANET project and went on to create the ITU’s X.25 protocols that allowed government telcos around the world to offer packet-switching networks for banking and other commercial organizations.
By recruiting some of computer science’s top researchers, Xerox developed the Alto and the Star, two of the first personal computers with a GUI interface, hypertext, and the mouse. But it didn’t have a way of networking them.
So Xerox PARC hired Robert Metcalfe, a Harvard PhD who worked closely with the ARPANET, to develop networking technology to connect its personal computers. Harvard had initially rejected Metcalfe’s proposal to write his dissertation on the ARPANET, so he left to spend some time at the University of Hawaii with the ALOHANET project so he could study their innovative work on data networking. The ALOHANET was a government funded project that experimented with using radio broadcasts between Hawaiian islands to transfer data. There Metcalfe picked up crucial ideas on packet-switching and collision detection from Engineering Professor Norm Abramson that would provide the basis of his Ph.D. dissertation and later for networking innovations at PARC.
Robert X. Cringely interviews Abramson and Metcalfe on how Ethernet was created as part of this hour-long documentary.
Concepts emerging from the ALOHA project led to Xerox’s Ethernet local area networking products. While the ARPANET had solved certain issues related to long distance data communications, Ethernet tackled short-range communications needed in an office or campus environment. It is now not only widely used in wired local area networks but is prevalent in the IEEE 802 series wireless “WiFi” technology used today as well as emergent 4G mobile services used for smartphones and tablets such as the iPad.
Anyone examining the state of modern computer and web technologies has to marvel at the role of human ingenuity, capital markets, and the organizational capabilities. Start-ups and the modern corporation have done an amazing job refining the basic technologies developed by the military and government-funded contractors into the modern Internet. Unfortunately,trying to rewrite government’s role out of its history is not only misleading, but obfuscates important components of the national system of innovation. World War II, the Cold War, and the Space Race were all crucial for the development of the satellites, computers, and microprocessors that help make up the global Internet. Capital markets and enterprise are crucial for important refinements in energy, health, and further developments in information technology and media, but it would foolish to ignore the basic research and practical that emanate from public monies.
PARC was an important intermediary that helped transform years of government investment into workable products. It was part of the inventive complex that captured the imagination of consumers and showed companies how to become more productive. It spawned people like Bob Metcalfe, who went on to start the 3Com company (now part of HP). Several other researchers from PARC such as Charles Simonyi went over to Microsoft to help create Windows and the suite of Office products based on the ARPA/PARC/Apple graphical user interface (GUI). And, of course, Apple is currently the world’s most pivotal economic engine.
Metcalfe, Jobs, along with Bill Gates and Larry Ellison are a few of the main “supply-siders” who drew on the technological and scientific legacy of the Cold War and the associated “Space Race“, as well as other government investments; to create the products and services that propel the global economy and create our modern digital world. The term “Reagan Revolution” has been applied to an extraordinary transfer of wealth from the “public” sector to the private sector, but one that paid off substantially as microprocessors, satellites, and data networks among many other innovations developed by the government were refined for economic consumption and utilization.
Full Disclosure: I took a Satellite Communications class with Norm Abramson at the University of Hawaii when I was working on my PhD with the East-West Center. If one follows on Crovitz’s contention that the Ethernet was the seminal technology creating the Internet, I guess I would have argue that Abramson and the University of Hawaii invented the Internet. 🙂
Citation APA (7th Edition)
Pennings, A.J. (2012, July 21). WSJ in the Ether about Inventing the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/wsj-in-the-ether-about-the-internet/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Advanced Research Projects Agency (ARPA) > Aloha System > ALOHANET > Commercialization of Cold War technology > Ethernet > Norm Abramson > Reagan Revolution > Robert Metcalfe > Space Race > WiFi > Xerox > Xerox CEO Peter McColough > Xerox PARC
Arthur C. Clarke’s Three Laws of Innovation
Posted on | July 1, 2012 | No Comments
During World War II, Arther C. Clarke was an electronics engineer in a top-secret radar installation outside London. It was a scary time when German V-1 and V-2 rockets were reigning terror on England. The experience did give him some time to experiment with radio waves and think about the future of technological innovation. This, of course, is the same Arther C. Clarke who went on to become a famous science fiction writer and author of 2001: A Space Odyssey.
Clarke made his mark first in non-fiction when he published a seminal article on the possibilities of satellites in the 1945 edition of a British journal, Wireless World[1]. He conceived the idea of putting satellites, what he initially called “rocket stations,” in orbit around the world to act as radio relays providing global communications. He proposed that three satellites placed an orbit high enough above the earth could blanket the Earth with radio broadcasts. He reasoned that a satellite circling the Earth at about 6,870 miles per hour and placed 22,300 miles above the equator would maintain pace with the revolving earth below and provide a stationary target for bouncing radio waves back to the earth. This “geostationary orbit,” often known as the Clarke Belt,” provided the basic science and rationale for the future of communications satellites.
As Clarke became famous, he reflected on the processes of predicting change and prophesying the future. In the early 1960s, he proposed some ideas about thinking about innovation with his three laws (not to be confused with Asimov’s three laws of robotics).
In an essay entitled “Hazards of Prophecy: The Failure of Imagination”, he proposed his first law:
1) When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
(OK, he didn’t get gender equality back then. but he his noted for his quote, “I don’t believe in God, but I’m interested in her” quote).
With encouragement from colleagues and readers he soon developed a second law which he had hinted at in his essay:
2) The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
One of his ideas was the space elevator that would replace rocketing stuff into space. This idea is being actively researched, and the science has been largely developed.
And surrendering to the notion of three laws such as Isaac Newton’s laws of motion, he suggested his most famous:
3) Any sufficiently advanced technology is indistinguishable from magic.
Could one even imagine the possibility of a satellite communications, say during the Civil War, about a 100 years before Clarke’s vision was achieved?
Notes
[1] Arthur C. Clarke, October 1945 “Extraterrestial Relays,” Wireless World.
[2] Clarke’s Law, later Clarke’s First Law, can be found in the essay
“Hazards of Prophecy: The Failure of Imagination”, in the collection
Profiles of the Future, 1962, revised 1973, Harper & Row.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global e-commerce. © ALL RIGHTS RESERVED
The Meaning Makers: Omnicom Group
Posted on | June 27, 2012 | No Comments
Update: On July 28, 2013 it was announced that Publicis Groupe and Omnicom Group were merging to form Publicis Omnicom Group.
Meaning-making in our media-saturated world through advertising and public relations has traditionally has been dominated by several major holding companies that each in turn owns a large number of individual of advertising, design, public relations and media companies. These include the Omnicom Group and Interpublic both headquartered in New York as well as WPP in London and Publicis of Paris. Here I’m going to first talk about Omnicom.
The Omnicom Group consists of three major advertising “brands”, BBDO Worldwide, DDB Worldwide, and TBWA\Worldwide; three major public relations firms, Fleishman-Hillard, Ketchum, and Porter Novelli. Marketing services are coordinated through Diversified Agency Services (DAS), a division of Omnicom Group Inc with more than 190 companies operating internationally and locally. This includes customer relationship management (CRM) and B2B advertising services, public relations, as well as specialty communications operating fields such as financial, healthcare, recruitment, multicultural marketing. The Omnicom Group also includes media services companies such as OMD, PHD, and Prometheus. Specialized services include Icon International, a leading asset barter company, Novus, OMG Outdoor Media Group, and Resolution Media.
Omnicom’s use of IBM Cognos Business Analytics is discussed in this video. The IBM system is being used to coordinate their global network of holding companies.
Omnicon is involved in PR for nations, such as Ketchum’s work with Russia to develop briefing points for interviews; press releases, fact sheets, etc. Omnicom companies have also promoted their oil companies and facilitated meetings between representatives of Russia’s government officials and international “experts’ who are often interviewed in the media. Of particular interest to the Russian government has guidance for their acceptance into the World Trade Organization (WTO), a mission accomplished in December of 2011.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global e-commerce. He is currently a visiting professor at Hannam University in South Korea.
© ALL RIGHTS RESERVED
Tags: Interpublic > OMD > Omnicom Group > PHD > Prometheus > Publicis
Four Generations of Wireless Tech
Posted on | April 30, 2012 | No Comments
The ubiquity, ease and sophistication of mobile services has proven to be an extraordinarily popular addition to modern social and productive life. The term “generations” has been applied to wireless technology classifications as a way to refer to the major disruptions and innovations in the state of mobile technology and associated services. These innovations include the move to data and the Internet protocols associated with convergence of multiple forms of communications media (cable, mobile, wireline) and the wide array of services that are becoming increasingly available on portable devices. We are now on the cusp of the 4th generation rollout of wireless services with intriguing implications for enterprise mobility, “m-commerce”, and a wide array of new entertainment and personal productivity services.
By 1982, the Federal Communications Commission (FCC) had recognized the importance of the wireless communications market and began to define Cellular Market Areas (CMA) and assigning area based radio licenses. It split the 40 MHz of radio spectrum it had allocated to cellular to two market segments, half would go to the local telephone companies in each geographical area and the other to interested non-telephone companies by lottery. Although AT&T’s Bell labs had effectively began the cellular market, it had estimated the 2000 market to be slightly less than a million subscribers and consequently abandoned it during its divestiture of the regional phone companies. Meanwhile financier Michael Milken began a process of helping the McCaw family buy up the other licenses, making them multibillionaires when they sold out to AT&T in the mid-1990s.
The first generation (1G) of wireless phones were large analog voice machines and their data transmission capability was virtually nonexistent. This initial generation was developed in the 1980s through a combination of lotteries and the rollout of cellular sites and integrated networks. It used multiple base stations with each providing service to small adjoining cell areas. Its most popular phone was the Motorola DynaTAC known sometimes as “the brick”, now immortalized by financier Gordon Gecko’s early morning beach stroll in Wall Street (1986). 1G was hampered by a multitude of standards such as AMPS, TACs, and NMT that competed for acceptance. The Advanced Mobile Phone System (AMPS) was the first standardized cellular service in the world and used mainly in the US.
The second generation (2G) of wireless technology was the first to provide data services of any significance. By the early 1990s, GSM (Global System for Mobile Communications) was introduced first in Europe and in the U.S. by T-Mobile and then other countries around the world. GSM standards were developed in 1982 by the Groupe Spécial Mobile committee. An offshoot of the European Conference of Postal and Telecommunications Administrations (CEPT), it was the standard that would allow national telecoms around the world to provide services. Although voice services improved significantly, the top data speed was only 14.4 Kbps. AT&T utilized Time-Division Multiple Access techniques (TDMA)-based systems, while Bell Atlantic Mobile introduced CDMA in 1996. This second generation digital technology reduced power consumption and carried more traffic while voice quality did improve, and security became more adept. The Motorola StarTac phone was originally developed for AMPS but was sold for both TDMA and CDMA systems.
New innovations sparked the development of the 2.5G standards that provided faster data speeds. The additional half a generation referred to the use of data packets. Known as the General Packet Radio Service (GPRS), the new standards could provide 56-171 Kbps of service. It has been used for Short Message Service (SMS) and MMS (Multimedia Messaging Service) services, WAP (Wireless Application Protocol), as well as Internet access. An advanced form of GPRS called EDGE (Enhanced Data Rates for Global Evolution) was used for the first Apple mobile phone, considered the first version using 3G technology.
Third generation (3G) network technology was introduced by Japan’s NTT DoCoMo in 1998 but was adopted slowly in other countries, mainly because of the difficulties of obtaining additional electromagnetic spectrum needed for the new towers and services. 3G technologies provided a range of new services, including better voice quality and faster speeds. Multimedia services like Internet access, mobile TV, and video calls became available. Telecom and application services such as file downloads and file sharing, made it easy to retrieve, install and share apps. 3G radio standards have been largely specified by the the International Mobile Telecommunications-2000 (IMT-2000) of the International Telecommunication Union but the major carriers continued to evolve their own systems such as Sprint and Verizon’s CDMA 2000 and AT&T and T-Mobile’s Universal Mobile Telecommunications System (UMTS), an upgrade of GSM based on the ITU’s IMT-2000 standard set, but an expensive one as it required new base stations and frequency allocations.
A 3.5 generation became available with the introduction of High Speed Packet Access (HSPA) with promises of 14.4Mbps although 3.5-7.2 were more likely.
Fourth generation wireless technology is still a work in progress but seeks to provide mobile all-IP communications and high speed Internet access to laptops with USB wireless modems, smartphones, and to other mobile devices. Sprint released the first 4G phone in March of 2010 at the communication industry’s annual CTIA event in Las Vegas. With a 4.3 inch screen, two cameras, and Android 2.1 OS the new phone was able to tap into the new IP environment Fourth generation (4G) technology is being rolled out in various forms with a dedication to broadband data and Internet protocols with services such as VoIP, IPTV, live video streams, online gaming, and multimedia applications for mobile users.
While 3G was based on two parallel infrastructures using both circuit switched and packet switched networking, 4G will be rely on packet switching protocols only. 4G LTE (Long Term Evolution) refers to wireless broadband IP technology developed by the Third Generation Partnership Project (3GPP). “Long Term Evolution” meant the progression from 2G GSM, to 3G UMTS, and into the future with LTE. The 3GPP, an industry trade group, designed the technology with the potential for 100 Mbps downstream and 30 Mbps upstream. Always subject to various environmental influences, data rates could reach 1 Gbps speeds in the next ten years.[2]
4G phones are also being developed to access WiMax (Worldwide Interoperability for Microwave Access) using the IEEE Standard 802.16 with a range of some 30 miles and transmission speeds of 75 Mbps to 200Mbps. 4G WiMax provides data rates similar to 802.11 Wi-Fi standards with the range and quality of cellular networks. What has made the difference in technology has been the softer handoffs between base stations that allow for more effective mobility over longer distances. Going to IP will allow mobile technology to integrate into the all-IP next-generation network (NGN) that is forming to offer services across broadband, cable and satellite communication mediums.
Notes
[1] For a history of wireless communications.
[2] This is a great review of the 4 generations of wireless technologies.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global e-commerce.
Geopolitical Risk and the Information Standard
Posted on | April 1, 2012 | No Comments
I’ve often used Walter Wriston’s “Information Standard” as part of an analytical framework for understanding the global economy and the implications of emerging digital financial practices. Maybe not with all the connotations as the Citicorp CEO Walter Wriston conceived it, but the basic idea that the gold-backed international currency framework that shaped the global economy after World War II has long been replaced by a new techno-structural system of computerized transactions, news flow, risk management, and technical analysis based on the US dollar and the collateral strength of US treasuries.[1]
Now gold is another dataset in the complex algorithm that shapes the prices and trades of the global financial system and consequently influences the policy decisions of nations and the flows of investment capital. In this post, I examine some of the recent historical threads of risk analysis and their relationship to computerized trading and the formation of a global “Information Standard” [2].
Risk has always been a factor in investing and lending, whether investing cattle in a sailing expedition to the Spice Islands or British pounds in a transcontinental railroad in the Americas. Captains and merchant ship owners knew that Edward Lloyd’s coffeehouse was the place to go for marine insurance and shipping news. That was the start of Lloyd’s of London, one of the most storied insurance companies in risk history. Certainly the “tulip mania” of 1630s Amsterdam is part of our lexicon of financial bubbles and risk, even getting a short cameo and description by Gordon Gekko in Oliver Stone’s Wall Street: Money Never Sleeps. Risk emerged a more predominant factor in the 1970s when a series of events and innovations occurred in the interconnected areas of international finance, news, and telecommunications.
The New Deal started an era of financial containment, but it couldn’t withstand the stresses of globalization and the rise of new technologies. In addition to legislation like the Glass-Steagall Act of 1933, which kept investment banks from gambling with depositor’s money, FDR confiscated all gold used for currency and built Fort Knox in Kentucky during the mid-1930s to keep it out of the hands of the Nazis. The Allies used these bullion stores at the end of World War II as the basis of a new international economy by tying the dollar to gold at $35/oz (28.35 grams) and the monies of other Allies to the dollar. This experiment, organized in July 1944 when the United Nations Monetary and Financial Conference was held in Bretton Woods, New Hampshire, ran counter to US policy and was destined to failure. When Nixon ended the arrangement in August of 1971, it introduced unprecedented levels of volatility and volume into the global economy. The end of the Bretton Woods agreement was in part at blame for the two oil crises of the 1970s and the raging inflation that plagued the decade.
This new era of floating exchange rates was characterized by a new virtualization of foreign exchange (F/X) trading through interbank spot markets, facilitated largely by Reuter’s new Monitor Money Rates that displayed prices of currencies from banks around the world. Recognizing the need for new F/X risk management techniques, currency futures were introduced at an offshoot of the Chicago Mercantile Exchange (CME). On May 16, 1972, the International Monetary Market (IMM) was opened to provide future delivery of currencies at fixed prices. It initially traded contracts in seven currencies: the British pound, the Canadian dollar, the German deutsche mark, the French franc, the Japanese yen, the Swiss franc, and Mexican peso. Both currency markets received a financial boost in October of 1973 when several Arab states launched an attack on Israel, setting off further volatility in the currency markets, especially as oil production was cut back in response to international support for the Jewish nation-state.
A related process was the recirculation of eurodollars (US currency held outside the country), from OPEC countries to developing and Soviet countries. As almost all oil sales are denominated in US dollars, it created a surplus in devalued “petrodollars” which western banks lent out to OPEC countries like Argentina, Bolivia, New Zealand, Poland and South Korea. To reduce their exposure, several banks would come together to make a syndicated loan to a country like Brazil, with a leading bank taking a major position and several others sharing the risk. Propelled by the rhetoric that sovereign countries don’t go bankrupt, this financial bubble would have a dramatic effect on nations around the world.
The resultant “Third World Debt Crisis” changed the global landscape, forcing countries to sell off government assets and state-owned enterprises (SOEs) and pressuring countries to open their borders to unrestricted and unexamined flows of data and news. Many of these countries were transformed from developing countries into “emerging markets”, open to international flows of direct and portfolio investments and subject to increasing amounts of country risk analysis and examination. The term “Washington Consensus” gained circulation as a set of policy prescriptions involving fiscal discipline, liberalization of trade and the inflows of capital, privatization, tax reform, and the deregulation of a wide range of industries to allow competition and foreign ownership. Government PTT (Post, Telephone, and Telegraph) operations were liberalized and sold off opening the way for modernization including the World Wide Web. Debt was transformed into equity and traded on new electronic platforms around the world while US$3 trillion flowed daily through the global currency creating a new global sovereign power that could influence the policies of individual countries.
As this global environment emerged in the late 1980s and early 1990s, it manifested a new type of trading organization intent on utilizing computerized risk management and trading techniques. Called “hedge funds”, they aimed to combine high leverage with various types of arbitrage methods and pairs trading. The general strategy was based on the thesis that multiple risky positions taken together can effectively eliminate risk itself. One company in particular, Long Term Capital Management (LTCM) gathered together some of best financial minds to create the ultimate hedge fund. The payroll included Nobel Prize laureates Myron Scholes, half of the team that came up with the Black-Scholes algorithm for trading options and Robert C. Merton, who also received his prize for working on how to determine the value of derivatives. LTCM strove to take advantage of this new trading environment by using “dynamic hedging” to trade continuously and globally. They raised money from large investors and developed electronic trading systems that literally drew on a type of rocket science called “Ito calculus“, developed to guide a missile microsecond by microsecond. Their trading strategy was to use these computerized systems to continuously monitor and trade a combination of financial derivatives and securities globally, based on probability theories and risk management techniques.
While hugely successful at first, they ran into a series of geopolitical events. LTCM returned over US$2.5 billion to its investors in 1997. While indicating success, it also reduced their capital base, and consequently their ability to deal with volatility. LTCM ran into trouble during the Asian financial crisis in 1997 and especially in August 1998 when Russia defaulted on its government bond payments, losing over a half a billion dollars in a 24 hour period just three days later. Over-leveraged and under-capitalized, it lost over $4.5 billion in the course of a few months. With the entire financial system at stake, the New York Fed and several major financial institutions initiated a bailout. LTCM operated for a few more years before it was quietly disbanded.
The reputation of mathematical risk models took a hit with the fall of LTCM but they were by no means abandoned, especially after the Commodity Futures Modernization Act of 2000 further deregulated derivatives trading. The extraordinary surplus wealth that had been accumulating in the 20th century, along with new digital calculation and transaction methods that were recruited to make money in the financial markets, has intensified the complex of global data and trading activities that Wriston argued make up the global “Information Standard”. He was particularly intrigued with the power of computerized F/X trading that now exceeds $3 trillion a day. But the environment has since become increasingly complex as “quants” entered the equation with sophisticated algorithmic techniques to manage risk while trading a wide range of financial instruments for profit. Debt instruments in particular have been a focus of speculation and the sovereign bond market has become an additional gauge of national risk. This has become particularly evident in Europe where the euro has become a multinational currency, superceding the national economic policies of the so-called “PIIGS” (Portugal, Italy, Ireland, Greece, and Spain) countries. Having lost control of their currencies, their sovereign debt has become a focus of trading scrutiny.
Like the traditional Gold Standard, the Information Standard is no panacea for the management of the global economy. Both impose restrictions on national political structures and policy decision-making. Politicians like Republican Ron Paul want to go back to the Gold Standard, but powerful forces are invested in the current global system[3]. Also currency regimes are usually organized by dominant creditors and that status is moving from the US to China.
Risks are inherent in both value regimes but it is likely that the Information Standard is reflective of a complexity that is more inclusive of a wider group of participants and away from the government/big banker nexus that characterizes the economic organization around gold. But the trillions of dollars that are in play in the system continue to create a chaotic turbulence of both bubbles and innovation that are highly disruptive to established and more stationary routinized lives and political structures. That is why it will be interesting to see if China has the capability to impose a new order of managed exchange rates and flows of capital.
Notes
[1] Walter Wriston went to work for Citicorp after World War II and eventually become CEO in 1969. The son of a university president, Wriston was often noted for his knowledge of international relations and diplomacy and championed the recirculating OPEC dollars to developing countries as well as other technological innovations such as CDs and the ATM. The Tufts Digital Library contains the Walter Wriston Archives that holds many of his speeches and published articles.
[2] Much of the discussion on the Information Standard had been on its disciplining of nation-states as Wriston discussed in this The Twilight of Sovereignty: How the Information Revolution Is Transforming Our World. Wriston’s interpretation of the Information Standard is organized around a rhetoric of assurance, not a critical analysis. The power of multinational corporations, nation-state dictatorships, and any aggregation of power antithetical to democratic prospects will fall to the sovereign power of the information standard.
[3] The size of banks would have to be considered in evaluating whether the Information Standard has explanatory or analytical power. The credit crisis of 2007 and its remedies significantly increased the power of major banks.
©
ALL RIGHTS RESERVED
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED
Tags: Bretton Woods > eurodollars > Glass-Steagall Act > gold > Information Standard > petrodollars > Third World Debt Crisis > Walter Wriston
Music’s Year of the Cloud
Posted on | January 12, 2012 | No Comments
I’m heading to Hawaii this January to talk about how cloud computing services and mobile technologies are influencing the music industry at The Pacific Telecommunications Council’s annual conference (PTC ’12: Harnessing Disruption: Global, Mobile, Social, Local).
I’ve had some help from my wife who, when she is not marketing green tea for Ito-en NA, is a semi-professional hula dancer. She has been listening to Hawaiian music on her iPhone and uses a program called Pandora. This service streams personalized music on your computer or mobile phone. When you type in a name of an artist, band, composer, or just a song, it begins to build a playlist of the music you chose and even more music like it.
The last year has seen some interesting changes in the music industry with the maturation of cloud and mobile services. Established companies like Amazon, Apple, and Google continued to develop their music “digital locker” services that allow people to store their purchased music in a cloud server that they can access from multiple devices.[1] More intriguing are a number of new companies that have sprung up like MOG, Pandora, and Spotify that stream music like a personalized radio station.
The “cloud” is a metaphor for the massive data storage, processing, and transmission capabilities connected to the Internet and wireless services that can be accessed on mobile devices. They get their name from the cloud-like symbol that’s often used to represent the Internet in network schematics and flowcharts. These are the cloud computing technologies that power Google docs and other “recombinant” software-based services and digital platforms for global e-commerce.
So we can point to two distinct cloud offerings for the music industry at this time. The first is the storage and streaming services such as Amazon’s Cloud Drive and Player, Apple’s iCloud, and Google Music. These services sell standardized MP3s or 256Kbps AAC DRM-free copies and allow you to store albums and songs on their servers and access them through a variety of consumer devices (iPods, iPads, Droid phones, Samsung Blue Ray players, etc.) and computers. They have yet to reach licensing agreements with the major music labels to allow them offer streaming of unpurchased music. Apple’s iCloud specifically advertises itself as “Your content, on all your devices”.
The other cloud service having a major impact on the music industry are those that enable streaming subscription services such as MOG, Pandora, Rdio, and Spotify. They allow consumers to sign up to listen to streamed music they haven’t personally bought. In general, they work off a a “freemium” model where you can get free music with some advertising or a premium paid service with no advertisements.
These services generally use recommendation engines, software applications that build models of a user’s preferences, in order to provide the types of songs they would like. These are used in a number of e-commerce platforms such Amazon and Netflix. Recommendation engines for example are updating algorithms and increasingly utilizing social data points to develop a referral systems connecting grids of personal contacts.
Both strategies will rely heavily on social commerce and its enhanced marketing, search and transaction capabilities. Logging into MOG for example, you will immediately be asked if you want to sign in using Facebook. That immediate access to some 800 million potential subscriptions has resulted in extraordinary growth for MOG and its 13 million song catalog. MOG collects the information you share to not only personalize music offerings but also to help you discover music through your contacts. Reducing search costs for reliable goods and services is an important part of social commerce’s appeal. Music has always had a subjective dimension where choices are factored in part based on preferences of others. Artists also often operate as currency paying the admission price for social acceptance. But how will artists and the music industry monetize music in this new cloud environment? Will they pay for the digital “lockers” to store their purchased music? Will they put up with the advertisements that crashed FM radio? Will they purchase subscriptions to stream personalized music? The are complex questions which will require a careful scrutiny of all the new services that cloud services offer and the dynamics of the music industry.
Notes
[1] “Digital Music’s Cloud Revolution”, by Steve Knopper in Rolling Stone. Dec 22-Jan 5, 2012, p.16.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital economics, information systems management, and global communications. © ALL RIGHTS RESERVED
Healthcare IT and the American Recovery and Reinvestment Act of 2009
Posted on | December 11, 2011 | No Comments
Citation APA (7th Edition)
Pennings, A.J. (2011, Dec 11) Healthcare IT and the American Recovery and Reinvestment Act of 2009. apennings.com https://apennings.com/enterprise-systems/healthcare-it-and-the-recovery-act-of-2009/
Within weeks of his inauguration, President Obama signed the American Recovery and Reinvestment Act of 2009, abbreviated ARRA, as part of a $787 billion stimulus package to help revive the ailing economy suffering from the Great Financial Crisis (GFC) of 2007.[1]
In 2008, the nation was teetering on the edge of economic ruin. Oil peaked at $157 a barrel in July, and the highly leveraged global securitization and credit default scheme collapsed, and with it the inflated housing market. The Dow-Jones Industrial Average (DJIA) fell from its closing high of 14,164 on Oct. 9, 2007 to a low of 7,882 on Oct. 10, 2008, just a few weeks before the election.
The Act has been criticized by the Left as too small and by the Right as a gift basket for US Congress members. However, the legislation did push a number of technological initiatives that would change the future of IT in healthcare.
One of the most important initiatives designated $19.2 billion towards an interoperable, standards-based infrastructure for the secure exchange of electronic health care information and medical records (EHR or EMR) among doctors, hospitals, laboratories, pharmacies, and healthcare research facilities.
The initiative got its start in the Bush Administration with a 2004 executive order creating the Office of the National Coordinator for Health Information Technology. It became part of the Department of Health and Human Services (DHHS) headed by then-Secretary Michael O. Leavitt (See video above). The Office was only funded at $60 million a year though, and the legislation never received congressional approval. The program engaged in only preliminary planning with its report, The Decade of Health Information Technology: Delivering Consumer-centric and Information-rich Health Care that called for a ten-year plan to develop a Nationwide Health Information Network (NHIN) of health care providers.
This network would connect regional health information organizations (RHIO) with regional health information exchanges (RHIEs), both of which would integrate clinical and public health data via electronic health record systems (EHR-S) with the goal of improving patient safety and delivering quality health care. It specified four objectives:
- Bringing information tools to the point of care, especially by investing in EHR systems in physician offices and hospitals.
- Building an interoperable health information infrastructure so that records follow the patient and clinicians have access to critical health care information when treatment decisions are being made.
- Using health information technology to give consumers more access and involvement in health decisions.
- Expanding capacity for public health monitoring, quality of care measurement, and bringing research advances more quickly into medical practice.[2]
With the Obama administration’s ARRA stimulus program, the diffusion of health information technology and the protection of medical records’ privacy and security were legislatively codified. Notably, the Health Information Technology for Economic and Clinical Health Act (HITECH) secured the national coordinator position and office. HITECH also provided some $2 billion for discretionary spending, primarily for grants and loans to implement health-related information and communications technologies.
To rush money into the economy, it established two related federal advisory committees to address healthcare ICT. One committee addressed standards to design a system of networked and interoperable electronic health records. The other committee developed policies to protect patient privacy and security. Together they worked with the private sector and consumer groups to develop the specifics of a health information network. The network permitted the ready exchange of certified electronic health records and other data while protecting patient privacy.[3] Much of the support was offered through “immediate funding” via federal agencies and grants to states, including a loan program to help providers purchase EHR systems as well as related training and technical support.
Most significantly, the HITECH Act allocated $17.2 billion in Medicare and Medicaid financial incentives for physicians and hospitals to implement Electronic Health Record (EHR) systems. Participating physicians could earn between $44,000 to $64,000 over the following five years for utilizing an electronic record system, providing they make “meaningful use” of the EHR installation.
Meaningful use included implementing certified EHR technologies with electronic prescribing capability meeting Department of Health and Human Services (HHS) guidelines: connectivity to other healthcare providers providing interoperable access to a patient’s health history, and; the capability to report to the HHS on how they are using the technology and its effectiveness, including fewer errors, clinical decision-making support, alerts, and other reminders.
Postscript
Whether the “stimulus package” saved the country from the economic abyss of the “Great Recession” will require a historical accounting of its positive economic impact on employment, GDP, and inflation. The economic stimulus package was often derided as a failure, even as unemployment fell from 8% to 5% by the end of the Obama administration.
Likewise, its influence on health care will require examining costs, mortality counts, patient care, and the populace’s overall well-being. As the 2012 elections for US President heated up, “Obamacare,” the Patient Protection and Affordable Care Act (PPACA) got most of the attention. The American Recovery and Reinvestment Act of 2009 propelled several technological initiatives, and its impact on the deployment of health information and communications technologies deserves a separate analysis.
Citation APA (7th Edition)
Pennings, A.J. (2011, Dec 11) Healthcare IT and the American Recovery and Reinvestment Act of 2009. apennings.com https://apennings.com/enterprise-systems/healthcare-it-and-the-recovery-act-of-2009/
Notes
[1] The American Recovery and Reinvestment Act of 2009. H.R.1.
[2] Quoted from “DHHS Office of National Coordinator for Health Information Technology (ONC).” Public Health Data Standards Consortium PHDSC – Promoting Standards Through Partnerships. Web. 11 Dec. 2011.
[3] Steinbrook, M.D., Robert. “Health Care and the American Recovery and Reinvestment Act.” N Engl J Med 360 (2009): 1057-060. March 12, 2009, 12 Mar. 2009. Web. 11 Dec. 2011.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, media economics, and global communications. © ALL RIGHTS RESERVED
Tags: ARRA > Electronic > Health Information Technology for Economic and Clinical Health Act > Health Records (EHR) > Healthcare IT > Obamacare > RHIO