Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

How IT Came to Rule the World, Digital Monetarism

Posted on | April 22, 2010 | No Comments

This is the 14th post in the mini-series How IT Came to Rule the World

    I have directed Secretary Connally to suspend temporarily the convertibility of the dollar into gold or other reserve assets except in amounts and conditions determined to be in the interests of monetary stability and in the best interests of the United States.
    – President Richard Nixon on national TV, August 15, 1971

This morning as I was coming home from my daughter’s Kindergarten class, I had to traverse the security for President Obama’s talk at Cooper Union on financial reform. With all the talk on derivatives and financial technologies, I thought I would move on the Regime Two of this series. Almost as soon as the combination of computer technologies, satellites, and data communications became available they started to be integrated into the financial field.

The regime of digital monetarism emerged in reaction to containment capitalism and involved releasing the powers of finance capital and creating transnational flows of electronic information, news Nixon takes the US off the gold - dollar standard established at Bretton Woodsand money. The containment regime was state-centric and involved elaborate regulatory structures to suppress domestic and international flows of money and also channel excessive economic surpluses into government activities.

While the idea of a regime is meant to be fluid, digital monetarism can be given a start date: On August 15, 1971, President Nixon shocked with world with his surprise television announcement that he was “closing the gold window” and no longer honoring the redemption of dollars for gold. Too many dollars had been created by the international financial system and they threatened the gold stockpiles at Fort Knox. Meanwhile, Nixon was preoccupied with economic battles with Japan and West Germany while also wanting to “print” more money to pay for the Vietnam War. After convening a secret meeting at Camp David among his top advisors, Nixon decided to renege on the country’s international agreements created at the Bretton Woods conference on dollar convertibility and sought to free up the US so it could create a more fluid regime of currency transactions and capital flows.

Nixon’s decision came at a unique time in computer and communications history. Silicon “chips” had recently become commercially available with integrated circuits being used in minicomputers and the revolutionary “microprocessor” on the verge of widespread availability in “microcomputers” such as the Altair and the Apple II. The FCC had been holding hearings on computer communications since 1966 in order to facilitate public-switched data networks and timesharing systems were in regular use. ARPANET was primitive, but online, and about to become the Internet with the development of the TCP protocols. An international satellite communications system was recently available and Nixon himself had made the longest phone call in history to the first astronauts on the moon in 1969. Furthermore, the Nixon administration created the Office of Telecommunications Policy (OTP) in 1970 that was quite active in a number of telecommunications deregulatory measures that opened up AT&T’s network and allowed companies such as MCI to compete. The time was ripe for an explosion of new technologies in the financial field.

Share

Four Ways to Think about Democracy and Media

Posted on | April 18, 2010 | No Comments

I was lucky to have a great mentor during my graduate school days who provided a strong intellectual foundation for further study and research. Majid Tehranian, a professor of the political economy of communications at the University of Hawaii was the chair of my Masters thesis committee and also assisted me by serving on my PhD committee. Although he did his PhD at Harvard on OPEC issues, he turned his focus towards communications and information technologies during the 1980s and later towards peace studies.

Professor Tehranian (everyone called him “Majid”) was working on a book called Technologies of Power: Information Machines and Democratic Prospects during my graduate days and I even had the pleasure of doing the index for it – the old fashioned way (by hand).

As a framework to develop our understanding of communication and media issues related to concepts of democracy and development, he laid out four different ways people were writing and theorizing about them. He used a recognizable color scheme to distinguish their different polarities: Blue – Liberal Democracy, Red – Communist Democracy, Green-Ecological Democracy and Black – Counter-Democratic tendencies. These areas are contentious, dynamic, and open to a lot of debate, but they served effectively as intellectual “hangers” on which struggling graduate students could organize and “hang” their knowledge.

When our Department of Digital Communications and Media Studies at NYU decided to put on an event exploring engagement and the media, I decided to review the book and use it for my presentation as a way of starting a conversation about democracy and possibilities presented by social media.

Notes from my presentation for the ENGAGED MEDIA Colloquium at NYU on April 17, 2010

FOUR Ways of Thinking about Democracy and Social Media
Anthony J. Pennings, PhD
Dept of Digital Communications and Media Studies
New York University

Blue Democracy

  • Liberal capitalism
  • Views democracy in terms of the expansion of individual liberty through the processes of pluralization of the economy, society and polity
  • Representative government
  • Fears bureaucratization and tyranny of the majority
  • Citizens stay informed through the Fourth Estate – the objective mass media – so that they can vote effectively and apply make their voices known to their representatives
  • Key Words
    pluralism, markets, free enterprise, balance of power, liberty, separation of powers, property rights, division of labor, interest groups, freedom of speech, assembly, and association; Fourth Estate; modernization, two-party system, privacy, ego, high accumulation

    Red Democracy

  • Communist democracy
  • Views democracy in terms of the mobilization of collective will in the service of equality
  • Origins stem from the critique of capitalism and its social implications
  • Fears the tyranny of the minority
  • Media is the instrument of the government to facilitate class struggle

  • Key Words
    mobilization, collective will, labor, class struggle, modernization, State, progress, alter-ego, intelligentsia, lumpen-proletariat, Marxism, single-channel flows, censorship, party control, critical, equality,

    Green Democracy

  • Ecological or communitarian democracy
  • Views democracy in terms of local, non-violent, grassroots involvement in the relationship between people, and with the environment
  • Fears the destruction of the natural environment
  • Community, small media,
    Direct democracy, MMP parliamentary system
  • Key Words
    grassroots participation, direct democracy, social justice, ecological wisdom, nonviolence, super-ego, decentralization, community-based economics, gender equality, fraternity, networking flows, diversity, think globally, act locally, future focus, sustainability, environment

    Black* Counter-Democracy

  • Fascist dictatorship
  • Views democracy as a threat to the rightful order and/or identity
  • Generally involves: radical change, myths of ethnic or national renewal, and a conception of crisis
  • Tactics involve radical transformation, violent coups, putsch
  • Media flows are monolithic and ritualized, including theatrical propaganda
  • Key Words
    racial and/or ideological purity, social transformation, corporatist, purification, ethic renewal, genetic traits, militarism, conspiracies, heroic past, political repression, “survival of the fittest”, anti-modernization, rebirth through power, acts of sabotage, marginalized social classes, ethnocentrism, marginals, fascist, totalitarian, order, identity, coups, putsch, pseudo-science, reactionary, authoritarian, monolithic media flows

    Conclusion
    These are dynamic times and the stakes are high: globalization, climate change, dwindling resources (oil, water, fish, etc.), national debt. Yet, we have this extraordinary technological convergence. Mass media is our base, but what we call “social media” is giving us an unprecedented ability to converse, cooperate, collaborate, and engage. The challenge for the next few decades, and perhaps more immediately than you think, is to configure a democratic system that understands the strengths and weaknesses of the social media infrastructure and its communicative possibilities.

    * The color Black is, unfortunately, linked to Fascist tendencies, as well as more positive and progressive ethnic movements.

    © ALL RIGHTS RESERVED

    Share

    Anthony

    Anthony J. Pennings, PhD is a visiting professor at Hannam University in South Korea. He recently joined the Digital Media Management program at St. Edwards University in Austin TX, after ten years on the faculty of New York University.

    how IT came to rule the world, 1.8: Bell Labs and the Transistor

    Posted on | April 10, 2010 | No Comments

    This is the 13th post in the mini-series How IT Came to Rule the World


    The transistor provided an extraordinary capability to control an electrical current which was initially used for amplifying electromagnetic frequencies and then for the switching of 1s and 0s needed for digital computing. An unlikely scenario unfolded in the 1950s when AT&T’s fear of government anti-trust action and regulation sparked the sharing of this seminal technology. This led to the solid state electronics revolution and then to the silicon semiconductor innovations that led to the rapid development of computerized information technology.

    The transistor emerged from the research efforts of AT&T, the corporate behemoth that was formed by JP Morgan and guided by US policy to become the nation’s primary telecommunications provider. In 1913, AT&T settled its first federal anti-trust suit with the US government. The agreement established the company, started with Alexander Graham Bell’s technology, as an officially sanctioned monopoly. A document known as the Kingsbury Commitment spelled out the new structure and rules of interconnection in return for AT&T divesting its controlling interest in telegraphy powerhouse Western Union.

    Both companies had a history of consolidating their market domination through patent creation or purchase. For example, AT&T purchased the patents for the De Forest vacuum tube amplifier in 1915, giving it control over newly emerging “wireless” technologies such as radio and transatlantic radiotelephony, as well as any other technology that used the innovation to amplify electrical signals. Patents, as government sanctioned barriers to entry, created huge obstacles for other competitors and effectively barred them from producing and using anything close to the restricted technology. As AT&T grew more powerful, it established Bell Telephone Laboratories Inc. (Bell Labs) in 1925 as a research and development subsidiary. Fed by AT&T’s monopoly profits, Bell Labs became a virtual “patent factory”, producing thousands of technical innovations and patents a year by the 1930s. One of its major challenges was to find a more efficient successor to the vacuum tube.

    After World War II, the US Justice Department filed another anti-trust lawsuit against AT&T. In 1949 it sought the divestiture of Western Electric, AT&T’s equipment-manufacturing arm. The action came after, although not necessarily because of, the telephone company’s invention of the transistor, an electronic device that regulated the flow of electricity through a small cylinder device. It operated much like the vacuum tube but the transistor however was “solid-state”: easier to use, more reliable, and much smaller as well. It worked by reducing the voltage while maintaining a strong current, ideal for a wide variety of electronic devices.

    The transistor’s inception dates to December 23, 1947 at Bell Labs’ facilities in Murray Hill, New Jersey. At the time, AT&T’s famed research facility employed nearly 6,000 people, with 2,000 being engineering and research professionals. The development of the transistor was not a result of just basic research; it was the result of an all-out attempt to find something to replace the vacuum tube. In any case, the government’s lawsuit meant that AT&T would tread lightly with this new invention lest it raise additional concerns about Ma Bell’s monopoly power.

    Unlike its previous history of zealously controlling or acquiring any patents (including the vacuum tube) dealing with its telephone network, AT&T decided to liberally license out the new technology. It did not want to antagonize the Justice Department over a technology it did not fully understand nor knew how to implement commercially. But some of the Bell Labs employees were already jumping ship with the technology and the anti-trust action was an indication that any patent infringement cases would be hard to defend in court.

    So in 1951 and 1952, Bell Labs put on two symposiums revealing all their information on the transistor. The first was for government and military officials only, while twenty-five American companies and ten foreign companies attended the second. All were required to put out $25,000 as “a down-payment on a license.” Sensing the potential of the new device, the Department of Defense awarded a number of multi-million dollar contracts for transistor research contracts. General Electric, Raytheon, RCA, and Sylvania, all major vacuum tube makers, began working with their transistor licenses on military applications. AT&T’s Western Electric for example found in the Department of Defense an immediate market for nearly all its transistors. AT&T’s fear of the government’s anti-trust threat resulted in an extraordinary diffusion of the century’s most important technology.

    In the mid-1950s the US government made a fateful decision regarding the future of the semiconductor industry when it ruled on Western Electric’s fate. In 1956, the Justice Department decided to let AT&T hold on to its manufacturing subsidiary under two conditions. First, it restricted the telephone company from computer-related activities except for sales to the military and for its own internal purposes, such as in telephone switching equipment. Second, AT&T was also required to give up its remaining transistor patents. As a consequence of the government’s pressure, the nascent semiconductor industry was released from the control of the monolithic telephone company.

    Three licensees in particular, Motorola, Texas Instruments and Fairchild took advantage of AT&T’s transistor technology. Each procured valuable government contracts to refine the electronic switching technology and increase its reliability. The government contracts also helped them to develop sophisticated manufacturing techniques so they could mass-produce the transistors. In particular, two political developments, the nuclear arms race with the USSR and the goal to land on the Moon, became important for advancing the transistor technology that would propel an electronics revolution and lead to major advances in computer technologies.

    Share

    © ALL RIGHTS RESERVED

    AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

    Some Economics of Social Media

    Posted on | April 8, 2010 | No Comments


    There is a dynamism to the Internet and its World Wide Web that is truly extraordinary. While the reasons for their success are complex, we can start by looking at some economic factors, both market and nonmarket. Both seem to be highly relevant to the latest series of technologies and applications known as social media.

    It can be argued that social networking is one of the original “killer apps” of the Internet. Its email and instant messaging capabilities, the combination of its capabilities: searching, linking, tagging, commenting, voting, authoring, etc., has allowed new platforms like Facebook, Twitter and Wikipedia to emerge, while empowering a wide number of other websites to enhance their communication infrastructure. I say communication because while social media often involves media content, it adds a new dimension by giving users a voice, either through user-generated content or/and its ability to critique, review, and comment on that content.

    So what are the economic factors driving the social media phenomenon?

    • Declining computing and storage costs are giving people access to the “means of production”. Computers, smartphones, video cameras, as well as audio, video, and word editing software now available at relatively affordable costs. Also, cloud computing services making possible online services such as Flickr and Gmail have aided the social media movement. Within the context of the Internet, these technologies have become more than consumer items, they are the capital goods of the social media economy. They are the goods that make other goods.
    • The resources that are used are primarily informational – cultural, political, artistic. These need to be “mined and processed” in ways that are still best done by human labor. Social and cultural production is a delicate operation requiring unique human sensitivities and creativity.
    • The products of social media have an unusual cost structure in that once produced, the marginal costs of additional copies are minimal. A tweet can go to four or four thousand people without incurring much additional costs.
    • The “link economy” of the Internet’s hypertext environment connects like ideas and makes them searchable and accountable. Combined with the “network effects” of the web – as each person joins the network – it becomes more valuable – the link economy has had a major influence on information economics of the web by making it extraordinarily easy and historically cheap to find the information for which you are searching.

    The convergence of these economic trends has resulted in social media being embraced for both e-commerce and non-market activities. It is a powerful combination that is not only enhancing e-commerce operations but giving support to all manner of social movements.

    Share

    © ALL RIGHTS RESERVED

    AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

    How IT Came to Rule the World, 1.7

    Posted on | April 3, 2010 | No Comments

    This is the 12th post in the mini-series How IT Came to Rule the World

    Quest for the Moon enhances microprocessing powerWithin weeks of the first landing on the Moon, the foundation of the Internet was created. Government-sponsored projects implemented the theories of data communications and created the first packet-switching and packet-broadcasting network called the ARPANET.

    ARPA subcontracted the design and creation of network to a small company called BBN, an important part of the emerging “revolving door” for engineers and scientists between academia, government and industry. Then the University of Hawaii’s Aloha System provided fascinating new possibilities for wireless data communications between mobile units and for satellite packet communications (and soon led to the Ethernet LANs).

    The problems encountered in reconciling these different data transmission systems operating in different networks led to the Internetting Project and the development of a new data communications protocol that would link different computers operating on different computer networks.

    Vint Cerf talks about his role in the creation of the TCP protocol and its implications for the global Internet.

    Share

    © ALL RIGHTS RESERVED



    AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

    How IT Came to Rule the World, 1.6

    Posted on | April 2, 2010 | No Comments

    This is the 11th post in the mini-series How IT Came to Rule the World

    1.6 Further Cold War tensions in the 1960s sparked additional innovation in the microelectronics industry as the MAD (Mutually Assured Destruction) defense policy transformed California’s “Silicon Valley” into the center of the military’s miniaturization revolution. A combination of Congressional politics

    Apollo Guidance Computer used the first integrated circuits

    Apollo Guidance Computer

    and industrial economics led to the shift of electronics research and production from the US East Coast to the West Coast. Minuteman missiles utilized transistors developed by Bell Labs and then commercialized by Western start-ups who created the small silicon-based computing “chips” for their guidance systems. Combined with NASA’s Gemini and Apollo projects, the first major markets were created for integrated circuits or ICs, a crucial innovation for computing. NASA’s Apollo Guidance Computer (APC) was the first computer to use the new innovation. ICs combined several transistors on a single silicon chip and required extensive oversight and support from the government to ensure the high levels of quality needed for manned space flights and guidance of intercontinental thermonuclear missiles.

    Share

    Anthony

    Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

    How IT Came to Rule the World, 1.5: ARPA and NASA

    Posted on | March 31, 2010 | No Comments

    This is the tenth post in the mini-series How IT Came to Rule the World

    After the USSR shocked the world in 1957 with its Sputnik satellite, the US took two major actions that would converge later in the modern Internet as well as a wide range of other technologies, including the microprocessor, the personal computer and eventually the smartphone.

    First, it formed the Advanced Research Projects Agency (ARPA) within the Department of Defense (DoD) to establish a US lead in science and technology applicable to the military. ARPA drew first on the legacy of computer and data communications development at MIT and other locations. It contracted with a small company, Bolt, Beranak, and Neuman (BBN) to build the first packet-switching network called the ARPANET. ARPA also supported the Aloha System at the University of Hawaii that created a wireless packet broadcasting system that was used to connect computers among the different islands and then via satellites. Later ARPA seeded the formation of computer science departments and research institutes around the country that led to the development of the graphic user interface (GUI) and other technological innovations, including the Internet.

    The other, more popularly known government action was the creation of the space program. The US and USSR, both of whom ravaged the Nazi’s V-2 rocket program at the end of World War II, engaged each other in a “space race” to control the German V-2 Rocket skies. Facing a new type of global warfare, President Eisenhower desperately wanted to establish the “high ground” and his New Look policy identified aerospace as the military’s highest priority. In 1958, he created the National Aeronautical and Space Administration (NASA) and within a week, Project Mercury was approved to place a human into orbit. In the early 1960s, newly elected President John F. Kennedy energized NASA by calling for a “man on the moon by the end of the decade” and the same set of speeches, he called for America’s leadership in international communications. One of the primary goals of the space program was to place communications satellites into space. Additionally, one of the most important by-products of the space race was the miniaturization of electronic circuitry into the technology that would integrate transistors into the microprocessor.

    Together these actions would form the institutional foundation for the ARPANET, the precursor of the modern Internet. Reacting to the threat of the USSR’s space endeavors, the US would mobilize its technological and human resource capabilities for a new chapter of the Cold War. Later the “Star Wars” Strategic Defense Initiative (SDI) would help the ARPANET transform into the NSFNET to connect the “artificially intelligent” computers needed for the space defense system. It was the NSFNET that was privatized to create the commercial World Wide Web.

    Share

    © ALL RIGHTS RESERVED



    AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea, St. Edwards University in Austin, Texas and Victoria University in Wellington, New Zealand. From 2002-2012 he was on the faculty of New York University. During the 1990s he was also a Fellow at the East-West Center in Honolulu, Hawaii.

    How IT Came to Rule the World, 1.4: SAGE and Early Electronic Computing

    Posted on | March 28, 2010 | No Comments

    This is the ninth post in the mini-series How IT Came to Rule the World

    The expansion of Communism and a set of US policies to contain it strengthened the regime of containment capitalism. Growing fears of nuclear war provided the motivation and rationalization for massive investments in US defense initiatives. The National Security Act of 1947 signed by President Truman on July 26, 1947. It provided the legislation for an era of permanent war mobilization, extensive investment in new arms, and the growth of highly funded covert and overt activities to keep Communism in check and under constant surveillance.

    The revolution in China and the detonation of the hydrogen bomb by Russia strengthened the resolve of the US to maintain its military might and capitalize on advances made in jet propulsion, rocketry, radar, and computing. Billions of US government dollars subsidized the creation of a computer and data communications infrastructure by funding the development of a hemispheric defense system to protect North America from Russian bombers.

    The SAGE (Semi-Automatic Ground Environment) system conceived at MIT and built at IBM’s Poughkeepsie, New York facilities helped transform the computer from a bulky, slow, vacuum-tube switched numerical processor into a generalized, software-Defense Early Warning Linedriven, transistor-enabled, media-enhanced computer with an accompanying communications system able to send digital data over telephone lines. The IBM FSQ-7 computer combined data from remote radar sites into a rough, real-time representation of the horizon’s airspace. In cooperation with Canada, the US built a Defense Early Warning line (DEW) from Alaska through the Canadian Arctic down to Long Island, New York (Including an installation at Stewart Airport in Newburgh). Built primarily by the Bell System and combined with the IBM computers, the SAGE system was the precursor to the NORAD computer defense system that was built deep in Colorado’s Cheyenne Mountains.

    But more importantly, SAGE was the foundation for the modern computer industry. It helped establish MIT’s expertise in computing, helped IBM’s transition to electronic computers, and allowed AT&T to set up long-distance data communications lines. It also helped create many new companies that would make important innovations. By the 1960s, IBM and the others, such as Burroughs and Honeywell, would establish electronic computing as a viable business and financial tool.

    © ALL RIGHTS RESERVED


    AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York, South Korea offering degrees from Stony Brook University and the Fashion Institute of Technology (FIT). Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. His PhD from the University of Hawaii was supported by the East-West Center.

    « go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    March 2025
    M T W T F S S
     12
    3456789
    10111213141516
    17181920212223
    24252627282930
    31  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.