Anthony J. Pennings, PhD

WRITINGS ON AI POLICY, DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL E-COMMERCE

The FCC’s First Computer Inquiry

Posted on | March 6, 2011 | No Comments

Any use of a communications carrier to connect remote terminals to computers brought the system into the realm of the Federal Communications Commission (FCC). By virtue of the Communications Act of 1934, all telecommunications traffic needed to be regulated by the FCC. As early computers were being developed, his task of regulating computer communications was neither welcomed nor clearly understood by the FCC, but the number of computer linkages were increasing rapidly as businesses discovered that the computer could not only process information but could transfer it to other computers.

The number of “online” systems increased from about 30 systems in 1960 to over 2300 by 1966. The number of data terminals was estimated to have climbed from 520 in 1956 to some 45,663 in 1966. New innovations such as time-sharing allowed many people to use the same computer via telecommunications. “Computer applications, in other words, had come to depend more upon communication links, including privately owned microwave hookups, dedicated leased lines, and regular common carrier circuits.”[1] Consequently, the FCC sent out an invitation to computer users to describe their current utilization of computers and telecommunications and how they expected these systems to be used in the future. The following is a list of some of the corporations and trade associations that responded:

Aetna Life and Casuality Co.
American Bankers Association
American Business Press, Inc.
American Petroleum Institute
Credit Data Corp.
Eastern Airlines
Lockheed Aircraft Corp.
McGraw-Hill, Inc.
Societe Internationale de Telecommunications
Aeronautique (SITA)
United Airlines.

This initial study became known as the FCC’s First Computer Inquiry. It marked the initial attempt by a government regulatory agency in the U.S. to address the policy implications of the new computer systems utilizing telecommunications and specifically the emerging needs and concerns of its users.[2] By the time the First Computer Inquiry got under way in 1966, over half the installed computer systems were located in corporate industries such airlines, banking, computer services bureaus and manufacturing.”

Comments to the First Computer Inquiry from the American Bankers Association indicated that automated banking services and electronic funds transfer systems were going to grow very rapidly. The physical transfer of checks was being replaced by information transport over wire and radio. Already over 1000 banks throughout the country had computers on their premises, and another 2000 used commercial data processing bureaus. Banks were very concerned about computerizing bank functions such as account reconciliation, correspondent bank services, internal accounting, labor distribution, and professional billing.

It became clear that users were developing sophisticated computer systems that were going to be increasingly dependent on data communication services. Aetna Life and Causality stated that telecommunications services for businesses were going to be “the limiting factor in the planning of new business systems.” The First Computer Inquiry was probably the most important reevaluation of telecommunications policy since the New Deal. While this period also included the AT&T antitrust case, which regulated the giant company out of the computer business, the First Computer Inquiry addressed new concerns. For the first time, corporate needs for telecommunications systems to facilitate their computerization processes were addressed.

Much of the debate focused on the capabilities of the status quo telecommunications network and its tariff structure. Business users charged that the telephone industry and its networks were simply not up to the task of providing the sophisticated technical services they required. They complained to the FCC that they needed new types of equipment and services from the common carriers to maximize the use of their computers. The technical standards that had emerged in the regulated telegraph and telephone systems differed from those required by computer systems. Corporate users wanted to interconnect their facilities, whether they owned them or just leased them, with common carrier facilities. Of particular concern were the prohibitions against the use of foreign attachments such as modems to the telephone network. They were also pressing the FCC to allow the shared use and resale of common carrier equipment and services. As computers were still quite slow, full capacity was not always reached on circuit. Users wanted to attach their own private equipment to the AT&T network in order to combine the data and send over a single line. For example, a bank might want to purchase multiplexing equipment that could draw in the data from a number of computers and retransmit the information through a single telecommunications line leased from a carrier to a receiving facility. They also wanted these carriers to provide telecommunications circuits that were conditioned to sufficient quality so that they could transmit information through them with a high degree of accuracy and efficiency. The First Computer Inquiry began to reveal the future importance of new communications services and the limitations in the current network infrastructure.[4]

That AT&T became one of the biggest obstacles to the development of data communications can be attributed at least to four factors. First, the number of computers in operation was quite small and they were not yet perceived as popular devices, even for businesses. Computers were used almost exclusively by government and large corporate users and the need to connect them via telephone lines was uncertain. Second, the demand for voice communications was quite large. The government’s concern for universal service meant that regulated telephone companies had to address issues involved in wiring wide geographic areas for domestic telephone use. Third, the Bell system was adamant that customers be restricted from attaching non-telephone terminal equipment to the network. The telephone grid and its electrical signals were fairly fragile and their safety and clarity were seen to be at risk if unsanctioned equipment could be connected. Fourth, the Consent Decree of 1956 had restricted AT&T’s entry into the computer field and so it had very little incentive to address issues affecting this market. So despite the growing need, AT&T’s reluctance to upgrade its technical facilities and tariff structure came largely from the fact that data communication traffic volumes simply did not compare with voice. AT&T was structured by investment and regulation to provide voice services, which were growing at post-war rates faster than previously anticipated. So not only was the market for computer communications small and difficult to forecast, but AT&T had their hands full just handling voice communications.

The huge monopoly was very defensive at the time about user demands to liberalize the equipment that could be connected to the telecommunications network. Initially only the standard black rotary telephone could be attached and it was illegal to attach any non-Bell system equipment to the network such as computer terminals and other data communications devices such as modems. Responding to corporate requests, AT&T did make available “data sets” or modems after 1962 that allowed data transmission rates at 1200 to 2400 bits per second over special communication lines acquired by dialing. By the mid-60s, rates up to 250,000 bits per second were available.[5] Still, users were uneasy with the situation.

It was the burgeoning mobile telephone industry that broke the network’s monopoly on connecting outside equipment. In what would be the FCC’s Carterphone Decision of 1968, equipment other than that designed by the Bell Telephone Company could finally be attached to the network. The “Carterphone Ruling” allowed a small radiotelephone company, Carter Electronics Corporation, to link its mobile radio system to the AT&T public-switched network. The ruling set a precedent as the FCC concluded that any customer who desires to use an interconnecting device should be able to, as long as it did not “adversely affect the telephone company’s operations or the telephone system’s utility for others….”[6] The FCC’s decision allowed a large industry to grow as computers, decorator phones, modems, network monitoring equipment, and entire telephone exchanges began to be connected to the network. This action would allow individual corporations to design and implement an entire telecommunications facility outside the network and would set the legal foundation for long-distance interconnect companies to be able to link to local telephone companies.[7]

Another aspect of the phone company’s reluctance to provide telecommunications for computers was that despite the fact that Bell Labs had designed the transistor and other seminal computer technology, it was effectively restricted from entering that field. The Consent Decree obtained from the Department of Justice in 1956 barred AT&T from competing in unregulated areas including computers and international data service. AT&T was directed to focus on domestic common carriage exclusively and that meant not selling computers and not building telecommunications systems in other countries. Despite extraordinary computer developments at its Bell Labs, including the invention of the transistor, data communication capabilities did not register as a top priority for AT&T.

These issues did not go unapprised when the FCC released its First Computer Inquiry conclusions. In its final decision, the FCC opted not to regulate data communication and processing services. While legally the FCC could regulate any communication, it decided to retreat from major intervention in the face of such strong corporate pressure. While computers were not as sexy as they would be in the age of the Internet, they nonetheless were proving too useful for corporations to allow the enactment of dubious regulations. What the FCC did do in its final First Computer Inquiry decision was to make distinctions between communications networks and “online” timesharing computer systems. The computer users had argued that, despite the Communications Act of 1934; the FCC could only regulate the communications business and should not be regulating, in effect, the business of other industries. The Aerospace Industries Association even argued that “they were no more subject to regulation under the Act than the voices of telephone users.” As a result, the term “online” subsequently achieved a high rate of circulation as it was used to linguistically protect computer networks from possible FCC regulation.[8]

While the distinctions proved to be arbitrary and did not last long, the FCC’s position can be clearly discerned. Corporations were beginning to need computer communications as an integral part of financial, organizational, and productive coordination. While AT&T was a crucial part of the nation’s telecommunications infrastructure, it was becoming evident that the long-term development of the nation’s corporate growth environment would require an efficient computer-communications system. Criticism of the telecommunications structure was increasing and the dominant role of AT&T challenged. “Owing to the rapidly escalating dependence upon communications by U.S. industry and finance, a single company–AT&T was in practice materially affecting (some said dictating) the pace and direction of corporate strategy in areas far removed from its traditional communications offerings.”[9] Regulation, or the lack of it, was to be used for promoting the growth and stability of U.S. economic development, despite the antagonistic struggle with its largest corporation.
Notes

[1] Dan Schiller’s work on early computerization policy is one of the most authoritative. See his (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 22.
[2] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 23.
[3] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 28.
[4] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 29-30.
[5] Phister, M. (1979) Data Processing Technology and Economics. Santa Monica, CA: Digital Press. p.76.
[6] Martin, J. Telecommunications and the Computer. p. 37.
[7] Pool, I. (1983) Technologies of Freedom. Cambridge, MA: Harvard University Press. p. 247.
[8] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 38.
[9] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 42.
Share

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

Why AT&T Invented and Shared the Transistor that Started the Digital Revolution

Posted on | March 5, 2011 | No Comments

The invention of the transistor in 1947 provided an extraordinary capability to control an electrical current. It was initially used to amplify electromagnetic frequencies and transistors and transistorthen to switch the 1s and 0s needed for digital computing. An unlikely scenario unfolded in the 1950s when AT&T’s fear of government anti-trust action and regulation sparked sharing this seminal technology with other companies. This self-serving altruism led to the “solid-state” electronics revolution and many silicon semiconductor innovations that rapidly developed computerized information technology.

The transistor emerged from the research efforts of AT&T, the corporate behemoth formed by JP Morgan and guided by US policy to become the nation’s primary telecommunications provider. In 1913, AT&T settled its first federal anti-trust suit with the US government. The agreement established the company, starting with Alexander Graham Bell’s technology, as an officially sanctioned monopoly. A document known as the Kingsbury Commitment spelled out the new structure and interconnection rules in return for AT&T divesting its controlling interest in telegraphy powerhouse Western Union.

Both companies had a history of consolidating their market domination through patent creation or purchase. For example, AT&T purchased the patents for the De Forest vacuum tube amplifier in 1915, giving it control over newly emerging “wireless” technologies such as radio and transatlantic radiotelephony, as well as any other technology that used the innovation to amplify electrical signals. Patents, as government-sanctioned barriers to entry, created huge obstacles for other competitors and effectively barred them from producing and using anything close to the restricted technology.

As AT&T grew more powerful, it established Bell Telephone Laboratories Inc. (Bell Labs) in 1925 as a research and development subsidiary. Fed by AT&T’s monopoly profits, Bell Labs became a virtual “patent factory,” producing thousands of technical innovations and patents a year by the 1930s. One of its major challenges was to find a more efficient successor to the vacuum tube.

A breakthrough occurred when William Shockley, PhD, who was the director of transistor research for Bell Labs worked with fellow PhDs John Bardeen and Walter Brattain to create the “Semiconductor amplifier; Three-electrode circuit element utilizing semiconductive materials.” The transistor’s inception dates to December 23, 1947 at Bell Labs’ facilities in Murray Hill, New Jersey.

At the time, AT&T’s famed research facility employed nearly 6,000 people, with 2,000 being engineering and research professionals.[1] The development of the transistor was not a result of just basic research; it was the result of an all-out attempt to find something to replace the vacuum tube. In any case, the government’s lawsuit meant that AT&T would tread lightly with this new invention lest it raise additional concerns about Ma Bell’s monopoly power.[2]

After World War II, the US Justice Department filed another anti-trust lawsuit against AT&T. In 1949, it sought the divestiture of Western Electric, AT&T’s equipment-manufacturing arm. The action came after, although not necessarily because of the telephone company’s invention of the transistor, an electronic device that regulated the flow of electricity through a small cylinder device. It operated much like the vacuum tube, but the transistor was “solid-state”: easier to use, more reliable, and much smaller. Faster to react, less fragile, less power-hungry, and cooler-running than glass vacuum tubes (which had to “warm up” to operating temperatures), it was ideal for a wide variety of electronic devices.

Unlike its previous history of zealously controlling or acquiring patents (including the vacuum tube) dealing with its telephone network, AT&T decided to liberally license the new technology. It did not want to antagonize the Justice Department over a technology it did not fully understand nor knew how to implement commercially. However, some of the Bell Labs employees were already jumping ship with the technology, and the anti-trust action was an indication that any patent infringement cases would be complex to defend in court.

So in 1951 and 1952, Bell Labs put on two symposiums revealing all their information on the transistor. The first was for government and military officials only, while twenty-five American companies and ten foreign companies attended the second. All were required to put out $25,000 as “a down-payment on a license.” Sensing the potential of the new device, the Department of Defense awarded a number of multi-million dollar contracts for transistor research contracts. General Electric, Raytheon, RCA, and Sylvania, all major vacuum tube makers, began working with their transistor licenses on military applications. AT&T’s Western Electric for example found in the Department of Defense an immediate market for nearly all its transistors.[3] AT&T’s fear of the government’s anti-trust threat resulted in an extraordinary diffusion of the century’s most important technology.

In the mid-1950s, the US government made a fateful decision regarding the semiconductor industry’s future when it ruled on Western Electric’s fate. In 1956, the Justice Department let AT&T hold on to its manufacturing subsidiary under two conditions. First, it restricted the telephone company from computer-related activities except for sales to the military and for their own internal purposes, such as in telephone switching equipment. Second, AT&T was also required to give up its remaining transistor patents.[4] As a consequence of the government’s pressure, the nascent semiconductor industry was released from the control of the monolithic telephone company.

Three licensees, Motorola, Texas Instruments, and Fairchild, in particular, took advantage of AT&T’s transistor technology. Each procured valuable government contracts to refine the electronic switching technology and increase reliability. The government contracts also helped them develop sophisticated manufacturing techniques to mass-produce the transistors. In particular, two political developments, the nuclear arms race with the USSR and the goal to land on the Moon, became essential for advancing the transistor technology that would propel an electronics revolution and lead to significant advances in computer technologies.

In 1956, William Shockley, John Bardeen, and Walter Brattain were awarded the Nobel Prize for their discovery of the transistor.

Citation APA (7th Edition)

Pennings, A.J. (2011, Mar 05). Why AT&T Invented and Shared the Transistor that Started the Digital Revolution. apennings.com https://apennings.com/how-it-came-to-rule-the-world/why-att-invented-and-shared-the-transistor-that-started-the-digital-revolution/

Notes

[1] Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 33.
[2] Herbert Kleiman quoted on AT&T and basic research in Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 36.
[3] Dirk Hansen’s The New Alchemists. NY: Avon Books provides a good introduction to the beginnings of the transistor market. p. 80-82.
[4] Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 34.

Share

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

Seeing from Space: Cold War Origins to Google Earth

Posted on | March 2, 2011 | No Comments

President Eisenhower had been secretly coordinating the space program as part of the Cold War since the early 1950s. He had become accustomed to the valuable photographic information obtained from spy planes and considered satellites a crucial new Cold War technology. The D-Day invasion of Europe, which he had managed as the head of the Allied Forces, had been meticulously reconnoitered with low and high altitude photography from a variety of reconnaissance aircraft.

When his Presidential administration took office in early 1953, tensions with Communist countries were increasing rapidly and his “New Look” policy identified aerospace as a decisive component of future US military strategy. Given the growing nuclear capacity of the USSR, he particularly wanted satellites that could assess how rapidly the Communists were producing its long-range bombers and nuclear ballistic missiles, as well as where they were being stationed.[1]

Rocketing into the “High Ground”

Consequently, after the Sputnik debacle, Werhner Von Braun, the German captive/immigrant who headed the US space program was cleared to launch the Jupiter-C, a proven military rocket modified to carry a satellite designed by the Jet Propulsion Laboratory (JPL). On January 31, 1958, America’s first satellite launcher lifted off, carrying the 10.5-pound Explorer 1 into orbit. The Soviet Sputnik satellite had established the precedent of overflight, freeing up orbital paths above sovereign national spaces.

Later that year the National Aeronautical and Space Administration (NASA) was created, and within a week, Project Mercury was approved to place a human into orbit.[2] While motivations for the human cargo were numerous, one significant reason was that rocket thrust capability was still limited and more was needed to place heavier payloads into space. Initially this meant cameras and other sensing devices but of course thermonuclear devices were still very heavy. In order to develop intercontinental ballistic missiles, much work was needed on increasing the transport and guidance capabilities of the rocket launcher. Initially, however, the Eisenhower administration was concerned with the surveillance possibilities that the “high ground” offered.

Upon the successful flight of the America’s first rocket launcher, the Corona spy satellite program was initiated. Operating under the name Discoverer, the highly covert program was started to put a series of satellites designated Keyhole (KH) into low earth orbits (LEO). Outfitted with 70mm cameras, KH-1 and its early successors failed to achieve orbit or suffered other technical failures.[3] By the late summer of 1960, however, a capsule containing the first film stock was retrieved in mid-air by an Air Force cargo plane as it parachuted back down to earth.

The Keyhole satellites could see where the US spy planes could not. The U-2 high-altitude spy plane could fly up to altitudes of 70,000 ft and photograph military installations and movements but were in increasing danger as Soviet radar and surface-to-air missiles improved. The satellites could cover a wider distance from much safer altitudes.

The limits of the U-2 were highlighted infamously on Mayday of 1960 when U-2 pilot Gary Powers was shot down over Soviet space and captured. Eisenhower, thinking Powers was dead and the plane destroyed, downplayed the incident as a probable technical failure on a weather plane. The incident proved highly embarrassing and provided a major international public relations boost for Khrushchev when the largely intact remains of the spy plane were paraded by the Soviets in front of the international press along with images of Powers.

The Powers incident provided a strong motivation for the Corona satellite surveillance program which retrieved film, snatched from the air by US aircraft. The retrieved film contained images from locations deep inside the USSR where the U-2 couldn’t reach. These new pictures of the USSR, while not immediately as clear as the U-2 pictures, used new techniques for taking and interpreting images that were constantly improving and proving the worth of the satellites.

Contemporary Developments

The National Reconnaissance Office (NRO) emerged as one of the most secret intelligence services in the US and is part of the Department of Defense. It now has its own website and offers overhead intelligence services to a number of government agencies, warning of potential environmental and political trouble spots around the world and helping coordinate military operations.

In 2004 Google acquired Keyhole, Inc. for its Google Earth, Google Maps and Google Mobile operations. Keyhole had been formed in 2001 as a software development company specializing in geospatial data visualization applications to commercialize some of the Keyhole satellite data. They developed the Keyhole Markup Language (KML) for geographical annotation and visualization with Internet browsers.

Notes

[1] Recently unclassified documents show a much more active President Eisenhower than the public had believed at the time.
[2] Winter, F. 1990. Rockets into Space. Cambridge, MA: Harvard University Press. pp. 70-80.
NASA was established on October 1, 1958 with the passage of the National Aeronautical and Space Act.
[3] The mishaps are said to be the inspiration behind the spy novel Ice Station Zebra which later became a full motion picture hitting the cinemas in 1968.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

My Top Social Media Books

Posted on | February 27, 2011 | No Comments

In general I’m unhappy with the focus and range of books on social media. Still, some interesting works are worth mentioning. I don’t really have them in a particular order as I haven’t really developed a useful taxonomy. Also I don’t have extensive comments as I link them to their Amazon pages where you can read the reviews on each one.

  • The Wealth of Networks: How Social Production Transforms Markets and Freedom.
    Yochai Benkler
  • – Opaque and intellectual, this book gets better with time. Its strengths are its discussions of democracy and the emergence of a networked public sphere.

  • Socialnomics: How Social Media Transforms the Way We Live and Do Business.
    Erik Qualman
  • – Readable and likable, it provides a good introduction to social media. A bit too anecdotal for my taste, but like I said, it’s a good start.

  • Web 2.0: A Strategy Guide: Business thinking and strategies behind successful Web 2.0 implementations.
    Amy Shuen
  • – I liked this book but found it a bit too cryptic for the undergraduate class I ordered it for. Still it’s the book I go back to when I want more serious ideas about social media. Looking forward to the second edition.

  • Here Comes Everybody: The Power of Organizing Without Organizations.
    Clay Shirky
  • – If you want to know what Wael Ghonim meant when he said “Ask Facebook”.

  • Everything Is Miscellaneous: The Power of the New Digital Disorder.
    David Weinberger
  • – Read this traveling on the trains in Japan and use it in one of my Digital Media Management classes. Really good foundation to the different ways of organizing information and how digital technologies transform the whole area of information management.

  • The Social Media Bible: Tactics, Tools, and Strategies for Business Success.
    Lon Safko
  • – Buy the Kindle version, this book is big.

  • Social Media Metrics: How to Measure and Optimize Your Marketing Investment.
    Jim Sterne
  • – Numbers, deal with them.

  • PR 2.0: New Media, New Tools, New Audiences.
    Deirdre Breakenridge
  • – I plan to use this in my New Technologies for Advertising and Public Relations course at NYU this summer.

  • What Would Google Do?
    Jeff Jarvis
  • – Google has a mixed record when it comes to social media hits but with $28 billion in 2010 revenues it is a force in itself.

  • The Long Tail: Why the Future of Business is Selling Less of More.
    Chris Anderson
  • – This is a classic in conceiving some of the economics of social media and blogging.

    Anthony

    Follow apennings on TwitterAnthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

    Media Content as a “Public Good”

    Posted on | February 23, 2011 | No Comments


    Media products are sometimes referred to as “public goods” because these products are not significantly “used up” in the process of consumption, and each additional user incurs little additional (marginal) cost. Unlike traditional products like a hamburger that is bit into, chewed, swallowed, and digested, media content is generally not destroyed while being consumed. Although the timeliness of media content is often an issue, the consumption of media does not deplete it.

    With traditional public goods like a road or a park, the costs associated with the additional driver on a highway or your child enjoying the playground are minimal. With media content as well, additional consumers are served without incurring much expense. An additional radio listener, for example, would not impose even a negligible cost on the radio station. Likewise, one more click on a website would not cost the owner very much.

    Economists like to talk about two characteristics of goods and services, called rivalry and excludability. Rivalry is sometimes called subtractability. When someone purchases a good that is highly rivalrous, it is consumed entirely by that person. These are also called private goods in that they are removed from collective or shared use. These are the ideal goods for economists because these products are “well-behaved” in that they fit nicely into their economic models and are easily priced. A mobile phone almost always has a single owner until it breaks or becomes outdated.

    Other goods are less rivalrous such as going to a water park. Labor is utilized, water evaporates, but for the most part, the park remains essentially the same after the family packs up and goes home. Radio stations emit their programming with a signal that is not used up by a commuter listening to music (and ads) on their way home.

    The other issue is excludability. A water park usually has enough fencing and security to keep those who do not pay away from its pools and rides. A beach would be more difficult, and many communities make sure that rich owners of beachfront properties do not restrict surfers, swimmers, and other beach users from accessing the beach. These “common goods” can be subject to congestion as more people use them.

    The “nonexcludability” criteria of a public good takes into account the costs associated with keeping nonpayers out of the system. It is very difficult to keep a car off a road or a kid out of park, just as it is not feasible to keep a viewer from watching a TV show on broadcast TV. An interesting case is the economics of knowledge, which is not consumed, but is it excludable? And what conditions would restrict others from acquiring it?

    Share

    Anthony

    Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

    Obama “Hustling” E-Commerce Exports

    Posted on | February 19, 2011 | No Comments

    President Obama recently spoke to the US Chamber of Commerce about the importance his administration was putting on exports and trade agreements. The event coincided with the release of a document on global e-commerce that discusses topics such as harmonizing commodity descriptions, export controls and regulation, free trade agreements, international payment systems, fraud warnings, and shipping methods.

      We know what it will take for America to win the future. We need to out-innovate, we need to out-educate, we need to out-build our competitors. We need an economy that’s based not on what we consume and borrow from other nations, but what we make and what we sell around the world. We need to make America the best place on Earth to do business.

      And this is a job for all of us. As a government, we will help lay the foundation for you to grow and innovate and succeed. We will upgrade our transportation and communication networks so you can move goods and information more quickly and more cheaply. We’ll invest in education so that you can hire the most skilled, talented workers in the world. And we’ll work to knock down barriers that make it harder for you to compete, from the tax code to the regulatory system.

    The event coincided with the release of an extensive booklet, Preparing Your Business for Global E-Commerce: A Guide for Online Retailers to Manage Operations, Inventory, and Payment Issues, published by export.gov, the administration’s portal for assisting international trade. Export.gov is managed by the International Trade Administration in collaboration and helps coordinate the different agencies with some involvement in global commerce such as the Departments of Agriculture, Commerce, Energy, State, Treasury as well as the Ex-Im Bank and Small Business Administration (SBA). The booklet gives detailed support on exporting goods and services.

    The author of a booklet Ken Walsh, International Trade Specialist with the U.S. Commercial Service, an agency of the Department of Commerce, gives an audio interview on global e-commerce.

    This is an example of how third parties can assist an e-commerce site in expanding to other countries.

    Anthony

    Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

    Social Network Seminar – Translating Virtual Engagement into Political Reality

    Posted on | February 17, 2011 | No Comments

    CrossRoads Spring 2011:
    Social Networks – translating virtual engagement into political reality
    Friday, February 18, 6-8:30pm
    NYU Kimmel Center, 60 Washington Square South, room 405

    Speaker:
    Andrew Noyes, Public Policy Communications Manager, Facebook, (Washington D.C.)
    Interviewed by: Dr. Anthony Pennings, NYU-McGhee-DCoM

    Andrew Noyes joined Facebook in 2009 after covering Capitol Hill, the White House, federal agencies, nonprofits, and think tanks as a member of the Washington press corps. In his current role, he nurtures relationships with policymakers, the press and the public and explains how the social networking giant helps its more than 500 million users share in a more trusted environment; helps makes the world more connected; and drives economic growth.

    His areas of focus include expanding digital privacy protection through user control of data; enhancing cybersecurity and online safety; and protecting free speech. Previously, Andrew wrote for CongressDaily, Technology Daily, Communications Daily and Washingtonian Internet Daily where he specialized in intellectual property; Internet governance; antitrust and competition; and privacy and data security. He also launched and authored Tech Daily Dose, a popular blog on NationalJournal.com, and was a contributor to National Journal and Government Executive magazines. Over the course of more than a decade, Andrew also wrote news, business, and human-interest stories for a range of other publications including Washington, Capitol File, DC Magazine, the Baltimore Sun, The Advocate and more. Andrew has also served on the adjunct faculty at American University and has provided commentary for MSNBC, CBS, C-SPAN, National Public Radio, Federal News Radio and other media outlets.

    Dr. Anthony Pennings has been at the McGhee Division since 2002 where he started the BS in Digital Communications and Media and teaches a variety of courses dealing with the management and politics of digital media and information systems. He has a PhD in Political Science and wrote his dissertation on cyberpunk fiction and electronic money. He also has a MA in Communications, both from the University of Hawaii where he was a Fellow at the East-West Center’s Institute of Culture and Communications doing research on computerization, media, and telecommunications issues in Asia. He got his first teaching position at Victoria University in Wellington, New Zealand where he taught television production but returned to his New York home in the mid 1990s after he heard about the World Wide Web. At Marist College he taught Multimedia and Web production, feeding the New York area with hundreds of skilled web producers during the dot.com area. Last month he was awarded a Fulbright Fellowship to teach digital media economics in Japan and revise his book on computerization in Asia.

    Mechthild Schmidt joined DCoM in Fall 2003 after a career as animator and art director in broadcast, advertising and interactive media. A Clinical Associate Professor, she has initiated and run the CrossRoads series since 2004.

    Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

    Multimedia and Multiple Intelligences

    Posted on | February 13, 2011 | No Comments



    When I was teaching at Victoria University in New Zealand, I was invited to give a keynote address for a distance learning conference at Massey University. I chose to draw on Howard Gardner‘s theories of multiple intelligence and connect them to multimedia. I always thought that the Harvard psychology professor had developed a framework that was useful (and admittedly quite obvious) for guiding multimedia innovations while recognizing the diversity of learning styles by different people. Gardner, who is the John H. and Elisabeth A. Hobbs Professor of Cognition and Education at the Harvard Graduate School of Education, developed the following list of intelligences: linguistic, musical, spatial, logical-mathematical, interpersonal, intrapersonal and bodily-kinesthetic.

    Here is a short video introducing the multiple intelligences by Chris Warren.

    I want to elaborate on his ideas with my own thoughts in some future postings and more importantly start to make the connections between specific tools and the types of intelligences they can help create. Here is a preliminary list of my thoughts.

    Linguistic – Certainly one of the implications of the web was the return to reading and it seems as if the emergence of the iPad is a remediation of the magazine with an enhancement of its interactive aspects. Recently I put a crossword puzzle on my website, although mainly as an antidote to any TV watching. But we can expect that multimedia will continue to develop programs, especially for kids, to help understand and use words, to develop story-telling and poetic capabilities, and to develop other word and reading games.

    Visual-Spatial – It was in New Zealand that I played iD’s Doom for the first time. Its first-person capability was an extraordinary new experience for navigating in a virtual space that simulated a real-world environment and I remember playing it for 12 hours straight (My memory was enhanced because I had to walk past a graveyard at 4am). Spatial understanding has been augmented by a number of multimedia tools including charts, maps, and other types of 3-D modeling, although it remains to be tested what impact it has on visual-spatial intelligence.

    Bodily-Kinesthetic – Nintendo’s Wii and Microsoft’s Kinect are two of the newer consumer products that expand the visual-spatial dimensions of multimedia with an enhanced sense of body awareness and coordination. Wii games can involve waist control with a virtual hula hoop and develop eye-hand coordination by playing tennis or baseball your living room. One of the games we have at home is The Beatles Rock Band on my Xbox, which is teaching my 6 year-old how to drum like Ringo Starr.

    Musical – While the early trend was to intellectualize music and “coagulate” music skills into keyboards and computer programs, games on the Xbox console and other devices are creating ways to enhance musical appreciation and skill acquisition.

    Interpersonal – What is social media if not, hopefully, the development of better interpersonal skills? Does texting enhance interpersonal skills? chatting online? Skyping? Collaboration tools enhance group learning and cooperative work. Perhaps it is no wonder that Howard Gardner’s foray into the digital world has been through an exploration of ethics. The proliferation of the smartphone lately has raised concerns that people are withdrawing from face-to-face interactions.

    Intrapersonal – Media have always been used for self-exploration through tools like diaries, blogs and biofeedback devices. Also, self-directed learning can be a tool for interpersonal discovery. The trend continues to be towards using digital multimedia in the online education world because of its opportunities for unique and personalized experiences.

    Logical-Mathematical – Although this still tends to be the preferred mental mode of modern life, it is challenged by the entertainment culture that media themselves have helped to promote. Economic competition from Asia and other parts of the world have renewed calls for strengthening the educational curriculum to enhances this type of intelligence, and multimedia can play a part.

    My interest in computers was first stimulated by a computer game on a system called Plato that simulated an automobile racing track, and each student could propel their car forward by answering math questions such as 49+59. A correct answer would propel the car forward a certain distance. I was amazed by the enjoyment and enhanced attention given by the students as each tried to win the race by answering the questions correctly and quickly.

    Anthony


    Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

    « go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    June 2025
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    30  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.
  • Verified by MonsterInsights