Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Viral Marketing and Network Effects

Posted on | April 4, 2011 | No Comments

Hotmail was one of the first companies to capitalize on network effects when founders Sabeer Bhatia and Jack Smith began to offer web-based free email hotmails accounts in the summer of 1996. Previously, people would access their email accounts by logging into their email accounts through a PC or mainframe terminal at a university, corporation, or ISP. Using Hypertext Markup Language or HTML (“HoTMaiL”), a web-based service was created where a person could access their email account from any web browser connected to the Internet. What was also extraordinary about Hotmail was the growth strategy they adopted to get new users to sign up.

The idea came from Tim Draper, a venture capitalist who first wanted “P.S. I love you. Get your free Web-based email at Hotmail” at the bottom of every email message. After a rigorous debate, the company dropped the “P.S. I love you”, but they added Hotmail’s URL so the email message’s recipient could click on the link and go directly to the website where they could sign up for a free account with 2 MB of storage. In six months, they had a million registered users. When it was sold to Microsoft for $400 million in December of 1997, it boasted 8.5 million subscribers and as part of the MSN grew to 30 million by mid-1999. Something extraordinary had happened.

It was about the time of the Hotmail sale, when Steve Jurvetson began to write about some of the unique characteristics of Internet companies. Netscape asked him to write about some companies for their corporate newsletter called “The M-Files.” Draper had a particular interest in companies that were using their browser in unique ways. Brainstorming with Draper, they came up with the term “viral marketing” after rejecting terms like “geometric marketing,” and “tornado marketing.” As Eric Ransdell wrote, Jurvetson began to peruse his psychiatrist wife Karla’s medical books. He was drawn to the idea of the sneeze as a way to examine the dynamics of viral activity. A sneeze can spew out millions of items, and infectious particles like viruses can spread to many people if it is done in a crowd. The Internet provides the crowd and the right viral message can “infect” millions of people. Ransdell writes:

    Suddenly, the principle behind viral marketing seemed so easy to understand. In this new world, companies don’t sell to their customers. Current customers sell to future customers. In exchange for a free service, customers agree to proselytize the service. Because recipients of Hotmail messages are almost always friends, relatives, or business acquaintances of the sender, the marketing message is that much more powerful. Each email carries an implied endorsement by someone who the recipient knows.

Google drew on the Hotmail experience for Gmail but took a somewhat different tack. Starting in 2004, it began beta-testing its advertising-supported web-based free browser. Their strategy also drew on network Gmail logoeffects although they added an aura of exclusivity, partly because of the beta-nature of their product. Instead of just getting a Gmail account, new subscribers had to be invited.

Statistics on email are hard to find and somewhat unreliable but starting with the 1,000 or so initial invitees in March of 2004 Gmail has become one of the top four web-based email clients with Windows Live Hotmail and AOL Mail following leader Yahoo! Mail which has over 270 million subscribers. Microsoft’s Outlook (renamed Windows Mail) was the most used email client until the popularity of Apple’s iPhone and iPad email capabilities. While Windows Live Hotmail probably still has more subscribers, Gmail has proved its value in the US based on Internet usage.

While the viral phenomenon is about rapid diffusion, network effects is about value. Network effects has been another main driver of web-based services and businesses. The ubiquity of the Internet has made it extremely valuable for all involved. With that type of connectivity, new viral strategies are being implemented to capture the value of the network effects on the web and other digital devices. I’m looking forward to Igor Shoifot’s new book on viral marketing. He promises us at least 101 reasons to buy it.


gmai linvite

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor at the Department of Technology and Society at SUNY Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Honolulu, Hawaii during the 1990s.

Developing Apps for Apple’s Mobile Devices

Posted on | March 25, 2011 | No Comments

While Android mobile devices like my Droid X are certainly gaining in popularity, Apple products still lead the pack in terms of popularity, if not user satisfaction, for mobile products. The key to Apple’s new mobility devices is the iOS operating system originally developed for the iPhone, but now the standard for an array of Apple products including Apple TV, the iPhone, iPad, and the iPod Touch. You can see a wide variety of applications developed for these Apple devices at the Apple App Store.

From idea to application

The iOS is notable for its user interface. Apple made history by commercializing the graphical user interface developed by Xerox Parc for the Apple Macintosh and continues to push development in this area with the haptic or touch user interface. Apple likes to call this the ‘Human User Interface” and they outline several Human Interface Principles that potential developers should take to heart.

  • Aesthetic Integrity
  • Does the design integrate well with the desired functionality?

  • Consistency
  • Does a design follow the iOS guidelines so that it fits the user’s expectations based on other Apple applications?

  • Direct Manipulation
  • Do users operate directly on onscreen objects and get immediate visual or haptic feedback?

  • Feedback
  • Do users get immediate acknowledgment of their actions and assurances that a process is occurring or has occurred?

  • Metaphors
  • Does the interface communicate directly and assuredly with the user through metaphoric symbols?

  • User Control
  • Does decision-making stay with the user as much as possible? Can they terminate an operation effectively and quickly?

Here Apple co-founder Steve Wozniak talks about the importance of the human in the user interface.

The iOS Human Interface Guidelines are a useful way to evaluate user interfaces and can give you ideas about ways to shape your ideas into a well-designed application. The next step is to start reading the iOS Application Programming Guide to help you prepare for coding and actually building the application.

Anthony
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED

Social Media: Some Thoughts on Curriculum

Posted on | March 22, 2011 | No Comments

Social media are sets of Internet and mobile platforms and tools that facilitate meaningful exchanges and value creation between individuals and with groups and organizations in both the commercial and public spheres. While the earliest tools included blogs, bookmark sharing, forums, podcasts, tagging, and wikis; new applications available through platforms like Facebook, Hulu, Second Life, Twine, and Xbox Live suggest that the creative capacities of social media are only beginning to be explored.

Social media has been embraced by users and organizations, both commercial and non-commercial. It is used around the world to connect with friends and families, share information about concerns and interests, and mobilize others in activities. Enterprises are increasingly using social media techniques to engage people in their discussion forums about services and products and to get customers to be active agents in promoting their brand and offerings. Non-profits and grassroots campaigns have embraced social networking to influence election results, interest people in public policy discussions, and facilitate social change through democratic involvement.

Social media has been part of our curriculum for the last several years. I redesigned the foundation courses for the BS in Digital Communications and Media degree in 2005. I included two courses that cover social media, the Digital Media Management I and II series taught by Igor Shoifot of Fotki.com (now living near Silicon Valley and teaching for UC Berkeley) and Collaboration Technologies that has been taught online by Kristen Sosulski. Last year I developed some ideas for an MS in Social Media and more recently a 2 credit course that would provide an introduction to the promises and perils of social media. The MS didn’t quite fit into the mix here yet but the small course, which seemed almost more difficult to conceptualize, will be taught in the summer of 2012. In any case, I thought I would share some ideas I had for developing a curriculum framework for teaching about social media.

These are the general areas that I think should be considered in a program on social media.

  1. New developments in social media technologies and techniques;
  2. Key communication and economic attributes that power this medium, including important metrics;
  3. How social media can be used as part of an organization’s communications strategy;
  4. Key skill sets and knowledge students can acquire for entrepreneurial innovation and employment in this area;
  5. Legal, privacy, and other unfolding social concerns that accompany this dynamic new medium;
  6. Issues of social change, citizen engagement and democratic prospects;
  7. Research implications of social media and the theorization and methodological skills needed to conceptualize research projects.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications. © ALL RIGHTS RESERVED

Google’s Competitive Advantages – Fixed Costs

Posted on | March 17, 2011 | No Comments

We’ve been reading The Curse of the Mogul: What’s Wrong with the World’s Leading Media Companies by Jonathan A. Knee, Bruce C. Greenwald, and Ava Seave in my Digital Media Management II class at NYU. The book challenges some of the key assumptions regarding the management of media companies including the importance of brands, talent and deep pockets. But its main point is to drive home the importance of some key competitive advantages for media companies. It perceives them primarily as a way of setting up barriers for other companies competing or entering into its markets. These barriers include fixed costs, network efforts, customer captivity, proprietary technologies, and government protection.

In this post, I wanted to focus on Google’s fixed costs as a competitive advantage. Fixed costs are those that do not vary according to production or sales levels, or in Google’s case, the amount of advertising sold. While fixed costs can certainly be a liability to a firm, they make it difficult for any other company to challenge them without matching its expenditures or finding a way to be more efficient or transformative. (Google itself transformed the advertising business)

If you are not very familiar with Google’s business strategy, I would recommend reading Steven Levy’s “Secret of Googlenomics: Data-Fueled Recipe Brews Profitability” on the Wired website. It provides an excellent introduction to the search behemoth’s business model, primarily built around its Adwords and Adsense advertising business. Preliminary estimates for Google’s 2013 revenues look to be around $58 billion dollars.

Google has a number of advantages, perhaps foremost being the massive investments in its built infrastructure. Google’s mission of “organizing the world’s information” requires more than the most sophisticated “big data” software. It also necessitates huge investments in physical plant, particularly data centers, power systems, cooling technologies, and high-speed fiber optic networks.

Google has built up a significant global infrastructure of data centers (increasingly located close to cheap, green tech) and connecting its storage systems, servers, and routers is a network of fiber optic switches. For example, the Eemshaven data center facility in the Groningen region of the Netherlands is at the end connection point for a transatlantic fiber optic cable. The US$ 770 million data center is also being built near a power plant and contracts for other green energy providing an estimated 120 megawatts of cheap electricity. For the most part, the details on fixed costs are not readily available as they are proprietary and represent trade secrets. However, aggregate numbers of Google’s fixed costs are informative.

Microsoft of course, has the financial wherewithal to compete with Google. They have been investing in data centers for a number of cloud services including Microsoft Live’s SkyDrive. With the expansion of their Bing search challenge to Google, you can expect significant investment in a global infrastructure of server farms and communications links integral to a wide range of new advertising and e-commerce services. Whether they can catch up with Google remains to be seen but a whole industry focusing on data centers is emerging that will make them more efficient to run and more economical for investment.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

The University of Hawaii and the History of Non-Linear Editing

Posted on | March 16, 2011 | No Comments

When I was working on my PhD at the University of Hawaii (UH), we became the first academic institution in the world to obtain an Avid non-linear digital editing (NLE) suite. An Avid NLE was purchased in early 1990 due to the vision and determination of Stan Harms and Dan Wedemeyer, professors in the Communication Department (now School of Communications) and Patricia Amaral Buskirk, Director, who managed the department’s Media Lab.

The AVID 1/ Media Composer system with a Macintosh IIfx arrived in the spring of 1990 and was set up in the Media Lab. Patricia Amaral Buskirk put the system together and made it work. That was no easy task as it required a good understanding of the Macintosh computer and the newest developments in digital technology. Not to mention a lot a SPEBE Center for Modern Mediaconcentration and patience. We tried it out that summer with a program I used to run for the Hawaii Department of Education, and the Departments of Communication and Journalism at UH called the SPEBE Center for Modern Media. It brought some of the brightest high school students from around the Hawaiian islands to the University of Hawaii for six weeks to give them an experience of college life. (Pictured on left with Jade Moon in red, a TV news broadcaster who taught for us) Patricia Amaral Buskirk and Susan Gautsch trained the students on the Avid and helped them produce short videos on U.S. Presidents. That Fall semester we used it in an International Communication class I was teaching to produce a short documentary on satellites.

The founder of Avid, Bill Warner, introduces a similar NLE suite:

Nonlinear video editing uses the power of the computer. It has added a whole new range of editing power and flexibility to the visual story-making process (much like word-processing changed the writing process). Still, the system would be almost unrecognizable by modern standards. It displayed very low-resolution clips on the computer screen, and the editor would create the narrative and, in the process, compile an edit decision list (EDL). The desktop computer, which was connected to traditional video cassette recorders, would “auto assemble” the final video and record it back onto videotape.

According to Patricia Amaral Buskirk, “What that meant was we had the playback and record decks (with time code) controlled by the AVID and then we would pop in the original Source Tape when asked, and the system would edit (auto assemble) the final version (that was digitally edited on the computer) onto the tape in the Record deck.” Despite the state-of-the-art microprocessors and hard drives that came with the system, they couldn’t handle all the large high-resolution clips that contemporary NLE systems use.

Successive upgrades and a fair amount of money over the years allowed for better and better resolution until the external card sets were not needed. Eventually, other professors started to use it, and of course, NLEs are now standard equipment for video post-production throughout higher education and in the film and television industries. But as far as we know, UH was the first university to obtain and teach non-linear editing.

Patricia Amaral, who received a MFA graduate degree in Film and Video production at Savannah College of Art & Design (SCAD), ran AB Productions, Inc., a full service digital audio and video production company in Georgia, but returned to the University of Hawaii faculty as an Associate Professor.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

Social Media Entering Phoenix Stage?

Posted on | March 10, 2011 | No Comments

I thought this guy was interesting when I saw him on MSNBC’s Morning Joe. He argues passionately that social media is in a stage much like at the end of the dot.com era where we didn’t really know how to get the returns on our investments in the Internet. He says we won’t really see the true face of social media for several more years.

Just like 2000? Well that is assuming a current overinvestment in social media. Some evidence indicates that, particularly the money pouring into Facebook. We are entering the eleventh anniversary of the famous dot-com crash when the investments in so many Internet companies did not result in significant revenue streams to justify their stock prices. In the spring of 2000, the tech market began to falter in the NASDAQ online stock exchange. A shakeout occurred and many companies like Pet.com fell to the wayside. Some companies like Amazon.com survived and e-commerce has steadily grown since the crash. Now social media is in the spotlight, but can it proceed in its current trajectory?

I ordered the Kindle version of his book The Thank You Economy yesterday and had a quick look. As expected it is a business book. Not bad in itself but in the era of the Middle East uprisings, the technology is showing itself to be a bit more multidimensional than ROI will ever show.

Visit msnbc.com for breaking news, world news, and news about the economy

On a related note, my students were quite complementary on the transformation Dominos has gone through. Apparently the outgoing CEO didn’t want his legacy to be the worst pizza ever. Now, I live in New York City, so good pizza is not scarce. But everyone should have access to good pizza. Give them a look.

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

The FCC’s First Computer Inquiry

Posted on | March 6, 2011 | No Comments

Any use of a communications carrier to connect remote terminals to computers brought the system into the realm of the Federal Communications Commission (FCC). By virtue of the Communications Act of 1934, all telecommunications traffic needed to be regulated by the FCC. As early computers were being developed, his task of regulating computer communications was neither welcomed nor clearly understood by the FCC, but the number of computer linkages were increasing rapidly as businesses discovered that the computer could not only process information but could transfer it to other computers.

The number of “online” systems increased from about 30 systems in 1960 to over 2300 by 1966. The number of data terminals was estimated to have climbed from 520 in 1956 to some 45,663 in 1966. New innovations such as time-sharing allowed many people to use the same computer via telecommunications. “Computer applications, in other words, had come to depend more upon communication links, including privately owned microwave hookups, dedicated leased lines, and regular common carrier circuits.”[1] Consequently, the FCC sent out an invitation to computer users to describe their current utilization of computers and telecommunications and how they expected these systems to be used in the future. The following is a list of some of the corporations and trade associations that responded:

Aetna Life and Casuality Co.
American Bankers Association
American Business Press, Inc.
American Petroleum Institute
Credit Data Corp.
Eastern Airlines
Lockheed Aircraft Corp.
McGraw-Hill, Inc.
Societe Internationale de Telecommunications
Aeronautique (SITA)
United Airlines.

This initial study became known as the FCC’s First Computer Inquiry. It marked the initial attempt by a government regulatory agency in the U.S. to address the policy implications of the new computer systems utilizing telecommunications and specifically the emerging needs and concerns of its users.[2] By the time the First Computer Inquiry got under way in 1966, over half the installed computer systems were located in corporate industries such airlines, banking, computer services bureaus and manufacturing.”

Comments to the First Computer Inquiry from the American Bankers Association indicated that automated banking services and electronic funds transfer systems were going to grow very rapidly. The physical transfer of checks was being replaced by information transport over wire and radio. Already over 1000 banks throughout the country had computers on their premises, and another 2000 used commercial data processing bureaus. Banks were very concerned about computerizing bank functions such as account reconciliation, correspondent bank services, internal accounting, labor distribution, and professional billing.

It became clear that users were developing sophisticated computer systems that were going to be increasingly dependent on data communication services. Aetna Life and Causality stated that telecommunications services for businesses were going to be “the limiting factor in the planning of new business systems.” The First Computer Inquiry was probably the most important reevaluation of telecommunications policy since the New Deal. While this period also included the AT&T antitrust case, which regulated the giant company out of the computer business, the First Computer Inquiry addressed new concerns. For the first time, corporate needs for telecommunications systems to facilitate their computerization processes were addressed.

Much of the debate focused on the capabilities of the status quo telecommunications network and its tariff structure. Business users charged that the telephone industry and its networks were simply not up to the task of providing the sophisticated technical services they required. They complained to the FCC that they needed new types of equipment and services from the common carriers to maximize the use of their computers. The technical standards that had emerged in the regulated telegraph and telephone systems differed from those required by computer systems. Corporate users wanted to interconnect their facilities, whether they owned them or just leased them, with common carrier facilities. Of particular concern were the prohibitions against the use of foreign attachments such as modems to the telephone network. They were also pressing the FCC to allow the shared use and resale of common carrier equipment and services. As computers were still quite slow, full capacity was not always reached on circuit. Users wanted to attach their own private equipment to the AT&T network in order to combine the data and send over a single line. For example, a bank might want to purchase multiplexing equipment that could draw in the data from a number of computers and retransmit the information through a single telecommunications line leased from a carrier to a receiving facility. They also wanted these carriers to provide telecommunications circuits that were conditioned to sufficient quality so that they could transmit information through them with a high degree of accuracy and efficiency. The First Computer Inquiry began to reveal the future importance of new communications services and the limitations in the current network infrastructure.[4]

That AT&T became one of the biggest obstacles to the development of data communications can be attributed at least to four factors. First, the number of computers in operation was quite small and they were not yet perceived as popular devices, even for businesses. Computers were used almost exclusively by government and large corporate users and the need to connect them via telephone lines was uncertain. Second, the demand for voice communications was quite large. The government’s concern for universal service meant that regulated telephone companies had to address issues involved in wiring wide geographic areas for domestic telephone use. Third, the Bell system was adamant that customers be restricted from attaching non-telephone terminal equipment to the network. The telephone grid and its electrical signals were fairly fragile and their safety and clarity were seen to be at risk if unsanctioned equipment could be connected. Fourth, the Consent Decree of 1956 had restricted AT&T’s entry into the computer field and so it had very little incentive to address issues affecting this market. So despite the growing need, AT&T’s reluctance to upgrade its technical facilities and tariff structure came largely from the fact that data communication traffic volumes simply did not compare with voice. AT&T was structured by investment and regulation to provide voice services, which were growing at post-war rates faster than previously anticipated. So not only was the market for computer communications small and difficult to forecast, but AT&T had their hands full just handling voice communications.

The huge monopoly was very defensive at the time about user demands to liberalize the equipment that could be connected to the telecommunications network. Initially only the standard black rotary telephone could be attached and it was illegal to attach any non-Bell system equipment to the network such as computer terminals and other data communications devices such as modems. Responding to corporate requests, AT&T did make available “data sets” or modems after 1962 that allowed data transmission rates at 1200 to 2400 bits per second over special communication lines acquired by dialing. By the mid-60s, rates up to 250,000 bits per second were available.[5] Still, users were uneasy with the situation.

It was the burgeoning mobile telephone industry that broke the network’s monopoly on connecting outside equipment. In what would be the FCC’s Carterphone Decision of 1968, equipment other than that designed by the Bell Telephone Company could finally be attached to the network. The “Carterphone Ruling” allowed a small radiotelephone company, Carter Electronics Corporation, to link its mobile radio system to the AT&T public-switched network. The ruling set a precedent as the FCC concluded that any customer who desires to use an interconnecting device should be able to, as long as it did not “adversely affect the telephone company’s operations or the telephone system’s utility for others….”[6] The FCC’s decision allowed a large industry to grow as computers, decorator phones, modems, network monitoring equipment, and entire telephone exchanges began to be connected to the network. This action would allow individual corporations to design and implement an entire telecommunications facility outside the network and would set the legal foundation for long-distance interconnect companies to be able to link to local telephone companies.[7]

Another aspect of the phone company’s reluctance to provide telecommunications for computers was that despite the fact that Bell Labs had designed the transistor and other seminal computer technology, it was effectively restricted from entering that field. The Consent Decree obtained from the Department of Justice in 1956 barred AT&T from competing in unregulated areas including computers and international data service. AT&T was directed to focus on domestic common carriage exclusively and that meant not selling computers and not building telecommunications systems in other countries. Despite extraordinary computer developments at its Bell Labs, including the invention of the transistor, data communication capabilities did not register as a top priority for AT&T.

These issues did not go unapprised when the FCC released its First Computer Inquiry conclusions. In its final decision, the FCC opted not to regulate data communication and processing services. While legally the FCC could regulate any communication, it decided to retreat from major intervention in the face of such strong corporate pressure. While computers were not as sexy as they would be in the age of the Internet, they nonetheless were proving too useful for corporations to allow the enactment of dubious regulations. What the FCC did do in its final First Computer Inquiry decision was to make distinctions between communications networks and “online” timesharing computer systems. The computer users had argued that, despite the Communications Act of 1934; the FCC could only regulate the communications business and should not be regulating, in effect, the business of other industries. The Aerospace Industries Association even argued that “they were no more subject to regulation under the Act than the voices of telephone users.” As a result, the term “online” subsequently achieved a high rate of circulation as it was used to linguistically protect computer networks from possible FCC regulation.[8]

While the distinctions proved to be arbitrary and did not last long, the FCC’s position can be clearly discerned. Corporations were beginning to need computer communications as an integral part of financial, organizational, and productive coordination. While AT&T was a crucial part of the nation’s telecommunications infrastructure, it was becoming evident that the long-term development of the nation’s corporate growth environment would require an efficient computer-communications system. Criticism of the telecommunications structure was increasing and the dominant role of AT&T challenged. “Owing to the rapidly escalating dependence upon communications by U.S. industry and finance, a single company–AT&T was in practice materially affecting (some said dictating) the pace and direction of corporate strategy in areas far removed from its traditional communications offerings.”[9] Regulation, or the lack of it, was to be used for promoting the growth and stability of U.S. economic development, despite the antagonistic struggle with its largest corporation.
Notes

[1] Dan Schiller’s work on early computerization policy is one of the most authoritative. See his (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 22.
[2] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 23.
[3] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 28.
[4] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 29-30.
[5] Phister, M. (1979) Data Processing Technology and Economics. Santa Monica, CA: Digital Press. p.76.
[6] Martin, J. Telecommunications and the Computer. p. 37.
[7] Pool, I. (1983) Technologies of Freedom. Cambridge, MA: Harvard University Press. p. 247.
[8] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 38.
[9] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 42.
Share

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

Why AT&T Invented and Shared the Transistor that Started the Digital Revolution

Posted on | March 5, 2011 | No Comments

The invention of the transistor in 1947 provided an extraordinary capability to control an electrical current. It was initially used to amplify electromagnetic frequencies and transistors and transistorthen to switch the 1s and 0s needed for digital computing. An unlikely scenario unfolded in the 1950s when AT&T’s fear of government anti-trust action and regulation sparked sharing this seminal technology with other companies. This self-serving altruism led to the “solid-state” electronics revolution and many silicon semiconductor innovations that rapidly developed computerized information technology.

The transistor emerged from the research efforts of AT&T, the corporate behemoth formed by JP Morgan and guided by US policy to become the nation’s primary telecommunications provider. In 1913, AT&T settled its first federal anti-trust suit with the US government. The agreement established the company, starting with Alexander Graham Bell’s technology, as an officially sanctioned monopoly. A document known as the Kingsbury Commitment spelled out the new structure and interconnection rules in return for AT&T divesting its controlling interest in telegraphy powerhouse Western Union.

Both companies had a history of consolidating their market domination through patent creation or purchase. For example, AT&T purchased the patents for the De Forest vacuum tube amplifier in 1915, giving it control over newly emerging “wireless” technologies such as radio and transatlantic radiotelephony, as well as any other technology that used the innovation to amplify electrical signals. Patents, as government-sanctioned barriers to entry, created huge obstacles for other competitors and effectively barred them from producing and using anything close to the restricted technology.

As AT&T grew more powerful, it established Bell Telephone Laboratories Inc. (Bell Labs) in 1925 as a research and development subsidiary. Fed by AT&T’s monopoly profits, Bell Labs became a virtual “patent factory,” producing thousands of technical innovations and patents a year by the 1930s. One of its major challenges was to find a more efficient successor to the vacuum tube.

A breakthrough occurred when William Shockley, PhD, who was the director of transistor research for Bell Labs worked with fellow PhDs John Bardeen and Walter Brattain to create the “Semiconductor amplifier; Three-electrode circuit element utilizing semiconductive materials.” The transistor’s inception dates to December 23, 1947 at Bell Labs’ facilities in Murray Hill, New Jersey.

At the time, AT&T’s famed research facility employed nearly 6,000 people, with 2,000 being engineering and research professionals.[1] The development of the transistor was not a result of just basic research; it was the result of an all-out attempt to find something to replace the vacuum tube. In any case, the government’s lawsuit meant that AT&T would tread lightly with this new invention lest it raise additional concerns about Ma Bell’s monopoly power.[2]

After World War II, the US Justice Department filed another anti-trust lawsuit against AT&T. In 1949, it sought the divestiture of Western Electric, AT&T’s equipment-manufacturing arm. The action came after, although not necessarily because of the telephone company’s invention of the transistor, an electronic device that regulated the flow of electricity through a small cylinder device. It operated much like the vacuum tube, but the transistor was “solid-state”: easier to use, more reliable, and much smaller. Faster to react, less fragile, less power-hungry, and cooler-running than glass vacuum tubes (which had to “warm up” to operating temperatures), it was ideal for a wide variety of electronic devices.

Unlike its previous history of zealously controlling or acquiring patents (including the vacuum tube) dealing with its telephone network, AT&T decided to liberally license the new technology. It did not want to antagonize the Justice Department over a technology it did not fully understand nor knew how to implement commercially. However, some of the Bell Labs employees were already jumping ship with the technology, and the anti-trust action was an indication that any patent infringement cases would be complex to defend in court.

So in 1951 and 1952, Bell Labs put on two symposiums revealing all their information on the transistor. The first was for government and military officials only, while twenty-five American companies and ten foreign companies attended the second. All were required to put out $25,000 as “a down-payment on a license.” Sensing the potential of the new device, the Department of Defense awarded a number of multi-million dollar contracts for transistor research contracts. General Electric, Raytheon, RCA, and Sylvania, all major vacuum tube makers, began working with their transistor licenses on military applications. AT&T’s Western Electric for example found in the Department of Defense an immediate market for nearly all its transistors.[3] AT&T’s fear of the government’s anti-trust threat resulted in an extraordinary diffusion of the century’s most important technology.

In the mid-1950s, the US government made a fateful decision regarding the semiconductor industry’s future when it ruled on Western Electric’s fate. In 1956, the Justice Department let AT&T hold on to its manufacturing subsidiary under two conditions. First, it restricted the telephone company from computer-related activities except for sales to the military and for their own internal purposes, such as in telephone switching equipment. Second, AT&T was also required to give up its remaining transistor patents.[4] As a consequence of the government’s pressure, the nascent semiconductor industry was released from the control of the monolithic telephone company.

Three licensees, Motorola, Texas Instruments, and Fairchild, in particular, took advantage of AT&T’s transistor technology. Each procured valuable government contracts to refine the electronic switching technology and increase reliability. The government contracts also helped them develop sophisticated manufacturing techniques to mass-produce the transistors. In particular, two political developments, the nuclear arms race with the USSR and the goal to land on the Moon, became essential for advancing the transistor technology that would propel an electronics revolution and lead to significant advances in computer technologies.

In 1956, William Shockley, John Bardeen, and Walter Brattain were awarded the Nobel Prize for their discovery of the transistor.

Citation APA (7th Edition)

Pennings, A.J. (2011, Mar 05). Why AT&T Invented and Shared the Transistor that Started the Digital Revolution. apennings.com https://apennings.com/how-it-came-to-rule-the-world/why-att-invented-and-shared-the-transistor-that-started-the-digital-revolution/

Notes

[1] Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 33.
[2] Herbert Kleiman quoted on AT&T and basic research in Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 36.
[3] Dirk Hansen’s The New Alchemists. NY: Avon Books provides a good introduction to the beginnings of the transistor market. p. 80-82.
[4] Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 34.

Share

Anthony

Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    January 2025
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.