Google’s Competitive Advantages – Fixed Costs
Posted on | March 17, 2011 | No Comments
We’ve been reading The Curse of the Mogul: What’s Wrong with the World’s Leading Media Companies by Jonathan A. Knee, Bruce C. Greenwald, and Ava Seave in my Digital Media Management II class at NYU. The book challenges some of the key assumptions regarding the management of media companies including the importance of brands, talent and deep pockets. But its main point is to drive home the importance of some key competitive advantages for media companies. It perceives them primarily as a way of setting up barriers for other companies competing or entering into its markets. These barriers include fixed costs, network efforts, customer captivity, proprietary technologies, and government protection.
In this post, I wanted to focus on Google’s fixed costs as a competitive advantage. Fixed costs are those that do not vary according to production or sales levels, or in Google’s case, the amount of advertising sold. While fixed costs can certainly be a liability to a firm, they make it difficult for any other company to challenge them without matching its expenditures or finding a way to be more efficient or transformative. (Google itself transformed the advertising business)
If you are not very familiar with Google’s business strategy, I would recommend reading Steven Levy’s “Secret of Googlenomics: Data-Fueled Recipe Brews Profitability” on the Wired website. It provides an excellent introduction to the search behemoth’s business model, primarily built around its Adwords and Adsense advertising business. Preliminary estimates for Google’s 2013 revenues look to be around $58 billion dollars.
Google has a number of advantages, perhaps foremost being the massive investments in its built infrastructure. Google’s mission of “organizing the world’s information” requires more than the most sophisticated “big data” software. It also necessitates huge investments in physical plant, particularly data centers, power systems, cooling technologies, and high-speed fiber optic networks.
Google has built up a significant global infrastructure of data centers (increasingly located close to cheap, green tech) and connecting its storage systems, servers, and routers is a network of fiber optic switches. For example, the Eemshaven data center facility in the Groningen region of the Netherlands is at the end connection point for a transatlantic fiber optic cable. The US$ 770 million data center is also being built near a power plant and contracts for other green energy providing an estimated 120 megawatts of cheap electricity. For the most part, the details on fixed costs are not readily available as they are proprietary and represent trade secrets. However, aggregate numbers of Google’s fixed costs are informative.
Microsoft of course, has the financial wherewithal to compete with Google. They have been investing in data centers for a number of cloud services including Microsoft Live’s SkyDrive. With the expansion of their Bing search challenge to Google, you can expect significant investment in a global infrastructure of server farms and communications links integral to a wide range of new advertising and e-commerce services. Whether they can catch up with Google remains to be seen but a whole industry focusing on data centers is emerging that will make them more efficient to run and more economical for investment.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Adsense > Adwords > Competitive Advantage > Eemshaven data center > The Curse of the Mogul
The University of Hawaii and the History of Non-Linear Editing
Posted on | March 16, 2011 | No Comments
When I was working on my PhD at the University of Hawaii (UH), we became the first academic institution in the world to obtain an Avid non-linear digital editing (NLE) suite. An Avid NLE was purchased in early 1990 due to the vision and determination of Stan Harms and Dan Wedemeyer, professors in the Communication Department (now School of Communications) and Patricia Amaral Buskirk, Director, who managed the department’s Media Lab.
The AVID 1/ Media Composer system with a Macintosh IIfx arrived in the spring of 1990 and was set up in the Media Lab. Patricia Amaral Buskirk put the system together and made it work. That was no easy task as it required a good understanding of the Macintosh computer and the newest developments in digital technology. Not to mention a lot a concentration and patience. We tried it out that summer with a program I used to run for the Hawaii Department of Education, and the Departments of Communication and Journalism at UH called the SPEBE Center for Modern Media. It brought some of the brightest high school students from around the Hawaiian islands to the University of Hawaii for six weeks to give them an experience of college life. (Pictured on left with Jade Moon in red, a TV news broadcaster who taught for us) Patricia Amaral Buskirk and Susan Gautsch trained the students on the Avid and helped them produce short videos on U.S. Presidents. That Fall semester we used it in an International Communication class I was teaching to produce a short documentary on satellites.
The founder of Avid, Bill Warner, introduces a similar NLE suite:
Nonlinear video editing uses the power of the computer. It has added a whole new range of editing power and flexibility to the visual story-making process (much like word-processing changed the writing process). Still, the system would be almost unrecognizable by modern standards. It displayed very low-resolution clips on the computer screen, and the editor would create the narrative and, in the process, compile an edit decision list (EDL). The desktop computer, which was connected to traditional video cassette recorders, would “auto assemble” the final video and record it back onto videotape.
According to Patricia Amaral Buskirk, “What that meant was we had the playback and record decks (with time code) controlled by the AVID and then we would pop in the original Source Tape when asked, and the system would edit (auto assemble) the final version (that was digitally edited on the computer) onto the tape in the Record deck.” Despite the state-of-the-art microprocessors and hard drives that came with the system, they couldn’t handle all the large high-resolution clips that contemporary NLE systems use.
Successive upgrades and a fair amount of money over the years allowed for better and better resolution until the external card sets were not needed. Eventually, other professors started to use it, and of course, NLEs are now standard equipment for video post-production throughout higher education and in the film and television industries. But as far as we know, UH was the first university to obtain and teach non-linear editing.
Patricia Amaral, who received a MFA graduate degree in Film and Video production at Savannah College of Art & Design (SCAD), ran AB Productions, Inc., a full service digital audio and video production company in Georgia, but returned to the University of Hawaii faculty as an Associate Professor.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: AB Productions > Avid > Avid FX > Bill Warner > Dan Wedemeyer > Inc. > non-linear editing (NLE) > Patricia Amaral Buskirk > SPEBE > Stan Harms > Susan Gautsch > UH School of Communications
Social Media Entering Phoenix Stage?
Posted on | March 10, 2011 | No Comments
I thought this guy was interesting when I saw him on MSNBC’s Morning Joe. He argues passionately that social media is in a stage much like at the end of the dot.com era where we didn’t really know how to get the returns on our investments in the Internet. He says we won’t really see the true face of social media for several more years.
Just like 2000? Well that is assuming a current overinvestment in social media. Some evidence indicates that, particularly the money pouring into Facebook. We are entering the eleventh anniversary of the famous dot-com crash when the investments in so many Internet companies did not result in significant revenue streams to justify their stock prices. In the spring of 2000, the tech market began to falter in the NASDAQ online stock exchange. A shakeout occurred and many companies like Pet.com fell to the wayside. Some companies like Amazon.com survived and e-commerce has steadily grown since the crash. Now social media is in the spotlight, but can it proceed in its current trajectory?
I ordered the Kindle version of his book The Thank You Economy yesterday and had a quick look. As expected it is a business book. Not bad in itself but in the era of the Middle East uprisings, the technology is showing itself to be a bit more multidimensional than ROI will ever show.
Visit msnbc.com for breaking news, world news, and news about the economy
On a related note, my students were quite complementary on the transformation Dominos has gone through. Apparently the outgoing CEO didn’t want his legacy to be the worst pizza ever. Now, I live in New York City, so good pizza is not scarce. But everyone should have access to good pizza. Give them a look.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
The FCC’s First Computer Inquiry
Posted on | March 6, 2011 | No Comments
Any use of a communications carrier to connect remote terminals to computers brought the system into the realm of the Federal Communications Commission (FCC). By virtue of the Communications Act of 1934, all telecommunications traffic needed to be regulated by the FCC. As early computers were being developed, his task of regulating computer communications was neither welcomed nor clearly understood by the FCC, but the number of computer linkages were increasing rapidly as businesses discovered that the computer could not only process information but could transfer it to other computers.
The number of “online” systems increased from about 30 systems in 1960 to over 2300 by 1966. The number of data terminals was estimated to have climbed from 520 in 1956 to some 45,663 in 1966. New innovations such as time-sharing allowed many people to use the same computer via telecommunications. “Computer applications, in other words, had come to depend more upon communication links, including privately owned microwave hookups, dedicated leased lines, and regular common carrier circuits.”[1] Consequently, the FCC sent out an invitation to computer users to describe their current utilization of computers and telecommunications and how they expected these systems to be used in the future. The following is a list of some of the corporations and trade associations that responded:
Aetna Life and Casuality Co.
American Bankers Association
American Business Press, Inc.
American Petroleum Institute
Credit Data Corp.
Eastern Airlines
Lockheed Aircraft Corp.
McGraw-Hill, Inc.
Societe Internationale de Telecommunications
Aeronautique (SITA)
United Airlines.
This initial study became known as the FCC’s First Computer Inquiry. It marked the initial attempt by a government regulatory agency in the U.S. to address the policy implications of the new computer systems utilizing telecommunications and specifically the emerging needs and concerns of its users.[2] By the time the First Computer Inquiry got under way in 1966, over half the installed computer systems were located in corporate industries such airlines, banking, computer services bureaus and manufacturing.”
Comments to the First Computer Inquiry from the American Bankers Association indicated that automated banking services and electronic funds transfer systems were going to grow very rapidly. The physical transfer of checks was being replaced by information transport over wire and radio. Already over 1000 banks throughout the country had computers on their premises, and another 2000 used commercial data processing bureaus. Banks were very concerned about computerizing bank functions such as account reconciliation, correspondent bank services, internal accounting, labor distribution, and professional billing.
It became clear that users were developing sophisticated computer systems that were going to be increasingly dependent on data communication services. Aetna Life and Causality stated that telecommunications services for businesses were going to be “the limiting factor in the planning of new business systems.” The First Computer Inquiry was probably the most important reevaluation of telecommunications policy since the New Deal. While this period also included the AT&T antitrust case, which regulated the giant company out of the computer business, the First Computer Inquiry addressed new concerns. For the first time, corporate needs for telecommunications systems to facilitate their computerization processes were addressed.
Much of the debate focused on the capabilities of the status quo telecommunications network and its tariff structure. Business users charged that the telephone industry and its networks were simply not up to the task of providing the sophisticated technical services they required. They complained to the FCC that they needed new types of equipment and services from the common carriers to maximize the use of their computers. The technical standards that had emerged in the regulated telegraph and telephone systems differed from those required by computer systems. Corporate users wanted to interconnect their facilities, whether they owned them or just leased them, with common carrier facilities. Of particular concern were the prohibitions against the use of foreign attachments such as modems to the telephone network. They were also pressing the FCC to allow the shared use and resale of common carrier equipment and services. As computers were still quite slow, full capacity was not always reached on circuit. Users wanted to attach their own private equipment to the AT&T network in order to combine the data and send over a single line. For example, a bank might want to purchase multiplexing equipment that could draw in the data from a number of computers and retransmit the information through a single telecommunications line leased from a carrier to a receiving facility. They also wanted these carriers to provide telecommunications circuits that were conditioned to sufficient quality so that they could transmit information through them with a high degree of accuracy and efficiency. The First Computer Inquiry began to reveal the future importance of new communications services and the limitations in the current network infrastructure.[4]
That AT&T became one of the biggest obstacles to the development of data communications can be attributed at least to four factors. First, the number of computers in operation was quite small and they were not yet perceived as popular devices, even for businesses. Computers were used almost exclusively by government and large corporate users and the need to connect them via telephone lines was uncertain. Second, the demand for voice communications was quite large. The government’s concern for universal service meant that regulated telephone companies had to address issues involved in wiring wide geographic areas for domestic telephone use. Third, the Bell system was adamant that customers be restricted from attaching non-telephone terminal equipment to the network. The telephone grid and its electrical signals were fairly fragile and their safety and clarity were seen to be at risk if unsanctioned equipment could be connected. Fourth, the Consent Decree of 1956 had restricted AT&T’s entry into the computer field and so it had very little incentive to address issues affecting this market. So despite the growing need, AT&T’s reluctance to upgrade its technical facilities and tariff structure came largely from the fact that data communication traffic volumes simply did not compare with voice. AT&T was structured by investment and regulation to provide voice services, which were growing at post-war rates faster than previously anticipated. So not only was the market for computer communications small and difficult to forecast, but AT&T had their hands full just handling voice communications.
The huge monopoly was very defensive at the time about user demands to liberalize the equipment that could be connected to the telecommunications network. Initially only the standard black rotary telephone could be attached and it was illegal to attach any non-Bell system equipment to the network such as computer terminals and other data communications devices such as modems. Responding to corporate requests, AT&T did make available “data sets” or modems after 1962 that allowed data transmission rates at 1200 to 2400 bits per second over special communication lines acquired by dialing. By the mid-60s, rates up to 250,000 bits per second were available.[5] Still, users were uneasy with the situation.
It was the burgeoning mobile telephone industry that broke the network’s monopoly on connecting outside equipment. In what would be the FCC’s Carterphone Decision of 1968, equipment other than that designed by the Bell Telephone Company could finally be attached to the network. The “Carterphone Ruling” allowed a small radiotelephone company, Carter Electronics Corporation, to link its mobile radio system to the AT&T public-switched network. The ruling set a precedent as the FCC concluded that any customer who desires to use an interconnecting device should be able to, as long as it did not “adversely affect the telephone company’s operations or the telephone system’s utility for others….”[6] The FCC’s decision allowed a large industry to grow as computers, decorator phones, modems, network monitoring equipment, and entire telephone exchanges began to be connected to the network. This action would allow individual corporations to design and implement an entire telecommunications facility outside the network and would set the legal foundation for long-distance interconnect companies to be able to link to local telephone companies.[7]
Another aspect of the phone company’s reluctance to provide telecommunications for computers was that despite the fact that Bell Labs had designed the transistor and other seminal computer technology, it was effectively restricted from entering that field. The Consent Decree obtained from the Department of Justice in 1956 barred AT&T from competing in unregulated areas including computers and international data service. AT&T was directed to focus on domestic common carriage exclusively and that meant not selling computers and not building telecommunications systems in other countries. Despite extraordinary computer developments at its Bell Labs, including the invention of the transistor, data communication capabilities did not register as a top priority for AT&T.
These issues did not go unapprised when the FCC released its First Computer Inquiry conclusions. In its final decision, the FCC opted not to regulate data communication and processing services. While legally the FCC could regulate any communication, it decided to retreat from major intervention in the face of such strong corporate pressure. While computers were not as sexy as they would be in the age of the Internet, they nonetheless were proving too useful for corporations to allow the enactment of dubious regulations. What the FCC did do in its final First Computer Inquiry decision was to make distinctions between communications networks and “online” timesharing computer systems. The computer users had argued that, despite the Communications Act of 1934; the FCC could only regulate the communications business and should not be regulating, in effect, the business of other industries. The Aerospace Industries Association even argued that “they were no more subject to regulation under the Act than the voices of telephone users.” As a result, the term “online” subsequently achieved a high rate of circulation as it was used to linguistically protect computer networks from possible FCC regulation.[8]
While the distinctions proved to be arbitrary and did not last long, the FCC’s position can be clearly discerned. Corporations were beginning to need computer communications as an integral part of financial, organizational, and productive coordination. While AT&T was a crucial part of the nation’s telecommunications infrastructure, it was becoming evident that the long-term development of the nation’s corporate growth environment would require an efficient computer-communications system. Criticism of the telecommunications structure was increasing and the dominant role of AT&T challenged. “Owing to the rapidly escalating dependence upon communications by U.S. industry and finance, a single company–AT&T was in practice materially affecting (some said dictating) the pace and direction of corporate strategy in areas far removed from its traditional communications offerings.”[9] Regulation, or the lack of it, was to be used for promoting the growth and stability of U.S. economic development, despite the antagonistic struggle with its largest corporation.
Notes
[1] Dan Schiller’s work on early computerization policy is one of the most authoritative. See his (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 22.
[2] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 23.
[3] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 28.
[4] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 29-30.
[5] Phister, M. (1979) Data Processing Technology and Economics. Santa Monica, CA: Digital Press. p.76.
[6] Martin, J. Telecommunications and the Computer. p. 37.
[7] Pool, I. (1983) Technologies of Freedom. Cambridge, MA: Harvard University Press. p. 247.
[8] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 38.
[9] Schiller, D. (1982) Telematics and Government. Norwood, NJ: Ablex Publishing Corporation, p. 42.
Share
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: FCC data policy > First Computer Inquiry
Why AT&T Invented and Shared the Transistor that Started the Digital Revolution
Posted on | March 5, 2011 | No Comments
The invention of the transistor in 1947 provided an extraordinary capability to control an electrical current. It was initially used to amplify electromagnetic frequencies and transistors and then to switch the 1s and 0s needed for digital computing. An unlikely scenario unfolded in the 1950s when AT&T’s fear of government anti-trust action and regulation sparked sharing this seminal technology with other companies. This self-serving altruism led to the “solid-state” electronics revolution and many silicon semiconductor innovations that rapidly developed computerized information technology.
The transistor emerged from the research efforts of AT&T, the corporate behemoth formed by JP Morgan and guided by US policy to become the nation’s primary telecommunications provider. In 1913, AT&T settled its first federal anti-trust suit with the US government. The agreement established the company, starting with Alexander Graham Bell’s technology, as an officially sanctioned monopoly. A document known as the Kingsbury Commitment spelled out the new structure and interconnection rules in return for AT&T divesting its controlling interest in telegraphy powerhouse Western Union.
Both companies had a history of consolidating their market domination through patent creation or purchase. For example, AT&T purchased the patents for the De Forest vacuum tube amplifier in 1915, giving it control over newly emerging “wireless” technologies such as radio and transatlantic radiotelephony, as well as any other technology that used the innovation to amplify electrical signals. Patents, as government-sanctioned barriers to entry, created huge obstacles for other competitors and effectively barred them from producing and using anything close to the restricted technology.
As AT&T grew more powerful, it established Bell Telephone Laboratories Inc. (Bell Labs) in 1925 as a research and development subsidiary. Fed by AT&T’s monopoly profits, Bell Labs became a virtual “patent factory,” producing thousands of technical innovations and patents a year by the 1930s. One of its major challenges was to find a more efficient successor to the vacuum tube.
A breakthrough occurred when William Shockley, PhD, who was the director of transistor research for Bell Labs worked with fellow PhDs John Bardeen and Walter Brattain to create the “Semiconductor amplifier; Three-electrode circuit element utilizing semiconductive materials.” The transistor’s inception dates to December 23, 1947 at Bell Labs’ facilities in Murray Hill, New Jersey.
At the time, AT&T’s famed research facility employed nearly 6,000 people, with 2,000 being engineering and research professionals.[1] The development of the transistor was not a result of just basic research; it was the result of an all-out attempt to find something to replace the vacuum tube. In any case, the government’s lawsuit meant that AT&T would tread lightly with this new invention lest it raise additional concerns about Ma Bell’s monopoly power.[2]
After World War II, the US Justice Department filed another anti-trust lawsuit against AT&T. In 1949, it sought the divestiture of Western Electric, AT&T’s equipment-manufacturing arm. The action came after, although not necessarily because of the telephone company’s invention of the transistor, an electronic device that regulated the flow of electricity through a small cylinder device. It operated much like the vacuum tube, but the transistor was “solid-state”: easier to use, more reliable, and much smaller. Faster to react, less fragile, less power-hungry, and cooler-running than glass vacuum tubes (which had to “warm up” to operating temperatures), it was ideal for a wide variety of electronic devices.
Unlike its previous history of zealously controlling or acquiring patents (including the vacuum tube) dealing with its telephone network, AT&T decided to liberally license the new technology. It did not want to antagonize the Justice Department over a technology it did not fully understand nor knew how to implement commercially. However, some of the Bell Labs employees were already jumping ship with the technology, and the anti-trust action was an indication that any patent infringement cases would be complex to defend in court.
So in 1951 and 1952, Bell Labs put on two symposiums revealing all their information on the transistor. The first was for government and military officials only, while twenty-five American companies and ten foreign companies attended the second. All were required to put out $25,000 as “a down-payment on a license.” Sensing the potential of the new device, the Department of Defense awarded a number of multi-million dollar contracts for transistor research contracts. General Electric, Raytheon, RCA, and Sylvania, all major vacuum tube makers, began working with their transistor licenses on military applications. AT&T’s Western Electric for example found in the Department of Defense an immediate market for nearly all its transistors.[3] AT&T’s fear of the government’s anti-trust threat resulted in an extraordinary diffusion of the century’s most important technology.
In the mid-1950s, the US government made a fateful decision regarding the semiconductor industry’s future when it ruled on Western Electric’s fate. In 1956, the Justice Department let AT&T hold on to its manufacturing subsidiary under two conditions. First, it restricted the telephone company from computer-related activities except for sales to the military and for their own internal purposes, such as in telephone switching equipment. Second, AT&T was also required to give up its remaining transistor patents.[4] As a consequence of the government’s pressure, the nascent semiconductor industry was released from the control of the monolithic telephone company.
Three licensees, Motorola, Texas Instruments, and Fairchild, in particular, took advantage of AT&T’s transistor technology. Each procured valuable government contracts to refine the electronic switching technology and increase reliability. The government contracts also helped them develop sophisticated manufacturing techniques to mass-produce the transistors. In particular, two political developments, the nuclear arms race with the USSR and the goal to land on the Moon, became essential for advancing the transistor technology that would propel an electronics revolution and lead to significant advances in computer technologies.
In 1956, William Shockley, John Bardeen, and Walter Brattain were awarded the Nobel Prize for their discovery of the transistor.
Citation APA (7th Edition)
Pennings, A.J. (2011, Mar 05). Why AT&T Invented and Shared the Transistor that Started the Digital Revolution. apennings.com https://apennings.com/how-it-came-to-rule-the-world/why-att-invented-and-shared-the-transistor-that-started-the-digital-revolution/
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Notes
[1] Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 33.
[2] Herbert Kleiman quoted on AT&T and basic research in Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 36.
[3] Dirk Hansen’s The New Alchemists. NY: Avon Books provides a good introduction to the beginnings of the transistor market. p. 80-82.
[4] Braun, E., and MacDonald, S. (1982) Revolution in Miniature: The History and Impact of Semiconductor Electronics. Cambridge, UK, Cambridge University Press. p. 34.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Alexander Graham Bell > AT&T > Bell Labs > De Forest vacuum tube > Fairchild > Kingsbury Commitment > Transistor > William Shockley
Seeing from Space: Cold War Origins to Google Earth
Posted on | March 2, 2011 | No Comments
President Eisenhower had been secretly coordinating the space program as part of the Cold War since the early 1950s. He had become accustomed to the valuable photographic information obtained from spy planes and considered satellites a crucial new Cold War technology. The D-Day invasion of Europe, which he had managed as the head of the Allied Forces, had been meticulously reconnoitered with low and high altitude photography from a variety of reconnaissance aircraft.
When his Presidential administration took office in early 1953, tensions with Communist countries were increasing rapidly and his “New Look” policy identified aerospace as a decisive component of future US military strategy. Given the growing nuclear capacity of the USSR, he particularly wanted satellites that could assess how rapidly the Communists were producing its long-range bombers and nuclear ballistic missiles, as well as where they were being stationed.[1]
Rocketing into the “High Ground”
Consequently, after the Sputnik debacle, Werhner Von Braun, the German captive/immigrant who headed the US space program was cleared to launch the Jupiter-C, a proven military rocket modified to carry a satellite designed by the Jet Propulsion Laboratory (JPL). On January 31, 1958, America’s first satellite launcher lifted off, carrying the 10.5-pound Explorer 1 into orbit. The Soviet Sputnik satellite had established the precedent of overflight, freeing up orbital paths above sovereign national spaces.
Later that year the National Aeronautical and Space Administration (NASA) was created, and within a week, Project Mercury was approved to place a human into orbit.[2] While motivations for the human cargo were numerous, one significant reason was that rocket thrust capability was still limited and more was needed to place heavier payloads into space. Initially this meant cameras and other sensing devices but of course thermonuclear devices were still very heavy. In order to develop intercontinental ballistic missiles, much work was needed on increasing the transport and guidance capabilities of the rocket launcher. Initially, however, the Eisenhower administration was concerned with the surveillance possibilities that the “high ground” offered.
Upon the successful flight of the America’s first rocket launcher, the Corona spy satellite program was initiated. Operating under the name Discoverer, the highly covert program was started to put a series of satellites designated Keyhole (KH) into low earth orbits (LEO). Outfitted with 70mm cameras, KH-1 and its early successors failed to achieve orbit or suffered other technical failures.[3] By the late summer of 1960, however, a capsule containing the first film stock was retrieved in mid-air by an Air Force cargo plane as it parachuted back down to earth.
The Keyhole satellites could see where the US spy planes could not. The U-2 high-altitude spy plane could fly up to altitudes of 70,000 ft and photograph military installations and movements but were in increasing danger as Soviet radar and surface-to-air missiles improved. The satellites could cover a wider distance from much safer altitudes.
The limits of the U-2 were highlighted infamously on Mayday of 1960 when U-2 pilot Gary Powers was shot down over Soviet space and captured. Eisenhower, thinking Powers was dead and the plane destroyed, downplayed the incident as a probable technical failure on a weather plane. The incident proved highly embarrassing and provided a major international public relations boost for Khrushchev when the largely intact remains of the spy plane were paraded by the Soviets in front of the international press along with images of Powers.
The Powers incident provided a strong motivation for the Corona satellite surveillance program which retrieved film, snatched from the air by US aircraft. The retrieved film contained images from locations deep inside the USSR where the U-2 couldn’t reach. These new pictures of the USSR, while not immediately as clear as the U-2 pictures, used new techniques for taking and interpreting images that were constantly improving and proving the worth of the satellites.
Contemporary Developments
The National Reconnaissance Office (NRO) emerged as one of the most secret intelligence services in the US and is part of the Department of Defense. It now has its own website and offers overhead intelligence services to a number of government agencies, warning of potential environmental and political trouble spots around the world and helping coordinate military operations.
In 2004 Google acquired Keyhole, Inc. for its Google Earth, Google Maps and Google Mobile operations. Keyhole had been formed in 2001 as a software development company specializing in geospatial data visualization applications to commercialize some of the Keyhole satellite data. They developed the Keyhole Markup Language (KML) for geographical annotation and visualization with Internet browsers.
Notes
[1] Recently unclassified documents show a much more active President Eisenhower than the public had believed at the time.
[2] Winter, F. 1990. Rockets into Space. Cambridge, MA: Harvard University Press. pp. 70-80.
NASA was established on October 1, 1958 with the passage of the National Aeronautical and Space Act.
[3] The mishaps are said to be the inspiration behind the spy novel Ice Station Zebra which later became a full motion picture hitting the cinemas in 1968.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: geospatial data visualization > Google Earth > Google Maps and Google Mobile > Jet Propulsion Laboratory (JPL) > Keyhole Markup Language > Keyhole satellites > KH-1 > President Eisenhower > Sputnik > U-2 > “New Look” policy
My Top Social Media Books
Posted on | February 27, 2011 | No Comments
In general I’m unhappy with the focus and range of books on social media. Still, some interesting works are worth mentioning. I don’t really have them in a particular order as I haven’t really developed a useful taxonomy. Also I don’t have extensive comments as I link them to their Amazon pages where you can read the reviews on each one.
Yochai Benkler
– Opaque and intellectual, this book gets better with time. Its strengths are its discussions of democracy and the emergence of a networked public sphere.
Erik Qualman
– Readable and likable, it provides a good introduction to social media. A bit too anecdotal for my taste, but like I said, it’s a good start.
Amy Shuen
– I liked this book but found it a bit too cryptic for the undergraduate class I ordered it for. Still it’s the book I go back to when I want more serious ideas about social media. Looking forward to the second edition.
Clay Shirky
– If you want to know what Wael Ghonim meant when he said “Ask Facebook”.
David Weinberger
– Read this traveling on the trains in Japan and use it in one of my Digital Media Management classes. Really good foundation to the different ways of organizing information and how digital technologies transform the whole area of information management.
Lon Safko
– Buy the Kindle version, this book is big.
Jim Sterne
– Numbers, deal with them.
Deirdre Breakenridge
– I plan to use this in my New Technologies for Advertising and Public Relations course at NYU this summer.
Jeff Jarvis
– Google has a mixed record when it comes to social media hits but with $28 billion in 2010 revenues it is a force in itself.
Chris Anderson
– This is a classic in conceiving some of the economics of social media and blogging.
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Here Comes Everybody > New Technologies for Advertising and Public Relations > Social Media Metrics > Socialnomics > The Long Tail > The Wealth of Networks
Media Content as a “Public Good”
Posted on | February 23, 2011 | No Comments
Media products are sometimes referred to as “public goods” because these products are not significantly “used up” in the process of consumption, and each additional user incurs little additional (marginal) cost. Unlike traditional products like a hamburger that is bit into, chewed, swallowed, and digested, media content is generally not destroyed while being consumed. Although the timeliness of media content is often an issue, the consumption of media does not deplete it.
With traditional public goods like a road or a park, the costs associated with the additional driver on a highway or your child enjoying the playground are minimal. With media content as well, additional consumers are served without incurring much expense. An additional radio listener, for example, would not impose even a negligible cost on the radio station. Likewise, one more click on a website would not cost the owner very much.
Economists like to talk about two characteristics of goods and services, called rivalry and excludability. Rivalry is sometimes called subtractability. When someone purchases a good that is highly rivalrous, it is consumed entirely by that person. These are also called private goods in that they are removed from collective or shared use. These are the ideal goods for economists because these products are “well-behaved” in that they fit nicely into their economic models and are easily priced. A mobile phone almost always has a single owner until it breaks or becomes outdated.
Other goods are less rivalrous such as going to a water park. Labor is utilized, water evaporates, but for the most part, the park remains essentially the same after the family packs up and goes home. Radio stations emit their programming with a signal that is not used up by a commuter listening to music (and ads) on their way home.
The other issue is excludability. A water park usually has enough fencing and security to keep those who do not pay away from its pools and rides. A beach would be more difficult, and many communities make sure that rich owners of beachfront properties do not restrict surfers, swimmers, and other beach users from accessing the beach. These “common goods” can be subject to congestion as more people use them.
The “nonexcludability” criteria of a public good takes into account the costs associated with keeping nonpayers out of the system. It is very difficult to keep a car off a road or a kid out of park, just as it is not feasible to keep a viewer from watching a TV show on broadcast TV. An interesting case is the economics of knowledge, which is not consumed, but is it excludable? And what conditions would restrict others from acquiring it?
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global communications.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
« go back — keep looking »