Anthony J. Pennings, PhD

WRITINGS ON DIGITAL STRATEGIES, ICT ECONOMICS, AND GLOBAL COMMUNICATIONS

YouTube Meaning-Creating (and Money-Making) Practices

Posted on | February 2, 2020 | No Comments

Note: This is required reading for my Visual Rhetoric and IT class.

Youtube has emerged as the primary global televisual medium, attracting about 1.3 billion viewers from countries around the world with over 5 billion videos watched every day. People suck up some 3.25 billion hours of YouTube videos each month and over ten thousand YouTube videos generated over 1 billion views since they were posted. YouTube contents range from homemade DIY videos to professional high definition television productions.

Alphabet recently announced that YouTube made $15 billion in advertising revenues in 2019, growing 36% over 2018. That is a lot of money to spread around.

YouTube provides opportunities for new publishers or “vloggers” covering a wide range of topics. Every minute, some 400 hours of video are uploaded to YouTube from all around the world. Not many of those owners get rich but some have done extraordinarily well. Together, the world’s 10 highest-paid YouTube stars made $180 million in the year between June 1, 2017, and June 1, 2018, almost double the year before.

One big star to emerge on YouTube is Daniel Middleton (DanTDM) who made US$18.5 million in 2018. Middleton is an Australian professional gamer, and his videos primarily cover games like Minecraft, Plants vs. Zombies, and other favorite games that DanTDM’s primary audience, young kids, enjoy. Here he reviews the massive hit called Fortnite.

What makes DanTDM’s YouTube videos successful? What does he do to keep the viewer’s interested in his content and what keeps his audience coming back for more? How does he create entertainment and meaning for those who watch his show?

Even more extraordinary is Ryan ToysReview (now called Ryan’s World!). Ryan is a 7-year-old host of the YouTube show and is the top-earning YouTube channel at $22.5 million for the year up to June 1, 2018.

What knowledge can we gather about Ryan’s World!? What observations can we make about his show and other popular channels?

This series of posts will set out to explore a crucial relationship in (digital) media studies – between cultural/technical production practices and the meanings, emotions, and feelings that are produced by those practices. Media production involves a combination of equipment and processes to capture and construct various images, edit sequences, and integrate audio and sound effects to produce specific results. These are the meaning-making techniques that construct our blockbuster movies, our Netflix binge favorites, our local newscasts, and also the YouTube channels that educate and entertain us. Can we use some of the same analytical techniques to “interrogate” YouTube channels?

A good deal of related work has been done on film and television. By exploring camera shots: close-ups, zooms, pans, shot composition, as well as montage: cutting rates, parallel editing, reaction shots, wipes, etc., film studies and even television studies have given us a literacy to understand the power of these mediums. These important meaning-making practices can also be discerned in the realm of YouTube videos.

Social media apps like YouTube present significant new complications in understanding the power of the global mediasphere. One area of concern are the metrics associated with YouTube. Ratings were always a significant part of television services to determine the value of programming. YouTube measures “views” and adds likes, dislikes, shares, play-listing, and subscribers to measure the credibility and commercial viability of a channel. But vulnerabilities in the system allow many of these numbers to be tweaked by the “fake-view” ecosystem that has grown around YouTube.

YouTube has become a new frontier for media studies. The opportunity exists now to pioneer strategies for understanding this intriguing visual medium and the sets of meanings they create. What techniques are used in YouTube “channels?” What types of persuasive techniques are effective on YouTube channels. How do they differ from techniques used in film and television? Who is driving the narration of the video and what voices are they using?

But there are broader issues to address as well. What are the cultural, economic, and social implications of YouTube? What new ideas and cultural forms diffuse via Youtube? What economic activities and opportunities are made available through the platform? What impact will YouTube have on existing institutions?

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

FROM NEW DEAL TO GREEN NEW DEAL, Part 1: Roosevelt Saves Capitalism

Posted on | February 2, 2020 | No Comments

“I pledge you, I pledge myself, to a new deal for the American people.” – FDR in the summer of 1932.

Recent discussions about a proposed Green New Deal encouraged me to review some of my notes on the original New Deal and if it could provide relevant insights into our current situation. The New Deal began in the early 1930s as a response to the economic crash in the late 1920s. It built up momentum during the “Great Depression” of the next decade. It ended, arguably, just before the end of the millennium with the repeal of the Glass-Steagall Act on November 12, 1999 when President Clinton signed the Financial Services Modernization Act. The New Deal was one of the most influential sets of legislation in American history and it set the course of modern US history.

On October 1, 1928, the Dow Industrial Average closed at 240.07. Earlier that year, “the Dow” was increased to thirty stocks from the traditional 12 stocks. The “Roaring Twenties” had been a good decade for many investors. But that was about to end for most.

It would continue to rise over the next year to 381 before beginning its dramatic descent. In October 1929, the Dow began its steep decline as investors hastened to liquidate their positions. From Thursday, October 24 to Tuesday, October 29, the stock market crashed dramatically, eventually falling to just 41.

Capitalism had run into deep trouble. Never before had public confidence been shaken so thoroughly. Unemployment was estimated to have fallen to 25% in the US and England. It was even worse in Germany, which had been strapped with war reparations at the Treaty of Versailles. Political unrest was brewing in democratic political economies around the world. It was little more than a decade earlier that a Communist revolution had occurred in Russia, and many believed that the problems of the industrial economy needed a communist or socialist solution.

In the US, protest marches became frequent. Rent riots broke out as people organized to prevent home evictions and farm foreclosures, often physically. The country was in such dire straits that many believed it could have gone in any political direction. This confusion lapsed into the term of the next President, Franklin D. Roosevelt, who beat Republican incumbent Herbert Hoover and was inaugurated as President on March 4 1933.[1]

Roosevelt moved immediately to save the banking system that had been experiencing significant drains on deposits. He closed the banks until the federal auditors could review the books. Although many were deemed insolvent, he decided to save the remaining banks by signing the Bank Moratorium that suspended acknowledgment of their demise.

After deliberating with Treasury Department officials, former Hoover advisers, and several leading bankers to come up with intervention measures and reform practices, Roosevelt reopened many banks. He decided to leave them in the hands of their original owners instead of nationalizing them as many thought he might.

The Emergency Banking Act of March 9 allowed the Federal Reserve to make loans to businesses and nonmember banks against the assets that they were allowed to define very broadly. The Reconstruction Finance Corporation, Hoover’s singular response to the economic collapse, was authorized to buy stock in banks and thus provide them with working capital. Three days later, Roosevelt made his first “fireside chat,” in which he made a plea for citizens to redeposit their money. The legislation plus his encouragement were a success, and by the end of the month, he had saved the banking system.

Immediately after inauguration, Roosevelt took controversial measures to stop the hoarding of gold. In April Roosevelt took the US off the gold standard and called on all Americans to turn in their gold coins for a new gold-backed paper currency. On Jan 31, 1934, the US devalued paper money from $20.67 to $35 dollars for an oz of gold.

In 1935, amidst concerns about the threat of fascism growing in Europe, Roosevelt decided to move US gold reserves from New York to Fort Knox in Kentucky. By moving the gold past the Appalachian Mountains and next to a military tank battalion and training facility, it reduced the risk of a Nazi attack on Manhattan that might capture significant amounts of gold bullion. Central bank vaults were the primary targets of the Nazis in their early invasion of Czechoslovakia and Poland, as gold was essential to efforts to procure oil and other critical supplies internationally for its war effort.

Despite the boldness and swiftness in which FDR carried out his plan, it was basically a very conservative to the national disaster. The plan he embarked on was to essentially save capitalism. At the time was Russia was firmly in the grips of its Communist leaders, and Hitler and the National Socialists (Nazis) had strengthened their grip on Germany. Many business leaders, ensconced in the rhetoric of laissez-faire and free enterprise economics, were slow to realize the conservativeness of the New Deal. They chastised Roosevelt in the newspapers and on the radio. But he was very popular with the people, including Ronald Reagan, the future US president that would later become one of the New Deal’s fiercest critics.

Over the next few years the administration set into place a reformed capitalist system based on the rationalization of business, finance, and labor practices and focused on long-term stability. Banks were separated from their other activities such as investment banking and stock-brokerage through the Glass-Steagall Act of 1933. The Securities Exchange Act passed in the next year created the Securities and Exchange Commission (SEC) to oversee and prevent manipulation and rigging in the stock markets. The Federal Reserve Board in Washington was also given greater powers to oversee the regional Reserve Banks and the Federal Deposit Insurance Corporations (FDIC) was instituted to prevent further bank panics and to restore depositor confidence.[2]

Hopefully, the transition to a green economy will not take such a dramatic event as the Great Depression. Tragically, we might be facing a bigger threat with climate change and environmental pollution. The New Deal was central to transiting to a carbon-based industrial model still heavily reliant on manual labor. Coal and oil were central to the New Deal, as was the process of electrification.

A very crucial component of the new economic transition will be “green finance” – how will the Green New Deal be paid for, and what are the ideological implications of the process? Will the current banking system suffice? Can pension funds and other wealth management funds be sufficiently incentivized to make the crucial investments? Will the political will be developed to stop subsidizing fossil fuels?

Finally, will the Green New Deal be “socialist?” or can capitalism be harnessed for the transition? The 2020 elections will be an important forum for discussion on these matters.

Part 2 of this series will discuss an early failure of the New Deal, the National Industrial Recovery Agency’s (NIRA) attempts to administer the economy and a wide range of codes and practices for business and industry.

Notes

[1] The Three Roosevelts. (n.d.). Retrieved February 16, 2020.
[2] McQuail, K. (1982) Big Business and Presidential Power. NY: William Morrow and Company.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Factors Supporting Early Computerization and Data Communications

Posted on | January 15, 2020 | No Comments

Several factors contributed to the development of computers and data networking in the early post-World War II era. This post looks at major influences that created the modern realm of computerization and networking of data and information that has transformed the world.

During the war, the first computers were created to calculate tables for artillery ballistics and to help decode encrypted messages. The British developed general-purpose computing technology to break the German’s Enigma-based codes. Eventually, a giant vacuum-tubed computer, the ENIAC (Electronic Numerical Integrator and Computer), emerged as the first completely electronic, general-purpose computer. Although it was completed too late to impact the outcome of war significantly, the promise of its potential made it a celebrity in the mid-1940s.

One major factor that supported post-war economic development was the availability of an electronics infrastructure on the East Coast. Wartime funding, primarily for the development of radar, helped build a complex of industrial organizations and expertise that provided a foundation for the computer and electronics industries. Located primarily in Boston, it stretched out to the IBM’s Hudson River facilities, down through Manhattan and New Jersey’s Bell Laboratories and to Philadelphia’s University of Pennsylvania. MIT emerged as the primary center of innovation with the development of the early real-time computers, monitors, modems, and time-sharing technology.

A second factor was the invention and sharing of the transistor by Bell Labs. This technology provided the seminal system for the miniaturization of processing power leading to the integrated circuit and later the microprocessor. What helped the process was that AT&T was facing anti-trust action and the divestiture of its manufacturing arm, Western Electric. It consequently decided to share its technology with other companies to avoid serious government intervention in its affairs. Companies such as Texas Instruments, Fairchild, and Motorola were a few of the new licensees. They set out to capitalize on the new invention. The result was a wide variety of inventions starting with missile guidance systems and computers and later consumer products like transistor radios and calculators.

The escalation of tensions with Communist China and the USSR was a third factor. The “Cold War” provided a permanent stream of funding for the development and maturation of information technologies and created the impetus to institutionalize a trajectory that President Eisenhower called the “Military-Industrial Complex.” The support for the industry was extensive, especially from the newly created Central Intelligence Agency (CIA) and the ultra-secretive National Security Agency (NSA). They literally built hectares of big mainframe computers, as did the Office of Naval Research and government organizations such as the Atomic Energy Commission at Los Alamos.

Related but still deserving of a separate mention was the desire to create an early warning defense system linking computers via telecommunications to an extensive grid of radars around the US and Canada. Created by MIT in the 1950s and built in IBM’s Poughkeepsie facilities, it would later become NORAD and be located deep within the Cheyenne Mountains in Colorado. The project cost the US government billions of dollars at a time that was considered big money. Called Semi-Automatic Ground Environment or SAGE, it created the foundation of the computer industry by supporting the IBM FSQ-7. Burroughs, DEC, and Honeywell were also spinoffs that became viable business computers suppliers. The SAGE technology was instrumental in the development of the IBM System/360 mainframe, released in the mid-1960s to become the re-programmable business computer of choice. It also helped develop the data communications modem and “survivable” communication technology concepts that would later be crucial for the Internet.

Fifth, the formation of ARPA in reaction to the USSR’s Sputnik satellite helped seed computer science departments throughout the US and directly funded the ARPANET, the precursor to the Internet. Specifically its Information Processing Technology Office (IPTO) spearheaded an aggressive attempt to develop interactive computer technologies. At first timesharing technologies were developed that shared a mainframe among numerous terminals, Later resource-sharing was created, allowing a terminal to access many different computers in different locations. Add TCP/IP and Hypertext protocols in the next two decades and we would have the modern Internet.

Sixth, the formation of the US space program leading to NASA propelled the development of US rockets and the capability to launch satellites into high orbits. Drawing on Nazi Germany’s V-2 rocketry, the US overcame a weak start to become the leader in the “space race.” The advent of human-crewed space flights served the dual purpose of developing rockets capable of carrying heavy payloads for nuclear warheads and satellites, as well as for harnessing the popular imagination for the future funding of the space program. A global network of communications satellites in the geosynchronous “Clarke Belt” orbit around the Earth and the “New Look” reconnaissance program with remote sensing and surveillance spacecraft effectively utilized space for military and commercial purposes.

The Communications Act of 1962, committed the US to the establishment of Intelsat, an international consortium for satellite communications. Intelsat mobilized national telecommunications organizations from around the world to invest in Arthur C. Clarke’s vision of a global communications system based on satellites placed in orbit 22,300 miles in space. Intelsat was quickly transformed into a workable commercial system for voice, video, and later data communications.

Seventh, the refinement of transistors into new “integrated circuits’ that put several transistors on a single “chip” furthered the miniaturization process of information processing. At first, the process was heavily subsidized by government and military projects. The goal to put humans on the moon created the need for a “fourth crewmember,” the Apollo Guidance System (AGS). It was a new set of “miniaturized” guidance technologies utilizing advanced ICs that could control major functions on the Command and Lunar Modules. Also, the implementation of a new defense policy that came to be called MAD (Mutually Assured Destruction) required a buildup of Minuteman intercontinental missiles. It also gave a significant boost to the subsidization and refinement of new ICs and the eventual development of semiconductor microprocessors with transistors etched into the material.

Eight, the bureaucracies of the “Great Society” created new needs for information processing technologies. Just as the New Deal helped IBM survive the Great Depression by passing the Social Security Act, the growth of the civilian government allowed new companies to prosper. Ross Perot’s Electronic Data Systems (EDS), for example, earned billions of dollars from Medicaid contracts. Xerox was another company that profited extensively from the Great Society, placing its photocopying machines in a wide variety of bureaucracies. It took part of its profits and invested them in a crucial technological incubation center, the Palo Alto Research Center (PARC).

PARC hired many of the best ARPA-sponsored computer scientists from around the US. It produced many of the seminal technologies crucial for graphical user interfaces, laser printers, local area networking, and object-oriented programming. In exchange for the opportunity to buy Apple shares, it gave Apple Computers much of the technology that went into the Lisa and the Macintosh. Other employees went to Microsoft and 3Com to produce Windows and Ethernet.

These factors reflect the extraordinary investments by the US government during the Cold War and Space Race to building a new infrastructure based on information technologies and data communications technologies. Later, global finance made significant investments to advance these technologies for their purposes. With the introduction of the Internet and World Wide Web, the infrastructure was developed enough to attract widespread commercial investment.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Four Futures: One Humanity

Posted on | January 2, 2020 | No Comments

I’ve had a long-term interest in an area of research called “Futures Studies.” I read Alvin Toffler’s Future Shock in high school and Third Wave in college and reading science fiction actually led to my studies in biochemistry and other sciences as an undergraduate. But the future is hard to predict and it was coming on so fast that by the 1990s, one of my favorite authors, William Gibson, quipped appropriately, “the best science fiction is on CNN.” So I wrote my PhD dissertation on dystopias and electric money.

This post looks at a few of my favorite futurists and a book I recently found intriguing that presented four visions of the future.

One of my favorite futurists was Buckminister Fuller. He mixed science with politics and had a unique view on economics. Comprehensive Anticipatory Design Science was the name he gave to his approach to the future. It was, in part, based on his “Synergetics” science that would eventually lead to a significant discovery in chemistry called Fullerenes or “Bucky Balls.” These are small soccer ball-like molecules that are leading to new materials and medicines. One of his first inventions based on Synergetics was the geodesic dome, which provided durable protection for radar installations starting in the Cold War and is still used in the design of many buildings.

Fuller challenged the status quo with his peculiar use of language that seemed to counter common sense and yet is surprisingly insightful. He often used the metaphor of a bow and arrow to describe the importance of history in the study of the future. The further back you can pull the bow’s string, he explained, the farther the arrow will travel. Likewise, he believed the further back you take your historical analysis, the better you can project into the future. Fuller promoted a utopian future based on technologies, if those technologies were based on synergetic design principles, and did more with less.

Another major influence was Jim Dator, a professor at the University of Hawaii while I was working on my PhD. He dissuaded his students of the idea of a one true future whose probability could be calculated with positivistic certainty, and suggested we use a futures visioning process to envision and develop several alternative scenarios.

Dator has been working on the analysis of four types of future scenarios: Continued Growth, Decline and Collapse, Limits and Discipline, and Transformation. Its possible to use a standard S growth curve to visualize these potential paths as shown below.

Continued Growth projects the current emphasis on economic development and its social and environment implications.

Decline and Collapse suggests a catastrophic turnaround due to natural or human-made disasters. Pollution and changes associated with massive carbon dioxide and methane releases are current concerns as they are linked with massive weather changes influencing droughts, floods, and wildfires.

Limits and Discipline appeals to a society that values precious places, processes, or values that are threatened by the existing economic and social trajectory. In this scenario, it is believed that life should be “disciplined” around a set of fundamental cultural, ideological, scientific, or religious values, including “green” solutions such as recycling or social distancing and mask-wearing in pandemic times.

Finally, a Transformative society anticipates a radical makeover of society based on technological or biological revolutions. A “singularity” of network connected humans and AI is one projected scenario. The creation of new genetically reconfigured “posthuman” bodies is another vision, perhaps due to the viral innovations of COVID-19 research. This scenario posits entirely redesigned sets of global economic and political structures.

S curve futures

So I was immediately drawn to Four Futures: Life After Capitalism (2016) by Peter Frase when it was recommended by a former classmate. It is an extremely interesting read. His publisher Verso, advertises it as “an exhilarating exploration into the utopias and dystopias that could develop from present society.”

For Frase, something new is coming, and its based on two main drivers: climate change and automation. These issues are bringing problems and promises for humanity and will likely result in one of four scenarios:

  • a society of equality and abundance (Communism);
  • a society of hierarchy and abundance (Rentism);
  • a society of equality and scarcity (Socialism); and;
  • a society of hierarchy and scarcity (Exterminism).

Or at least we can use these four “ideal types” to think about the future and plan strategies for approximating a preferred one.[1] They set up contrasting visions of the future and work to produce an analytical structure that provides provocative ideas and insights. Granted some of these terms are quite charged in contemporary society. Communism is the boogeyman of the right, Exterminism is the great fear of the left.

But this is not your father’s Communism. This is a highly automated future with increased leisure time and a new abundance of resources based on renewal energy. While some think this only comes with great political mobilization and struggle, Frase believes the process will be facilitated by technological change and the institutional responses that come with regulatory adjustments. Those technological changes will also be the catalyst for a stronger democracy.

Rentism is a future that is produced when strong intellectual property (IP) laws persist and dominate over a new era of manufactured commodities produced by 3-D printers and Star Trek-like replicators. We already live in a world economy dominated by supply chains that produce major international flows of royalty payments. Copyrights and patents bestow rents to the owner of these intellectual properties, making them the newly rich and also creating a new era of scarcity. Legalism will proliferate to keep track of all the IP uses, but AI will largely replace lawyers.

Socialism is a future that has not kicked its carbon habits. It thus operates within the limits of hydrocarbon access on the one hand, and the extreme ecological damage it creates on the other. Addressing the resultant environmental issues may create new conditions for democratic governance and distribution.

Exterminism is a future of both scarcity and inequality. Automation has made labor redundant, and environmental damage has made them dangerous. The rich hide in “enclave societies” behind gates or perhaps “off-world” and become increasingly desensitized to the conditions of the poor. Just as tabulating machines, punched cards, and tattooed prisoners enabled the Nazi’s Final Solution, social media and big data technologies are available for identifying those classified as unwanted by a society. Immigrants, refugees, gender deviants, as well as poor people in general, could be easily targeted.

Frase grounds his work in The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies (2014), by Erik Brynjolfsson and Andrew McAfee. The first machine age was made possible by the application of steam power to industrial processes that led to subsequent innovations in energy and other technologies changing work, society, and the economy. Brynjolfsson and McAfee argued the second machine age is based on digital technologies. These technologies produce information that has little to no marginal costs when reproduced and shared, continues to double in processing power every two years, and stays user-friendly with its “combinatorial” power.

Music, for example, can be reproduced and distributed with little cost and distributed to smartphones with incredible abilities to provide high-quality sound, produce playlists, and provide lyrics and other artist information. That same device records data, takes and stores pictures, makes phone calls. The authors call this “bounty,” massive benefits allowing us to do Bucky Fuller’s more with less – like talking or videoconferencing overseas for hours for virtually no cost.

Lastly, the term “spread” refers to the increasing inequality that is also resulting from the widespread adoption of new technology. Automation will continue to eliminate routine jobs and at least keep wages stagnating in certain areas. Furthermore, networked technologies tend to create winner-take-all markets, and the globally linked stock markets have dramatically improved the wealth of investors.

These digital technologies produce more: more education, more entertainment, more health care, more travel, etc. Still, the future of the social and political institutions that they will produce is yet to be determined. Futures studies is an interesting exercise in thinking about available directions and choices to be made.

Notes

[1] The notion of ideal types comes primarily from Max Weber.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

The CDA’s Section 230: How Facebook and other ISPs became Exempt from Third Party Content Liabilities

Posted on | November 26, 2019 | No Comments

“The rewrite of the communications law that emerged by early 1996 was driven by the appetite of the Bell legatees to position themselves as central providers of both content and conduit for the information age.” – Patricia Aufderheide [1]

Facebook and other Internet Service Providers (ISPs) are facing criticism for the legitimacy of the third party content they carry and their attempts to manage it. Political advertising has been a major issue, but more recently, the legitimacy of President Trump’s tweets.

According to Title V of the Telecommunications Act of 1996, online intermediaries such as ISPs and telcos were legally protected from what users and publishers might do or say. The legislation was passed to enhance these service providers’ ability to monitor and even delete content without becoming publishers. In an emerging age of user-curated and user-generated content, the legislation has specific implications for the provision of news and social media in general.

Specifically, Section 230 of the Communications Decency Act (CDA) stated that, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[2] Section 230 also offered protection to bulletin boards and later bloggers who host comments. Bloggers and later vloggers such as YouTube channels were not liable for comments left by readers or tips left via email.[3] Although the CDA was struck down by the Supreme Court as being too restrictive to free speech, Section 230 continued to shape Internet services.

The Clinton Administration and its appointed Federal Communications Commission (FCC) Chairman worked with the Republican-controlled Congress to pass the Telecom Act of 1996 and its associated Section 230 of the CDA. President William Jefferson Clinton, Vice President Al Gore, and the new FCC Chair, Reed Hundt, drove the policy process intending to enact a new telecommunications re-regulation to help revive the economy with a strategy based on the centrality of information technology. An agreement depended on getting some support from the new Republican Congress.

Republicans grouped around Newt Gingrich, a history professor from Georgia, who had been elected to the US House of Representatives in 1979. “Newt” proposed a new strategy of extreme partisanship, encouraging a total lack of cooperation with Democrats. He attacked House Speaker Jim Wright on ethics charges of bribery and not reporting book receipts, eventually driving him from office. Often invoking the memory of Ronald Reagan, Gingrich helped move the political landscape significantly to the political right.

Gingrich became the chief architect of the infamous “Contract with America,” an attempt to revive the Reagan Revolution. The Contract attempted to legislate the radical agenda of the Republican right, such as cutting back on welfare, forcing a balanced budget, eliminating public television, and phasing out government regulations on the media and telecommunications industries. Newt also wanted to get rid of the FCC, and relax accounting and securities rules on corporations.

On November 8, 1994, the Republicans obtained the House majority with a flood of new freshmen Congressmen, including Sonny Bono, the former Mayor of Palm Springs and the slightly less talented side of the “Sonny and Cher” act. Another new Congressmen, Joe Scarborough later became the host of “Morning Joe,” an MSNBC morning cable news show. With the Republican victory, Gingrich became the Speaker of House, coordinating the legislative agenda and stepping up his divisive efforts.

On the legislative docket was a major reform of the Communications Act of 1934, eagerly pursued by both the Democrats and the Republicans. Central for Gingrich was the abolishing of the FCC and rolling back of anti-monopoly regulation. Tom DeLay (R-Tex.) worked with some 350 industry lobbyists drafting deregulation bills. Called Project Relief, the secretive group organized campaign contributions for the legislation’s supporters while charting the course for a new era of oligopoly-controlled distribution of media content.

It was the Democrats that took the more prurient course. Despite Gore’s objection, Senator J. James Exon, a Democrat from Nebraska, inserted the Communications Decency Act that criminalized offensive content. Section 230 of the Communications Decency Act was not part of the original Senate legislation, but took shape in negotiations with the House of Representatives, where it had been separately introduced by Congressmen Christopher Cox (R-CA) and Ron Wyden (D-OR). Called the Internet Freedom and Family Empowerment Act, it passed by a near-unanimous vote.

One of the key issues guiding Section 230 goes back to a lawsuit called Stratton Oakmont v. Prodigy. Stratton Oakmont was a financial institution, and Prodigy was an online service that ran a chat room and offered a number of other services such as news and weather. Stratton Oakmont sued Prodigy because an anonymous participant in the chat room tarnished the financial company’s good name. Because the person who posted the information could not be found, Stratton Oakmont sued Prodigy. They won a 1995 U.S. New York Supreme Court decision that held that online service providers could be held liable for the speech of their users. They considered Prodigy to be a publisher as they had been filtering offensive content.

Congress, with encouragement from the telcos, did not agree and Title V was introduced to the Senate Committee on Commerce, Science, and Transportation by Senators James Exon (D-NE) and Slade Gorton (R-WA). Title V attempted to regulate both the exposure of indecency to children and obscenity online. It was added to the Telecommunications Act in the Senate on June 15, 1995 by a vote of 81–18.

It effectively immunized both ISPs and Internet users from torts committed by others using their online services. A tort is legal liability for a civil wrong done by a person that unfairly causes someone else to suffer some harm or loss. The exemption was designed to protect the service provider, even if they fail to take action after receiving complaints about the harmful or offensive content.

Postscript

Recently, President Trump issued an executive order to limit the power of social media platforms. The action came after Twitter placed fact-checking warnings on two of his tweets about election fraud. He claimed, without credible evidence, that voting with mail-in ballots will result in election fraud.

Will the Internet change? Some a major reform occurred in 2018 to Section 230 when Congress came down on websites that promoted sex trafficking. While the original concerns dealt with TV broadcast and cable companies, Internet platforms like Google, Facebook, and Twitter have become the more powerful media. Issues like privacy and surveillance have also become major concerns. But it’s not likely that President Trump can muster up the political support in Congress to make significant regulatory changes.

Notes

[1] Aufderheide, Patricia (1998) Communications Policy and the Public Interest: The Telecommunications Act of 1996. NY: Guilford Publications, Inc. p. 37.

[2] (47 U.S.C. § 230). U.S. Code Title 47. TELECOMMUNICATIONS Chapter 5. WIRE OR RADIO COMMUNICATION Subchapter II. COMMON CARRIERS Part I. Common Carrier Regulation Section 230.

[3] Mackey, Aaron, et al. “Section 230 of the Communications Decency Act.” Section 230 of the Communications Decency Act, Electronic Frontier Foundation, www.eff.org/issues/cda230. Accessed November 25, 2019.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Vice-Chair at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

The Spreadsheet that Fueled the Telecom Boom – and Bust

Posted on | October 24, 2019 | No Comments

“I had built a model in an Excel spreadsheet that translated what our sales forecast was into how much traffic we would expect to see,” he says. “And so I just assigned variables for those various parameters, and then said we can set those variables to whatever we think is appropriate.”Tom Stluka, Worldcom 1997 [1]

By the mid 1990s, telecommunications infrastructure was at the center of the world’s attention. The Internet and its World Wide Web were taking off. Cable TV began to offer broadband services. Satellite signal power shrunk dishes to a few meters. Mobile telephony also showed promise. In the wake of the Telecommunications Act of 1996 and the 1997 meeting of the World Trade Organization (WTO), investors poured some US$2 trillion dollars into the telecommunications industry.[2]

This post explores the story that an employee at WorldCom, a major telecommunications company that later became part of Verizon, formulated and propagated a spreadsheet that projected a major growth period for the Internet. It created a media conversation that heavily influenced the flow of investment capital into the telecommunications sector.

I’ve written previously about the impact of the digital spreadsheet on modern society. It has become what I call an techno-epistemological tool that creates meaning and cognitive trajectories of analysis and action. These worksheets combine words, numbers, lists, tables, with quantitative tools and formulas that structure information and suggest decision paths and scenarios. This case of the spreadsheet that changed the telecommunications environment of the 1990s operated initially within the WorldCom operation. Then it produced results that diffused throughout the telecommunications/Internet industry and investment community. The story became a bit of an urban myth, but that only points to its rhetorical value as it circulated through the technologically-driven economy of the 1990s “Bull Run” era.

The Bull Run

Interest in telecommunications intensified in the late 1980s with the emergence of contending “information superhighways”. Fiber optic cabling, multi-protocol routers, and ADSL broadband connections promised new services for both traditional cable and telephone companies. Mobile telephony and some data services like Gopher also started to become viable.

The privatization of the Internet in 1992 and invention of the World Wide Web’s hypertext protocols a few years later made “dot.com” companies feasible. The IPO of Netscape, famous for its radical web browser, marked the start of the dramatic “dotcom” investment boom of the bull run. People bought PCs or Macs, hooked to a modem, dialed into a local ISP, and “surfed the web.”

WorldCom was at the center of that investment boom, but many telecommunications firms benefited. Money also flowed into new companies like Enron, Global Crossing, Tyco, and Winstar, as well as traditional telecommunications companies like AT&T and the “Baby Bells” of the time (Ameritech, Bell Atlantic, BellSouth, NYNEX, Pacific Telesis, Southwestern Bell, US West). WorldCom emerged in the 1990s as a a significant growth company as it expanded from a long-distance provider to a major Internet Service Provider (ISP).

WorldCom started with long-distance telephone services after the breakup of AT&T but continued to expand through mergers and acquisitions. It acquired telecom competition pioneer MCI and became the largest ISP in the world after its purchase of backbone provider UUNET. Although WorldCom would end in accounting scandals, bankruptcy and ruin, and its CEO sent to prison, they inadvertently (or not) sparked the dramatic investment boom in telecommunications.

The Spreadsheet Model/Meme

In 1997, when he was an employee of WorldCom, Tom Stluka created a “best-case scenario” for the Internet’s growth on an Excel spreadsheet. Tom Stluka was an engineer for UUNET, a popular Internet service provider (ISP) that was taken over by WorldCom in 1996. He regularly developed estimates for data traffic based on a spreadsheet model he created.

Stluka’s CEO, Kevin Boyne, would often encourage Stluka to increase his forecast. Boyne wanted his suppliers of fiber optics and other new telecom equipment to increase their production so that supplies of the glass conduits and routers would be sufficient, and prices driven lower due to an abundance of supply. Boyne contended that the Internet was doubling in size every 100 days. So Stluka created a spreadsheet that validated these best-case scenarios for the Internet’s growth.

The spreadsheet story was revealed in a CNBC television news show, “The Big Lie: Inside the Rise and Fraud of WorldCom,” by their news analyst David Faber. Edward Romar and Martin Calkins explained:

    The so-called “big lie” was promoted by citing an internally developed spreadsheet developed by Tom Stluka, a capacity planner at WorldCom, that modeled in Excel format the amount of traffic WorldCom could expect in a best-case scenario of Internet growth. In essence, “Stluka’s model suggested that in the best of all possible worlds Internet traffic would double every 100 days” (Faber, 2003). In working with the model, Stluka simply assigned variables with various parameters to “whatever we think is appropriate.” (David Faber, 2003)[3]

The “doubling meme” started to become popular in the telecommunications industry to the point where it began to drive investment. In the wake of the “irrational exuberance” comment by Alan Greenspan, the telecommunications industry began forecasting its growth according to this spreadsheet model. Bernie Ebbers at WorldCom soon echoed the forecast as did new AT&T CEO Armstrong. The proliferation of tech-related magazines such as Red Herring and Wired inspired enthusiasm in the latest tech environment and the Holy Grail of Internet growth. Business news channels such as CNBC and the ill-fated CNNfn also promoted telecom stocks.

The Bubbles Burst

The spring of 2000 saw the end of the “new economy.” A lot of investment money had gone into companies offering dot.com services on the Internet. Every IPO it seemed, such as Pets.com was met with hordes of cash. NASDAQ, the online trading environment for technology stocks, reached a high of over 5000 in March, but the next month, prices began to fall. By the time the Bush administration settled into their new offices at the White House less than a year later, it had declined by over 3000 points. The NASDAQ continued to fall while the Dow-Jones Index of 30 established corporations climbed even higher, surpassing the 11,000 mark again in February 2001.

Research by Andrew Odlyzko, a mathematician who went to the University of Minnesota’s University of Minnesota’s Digital Technology Center and of the Minnesota Supercomputing Institute after working at AT&T, challenged the meme.[4]

    To be fair, says Mr Odlyzko, Internet traffic did grow this quickly in 1995 and 1996, when the Internet first went mainstream. But since then, he estimates, annual growth has settled down at around 70-150%, a far cry from the 700-1,500% trumpeted by WorldCom. The myth of 100-day doubling, however, refused to die.[5]

Rival telecoms companies such as Global Crossing and Qwest tried to adjust to the contrived projections, leading to the “Bad Apple” accounting scandals and telecom crash that rocked the US economy in the immediate years after 9/11. Many companies believed the meme or at least thought that they had to respond accordingly. They soon resorted to “capacity swaps” and other accounting tricks to boost their sales numbers to inflate earnings to compete with what WorldCom was reporting. Capacity swaps are the exchange of telecommunications bandwidth capacity between carriers that is accounted as revenue despite no exchange of currency.

The Fall of WorldCom

The telecommunications industry soon went into steep decline. The first revelation came with the meltdown of Enron, an energy company that embraced the Internet and bought and built 18,000 miles of fiber optic network. One of its schemes in an interesting but futile attempt to create a trading environment for bandwidth as it had for natural gas spot contracts. But it was too little, too late and would soon wind up as the largest bankruptcy in history.

As the WorldCom case would show, many companies started to engage in illegal accounting techniques after the markets faltered. In late June 2002 CNBC reported that WorldCom had been discovered to have accounting irregularities dating back to early 2001. Nearly US$4 billion had been illegally documented as capital expenditures. WorldCom had registered $17.79 billion in 2001 “line cost” expenses instead of the $14.73 billion it should have reported. The result was that it reported US$2.393 billion in 2001 profits instead of showing what should have been a $662 million dollar loss.

Shares dropped quickly. Although the stock had already fallen from its 1999 high of $64 a share to just over $2, it soon dropped to mere pennies before the stock stopped trading. Other companies such as Qwest and Tyco further reduced the vestiges of the general public’s confidence in the stock market, and particularly its telecommunications companies.[6]

The Telecom Crash

The stock markets continued to decline as new corporate scandals were revealed.The “Dow,” representing mainly the stalwarts of the old economy, would maintain its strength during the Bush administration’s early years. It would dip below, but return to highs over 10,000 intermittently until the summer of 2002 when the corporate scandals were exposed. Bush’s SEC chief, Harvey Pitt, failed to gain the confidence of investors and eventually resigned.

By July 22, 2002, over $7 trillion of stock values had dissipated. The Wilshire Total Market Index fell from $17.25 trillion on March 24, 2000 to $10.03 trillion on July 18, 2002. Telecommunications services, which had accounted for 7.54% of the Wilshire Total Market Index at the end of March, 2000; saw its total value fall to only 3.63%. Information technology fell from 36.2% to 15.01% and even Microsoft saw its market capitalization fall from $581.3 billion to $276.8 billion.

Finally, Congress passed the Sarbannes-Oxley Bill in August of 2002. The legislation enacted strict new accounting rules for publicly traded corporations. Accountants, auditors, and corporate officers were required to follow stringent record-keeping requirements and CEOs had to sign off on their company’s books. The stock price fall abated, but at a cost of trillions of investor dollars from IRAs, mutual funds, individual investments, and pension funds.

Conclusion

My research on spreadsheets mainly focuses on the productive aspects of this transformative technology in the financial and administrative spheres and not so much on the problems that may occur with their misuse, either on purpose or by mistake. But the expose by CNBC on Worldcom was extraordinarily interesting in that it showed how the power bestowed on quantitative and forecasting methods, and in this case, utilized with spreadsheets, can circulate throughout an industry and gather media attention and take on a life of their own. The “doubling meme” quantified and justified by the WorldCom spreadsheet accelerated over-investment in the telecommunications capacity needed for the Internet.[7]

Notes

[1] Quote from Faber, David. “The Rise and Fraud of WorldCom.” NBCNews.com. NBCUniversal News Group, 09 Sept. 2003. Web. 22 June 2017.
[2] BUSINESS WEEK, (2002) “Inside the Telecom Game”. Cover Story, August 5, 2002. Pp. 34-40.
[3] Quote from Romar, Edward J., and Martin Calkins. “WorldCom Case Study Update.” Markkula Center for Applied Ethics, Santa Clara University, www.scu.edu/ethics/focus-areas/business-ethics/resources/worldcom-case-study-update/. The references to Faber,2003 are from the CNBC television expose The Rise and Fraud of WorldCom. 8 September 2003.
[4] Coffman, Kerry, and Andrew Odlyzko. “The Size and Growth Rate of the Internet.” First Monday, A Great Cities Initiative of the University of Illinois at Chicago University Library., 5 Oct. 1998, firstmonday.org/ojs/index.php/fm/article/view/620/541.
[5] Quote from “The Power of WorldCom’s Puff.” The Economist, The Economist Newspaper, 18 July 2002. www.economist.com/special-report/2002/07/18/the-power-of-worldcoms-puff.
[6] An article by John Duchemin about Tyco on the Honolulu Advertiser’s website. It chronicled the travails of the Tyco telecommunications hub in Wai’anae. The 38,000 square foot center went unused when the telecom market collapsed. August 13, 2002.
[7] A good discussion of the doubling meme can be found in The Great Telecom Meltdown by Fred R. Goldstein, p. 72.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

HOW THE US MOBILE INDUSTRY CAME TOGETHER, PART III: JUNK BONDS CONSOLIDATE THE NETWORK

Posted on | October 6, 2019 | No Comments

Prosperity is the sum of financial technology times the sum of human capital plus social capital plus real assets. P=ΣEFT (DHC+ESC+ERA)
– Michael Milken

This post is the third on how the mobile industry emerged in the US. The first installment talked about how AT&T ceded the wireless opportunity to the local telephone companies when they were broken up in the early 1980s. The second post talked about how the regulatory dynamics during that time resulted in a bifurcated market structure when the FCC dedicated equal radio spectrum to landline RBOCs and non-landline bidders in an auction. The below looks at how an alternative funding system emerged to help partially consolidate the industry and return AT&T to telephony dominance via radio spectrum dedicated to non-wireline mobile.

In 1986, McCaw Cellular approached financier Michael Milken for capital to compete against AT&T with a wireless network. The McCaw family wanted the money to purchase cellular licenses and to buy MCI Wireless for nearly $2 billion. McCaw Cellular Communications was later sold to AT&T Wireless Services in 1994, combining the nation’s biggest cellular carrier with the largest long-distance telephone company.

Milken, formerly of the defunct investment firm Drexel Burnham Lambert, was one of the most controversial financiers in modern history. Convicted and jailed in the early 1990s from charges brought on by US Attorney for the Southern District of New York Mayor Rudy Giuliani. Milken became the poster-boy for the financial greed of the Reagan era due to work with high-yield “junk” bonds. Milken started with the oil industry, and she began to shift his focus towards media and the building of “information highways.”

By raising funds for a variety of new media companies such as CNN’s satellite news network, TCI’s cable television network, and MCI’s alternative fiber optics-based telecommunications system, that were taking advantage of new technologies. He and his colleagues piped some $26 billion into the emerging information industries and its leading companies such as Cablevision Systems, CNN, MCI, Metromedia, News Corporation, Time Warner Cable, and Viacom.

McCaw Cellular started in 1981 when McCaw came across an AT&T projection about the future of cellular telephony. While it predicted less than a million subscribers by the start of the 21st century, McCaw was intrigued as he knew the value and dynamics of subscribers from his success with cable television. Radio licenses for the cellular spectrum were being sold at less than $5 per “pop.” He decided to purchase licenses in some of the largest markets.

By 1982, the Federal Communications Commission (FCC) began to recognize the potential of wireless technology. The federal agency began to define Cellular Market Areas (CMA) and assign area-based radio licenses. They determined 306 metropolitan statistical areas (MSA) and 428 rural service areas that could be assigned radio frequencies and auctioned off. The FCC wanted to promote competition, so it gave 20 MHz of the radio spectrum (RSA) it had allocated to cellular in each area to two market segments. The FCC’s 1981 Report and Order specified that half would go to the local landline telephone companies in each geographical area and the other to interested “nonwireline” licensee companies by lottery. [1]

In January of 1985, Pacific Telesis, a West Coast landline RBOC, announced that it wanted to expand its cellphone business. Its target was CellNet’s interest in a San Francisco license. A month later McCaw asked the FCC to block Pacific Telesis. It argued the big RBOC had no incentive to provide good cellular service since it would compete with its land services. It also filed suit in the California Supreme Court to block the Pacific Telesis purchase. The lawsuit caused widespread uncertainty about the wireless industry.

One of the first big companies to abandon the cellular industry was MCI. After losing the Los Angeles license and consequently setting back its plan for a national network, MCI decided that it wanted to sell its cellular interests. McCaw would lose the lawsuit the next year, but the attempt would keep several nonwireless companies out of the wireless business, and in the meantime, the uncertainty would keep POP prices cheap.

MCI shunned a McCaw deal at first, thinking they would not have the money. In August MCI nearly completed an agreement with American Cellular Communications Corporation (ACCC) but after discovering that BellSouth heavily financed the company, it ended the negotiations. McCaw was back in, and soon inked a deal with MCI, but it needed significant financing. In the Fall of 1985, McCaw took a health sabbatical while the company searched for capital to complete the MCI deal.

McCaw needed some $225 million to buy parts of MCI’s wireless and paging businesses, and gain a stronghold in the cellular business. In the spring of 1986, Salomon Brothers approached McCaw with the promise to provide funding, but came up painfully short as the MCI deadline approached. After several months of prefatory research, the famed Wall Street financial operator could only raise $4.5 million.[2] The McCaw executives were beginning to worry that MCI might back out if they failed to deliver payment in time. With POP prices rising again, McCaw needed to secure the deal with MCI and pursue other acquisitions as well. Desperate for the needed capital, McCaw executives visited Michael Milken.

The visit to Drexel was famous for Milken’s opening statement, “You guys needed brain surgery and went to a bunch of veterinarians.”[3] The king of junk bonds was prepared. He proceeded to recount for the McCaw executives (Craig McCaw was on a health sabbatical) why Salomon Brothers had approached them for funding, and why they were not successful. He then asked how much they needed. To their reply of $225 million, he responded, “The first thing we need to do is increase the size of the deal. We’ll go for $250 million.”[4] Hearing of the Milken meeting, Craig McCaw endorsed the new direction wholeheartedly, preferring to take on the debt load rather than giving up control and equity in a joint venture. But the funding had to be in place before the July 3rd deadline, just weeks away. Otherwise, MCI could renegotiate the price, or worse, change their minds.

The summer of 1986 was a dynamic one for the wireless industry. Emboldened by the potential Drexel funding, the McCaw team launched a campaign to expand their cellular holdings dramatically. This included buying 9 million POPs throughout the Southern states in cities like Jacksonville and Memphis.
The FCC had also given another 10 Mhz to each market segment.[5] Just days before Drexel bond closing, Southwestern Bell and Metromedia announced a deal at $45 a POP, two and half times more than McCaw’s current contracts. If Milken could not deliver, and the MCI deal went into default, McGowan and company would certainly restructure the deal beyond McCaw’s financial capabilities.

On the crucial day, McCaw executives went to MCI headquarters with Milken’s assurance that the bond sale had closed successfully the day before. But they still needed to deliver the money, an act that depended on multiple electronic wire transfers. As 3 PM approached, still no word from the MCI treasurer. Then at 3:25 PM, the word came, the money had arrived, and the deal had closed.

McCaw was soon propelled into the number one cellular company in the US. Armed with the Drexel war chest, the McCaw team scoured the country for additional deals. Returning from a sabbatical in September 1986, McCaw put the cable business up for sale and began to focus exclusively on cellular. Taking advantage of their experience with the Counter-Alliance and Big Monopoly Game, they contacted the licensees they had been working with to see if they could buy them out. “By the summer of 1987, just before the IPO, McCaw owned licenses covering 35 million POPs in 94 markets—nearly twice as many POPs as the second biggest nonwireless company, LIN Broadcasting, which had 18 million.”[6]

When McCaw Cellular Communications went public in late 1987, they made a fortune. The decision to use junk bonds was figured to have saved the owners nearly US$1.2 billion. By going with debt instead of dispersing equity, not only did the McCaws retain their ownership, but also their control, flexibility, and independence.

Drexel raised nearly $2 billion for McCaw Wireless, and in return made approximately $45 million for itself.[7] McCaw was bought by AT&T in September 1994 and the McCaw family wound up as AT&T Wireless Services, Inc.’s biggest owners with over $2 billion in the company’s stock.[8] Craig McCaw and his brothers amassed a fortune of $6.4 billion by the summer of 1998.[9]

Notes

[1] Hell, R. (1992) Competition in the Cellular Telephone Service Industry. Diane Publishing Co, Darby, PA.
[2] McCaw choice of Drexel over Salomon Brothers from O. Casey Corr’s (2000) Money from Thin Air. p. 138. The figure of how much Salomon Brothers raised has been quoted as long as $2 million by James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.200.
[3] Brain surgery quote from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.201.
[4] $250 million quote from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.201.
[5] Hell, R. (1992) Competition in the Cellular Telephone Service Industry. Diane Publishing Co, Darby, PA.
[6] 35 million POPs quote from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.205.
[7] McCaw raising of money at Drexel from O. Casey Corr’s (2000) Money from Thin Air. p. 140.
[8] McCaw ownership of ATT from O. Casey Corr’s (2000) Money from Thin Air. p. 226.
[9] McCaw fortune from FORBES.COM, “Craig McCaw – The Wireless Wizard of Oz”. 6/22/98. Accessed on February 12, 2004. Figures are for 1998 when the prices of stocks were quite high.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Early Internationalization of the Internet

Posted on | September 18, 2019 | No Comments

A conference was organized in 1972 to bring network engineers and computer scientists from around the world together to discuss the future of data communications. It was held in Washington DC and primarily provided a showcase for the ARPANET, the first data packet network. Funded by the Pentagon’s Advanced Research Projects Agency (ARPA) and built by BBN in the late 1960s, the ARPANET was struggling with operational costs and was becoming somewhat of an albatross for its handlers. Meanwhile, it was attracting the attention of the research community and some telecommunications operators, primarily from Europe, that saw the potential of connecting computers.

In October 1972, the IEEE’s First International Conference on Computers and Communications began at the Hilton Hotel. Organized by Bob Kahn of BBN and supported by Larry Roberts at ARPA, the conference sparked a major discussion of what the ARPANET could do and where it was heading. A number of ideas were discussed concerning future uses and implementation of the ARPANET, including its integration with other networks around the world. It’s objectives were to show off the ARPANET’s capabilities and perhaps unload the network to a research institute or the private sector.

Researchers from many countries eagerly attended the conference. One of the major concerns was voiced by representatives from those nations who wanted to implement their own packet-switching networks. French representatives for example were planning a packet-switching network called CYCLADES and the British had their own network independently designed by the National Physical Laboratory (NPL) in 1971. Even in the US, a group of disgruntled employees had left BBN in July 1972 and formed Packet Communications Incorporated, expressing concerns that BBN was commercializing too slowly.

Like most conferences, graduate students were crucial to its success. Bob Metcalfe was working on his PhD at Harvard (and future inventor of Ethernet and founder of 3Comm) and assigned the task of compiling a list of uses for the ARPANET. He queried the administrators of ARPANET, many of which he knew because of his participation in the project. He then wrote a manuscript called Scenarios, which listed 19 things to do with the ARPANET. The list included activities such as Remote Job Entry (RJE) as well as games and symbolic manipulation of mathematical formulas. Many of which would be demonstrated at the conference.

The ICCC of 1972 was the first major demonstration of ARPANET and Metcalfe was an obvious choice to demonstrate the fledgling computer network at the conference. An IMP was set up in Georgetown Ballroom of the Hilton Hotel and terminals were set up around the room. Kahn had requested participation from the various nodes of the network and universities which ARPA was funding. Together they included some thirty universities such as Carnegie Melon, Harvard, Hawaii, Illinois, MIT, New York University, USC, and Utah, as well as AMES, BBN, MITRE, and RAND. One major objective of the conference was to shop the network to interested private concerns and/or unload the operational aspects of the facilities. They saw its potential as a commercial operation licensed with the FCC as a specialized common carrier and providing packet-switched data communications to corporate and other clients.

An obvious candidate for taking over the ARPANET was AT&T. Ten executives from AT&T scheduled a meeting with Metcalfe that he recounts with visible anger. Partway into the demonstration, the IMP crashed. The AT&T executives appeared visibly pleased and laughed, reassured that this new technology would be no threat to the largest network in the world. Bob Metcalfe never forgave them. He went on to Hawaii to learn the AlohaNet radio packet broadcasting system and then incorporated those ideas into Ethernet at Xerox PARC.

It would was the International Telecommunications Union (ITU) that would play the next important role in the adoption of packet-switching technologies.

To get some perspective of what the Internet has transformed into, view this video by whoishostingthis.com.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from http://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor and Associate Chair at State University of New York (SUNY) Korea. Recently taught at Hannam University in Daejeon, South Korea. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, media economics, and strategic communications.

    You can reach me at:

    anthony.pennings@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Calendar

    September 2020
    M T W T F S S
     123456
    78910111213
    14151617181920
    21222324252627
    282930  
  • Pages

  • September 2020
    M T W T F S S
     123456
    78910111213
    14151617181920
    21222324252627
    282930  
  • Flag Counter
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.