Anthony J. Pennings, PhD

WRITINGS ON DIGITAL STRATEGIES, ICT ECONOMICS, AND GLOBAL COMMUNICATIONS

The Spreadsheet that Fueled the Telecom Boom – and Bust

Posted on | October 24, 2019 | No Comments

“I had built a model in an Excel spreadsheet that translated what our sales forecast was into how much traffic we would expect to see,” he says. “And so I just assigned variables for those various parameters, and then said we can set those variables to whatever we think is appropriate.”Tom Stluka, Worldcom 1997 [1]

By the mid 1990s, telecommunications infrastructure was at the center of the world’s attention. The Internet and its World Wide Web were taking off. Cable TV began to offer broadband services. Satellite signal power shrunk dishes to a few meters. Mobile telephony also showed promise. In the wake of the Telecommunications Act of 1996 and the 1997 meeting of the World Trade Organization (WTO), investors poured some US$2 trillion dollars into the telecommunications industry.[2]

This post explores the story that an employee at WorldCom, a major telecommunications company that later became part of Verizon, formulated and propagated a spreadsheet that projected a major growth period for the Internet. It created a media conversation that heavily influenced the flow of investment capital into the telecommunications sector.

I’ve written previously about the impact of the digital spreadsheet on modern society. It has become what I call an techno-epistemological tool that creates meaning and cognitive trajectories of analysis and action. These worksheets combine words, numbers, lists, tables, with quantitative tools and formulas that structure information and suggest decision paths and scenarios. This case of the spreadsheet that changed the telecommunications environment of the 1990s operated initially within the WorldCom operation. Then it produced results that diffused throughout the telecommunications/Internet industry and investment community. The story became a bit of an urban myth, but that only points to its rhetorical value as it circulated through the technologically-driven economy of the 1990s “Bull Run” era.

The Bull Run

Interest in telecommunications intensified in the late 1980s with the emergence of contending “information superhighways”. Fiber optic cabling, multi-protocol routers, and ADSL broadband connections promised new services for both traditional cable and telephone companies. Mobile telephony and some data services like Gopher also started to become viable.

The privatization of the Internet in 1992 and invention of the World Wide Web’s hypertext protocols a few years later made “dot.com” companies feasible. The IPO of Netscape, famous for its radical web browser, marked the start of the dramatic “dotcom” investment boom of the bull run. People bought PCs or Macs, hooked to a modem, dialed into a local ISP, and “surfed the web.”

WorldCom was at the center of that investment boom, but many telecommunications firms benefited. Money also flowed into new companies like Enron, Global Crossing, Tyco, and Winstar, as well as traditional telecommunications companies like AT&T and the “Baby Bells” of the time (Ameritech, Bell Atlantic, BellSouth, NYNEX, Pacific Telesis, Southwestern Bell, US West). WorldCom emerged in the 1990s as a a significant growth company as it expanded from a long-distance provider to a major Internet Service Provider (ISP).

WorldCom started with long-distance telephone services after the breakup of AT&T but continued to expand through mergers and acquisitions. It acquired telecom competition pioneer MCI and became the largest ISP in the world after its purchase of backbone provider UUNET. Although WorldCom would end in accounting scandals, bankruptcy and ruin, and its CEO sent to prison, they inadvertently (or not) sparked the dramatic investment boom in telecommunications.

The Spreadsheet Model/Meme

In 1997, when he was an employee of WorldCom, Tom Stluka created a “best-case scenario” for the Internet’s growth on an Excel spreadsheet. Tom Stluka was an engineer for UUNET, a popular Internet service provider (ISP) that was taken over by WorldCom in 1996. He regularly developed estimates for data traffic based on a spreadsheet model he created.

Stluka’s CEO, Kevin Boyne, would often encourage Stluka to increase his forecast. Boyne wanted his suppliers of fiber optics and other new telecom equipment to increase their production so that supplies of the glass conduits and routers would be sufficient, and prices driven lower due to an abundance of supply. Boyne contended that the Internet was doubling in size every 100 days. So Stluka created a spreadsheet that validated these best-case scenarios for the Internet’s growth.

The spreadsheet story was revealed in a CNBC television news show, “The Big Lie: Inside the Rise and Fraud of WorldCom,” by their news analyst David Faber. Edward Romar and Martin Calkins explained:

    The so-called “big lie” was promoted by citing an internally developed spreadsheet developed by Tom Stluka, a capacity planner at WorldCom, that modeled in Excel format the amount of traffic WorldCom could expect in a best-case scenario of Internet growth. In essence, “Stluka’s model suggested that in the best of all possible worlds Internet traffic would double every 100 days” (Faber, 2003). In working with the model, Stluka simply assigned variables with various parameters to “whatever we think is appropriate.” (David Faber, 2003)[3]

The “doubling meme” started to become popular in the telecommunications industry to the point where it began to drive investment. In the wake of the “irrational exuberance” comment by Alan Greenspan, the telecommunications industry began forecasting its growth according to this spreadsheet model. Bernie Ebbers at WorldCom soon echoed the forecast as did new AT&T CEO Armstrong. The proliferation of tech-related magazines such as Red Herring and Wired inspired enthusiasm in the latest tech environment and the Holy Grail of Internet growth. Business news channels such as CNBC and the ill-fated CNNfn also promoted telecom stocks.

The Bubbles Burst

The spring of 2000 saw the end of the “new economy.” A lot of investment money had gone into companies offering dot.com services on the Internet. Every IPO it seemed, such as Pets.com was met with hordes of cash. NASDAQ, the online trading environment for technology stocks, reached a high of over 5000 in March, but the next month, prices began to fall. By the time the Bush administration settled into their new offices at the White House less than a year later, it had declined by over 3000 points. The NASDAQ continued to fall while the Dow-Jones Index of 30 established corporations climbed even higher, surpassing the 11,000 mark again in February 2001.

Research by Andrew Odlyzko, a mathematician who went to the University of Minnesota’s University of Minnesota’s Digital Technology Center and of the Minnesota Supercomputing Institute after working at AT&T, challenged the meme.[4]

    To be fair, says Mr Odlyzko, Internet traffic did grow this quickly in 1995 and 1996, when the Internet first went mainstream. But since then, he estimates, annual growth has settled down at around 70-150%, a far cry from the 700-1,500% trumpeted by WorldCom. The myth of 100-day doubling, however, refused to die.[5]

Rival telecoms companies such as Global Crossing and Qwest tried to adjust to the contrived projections, leading to the “Bad Apple” accounting scandals and telecom crash that rocked the US economy in the immediate years after 9/11. Many companies believed the meme or at least thought that they had to respond accordingly. They soon resorted to “capacity swaps” and other accounting tricks to boost their sales numbers to inflate earnings to compete with what WorldCom was reporting. Capacity swaps are the exchange of telecommunications bandwidth capacity between carriers that is accounted as revenue despite no exchange of currency.

The Fall of WorldCom

The telecommunications industry soon went into steep decline. The first revelation came with the meltdown of Enron, an energy company that embraced the Internet and bought and built 18,000 miles of fiber optic network. One of its schemes in an interesting but futile attempt to create a trading environment for bandwidth as it had for natural gas spot contracts. But it was too little, too late and would soon wind up as the largest bankruptcy in history.

As the WorldCom case would show, many companies started to engage in illegal accounting techniques after the markets faltered. In late June 2002 CNBC reported that WorldCom had been discovered to have accounting irregularities dating back to early 2001. Nearly US$4 billion had been illegally documented as capital expenditures. WorldCom had registered $17.79 billion in 2001 “line cost” expenses instead of the $14.73 billion it should have reported. The result was that it reported US$2.393 billion in 2001 profits instead of showing what should have been a $662 million dollar loss.

Shares dropped quickly. Although the stock had already fallen from its 1999 high of $64 a share to just over $2, it soon dropped to mere pennies before the stock stopped trading. Other companies such as Qwest and Tyco further reduced the vestiges of the general public’s confidence in the stock market, and particularly its telecommunications companies.[6]

The Telecom Crash

The stock markets continued to decline as new corporate scandals were revealed.The “Dow,” representing mainly the stalwarts of the old economy, would maintain its strength during the Bush administration’s early years. It would dip below, but return to highs over 10,000 intermittently until the summer of 2002 when the corporate scandals were exposed. Bush’s SEC chief, Harvey Pitt, failed to gain the confidence of investors and eventually resigned.

By July 22, 2002, over $7 trillion of stock values had dissipated. The Wilshire Total Market Index fell from $17.25 trillion on March 24, 2000 to $10.03 trillion on July 18, 2002. Telecommunications services, which had accounted for 7.54% of the Wilshire Total Market Index at the end of March, 2000; saw its total value fall to only 3.63%. Information technology fell from 36.2% to 15.01% and even Microsoft saw its market capitalization fall from $581.3 billion to $276.8 billion.

Finally, Congress passed the Sarbannes-Oxley Bill in August of 2002. The legislation enacted strict new accounting rules for publicly traded corporations. Accountants, auditors, and corporate officers were required to follow stringent record-keeping requirements and CEOs had to sign off on their company’s books. The stock price fall abated, but at a cost of trillions of investor dollars from IRAs, mutual funds, individual investments, and pension funds.

Conclusion

My research on spreadsheets mainly focuses on the productive aspects of this transformative technology in the financial and administrative spheres and not so much on the problems that may occur with their misuse, either on purpose or by mistake. But the expose by CNBC on Worldcom was extraordinarily interesting in that it showed how the power bestowed on quantitative and forecasting methods, and in this case, utilized with spreadsheets, can circulate throughout an industry and gather media attention and take on a life of their own. The “doubling meme” quantified and justified by the WorldCom spreadsheet accelerated over-investment in the telecommunications capacity needed for the Internet.[7]

Notes

[1] Quote from Faber, David. “The Rise and Fraud of WorldCom.” NBCNews.com. NBCUniversal News Group, 09 Sept. 2003. Web. 22 June 2017.
[2] BUSINESS WEEK, (2002) “Inside the Telecom Game”. Cover Story, August 5, 2002. Pp. 34-40.
[3] Quote from Romar, Edward J., and Martin Calkins. “WorldCom Case Study Update.” Markkula Center for Applied Ethics, Santa Clara University, www.scu.edu/ethics/focus-areas/business-ethics/resources/worldcom-case-study-update/. The references to Faber,2003 are from the CNBC television expose The Rise and Fraud of WorldCom. 8 September 2003.
[4] Coffman, Kerry, and Andrew Odlyzko. “The Size and Growth Rate of the Internet.” First Monday, A Great Cities Initiative of the University of Illinois at Chicago University Library., 5 Oct. 1998, firstmonday.org/ojs/index.php/fm/article/view/620/541.
[5] Quote from “The Power of WorldCom’s Puff.” The Economist, The Economist Newspaper, 18 July 2002. www.economist.com/special-report/2002/07/18/the-power-of-worldcoms-puff.
[6] An article by John Duchemin about Tyco on the Honolulu Advertiser’s website. It chronicled the travails of the Tyco telecommunications hub in Wai’anae. The 38,000 square foot center went unused when the telecom market collapsed. August 13, 2002.
[7] A good discussion of the doubling meme can be found in The Great Telecom Meltdown by Fred R. Goldstein, p. 72.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

HOW THE US MOBILE INDUSTRY CAME TOGETHER, PART III: JUNK BONDS CONSOLIDATE THE NETWORK

Posted on | October 6, 2019 | No Comments

Prosperity is the sum of financial technology times the sum of human capital plus social capital plus real assets. P=ΣEFT (DHC+ESC+ERA)
– Michael Milken

This post is the third on how the mobile industry emerged in the US. The first installment talked about how AT&T ceded the wireless opportunity to the local telephone companies when they were broken up in the early 1980s. The second post talked about how the regulatory dynamics during that time resulted in a bifurcated market structure when the FCC dedicated equal radio spectrum to landline RBOCs and non-landline bidders in an auction. The below looks at how an alternative funding system emerged to help partially consolidate the industry and return AT&T to telephony dominance via radio spectrum dedicated to non-wireline mobile.

In 1986, McCaw Cellular approached financier Michael Milken for capital to compete against AT&T with a wireless network. The McCaw family wanted the money to purchase cellular licenses and to buy MCI Wireless for nearly $2 billion. McCaw Cellular Communications was later sold to AT&T Wireless Services in 1994, combining the nation’s biggest cellular carrier with the largest long-distance telephone company.

Milken, formerly of the defunct investment firm Drexel Burnham Lambert, was one of the most controversial financiers in modern history. Convicted and jailed in the early 1990s from charges brought on by US Attorney for the Southern District of New York Mayor Rudy Giuliani. Milken became the poster-boy for the financial greed of the Reagan era due to work with high-yield “junk” bonds. Milken started with the oil industry, and she began to shift his focus towards media and the building of “information highways.”

By raising funds for a variety of new media companies such as CNN’s satellite news network, TCI’s cable television network, and MCI’s alternative fiber optics-based telecommunications system, that were taking advantage of new technologies. He and his colleagues piped some $26 billion into the emerging information industries and its leading companies such as Cablevision Systems, CNN, MCI, Metromedia, News Corporation, Time Warner Cable, and Viacom.

McCaw Cellular started in 1981 when McCaw came across an AT&T projection about the future of cellular telephony. While it predicted less than a million subscribers by the start of the 21st century, McCaw was intrigued as he knew the value and dynamics of subscribers from his success with cable television. Radio licenses for the cellular spectrum were being sold at less than $5 per “pop.” He decided to purchase licenses in some of the largest markets.

By 1982, the Federal Communications Commission (FCC) began to recognize the potential of wireless technology. The federal agency began to define Cellular Market Areas (CMA) and assign area-based radio licenses. They determined 306 metropolitan statistical areas (MSA) and 428 rural service areas that could be assigned radio frequencies and auctioned off. The FCC wanted to promote competition, so it gave 20 MHz of the radio spectrum (RSA) it had allocated to cellular in each area to two market segments. The FCC’s 1981 Report and Order specified that half would go to the local landline telephone companies in each geographical area and the other to interested “nonwireline” licensee companies by lottery. [1]

In January of 1985, Pacific Telesis, a West Coast landline RBOC, announced that it wanted to expand its cellphone business. Its target was CellNet’s interest in a San Francisco license. A month later McCaw asked the FCC to block Pacific Telesis. It argued the big RBOC had no incentive to provide good cellular service since it would compete with its land services. It also filed suit in the California Supreme Court to block the Pacific Telesis purchase. The lawsuit caused widespread uncertainty about the wireless industry.

One of the first big companies to abandon the cellular industry was MCI. After losing the Los Angeles license and consequently setting back its plan for a national network, MCI decided that it wanted to sell its cellular interests. McCaw would lose the lawsuit the next year, but the attempt would keep several nonwireless companies out of the wireless business, and in the meantime, the uncertainty would keep POP prices cheap.

MCI shunned a McCaw deal at first, thinking they would not have the money. In August MCI nearly completed an agreement with American Cellular Communications Corporation (ACCC) but after discovering that BellSouth heavily financed the company, it ended the negotiations. McCaw was back in, and soon inked a deal with MCI, but it needed significant financing. In the Fall of 1985, McCaw took a health sabbatical while the company searched for capital to complete the MCI deal.

McCaw needed some $225 million to buy parts of MCI’s wireless and paging businesses, and gain a stronghold in the cellular business. In the spring of 1986, Salomon Brothers approached McCaw with the promise to provide funding, but came up painfully short as the MCI deadline approached. After several months of prefatory research, the famed Wall Street financial operator could only raise $4.5 million.[2] The McCaw executives were beginning to worry that MCI might back out if they failed to deliver payment in time. With POP prices rising again, McCaw needed to secure the deal with MCI and pursue other acquisitions as well. Desperate for the needed capital, McCaw executives visited Michael Milken.

The visit to Drexel was famous for Milken’s opening statement, “You guys needed brain surgery and went to a bunch of veterinarians.”[3] The king of junk bonds was prepared. He proceeded to recount for the McCaw executives (Craig McCaw was on a health sabbatical) why Salomon Brothers had approached them for funding, and why they were not successful. He then asked how much they needed. To their reply of $225 million, he responded, “The first thing we need to do is increase the size of the deal. We’ll go for $250 million.”[4] Hearing of the Milken meeting, Craig McCaw endorsed the new direction wholeheartedly, preferring to take on the debt load rather than giving up control and equity in a joint venture. But the funding had to be in place before the July 3rd deadline, just weeks away. Otherwise, MCI could renegotiate the price, or worse, change their minds.

The summer of 1986 was a dynamic one for the wireless industry. Emboldened by the potential Drexel funding, the McCaw team launched a campaign to expand their cellular holdings dramatically. This included buying 9 million POPs throughout the Southern states in cities like Jacksonville and Memphis.
The FCC had also given another 10 Mhz to each market segment.[5] Just days before Drexel bond closing, Southwestern Bell and Metromedia announced a deal at $45 a POP, two and half times more than McCaw’s current contracts. If Milken could not deliver, and the MCI deal went into default, McGowan and company would certainly restructure the deal beyond McCaw’s financial capabilities.

On the crucial day, McCaw executives went to MCI headquarters with Milken’s assurance that the bond sale had closed successfully the day before. But they still needed to deliver the money, an act that depended on multiple electronic wire transfers. As 3 PM approached, still no word from the MCI treasurer. Then at 3:25 PM, the word came, the money had arrived, and the deal had closed.

McCaw was soon propelled into the number one cellular company in the US. Armed with the Drexel war chest, the McCaw team scoured the country for additional deals. Returning from a sabbatical in September 1986, McCaw put the cable business up for sale and began to focus exclusively on cellular. Taking advantage of their experience with the Counter-Alliance and Big Monopoly Game, they contacted the licensees they had been working with to see if they could buy them out. “By the summer of 1987, just before the IPO, McCaw owned licenses covering 35 million POPs in 94 markets—nearly twice as many POPs as the second biggest nonwireless company, LIN Broadcasting, which had 18 million.”[6]

When McCaw Cellular Communications went public in late 1987, they made a fortune. The decision to use junk bonds was figured to have saved the owners nearly US$1.2 billion. By going with debt instead of dispersing equity, not only did the McCaws retain their ownership, but also their control, flexibility, and independence.

Drexel raised nearly $2 billion for McCaw Wireless, and in return made approximately $45 million for itself.[7] McCaw was bought by AT&T in September 1994 and the McCaw family wound up as AT&T Wireless Services, Inc.’s biggest owners with over $2 billion in the company’s stock.[8] Craig McCaw and his brothers amassed a fortune of $6.4 billion by the summer of 1998.[9]

Notes

[1] Hell, R. (1992) Competition in the Cellular Telephone Service Industry. Diane Publishing Co, Darby, PA.
[2] McCaw choice of Drexel over Salomon Brothers from O. Casey Corr’s (2000) Money from Thin Air. p. 138. The figure of how much Salomon Brothers raised has been quoted as long as $2 million by James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.200.
[3] Brain surgery quote from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.201.
[4] $250 million quote from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.201.
[5] Hell, R. (1992) Competition in the Cellular Telephone Service Industry. Diane Publishing Co, Darby, PA.
[6] 35 million POPs quote from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.205.
[7] McCaw raising of money at Drexel from O. Casey Corr’s (2000) Money from Thin Air. p. 140.
[8] McCaw ownership of ATT from O. Casey Corr’s (2000) Money from Thin Air. p. 226.
[9] McCaw fortune from FORBES.COM, “Craig McCaw – The Wireless Wizard of Oz”. 6/22/98. Accessed on February 12, 2004. Figures are for 1998 when the prices of stocks were quite high.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Early Internationalization of the Internet

Posted on | September 18, 2019 | No Comments

A conference was organized in 1972 to bring network engineers and computer scientists from around the world together to discuss the future of data communications. It was held in Washington DC and primarily provided a showcase for the ARPANET, the first data packet network. Funded by the Pentagon’s Advanced Research Projects Agency (ARPA) and built by BBN in the late 1960s, the ARPANET was struggling with operational costs and was becoming somewhat of an albatross for its handlers. Meanwhile, it was attracting the attention of the research community and some telecommunications operators, primarily from Europe, that saw the potential of connecting computers.

In October 1972, the IEEE’s First International Conference on Computers and Communications began at the Hilton Hotel. Organized by Bob Kahn of BBN and supported by Larry Roberts at ARPA, the conference sparked a major discussion of what the ARPANET could do and where it was heading. A number of ideas were discussed concerning future uses and implementation of the ARPANET, including its integration with other networks around the world. It’s objectives were to show off the ARPANET’s capabilities and perhaps unload the network to a research institute or the private sector.

Researchers from many countries eagerly attended the conference. One of the major concerns was voiced by representatives from those nations who wanted to implement their own packet-switching networks. French representatives for example were planning a packet-switching network called CYCLADES and the British had their own network independently designed by the National Physical Laboratory (NPL) in 1971. Even in the US, a group of disgruntled employees had left BBN in July 1972 and formed Packet Communications Incorporated, expressing concerns that BBN was commercializing too slowly.

Like most conferences, graduate students were crucial to its success. Bob Metcalfe was working on his PhD at Harvard (and future inventor of Ethernet and founder of 3Comm) and assigned the task of compiling a list of uses for the ARPANET. He queried the administrators of ARPANET, many of which he knew because of his participation in the project. He then wrote a manuscript called Scenarios, which listed 19 things to do with the ARPANET. The list included activities such as Remote Job Entry (RJE) as well as games and symbolic manipulation of mathematical formulas. Many of which would be demonstrated at the conference.

The ICCC of 1972 was the first major demonstration of ARPANET and Metcalfe was an obvious choice to demonstrate the fledgling computer network at the conference. An IMP was set up in Georgetown Ballroom of the Hilton Hotel and terminals were set up around the room. Kahn had requested participation from the various nodes of the network and universities which ARPA was funding. Together they included some thirty universities such as Carnegie Melon, Harvard, Hawaii, Illinois, MIT, New York University, USC, and Utah, as well as AMES, BBN, MITRE, and RAND. One major objective of the conference was to shop the network to interested private concerns and/or unload the operational aspects of the facilities. They saw its potential as a commercial operation licensed with the FCC as a specialized common carrier and providing packet-switched data communications to corporate and other clients.

An obvious candidate for taking over the ARPANET was AT&T. Ten executives from AT&T scheduled a meeting with Metcalfe that he recounts with visible anger. Partway into the demonstration, the IMP crashed. The AT&T executives appeared visibly pleased and laughed, reassured that this new technology would be no threat to the largest network in the world. Bob Metcalfe never forgave them. He went on to Hawaii to learn the AlohaNet radio packet broadcasting system and then incorporated those ideas into Ethernet at Xerox PARC.

It would was the International Telecommunications Union (ITU) that would play the next important role in the adoption of packet-switching technologies.

To get some perspective of what the Internet has transformed into, view this video by whoishostingthis.com.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

How the US Mobile Industry Came Together, Part II: The Grand Alliance

Posted on | July 21, 2019 | No Comments

Much like J.P. Morgan’s creation of telephone industry giant AT&T, the wireless industry was one of the information industries (along with cable TV, fiber optics and satellite TV) created mainly through the efforts of financier Michael Milken of the defunct investment firm Drexel Burnham Lambert. Milken helped the McCaw family assemble the wireless industry during the 1980s by raising money to purchase cellular licenses and to buy MCI Wireless for nearly $2 billion. McCaw Cellular Communications was later sold to AT&T Wireless Services in 1994, combining the nation’s biggest cellular carrier with the largest long-distance telephone company.

In a previous post, I explained how the wireless “mobile phone” business emerged about the same time as the breakup of AT&T, which had been under new anti-trust scrutiny since the Department of Justice (DOJ) filed suit in 1974. A year before, Motorola made the first mobile phone call and later in 1977, the FCC permitted them to build experimental systems in two major cities. In 1980 an AT&T report estimated that the market for mobile phones in 2000 would be 900,000 subscribers (The actual number of mobile phone subscribers in 2000 was 109.5 million). Consequently, when AT&T was forced to break up in 1982, it ceded the wireless business to the local phone companies. It was a significant strategic mistake that would cost them billions of dollars in the future.

Despite the limited market forecast, the FCC felt obliged to assign spectrum to the new mobile service. In May of 1981, they divided the 40 MHz frequency allocated to cellular into two segments in an attempt to create duopolistic competition. One segment would go to the local telephone company in each market, and the other would be open to non-telephone companies that wanted it.[1] This way, the phone companies could improve the service while the market could be “democratized.” If more than one company applied for the license, an administrative judge would review the company’s application. They would judge the merit of its plans and technical capability to provide the service coverage. On June 7, 1982, some 190 applicants applied for FCC licenses to provide cellular in America’s 30 largest cities. The applicants included a wide range of companies looking to make a play in the mobile industry, from the upstart McCaw Company to the former telegraph monolith Western Union. One of the most dramatic technological roll-outs of the 20th century was beginning but not without an additional twist in the method.

The year 1984 was a significant one for the development of the technological infrastructure in the US. Apple introduced the Macintosh computer; IBM bought Rolm, a pioneer of digital office technology; and AT&T was divided into a long-distance company, a manufacturing entity, and an assortment of regional local phone companies. 1984 was also the time the mobile phone industry began a series of dramatic changes. The telephone companies were starting to roll out their cellular services, and on April 10th Bell Atlantic made its first mobile call to comedian Bob Hope.

Progress also continued for the non-wire companies when in February of 1984, the “Grand Alliance” was born. This group consisted of the non-RBOC companies vying for the alternative cellular market. CellNet, Cox Cable LIN, MCI, Metromedia, Metro Mobile, MCI, The Washington Post Company, and Western Union signed several agreements to share or merge markets. They also made arrangements for rounds II and III of the FCC spectrum distribution.[2] The reason for the Alliance was the idea of “cumulative chances.” These companies began to agree to merge applications in a given market to increase the chances of one company getting control of a specific area and competing successfully against the local phone company.

In April however the FCC decided to allocate markets 31st to 90th via lottery, even digging up the same Ping-Pong drum used to select draftees during the Vietnam War.[3] Sown into the decision though, were the seeds of its own destruction. The FCC included three rules that ultimately would create chaos and lead to the elevation of McCaw. The government ruled that: 1) applicants could file duplicates of their applications; 2) they would not have to tie up capital; and 3) that the second through tenth place winners would be drawn at the same time. This last decision was made so the lottery would not have to be redone if the first choice became ineligible for some reason.[4] As a result, a new strategy began to form based on the notion of buying cumulative chances, and McCaw got the process started.

McCaw was spread quite evenly among cable, cellular and paging but by the summer of 1984, they began aggressively going after mobile. They took a risk and bought Knight-Ridder’s cellular applications for $1 million. They had no assurances from the FCC that the government agency wouldn’t declare the previous application ineligible and nix the idea of buying license applications in general. But when Round IV for the 91st to 120th largest markets began, over 5,100 applications arrived. Richard L. Vega and Associates had begun producing applications for $1,500 each whereas applications for Round I had cost about $300,000. The applications flooded the FCC’s headquarters and threw it into chaos. As a result, the FCC announced in late August that it would speed up the process and hold its Round II lottery on October 3. Shocked by the announcement, Telocator, the DC-based trade association for the cellular industry, formed The Counter-Alliance. Within a month, some 150 different businesses representing 700 various applications agreed to merge into 60 partnerships and take the process out of the FCC hands. McCaw aide John Stanton headed the Counter-Alliance and skillfully brought together the smallest of the FCC applicants with the promise of working out their differences and combining their “cumulative chances” into controlling interests.[5]

What would transpire in a New York City conference room is probably the most unheralded major telecommunications event of the 20th century. The Counter-Alliance put the smaller players back in the game but it also allowed McCaw to develop relationships that proved useful later for acquisitions. With Stanton reaching out to small license holders across the nation, McCaw was doing valuable research for future acquisitions. But that future was still unclear. In September, a “trading room” was formed at Rubin Baum law offices in NYC to work out what was called “Le Grand Deal”.[6] Representatives from Rounds II and III met in a conference room on Fifth Avenue to trade license applications in order to increase the chances that a company could control a single market, say Austin, Texas. Since the lottery would be drawn for the first 10 applications, the idea was that a single company could trade licenses in other markets in order to increase their chances to control other markets.

One crucial issue was equivalence. What would be the metric of value? How would the traders determine the value of each market? Their answer was the “POP,” a value determined by the 1980 population census. The traders worked out a formula that divided the population of an area by the number of license applicants. For example, Orlando, Florida, had a population of 700,000 people. Thirteen companies had applied for the FCC license in that area. So each application was allocated 55,000 POPs. A value of $3 per POP was used based on the McCaw-Knight-Ridder deal. Each could trade POPs for POPs.[7]

In the next post, I will discuss how McCaw Cellular became the dominant alternate cellular network by utilizing a new financial tool, the junk bond.

Notes

[1] While AT&T was the dominant phone company, GTE had a significant share and small companies such as the Warwick Phone Company in Warwick, New York also provided service.
[2] Alliance formation from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p. 77.
[3] Sale of McCaw Cable from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. pp. 146-147.
[4] FCC three rules from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. pp. 81-82.
[5] Counter Alliance from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p. 92.
[6] from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p. 92.
[7] Pop figures from James B. Murray, Jr. (2000) Wireless Nation: The Frenzied Launch of the Cellular Revolution in America. Cambridge, MA: Perseus Publishing. p.200-205.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

How the US Mobile Industry Came Together, Part I: Transformation of the Network

Posted on | June 27, 2019 | No Comments

The ubiquity, ease and sophistication of mobile “smartphone” services has proven to be an extraordinarily popular addition to modern social and productive life. We are now on the cusp of the 5th generation (5G) rollout of wireless services. It is different to gauge the characteristics of social change in the midst of such dramatic technological change, but mobile services have arguably extended our sense of self, allowed us to reintegrate caring (and some annoying relationships), and provided new ways to create and to learn.

This post addresses the transformation of the telecommunications industry and its expansion into mobile services. This was initiated by a series of actions that led to the breakup of the AT&T monopoly and opened the environment for the major business, regulatory and technological disruptions that led to the proliferation of smartphones and other portable devices. The next post will examine the emergence of the wireless industry which grew rather independently of the telcos, until the 1990s anyway. I delved into my notes first for the below explanation of the “breakup” of AT&T and how it opened up the telecommunications to new innovations, including wireless.

In 1980, IBM officially went into competition with AT&T when it offered its Satellite Business Systems (SBS) service, a partnership with Aetna Life & Casualty and Comsat. SBS was set up to provide “intranet” telecommunications services for banks and other businesses after the FCC deregulated satellite services in the early 1970s. Using new Ku band technology, SBS services such as data, facsimile, electronic mail, telephony and video conferencing could be transmitted directly to smaller antennas at the user’s premises. This meant that they could effectively bypass AT&T’s network and provide a type of direct broadcasting service, except with two-way capabilities. Jill Hills: “It seems to have been the entry of IBM via its SBS consortium into direct competition that altered AT&T’s attitude from formal toleration of competition to outright aggression in defense of its monopoly.”[1] The Consent Decree of 1956 had restricted AT&T from computer activities and now it saw IBM entering a potentially very lucrative market. With some US$1.26 billion invested in this competing satellite communications system that could circumvent Ma Bell’s existing facilities, it got the full attention of AT&T. [2]

Consequently, in the early 1980s, AT&T looked to the political realm for protection. “Action switched from the FCC to Congress as AT&T under its chairman, John de Butt, set about raising political consciousness of the FCC decisions.”[3] However, members of Congress did not rush to support AT&T, a potent contributor, as a number of criticisms were in the process of being raised. These primarily concerned:

    1) competing equipment suppliers who wanted to break into the telecommunications and customer premise equipment (CPE) markets that were dominated by AT&T’s Western Electric;
    2) business users, especially banks, who wanted more sophisticated services, including those offered by SBS, and;
    3) competing carriers who found it difficult to compete against AT&T’s dominance of long distance transmission facilities and “local loop” telephony services. In fact, the government was looking to create a more competitive, privatized data-networking environment free from the dominance of AT&T. [4]

Deregulatory actions during the 1960s and 1970s cut into AT&T’s legislatively mandated monopoly on all telecommunications, but it was still by far the major service and equipment provider. The Justice Department had filed an anti-trust suit against Ma Bell in 1974, challenging its competitive practices and seeking to divest Western Electric, its manufacturing arm, in addition to its Bell Operating Companies. Then 1980, the FCC issued its Computer II Decision that was specifically focused on the future role of AT&T. It deregulated all data processing services and customer premises equipment allowing AT&T to enter other markets, although through a separate subsidiary. Despite some restrictions, this officially allowed AT&T to compete outside its traditional purview for the first time.[5]

Despite his ideological convictions, Ronald Reagan was noncommittal about the breakup, mainly because his Secretary of Commerce William Baldridge presented the threat of foreign equipment manufacturers entering the US market and also because his Secretary of Defense, Casper Weinberg, argued forcibly “that an integrated AT&T was desirable for national security.” However, his Justice Department was ideologically motivated enough to carry on with the divestiture, particularly William Baxter, the Assistant Attorney General for Anti-Trust.[6] Baxter argued in 1981 that the radical restructuring of the world’s largest company would forward the new administration’s promise of reducing regulation and promoting competition. In face of the opposition from the Departments of Commerce and Defense, the DOJ continued with its anti-trust case.

The result was a three-pronged approach. The Department of Justice continued its investigation of AT&T while the FCC sought, through Computer II, to distinguish between regulated basic service (both long distance and local loop services) and unregulated enhanced services such as data communications. The Republican Congress, led on this issue by Senator Packwood, chair of the Senate Commerce Committee, proceeded with a legislative solution along the basic/enhanced distinctions desired by FCC but with awkward accounting and regulatory requirements. Although passed by the Senate and supported by AT&T, the Packwood bill failed to make it through the House and it never became law. It was the Justice Department that won out, agreeing to a Consent Decree with AT&T in January of 1982. AT&T, facing the DOJ’s persuasive case, decided to settle. The result was a political decision bypassing the FCC and other traditional forms of telecommunications policy. Ultimately called the Modified Final Judgement (MFJ), it split AT&T into a competitive company consisting of a long distance carrier and a manufacturing arm and a set of regulated Bell Operating Companies (BOCs) that would provide local exchange services. AT&T and the Justice Department agree on tentative terms for settlement of anti-trust suit filed against AT&T in 1974.

The MFJ created 22 Bell Operating Companies, 7 Regional Bell Operating Companies (RBOCs or “Baby Bells”) and AT&T.

While the MFJ did not provide the total datacom solution wanted in the corporate world, it initiated the opening up what was previously a significantly closed telecommunications system. The next year, in a coffee shop in Hattiesburg, Mississippi. Bernard J. Ebbers, Bill Fields, David Singleton and Murray Waldron worked out details for starting a long distance company, LDDS (Long Distance Discount Service), the precursor to WorldCom. In 1984 President Reagan approved international competition for the INTELSAT international satellite system after petition from Orion to provide transatlantic telecommunications services for corporate users. But while significant liberalization was being made in the United States, countries around the world still held on to a government owned and controlled telecommunications system. In order for the new internationalization of financial services and digital commerce to be enacted, telecommunications changes needed to occur worldwide.

As a result of the Department of Justice’s 1982 Consent Decree, AT&T divested its Regional Bell Operating Companies (RBOCs) which retained control over the basic telecommunications exchanges and local lines in their respective territories. A Consent Decree means that all parties agree to the remedy. The RBOCs or “Baby Bells” as they were sometimes called were restricted from entering the long-distance interstate communications markets and also from manufacturing telephone equipment. The parent company was called ATT and it retained its continental Long Lines Division, research center Bell Telephone Labs (Bellcore) and Western Electric, its manufacturing arm. Two new subsidiaries were created to allow the company to expand into previously restricted areas. ATT Information Systems was an unregulated subsidiary offering, what was termed by the Second Computer Inquiry, “enhanced” services. These included information services and customer premise equipment, including computers.

AT&T had been involved with the creation of the mobile phone through its Bell Labs division but lost the initiative to Motorola in 1970s. It later effectively ceded the mobile business to the RBOCs after an AT&T report estimated the market for mobile phones in the year 2000 would be 900,000. In January of 1982, AT&T indicated that cellular would be given to the local companies, the RBOCs, after the FCC divided the 40 MHz it allocated to cellular into two segments. One segment would go to the local telephone company in each market and the other would be open to non-telephone companies that wanted it.

As a result of this bifurcated market, the mobile telephone industry emerged largely in this “other” market, thanks largely to the controversial financing methods of Michael Milken and the defunct investment firm Drexel Burnham Lambert. Milken helped the McCaw family assemble the wireless industry during the 1980s, raising raising money for the formation of McCaw Wireless which it sold to AT&T in 1994.

Notes

[1] Hills, J. (1986) Deregulating Telecoms: Competition and Control in the United States, Japan and Britain. p. 66. Quote on IBM’s threat to AT&T.
[2] Hills, J. (1986) p. 63. Investment figures on SBS.
[3] Hills, J. (1986) p. 66. Raising concerns about FCC to members of Congress.
[4] Dan Schiller. Telematics and Government. p. 92. Stance of US government towards AT&T.
[5] For example, the subsidiary could not use Bell Labs software. Hills, J. (1986) Deregulating Telecoms. p. 66.
[6] Brock, G. (1981) The Telecommunications Industry: The Dynamics of Market Structure. Harvard University Press. p. 156-157.
[7] Schiller, D. (1982), p. 163.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Metrics, Analytics, and Visualization

Posted on | May 21, 2019 | No Comments

Understanding metrics is a significant component in evaluating global media, social media, and culture industries. Measurements of magazine and newspaper circulation, book readership, as well as television and radio audiences, have each had a distinguished history in the media field. The Nielsen ratings, for example, have been a crucial part of the television industry and its ability to attract advertisers. The introduction of digital technologies has made them even more important, and Nielsen has expanded its monitoring activities beyond TV to PCs, mobile devices, and perhaps the automobile dashboard if the “driverless car” takes off.[1]

Search engines, social media platforms, and a myriad of new media applications on the web and mobile devices have increased the production of useful data, and new systems of analysis have augmented their utility. Analytics is directly involved in monitoring ongoing content delivery, registering user choices, and communicating with fans and other users. They also connect incoming flows of information to key organizational concerns concerning financial sustainability, legal risks, and brand management involved in digital media activities. This type of information is increasingly important for management of private and public organizations.

Organizations are beginning to acknowledge the importance of social media metrics across the range of corporate and non-profit objectives, especially those that involve legal, human resources, as well as advertising and marketing activities. These new metrics can be roughly categorized into three areas. At a basic level, they are granular metrics that quantify activities like the number of retweets, check-ins, as well as likes and subscribers. Strategically, metrics can also be useful for designing new products and services, facilitating support and promoting advocacy for an organization, fostering dialogues about products and services, and monitoring marketing/social media campaigns. Lastly, metrics are of particular concern to upper management as they can be useful to provide information on the sustainability of an organization.

Those in the “C-suites” (CIOs, CFOs, COOs, and CEOs) can use the information on an organization’s financial status, technical performance, and legal risks to assist management decision-making. Metrics present connections from social media investments to key concerns such product development, service innovation, policy changes, market share, and stock market value. Recognizing the increasing utility of metrics, management has increasingly appreciated digital dashboards as a way to collect and display data in a visually comprehensive way.

The increased attention on metrics has suggested an era of “big data” analytics has emerged in the digital media sphere. The collection of unstructured information from around the web (as opposed to pre-defined, structured data from traditional databases) presents unprecedented opportunities to conceptualize and capture empirical information from networked environments for various parts of an organization. Techniques such as pattern and phrase matching use new types of algorithms to capture and organize information from throughout the Internet, including the rapidly growing “Internet of Things” (IoT). The result is a transformation in the way commerce, politics and news are organized and managed.

Combined with artificial intelligence (AI) and natural language processing (NPL) for instance, cognitive systems like IBM’s Watson are disrupting industries like healthcare and finance. Watson made a spectacular debut by beating two human contestants on the TV game show Jeopardy, a challenging combination of cultural, historical and scientific questions. While Watson is struggling to catch on, AI is emerging in autonomous vehicles, voice recognition, game systems and educational technology. Apple’s Siri and Samsung’s Bixby are central to modern uses of smart phones. Amazon’s Alexa is becoming popular on kitchen counters around the US, shared by all members of the family. AI is likely to be a major influence on a wide range of cultural and experience industries.

Project managers and media producers can use the metrics to see connections between content/cultural products, audience participation, customer satisfaction, and market share. C-suites executives utilize the information on financial status, technical performance, and legal risks. Besides assisting management decision-making, analytics can provide useful performance information, and improve the development of new content products, cultural expressions, and experience-based services while targeting them to individual customers. While new developments like AI-assisted genre innovation and other machine incursions into the creative process are a justifiable means for concern, cognitive-logistical support for cultural event planners, film surveyors, and other creative content producers could be a welcome provision in the cultural/media industries.

This move to “big data” science requires the ability to conceptualize and enact strategies for data collection, analysis, and visualization. Employees and management should develop an appreciation for research and statistical thinking, as well as visual literacies and competencies for representing information and designing graphics that display data results in interesting and aesthetically appealing formats. The ability to identify and capture data from spreadsheets, the web and other online sources is important, as well as the ability to formulate pertinent questions and hypotheses that shape data research and models that can help explain or predict human or system behaviors.

Not everyone will be comfortable with these new types of data collection and intelligence-producing systems, but like them or not, AI and big data are encroaching rapidly on modern life.

Notes

[1] This is from a section in Pennings, A. (2017, October). Emerging Areas of Creative Expertise in the Global Digital Economy. [Electronic version] GDM Quarterly 1 (2), 1-11.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Lasswell and Hall – Power and Meaning in Media

Posted on | March 6, 2019 | No Comments

Harold Dwight Lasswell was one of the founding influences on the formation of the study of communication, media studies, and sociology. Stuart Hall was a Rhodes Scholar from Jamaica who helped pioneer an area of research at Birmingham Open University that was later coined British Cultural Studies. Both contributed significantly to media studies during their tenures, and and some of their contributions can be discussed by examining the phrase, first coined by Lasswell:

“Who, says what, in which channel, to whom, with what effect?”

Lasswell Model of Communication

Published in his 1948 essay, “The Structure and Function of Communication in Society,” the Lasswell model was easy to understand, and it works for a wide variety of communication and media activities.[1] Noted for its emphasis on effects – how a message influences an audience member, it was picked up by psychology-oriented scholars to examine the media’s influence on human behaviors, including consumption, violence, and voting. It was coined just after the second world war, during an era when radio was still dominant, and social scientists were interested in the dramatic impacts of propaganda and advertising. Radio had played an important part in mobilizing both Allied and Axis populations during the 1930s.

The Lasswell model was often criticized for stressing linear, one-way flows of information. Feedback and message-disturbing “noise” were not stressed, as they would be in new areas of study called cybernetics (Wiener, Forrester) and information theory (Shannon and Weaver). Hall offered an additional criticism, the Lasswell model minimized the role of power in the communication process.

Hall reworked Lasswell’s formula into “Who has the power, in what channels, to circulate which meanings, to whom (with what effect)?” He wanted us to examine the meanings of images in such a way to show how different interests work to hold a preferred interpretation of that image. What organizations (news media, industry boards, advertising, government, etc.) and areas of specialization (journalism, medicine, finance, etc.) have the power to enforce and police such meanings? Drawing on the area of semiotics, Hall emphasized that because the meanings in images are fundamentally flexible and fluid, “power” works to arrest or fix the meanings associated with the image. Brand management, political communication, and public relations are primarily about establishing a set understanding of media images and continually policing their interpretations.

Here is some of a lecture by Stuart Hall where he discusses how culture gives meaning to things, the human tendency to jointly build maps of intelligibility and to create systems of classification, as well as how signifying practices in media production such as image composition, narration and editing work to make meaning. At the end of this clip he challenges the Lasswell model. (See the Media Education Foundation (MEF) for the video’s transcript)[2]

One way power operates to secure preferred meanings is through systems of classification. Organizing the world into categories helps maintain order by discerning distinct differences in things and making sure they stay in those boxes. It is important to realize that culture works to create and maintain these categories through processes of differentiation and control. These would include antidotes, jokes, memes, metaphors, stories, etc.

For Hall, classification is generative. Once a system of classification is created, other things fall into place and serve to maintain the order of the overall structure. In the US, the category of the presidency has recently been challenged by a series of elected candidates that upset traditional notions of who is eligible. George W. Bush was the son a previous president, raising issues of nepotism. Barack Obama, being black, challenged many Americans’ sense of who was racially eligible to be in the “Oval Office.” More recently, Donald Trump, who never held elected office or ran a large organization, was considered by many to be unfit and unqualified to hold the office. These category fits are important to people and a major source of social strife in modern society.

Hall’s primary interest was cultural studies. He helped shift the the study of culture from high culture to popular culture. “Culture, he argued, does not consist of what the educated élites happen to fancy, such as classical music or the fine arts.” The value in such studies is that they can tell about other parts of society that have been marginalized. They can tell us things about how race, gender, and economic classes are rendered in modern society.[3]

His work on Race: The Floating Signifier is a classic on the fluidity of meanings associated with representations of race. Here he expands his work on classification, drawing on the area of semiotics. Semiotics is based on the study of signs, divided into signifiers and signified; the thing and its meaning. He used this approach to examine how culture influences the way we see race as part of a system of classification that is used to order society and the types of people within it.

For Hall, race has physical characteristics, primarily, hair and skin color. But a range of meanings and values are associated with races. He uses a concept by Mary Douglas, “matter out of place” to describe the implications of those classifications. In her Purity and Danger: An Analysis of Concepts of Pollution and Taboo (1966), she argued that we constantly construct symbolic orders that evaluate and rank items and events. Hall recounts her example of dirt in the bedroom and dirt in the garden. One is “dirty” the other is natural. One is problematic and needs to be addressed at once, the other is invisible. A related example is the “back of the bus” that was allocated for black people during the height of segregation in the southern states of America.

For Hall, this is part of the Enlightenment’s project to bring all experience into observation and understanding. The panoptic view of Hall reiterates Michel Foucault’s admonition in Power/Knowledge (1980) that it is not so much that information is power, but rather, it is power that shapes information.[4][5]

Stuart Hall died on February 10, 2014. His legacy was being a major contributor to the creation of cultural studies and specifically, the Birmingham School of Cultural Studies. He engaged the study of signs – semiotics – and warned that power engages in securing preferred meanings to keep the understanding of images from sliding into other interpretations and thus empowering other groups or individuals who would benefit from another set of meanings.[6]

Notes

[1] Lasswell, H. D. (1948). “The Structure and Function of Communication in Society.” In L. Bryson (Ed.), The Communication of Ideas. (pp. 37-51). New York: Harper and Row.
[2] The transcript and the video Representation & the Media Produced & Directed by Dr. Sut Jhally. Edited by: Sanjay Talreja, Sut Jhally & Mary Patierno. Featuring a lecture by Stuart Hall Professor, The Open University and introduced by Sut Jhally University of Massachusetts at Amherst. Distributed by Media Education Foundation (MEF) that produces and “distributes documentary films and other educational resources to inspire critical thinking about the social, political, and cultural impact of American mass media.”
[3] Hsu, Hua. “Stuart Hall and the Rise of Cultural Studies.” The New Yorker, The New Yorker, 17 July 2017, www.newyorker.com/books/page-turner/stuart-hall-and-the-rise-of-cultural-studies.
[4] Foucault, M. (1980) Power/Knowledge: Selected Interviews and Other Writings, 1972-1977. Trans. Colin Gordon et al. New York: Pantheon.
[5] Mason, Moya K. Foucault and His Panopticon. 2019. Accessed March 6, 2019.
[6] Chandler, D. (2017) Semiotics for Beginners. Routledge.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY Korea, he taught at New York University (NYU) for 15 years and at Marist College in New York before his move to Manhattan. His first academic position was at Victoria University in New Zealand. He has also spent ten years at the East-West Center in Honolulu, Hawaii. Originally from New York, his US home is now in Austin, Texas where he also taught in the Digital MBA program at St. Edward’s University.

Spreadsheets and the Rhetoric of Ratios

Posted on | March 2, 2019 | No Comments

In this post, I examine the figuring of ratios as a conceptual technique for constructing systems of understanding in the modern political economy. The ratio is an important mathematical device for reality construction in a wide range of activities, but their role in financial and management environments are especially notable. These type of ratios are a result of dividing one account balance or financial measurement with another. Examples are debt-to-equity, return on assets (ROA), and return on investment (ROI).

This post continues my series on meaning-making practices and the power of digital spreadsheets. I previously wrote about the historical components of spreadsheets – the lists, rows, numbers, words, and tables that combine to give users of spreadsheets their visual and analytical proficiencies. Accounting, in particular, used the lists and tables of spreadsheets to create “time-space” power that propels organizations across the span of months and years, and over the span of long distances.

More recently, I’ve been examining the various formulas that provide additional calculative and analytical capabilities for the spreadsheet. With almost 500 different types of formulas, from simple arithmetic like AVERAGE, SUM, and MIN/MAX to more complex formulas such as CHOOSE, CONCAT/CONCATENATE, HLOOKUP, INDEX MATCH, PMT and IPMT, XNPV and XIRR.

Below, I explore the communicative usage of ratios to construct an understanding of relationships, in this case, a corporation’s productivity.

Productivity ratios provide one evaluative structure for indicating the efficiency of a company. It is a ratio or fraction of output over input, where output is the number of goods or services produced by an industry, company, person, or machine and input is the amount of resources you want to measure.

Ratios are used mainly to establish a quantitative relation between two numbers, showing the times the value of one amount is contained within the other such as total revenue divided the number of employees. For example, in 2015, Apple had revenue of $182,800,000,000 and just under 98,000 employees. This meant that Apple made $1,865,306 per employee.

$182,800,000,000 / 98,000 = $1,865,306

Google was next with $1,154,896 of revenue per employee. Japan’s Softbank made $918,449 per employee for third place while Microsoft made $732,224 per employee. Measuring revenue per employee (R/E) provides an understanding of the productivity of the company and possibly how efficiently the company runs. It also provides a metric for comparing it with other companies.

Creating ratios in Excel requires the use of the GCD function (Greatest Common Divisor) or use the TEXT and SUBSTITUTE functions.

Spreadsheets vary in complexity and purpose, but they primarily organize and categorize data into logical formats by using rows and columns that intersect in active cells. They can store information and make calculations and data reorganization based on models. They display information in tabular form to show relationships and can help make elaborate visualizations with the data. Consequently, they make it easier to leverage organizational data to make relationships apparent and answer what-if questions. With the use of ratios, they can also identify high and low performing assets, track employee performance, and evaluate profitability.

A ratio denotes a relationship, usually between two numbers, but in any case, between amounts. They indicate how many times the first amount is “contained” in the second. They can be a valuable technique for comparison and a measurement of progress against set goals such as market share versus a key competitor. They can also be used for tracking trends over time or identifying potential problems such as a looming bankruptcy.

Ratios are a technique that “fixes” or freezes a relationship in order to construct a moment of reality. While they attempt to apprehend truth, they are instrumental in solidifying, at least temporarily, the appearance of concrete realities. Ratios have analytic capacity, such as how much productivity comes from the average individual worker.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY Korea, he taught at New York University (NYU) for 15 years and at Marist College in New York before his move to Manhattan. His first academic position was at Victoria University in New Zealand. He has also spent ten years at the East-West Center in Honolulu, Hawaii. Originally from New York, his US home is now in Austin, Texas where he also taught in the Digital MBA program at St. Edward’s University.

keep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from http://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor and Associate Chair at State University of New York (SUNY) Korea. Recently taught at Hannam University in Daejeon, South Korea. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, media economics, and strategic communications.

    You can reach me at:

    anthony.pennings@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Traffic Feed

  • Calendar

    November 2019
    M T W T F S S
    « Oct    
     123
    45678910
    11121314151617
    18192021222324
    252627282930  
  • Pages

  • November 2019
    M T W T F S S
    « Oct    
     123
    45678910
    11121314151617
    18192021222324
    252627282930  
  • Flag Counter
  • My Fed Watcher’s Handbook is now available on

  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.