Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Spreadsheets and the Rhetoric of Ratios

Posted on | March 2, 2019 | No Comments

In this post, I examine the figuring of ratios as a conceptual technique for constructing systems of understanding in the modern political economy. The ratio is an important mathematical device for reality construction in a wide range of activities, but their role in financial and management environments are especially notable. These type of ratios are a result of dividing one account balance or financial measurement with another. Examples are debt-to-equity, return on assets (ROA), and return on investment (ROI).

This post continues my series on meaning-making practices and the power of digital spreadsheets. I previously wrote about the historical components of spreadsheets – the lists, rows, numbers, words, and tables that combine to give users of spreadsheets their visual and analytical proficiencies. Accounting, in particular, used the lists and tables of spreadsheets to create “time-space” power that propels organizations across the span of months and years, and over the span of long distances.

More recently, I’ve been examining the various formulas that provide additional calculative and analytical capabilities for the spreadsheet. With almost 500 different types of formulas, from simple arithmetic like AVERAGE, SUM, and MIN/MAX to more complex formulas such as CHOOSE, CONCAT/CONCATENATE, HLOOKUP, INDEX MATCH, PMT and IPMT, XNPV and XIRR.

Below, I explore the communicative usage of ratios to construct an understanding of relationships, in this case, a corporation’s productivity.

Productivity ratios provide one evaluative structure for indicating the efficiency of a company. It is a ratio or fraction of output over input, where output is the number of goods or services produced by an industry, company, person, or machine and input is the amount of resources you want to measure.

Ratios are used mainly to establish a quantitative relation between two numbers, showing the times the value of one amount is contained within the other such as total revenue divided the number of employees. For example, in 2015, Apple had revenue of $182,800,000,000 and just under 98,000 employees. This meant that Apple made $1,865,306 per employee.

$182,800,000,000 / 98,000 = $1,865,306

Google was next with $1,154,896 of revenue per employee. Japan’s Softbank made $918,449 per employee for third place while Microsoft made $732,224 per employee. Measuring revenue per employee (R/E) provides an understanding of the productivity of the company and possibly how efficiently the company runs. It also provides a metric for comparing it with other companies.

Creating ratios in Excel requires the use of the GCD function (Greatest Common Divisor) or use the TEXT and SUBSTITUTE functions.

Spreadsheets vary in complexity and purpose, but they primarily organize and categorize data into logical formats by using rows and columns that intersect in active cells. They can store information and make calculations and data reorganization based on models. They display information in tabular form to show relationships and can help make elaborate visualizations with the data. Consequently, they make it easier to leverage organizational data to make relationships apparent and answer what-if questions. With the use of ratios, they can also identify high and low performing assets, track employee performance, and evaluate profitability.

A ratio denotes a relationship, usually between two numbers, but in any case, between amounts. They indicate how many times the first amount is “contained” in the second. They can be a valuable technique for comparison and a measurement of progress against set goals such as market share versus a key competitor. They can also be used for tracking trends over time or identifying potential problems such as a looming bankruptcy.

Ratios are a technique that “fixes” or freezes a relationship in order to construct a moment of reality. While they attempt to apprehend truth, they are instrumental in solidifying, at least temporarily, the appearance of concrete realities. Ratios have analytic capacity, such as how much productivity comes from the average individual worker.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY Korea, he taught at New York University (NYU) for 15 years and at Marist College in New York before his move to Manhattan. His first academic position was at Victoria University in New Zealand. He has also spent ten years at the East-West Center in Honolulu, Hawaii. Originally from New York, his US home is now in Austin, Texas where he also taught in the Digital MBA program at St. Edward’s University.

From Gold to G-20: Flexible Currency Rates and Global Power

Posted on | November 11, 2018 | No Comments

When British author Ian Fleming published the James Bond spy novel Goldfinger in 1959, the world was locking into a new framework for managing global trade and foreign exchange transactions. The New Deal’s Bretton Woods agreements tied participating currencies into a fixed exchange rate with the US dollar that itself was tied to $35 for an ounce of gold. The Goldfinger movie (1964) centered around the stocks of gold nestled away at the Fort Knox Bullion Depository, Kentucky. The sinister plan (spoiler alert) suggested Auric Goldfinger was conspiring with the mafia to rob the famous bank, but the actual plan was to work with the Chinese to nuke Fort Knox so that the rest of the world’s gold supplies (that he had been buying up) increased dramatically in value. Below is a clip of the conclusion of the movie.

This post discusses how the major industrialized nation-states organized to manage the transition from fixed exchange rates to the global, floating, digital trading environments that emerged in the 1970s and shaped the modern world.

President Franklin Delano Roosevelt (FDR) created Fort Knox in 1935 during the New Deal after outlawing the use of gold for private wealth holdings and as common currency. The Public Works Administration began construction of the giant vault in 1935 as fears of spreading fascism in Europe convinced him to move US-held gold from mints and treasuries along the Eastern seaboard and store them far inland and next to Fort Knox, a US military training facility that had pioneered the use of tank tactics during World War I.

As the Nazis began to conquer neighboring countries to steal their gold, much of the durable metal escaped and flowed into Fort Knox. Britain bought US military equipment and supplies, and other countries sought refuge for their metallic wealth. By the end of the war, most of the world’s gold was tucked away at Fort Knox and formed the basis for a new international currency arrangement.

As the war was coming to an end, the Allies met in Bretton Woods, New Hampshire, to iron out a new international framework for currency control and trade. The resulting 1944 agreement created the International Monetary Fund (IMF), the World Bank, and a precursor to the World Trade Organization (WTO), and the International Trade Organization (ITO). The ITO was defeated by the US Congress at the time, but the main result of the conference was an agreement to stabilize world currencies by tying the US dollar to gold at $35 an ounce and requiring trading partners to keep the value of their currencies connected to the dollar by 1-3 percent in value. This arrangement was successful but started to fall apart due to spread of dollars around the world that were circulating back to the US as a claim to the gold at Fort Knox.

When Richard Nixon shocked the world of the Bretton Wood’s gold-dollar standard in 1971, he initiated what would become a new global economy based on flexible currency exchange rates with the US dollar as the principal currency. The USD became “fiat” money, no longer tied to the promise of gold convertibility. His New Economic Policy (NEP) was partially responsible for the transition to digital currency trading as commercial financial institutions intensified their interest in the F/X markets. As the international financial system became more electronically-based and transactions escalated in scale and scope, a new regime of power emerged. However, the process was a long and often painful one.

The announcement that Nixon merely “closed the gold window,” made to a national television audience on August 15th of that year, was a rhetorical understatement meant to lessen the startlingly dramatic changes he was imposing on the world economy. Nixon was stating that the US was not going to intervene in the foreign exchange markets to manage the price of the dollar or continue to deplete US gold reserves. To give him a bargaining chip in future international negotiations, he also added a 10 percent surcharge to imports in his NEP. The move destabilized the world of fixed currency rates and ended the Bretton Woods control over international finance. It concluded what had in effect beena gold standard.[1]

The Nixon moves were strongly opposed by Japan and Western Europe, fast becoming major economic competitors due to automobiles and other cheap and attractive imports. The “Nixon Shokku” drove the value of the US dollar down and made exports into the US more expensive, increasing price inflation. He responded with charges that his economic counterparts were trading unfairly with mercantile tactics, and freeloaded off the US by failing to contribute to the NATO defense effort. The US dollar lost nearly a third of its value during the first six years after the dollar had its link to gold severed.

Nixon was also working with the OPEC countries to raise oil prices and cripple his economic foes, who were much more reliant on external petroleum production than the US. OPEC was essentially a cartel designed to managed the price of oil, which were priced the US dollar. The floating exchange rates were leading to a dramatic fall in the value of the US dollar and nations had to buy dollars first to purchase OPEC crude. Consequently, OPEC countries started to increase the price of oil.

Nixon had strong allies in the banking community who were intrigued by the profit possibilities in arbitrage, consulting, and hedging. The policy signaled a significant shift away from government control to a new, free yielding environment where US banks could break out of their highly regulated and geographically limited reins. Walter Wriston, who would later coin the term “information standard,” argued that in this new environment, his Citicorp enterprise, “ought to have a price-to-earnings (P/E) ratio of 15 to 20, typical of growth stocks, rather than a ratio of 5 to 6, which was typical of public utility companies.” In the next few years, Citicorp stock reached a P/E ratio of 25. The stock moves invigorated the banking industry as they began to move into international activities such as Eurodollar lending.[3]

To buy time to sort out the implications of the broken Bretton Woods system, the US initiated a conference via the IMF on world monetary reform. The IMF was organized around its Articles of Agreement consented to by its participating countries, although the US held a privileged place due to its overall power and dollar’s strength. The IMF set up the Committee of 20 (C-20) in 1972 to prepare proposals for a new financial regime in the wake of Nixon’s closing the gold window. The committee was made up of representatives of the twenty countries most involved in non-communist international trading and finance.

The C-20 met between 1972 and 1974 to develop a new international system that would subordinate currencies to Special Drawing Rights (SDRs), a global reserve asset created in 1969. Although SDRs allocations provided some liquidity and supplemented member countries’ official reserves, it proved inadequate to stabilize the monetary system. The C-20 broke down in 1974 over concerns about countries being forced to inflate their currencies to buy up excessive US dollars.

Concerns were raised by the Group of Ten (G-10). They had created the infamous Interim Committee that had been responsible for the ill-fated Smithsonian Agreement in December 1971. Those accords had established a new dollar-gold standard, where the currencies of many industrialized nations were again pegged to the US dollar, but at an inflated $38 an oz. A crisis ensued, and the US dollar continued to drop in value.[3]

The committee reconvened on March 25, 1973, at the White House to discuss the international monetary crisis at the invitation of US Secretary of the Treasury, George Schultz and his undersecretary Paul Volcker (later appointed as Federal Reserve Chairman). Their list included finance ministers from England, France, and Germany. Although they resolved to address the issue of destabilized currency rates, late that year, overwhelmed by the first Middle East “Oil Crisis,” the Group of Ten decided by default that exchange rates would float.[4]

The Oil Crisis, sparked by OPEC’s embargo of petroleum to the US for its support of Israel, increased widespread volatility in the foreign exchange markets. A beneficiary was a startup project by Reuters called Money Monitor Rates. A well-established news agency, Reuters created a two-sided electronic market for currency markets. It charged selling banks to list currency prices on computer monitors, while also charged buying banks to access the currency prices. The crisis helped the Reuters endeavor become profitable as currency traders valued the electronic information.

Schultz and Volcker next agreed to invite the Japanese, and the news media labeled the group the “G-5” and began to refer to the “Group of Five” meetings. The next year, two of the finance ministers became the head of government. When Valery Giscard d’Estaing from France and Helmut Schmidt from Germany became leaders of their respective countries, they attempted to elevate the meetings to include not only the ministerial, but heads of government as well.

The new group met for the first time during November 15-17, 1975 at the Chateau de Rambouillet, France.[5] At that meeting the heads of state and the finance ministers of France, Germany, Japan, the United States, and Great Britain, the G-5 prepared (some reluctantly) the Declaration of Rambouillet. They forsook a system of stable monetary rates for a “stable system of exchange rates.”[6] Instead of governments agreeing to peg currencies according to political agreements, they instead would manage the macro-environment in which foreign exchange rates would be set by market forces.

The Second Amendment to the IMF’s Articles of Agreement officially eliminated the unique role of gold and legitimized floating exchange rates in 1978. The governments of the major currencies decided to let the “markets” decide the proper exchange rates for its currencies. In the future, it would be infeasible for sovereign governments, except maybe the US, to be able to dictate the value of their money.

The next year Canada attended, and after the fall of the USSR, Russia began to participate. In the wake of Bretton Woods demise, the G-# summits played an important role in providing overall stability for the foreign exchange markets and world economy as a whole. Although America clearly led the group, and volatility was rampant, the group produced global coordination and symbolic power.[7] One of its most important roles is to solidify the notion of national sovereignty. In the age of the transnational money markets, the individual nation-state needed to be reified. That is, propped up in part to maintain the system of different currencies. Notable examples were the debt crises of the 1980s and the Asian currency crisis in 1997.

Susan Strange:

    The Group of Ten developed countries, by contributing to rescue funds for troubled Asian economies, reiterated more forcefully than ever the conviction that in the modern international financial system, bankruptcy was not an option, at least, not bankruptcy in the sense that a failed business closes down, gets sold off to the best bidder or taken over lock, stock, and barrel. The appearance of an immortal, sovereign state was to be preserved—not for its own sake, but for the greater security of the world system.

The financial disruptions of the 1990s, particularly in Asia and the USSR, led to the formation of the G-20 in 1999 at the G-7 Cologne Summit. They saw the need include “systemically important countries” in the discussions about the global economy. Initially, it was the finance ministers who met every year, until George W. Bush invited the G-20 leaders to Washington DC to discuss the unfolding economic crisis.

After its inaugural meeting of the national leader’s summit in 2008, the G-20 announced in September 2009 that it would replace the G-8 as the main economic council of wealthy nations. The Obama administration pushed for the change in the midst of the Great Recession in recognition of the influence of many other countries on the global economy and the need to coordinate policy among the major countries.

The G-# summits have been held annually since that time to address a number of macroeconomic issues affecting the global political economy as well as crime, drugs, the environment, and human rights. During the 1990s, they synchronized the “global information infrastructure” and its transition to the global Internet.

Meanwhile, the financial system’s “information standard” has solidified its power as the replacement for the gold with the extraordinary increase in digital money and flows of currency. In junction with the G-#’s political power, a modicum of systemic stability and coordination keeps the world turning, as James Bond would have it.

Citation APA (7th Edition)

Pennings, A.J. (2018, Nov 11). From Gold to G-20: Flexible Currency Rates and Global Power. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/from-gold-to-g-5-flexible-currency-rates-and-global-power/

Share

Notes

[1] Greider, W. (1987) Secrets of the Temple. How the Federal Reserve Runs the Country. New York: Simon and Schuster. pp. 337-338. A very extensive treatise on the economic conditions of the 1970s.
[2] Smith, R. (1989) The Global Bankers. New York: Truman Talley Books/ Plume. p. 34.
[3] The G-10 consisted of governments of eight International Monetary Fund (IMF) members—Belgium, Canada, France, Italy, Japan, the Netherlands, the United Kingdom, and the United States—and the central banks Germany and Sweden. This group was, by default, in charge of maintaining the economic growth and stability of international currencies. Although in effect, its powers are limited, it still presented an important image of national sovereignty.
[4] Gowan, P. Global Gamble: Washington’s Faustian Bid for World Dominance. London: Verso. p. 20-21.
[5] G-7/G-8 Summits: History and Purpose. Fact sheet released by the Bureau of European and Canadian Affairs, U.S. Department of State, April 30, 1998 accessed on May 01, 2001 from: http://www.state.gov/www/issues/economic/summit/fs_980430_g8_sumhistory.html.
[6] Eichengreen, B. (1996) Globalizing Capital: A History of the International Monetary System. Princeton, NJ: Princeton University Press. I am indebted to his fifth chapter, “From Floating to Monetary Unification,” which contains a good overview of this process. Pages 136-141 were particularly useful.
[7] Peter Gowan (1999) makes a strong case for the dominance of the US in the G-7 oligarchy in his book The Global Gamble: Washington’s Faustian Bid for World Dominance. London: Verso.
[8] Strange, S. (1997) Mad Money: When Markets Outgrow Governments. Ann Arbor: University of Michigan Press. p. 9.



© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Apple’s GUI and the Creation of the Microsoft’s Excel Spreadsheet Application

Posted on | October 19, 2018 | No Comments

Microsoft’s famous spreadsheet application, Excel, was originally designed for Apple’s Macintosh personal computer. This post explores the beginning years of the personal computer and its transition to its more modern interface pioneered by Apple and its Macintosh computer. This transition opened the way for new software innovations, particularly the development of the Excel spreadsheet application by Microsoft. Excel has since become the mainstay spreadsheet application in organizations around the world.

The Apple Macintosh or “Mac” was based on the Graphical User Interface (GUI, pronounced “gooey”), that was primarily developed at Xerox Parc in Palo Alto, CA. It was sometimes called WIMP for its Windows, Icons, Mouse, and Pull-down menus technology. The Apple II and IBM PC were still based on something called a command line interface, a “black hole” next to a > prompt that required code to entered and executed on. For example, to find a word or other string of characters, you would type find /V “any string” FileName at the prompt C:\>

This command line system required extensive prior knowledge and/or access to readily available technical documentation. The user needed to know the codes or copy them from a manual. The GUI on the other hand, allowed you to point to information already on the screen or categories that contained subsets of commands. Eventually, menu categories such as File, Edit, View, Tools, Help were standardized on the top of GUI screens.

A crucial issue for the Mac was good third-party software that could work in its GUI environment, especially a spreadsheet. Representatives from Jobs’ Macintosh team visited the fledgling companies that had previously supplied microcomputer software. Good software came from companies like Telos Software that produced the picture-oriented FileVision database and Living Videotext’s ThinkTank used “dynamic outlines” to capture levels of thought and promote creative thinking. By April 1985, they had sold 30,000 copies, or to about 10% of all Mac owners. However, that number was still small compared to the potential of the business world.

For the Mac to be useful for a business, Apple needed a new VisiCalc. Jobs’ longstanding relationship with VisiCorp was strained because the VisiCalc distributor was trying to develop its own “PARC-like system” for IBM PCs called Visi-On.[1] Lotus was the up-and-coming software producer and signed on with Apple to produce an ambitious spreadsheet application called “Jazz,” but the software soon ran into trouble. Steven Levy wrote:

    Apple was desperate for the Macintosh equivalent to VisiCalc, something so valuable that people would buy the computer solely to run it. It had high expectations for Lotus’s product—after all, Lotus 1-2-3 had been the IBM PC’s VisiCalc—but Lotus’s Jazz turned out to be a dud. Mitch Kapor’s charges had clumsily missed the point of Macintosh. In the Lotus view, Mac was a computer for beginners, for electronic dilettantes who still clung to a terror of technology. Jazz was the equivalent of a grade school primer, an ensemble of crippled little applications that worked well together but were minimally useful. No one bought it.[2]

Other companies had partial success in creating a spreadsheet for the Mac, but part of the problem of getting a good spreadsheet for the Mac was that its screen was small and not conducive to spreadsheet work. Microsoft’s Multiplan for the Mac was an early offering. Ashton Tate produced a spreadsheet for the Mac called Full Impact that contained much of the software used in an Apple in-house spreadsheet called Mystery House. Unlike its Apple II predecessor, the Macintosh was failing to make significant inroads into a business world that was enamored with the PC and Lotus 1-2-3.

Apple’s innovative Macintosh technology did not go unnoticed by Microsoft. Microsoft was shown the embryonic Mac near the end of 1981. Apple authorized Microsoft to develop software languages and apps for the Mac GUI-based system. Gates and company had gone public in March of 1985 netting the co-founder’s 45 percent share in the company some $311 million in net worth. The young company was working secretly on their own GUI interface while continuing to develop software for the Mac. Meanwhile, Microsoft pressured Apple to give them a license for the GUI interface and threatened to stop work on the software they were producing for Apple.[3] In October 1985, Apple CEO John Scully gave in and offered them a license for the Mac GUI. But much to the ire of Apple, Gates’ company had developed their own GUI that it layered on top of its DOS operating system. In November 1985, Microsoft began to ship Windows 1.0, a DOS operating system with an awkward but much friendlier face.

The Macintosh’s initial “killer app” turned out desktop publishing that helped them develop their WYSYWIG (what you see is what you get) technology. In mid-July 1985, a company called Aldus released a final version of PageMaker. For the producers of corporate newsletters and other small publishers, the Macintosh began to show great promise. For the first time, documents could be manipulated and shown on the screen exactly as it would be published. But the WYSYWIG appearance on the screen was only minimally effective without the capacity to print what was shown. The printing solution came from another computer scientist from Xerox PARC. John Warnock had created technology called Postscript that allowed a laser printer to print exactly what was on the screen. In 1982 he created a company called Adobe that caught the attention of Apple. Jobs canceled work on other projects and bought nearly 20% of Adobe while carefully integrating their technology into the design of a new Laser printer.

Combined with applications like Pagemaker and Macpaint, the new software-print combination inspired thousands of graphic artists and painters to try the Macintosh. While Apple lost market share in the financial sphere, it became the darling of graphic designers and publishers. One casualty, however, was Steve Jobs who was ousted by CEO John Scully and the Apple board in May 1985 after Apple recorded its first-ever quarterly loss. In September, Jobs sold his shares of Apple.

The new GUI-enhanced machines presented a major challenge for the software industry. The original company that created the famous VisiCalc spreadsheet was sued by its distributor VisiCorp, formerly Personal Software, in September 1983 for failing to keep the software current. A counter-suit was filed and the legal hassles distracted software development. Consequently, the spreadsheet application never made an adequate jump to a GUI environment. Lotus Development bought out Software Arts after a chance meeting between Bricklin and Kapor on an airline flight. Soon after, Lotus discontinued VisiCalc. But Lotus also took a nosedive as its failure with Jazz was a fateful mistake, providing a crucial opening for Microsoft. Gates and company released Excel 2 for the “fat Mac” in 1985. Although it was the first version, it was numbered to correspond with the new Mac. Soon Excel became a popular spreadsheet for the Macintosh, especially after the release of Excel 2.2 for the Macintosh in 1989 that was nearly twice as fast as the original.

Microsoft went on to develop several versions of Windows, its new GUI operating system and as they got better, the Excel spreadsheet became more popular. Windows 2.0 was released in 1987 and Microsoft offered versions of Excel for Windows and for DOS that year. But both applications were awkward and did not become very popular. Apple sued Microsoft in 1988 after the release of Windows 2.01, claiming its interface design had been copied, but to no avail.

Lotus 1-2-3 had been the top-selling software product of 1989 but that was also the year the GUI gained popular acceptance within the DOS world with the introduction of Windows 3.0. The DOS-based Quattro Pro by Borland had been on the rise against the dominance of Lotus 1-2-3, but neither could resist the power of the user-friendly Excel 3.0 and the even better Excel 4.0 released in 1992. Meanwhile, Excel remained the only spreadsheet available for Windows until 1992 when Lotus countered with its Lotus 1-2-3 4.0 for Windows. But Excel 5.0 signaled Microsoft’s rise to dominance, partially because of the inclusion of the Visual Basic Programming System. Windows soon monopolized the PC desktop and Excel became its flagship business tool.[5]

Notes

[1] Visi-On information from Steven Levy’s Insanely Great. The Life and Times of Macintosh, The Computer that Changed Everything. NY: Penguin Books. p. 160.
[2] Quote on the Jazz failure from Steven Levy’s (1995) Insanely Great. The Life and Times of Macintosh, The Computer that Changed Everything. NY: Penguin Books. p. 219-20.
[3] Information on John Warnock and Postscript from (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 38. Also see Levy (1995) p. 212-213.
[4] Apple vs Microsoft in Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 361.
[5] Information on spreadsheet competition from Rob Clarke’s “A Formula for Success,” PC WORLD, August 1993, pp. 15-16.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Potential Bill on Net Neutrality and Deep Pocket Inspection

Posted on | October 17, 2018 | No Comments

Just got this discussion draft by Eliot Lance Engel (D-NY) from one of my former colleagues at New York University and a former Verizon executive, Thomas Dargan. Eliot Lance Engel (D-NY) is the U.S. Representative for New York’s 16th congressional district that contains parts of the Bronx and Westchester County.[1] US telecommunications policy is based on the Communications Act of 1934 that created the Federal Communications Commission and established the importance of common carriage, a concept that is included in current understandings of net neutrality.

115TH CONGRESS 2D SESSION

[DISCUSSION DRAFT]

H. R. __

To amend the Communications Act of 1934 to prohibit broadband internet access service providers from engaging in deep packet inspection.

IN THE HOUSE OF REPRESENTATIVES
—— introduced the following bill; which was referred to the Committee on ______________

A BILL

To amend the Communications Act of 1934 to prohibit broadband internet access service providers from engaging in deep packet inspection.

Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled,

SECTION 1. SHORT TITLE.
This Act may be cited as the “Deep Packet Privacy Protection Act of 2018”.

SEC. 2. PROHIBITION ON DEEP PACKET INSPECTION.

(a) IN GENERAL.—Title VII of the Communications Act of 1934 (47 U.S.C. 601 et seq.) is amended by adding at the end the following:

“SEC. 722. PROHIBITION ON DEEP PACKET INSPECTION.

“(a) IN GENERAL.—A broadband internet access service provider may not engage in deep packet inspection, except in conducting a reasonable network management practice.

“(b) RULE OF CONSTRUCTION.—Nothing in this section shall be construed to prohibit a broadband internet access service provider from engaging in deep packet inspection as required by law, including for purposes of criminal law enforcement, cybersecurity, or fraud prevention.

“(c) DEFINITIONS.—In this section:
“(1) BROADBAND INTERNET ACCESS SERVICE.—

“(A) IN GENERAL.—The term ‘broadband internet access service’ means a mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all internet endpoints, including any capabilities that are incidental to and enable the operation of the communications service, but excluding dial-up internet access service.

“(B) FUNCTIONAL EQUIVALENT; EVASION.—The term ‘broadband internet access service’ also includes any service that—

“(i) the Commission finds to be providing a functional equivalent of the service described in subparagraph (A); or

“(ii) is used to evade the prohibitions set forth in this section.

“(2) DEEP PACKET INSPECTION.—The term ‘deep packet inspection’ means the practice by which a broadband internet access service provider reads, records, or tabulates information or filters traffic based on the inspection of the content of packets as they are transmitted across their network in the provision of broadband internet access service.

“(3) NETWORK MANAGEMENT PRACTICE.—The term ‘network management practice’ means a practice that has a primarily technical network management justification, but does not include other business practices.

“(4) REASONABLE NETWORK MANAGEMENT PRACTICE.—The term ‘reasonable network management practice’ means a network management practice that is primarily used for and tailored to achieving a legitimate network management purpose, taking into account the particular network architecture and technology of the broadband internet access service, including—

“(A) delivering packets to their intended destination;

“(B) detecting or preventing transmission of malicious software, including viruses and malware; and

“(C) complying with data protection laws and laws designed to prohibit unsolicited commercial electronic messages, including the CAN-SPAM Act of 2003 (15 U.S.C. 7701 et seq.) and section 1037 of title 18, United States Code.”.

(b) DEADLINE FOR RULEMAKING.—Not later than 180 days after the date of the enactment of this Act, the Federal Communications Commission shall issue a rule to implement the amendment made by subsection (a).

(c) EFFECTIVE DATE.—The amendment made by this section shall apply beginning on the date that is 270 days after the date of the enactment of this Act.

Notes

[1] Tom Dargan can be reached at US 914-582-8995
[2] Eliot Lance Engel (D-NY) official website.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

TIME Magazine’s “Machine of the Year”

Posted on | October 10, 2018 | No Comments

The Apple II was quite a success when it was introduced in 1977 with sales of US$770,000 in its first year. Its growth over the next few years, however, was tremendous. Revenues hit $7.9 million in its second year of operation and $49 Time's Machine of the Year covermillion in 1979. Its founders, Steve Jobs and Steve Wozniak, were soon multimillionaires despite still being in their early 20s. Apple’s sales were initially concentrated in the hobbyist market with recreational software such as games dominating. Another important market was education, with simulation games for teaching mathematics, music, and science, etc. Most of these programs were poor in quality though, and in both of these areas, the software industries failed to develop significantly. It was not until the business market for microcomputer software matured that demand for the smaller machines solidified and ultimately ensured Apple’s success, but not without a challenge from a similar machine – the IBM Personal Computer (PC).

Although it was not apparent at first, three software packages: databases, spreadsheets, and word processing created significant demand for the PC in the business world. In 1983, dBase emerged as the database leader with 150,000 units sold, WordStar sold 700,000 packages to take the lead among word processing software, while VisiCalc led the spreadsheet market with 800,000 units sold.[1] It was the spreadsheet though that had the most significant allure. When VisiCalc was created for the Apple II in late 1979, sales of both increased rapidly. Software Arts, who marketed and priced its VisiCalc for around $100, had sales of $11 million its first year while Apple’s sales also continued to grow, reaching to $600 million in 1982.[2] With the success of these three software areas, microcomputers were proving to be more than toys.

Until the electronic spreadsheet, the Apple II was largely considered primarily a hobby toy for “men with big beards” to control things like model train sets.[3] But VisiCalc combined the microcomputer and money in extraordinary new ways. It was the “killer app” which launched the PC revolution, but it also brought powerful new techniques of numerical analysis to the individual. Countering the prevailing notion that accounting calculations were the domain of meekish accountants and subordinate secretaries, electronic spreadsheets reached a whole new range of entrepreneurs, executives, traders, students, etc. Competition was growing though, and in 1982 it was a small company called Microsoft whose new spreadsheet called Multiplan received acclaim. The software continued to advance however, especially when a former employee of Personal Arts (a company hired to market VisiCalc) took his earnings from rights to software he developed to start his own company and create a new spreadsheet program that would dominate sales for the rest of the decade. The new software package was called Lotus 1-2-3.

Lotus 1-2-3 was a product of a new company started by Mitch Kapor. Like Steve Jobs and Wozniak, Kapor came from a bastion of military investment in computer technology, but in this case, it was Boston, not Silicon Valley. In the 1960s, he actually had the chance to learn computer programming in his high school and had built a small computer/adding machine using a telephone rotary dialer for inputting data. But also like Jobs, he was highly influenced by the counter-cultural movement, primarily a reaction to the Vietnam War. After exploring a wide variety of life experiences including teaching meditation for awhile and getting a masters degree in counseling psychology, Kapor returned to his computer roots. He went to work for Personal Arts and designed a program called VisiPlot/VisiTrend in order to increase the readability of the spreadsheet.

But after a management change he left the company. Before he left though, he received $1.2 million for the rights to his software program. Despite his counter-big business sensibilities, he took the money and started his own company called Micro Finance Systems. Their first product was called the Executive Briefing System. But before he released it, he changed the name of the company to Lotus Development Corporation in honor of a mediation practice. Soon he got venture capitalist Ben Rosen to invest in a new product that would integrate a spreadsheet and a word processor into his graphics program. After an unsuccessful attempt to produce the word processor, they added a database program and began to market Lotus 1-2-3.[4]

The success of this new spreadsheet software was tied intimately to the success of another new microcomputer, the IBM Personal Computer or the “PC.” On August 12, 1981, Big Blue introduced its personal computer based on a 4.77 MHz 8088 Intel chip with 16K (expandable to 256K) of RAM and an operating system, PC-DOS, licensed from an upstart company called Microsoft. The IBM PC came with a 5.25-inch floppy disk and room for another. Working secretly as “Project Acorn” in Boca Raton, Florida, the renegade IBM group also created expansion cards, monochrome monitors, and printers for new machine as well as a host of new software programs. The IBM PC was attractive to traditional businesses and to mainstream computer users who had previously considered the small computer little more than a toy. The availability of the spreadsheet however, turned the IBM PC into a practical business tool.

Just as VisiCalc helped the Apple II take off, Lotus 1-2-3 contributed greatly to the IBM’s success and vice-versa. The new spreadsheet package actually integrated its calculative abilities with graphics and database capabilities thus the numerical suffix on its name. For the user, it meant that not only could they do sophisticated types of calculating, they could also print the results out for business meetings, and store it as data in an organized manner. Within a year of its software release, the Lotus Corporation was worth over a $150 million. The relationship with the IBM PC was symbiotic, as Big Blue’s new little computer sold over 670,000 units were sold in its first year.[5] Lotus Corp meanwhile became the number one supplier of software as its sales revenues grew to $258 million in its third year.[6]

Lotus 1-2-3 also benefited greatly from what was arguably the deal of the century, Microsoft’s ability to license its operating system to IBM without granting it exclusive rights. Microsoft’s 1980 deal with IBM, which allowed it to sell its DOS software to other PC manufacturers meant that Lotus 1-2-3 could be used on any “IBM-compatible” microcomputer, including the semi-portable Compaq machine. Compaq was started in 1982 and set out immediately to reverse engineer IBM’s BIOS (Basic Input/Output System) in order to produce its own IBM “clone”. Soon it had developed its own microcomputer that would run the same software as the IBM PC. Compaq achieved remarkable sales of $111 million in its first year and went on to become a Fortune 500 Company.[7] Meanwhile, Lotus 1-2-3 became the most popular PC software program sold throughout the 1980s. Not unrelated, Bill Gates was on his way to becoming the richest man in the world as he made money off both the OS for the IBM PC but also each of the clones that used Microsoft’s operating system.

The personal computer was fast becoming a popular icon. By 1982 the sales of the Apple II were strong and the IBM PC was quickly gaining a piece of the rapidly growing market share. Furthermore, other companies were looking to compete and by the end of the year over 5.5 million personal computers were being used as Atari, Commodore, Compaq, Osborne, Sinclair, Tandy Radio Shack, and Texas Instruments each offered their own models for sale.[8] Another important factor for the personal computer’s popularity was its new data communication capability. Hayes successfully marketed a 1200bps modem, allowing computer users to access information services like Compuserve, Lockheed, and The Well.

The new PCs were so successful that TIME magazine decided to honor them. Originally it planned to name Steve Jobs as its “Man of the Year”. But because sales of other PCs were rising so dramatically, they changed their mind. Instead, in a January 1983 issue, TIME decided to name the “Personal Computer” its “Machine of the Year”. Although the magazine’s yearly acknowledgement generally goes to real people and was originally scheduled to go to Apple’s Steve Jobs, the dramatic sales of the IBM PC at the end of the year convinced them to change their minds.

Notes

[1] Computer: A History of the Information Machine. p. 260.
[2] Apple statistics from Rose, Frank (1989) East of Eden: The End of Innocence at Apple Computer. NY: Viking Penguin Group. p.47. VisiCalc information from Jones Telecommunications & Multimedia Encyclopedia at http://www.digitalcentury.com/encyclo/update/bricklin.html, accessed on October 24, 2001.
[3] This is a term from the documentary, Triumph of the Nerds that played on PBS during 1996.
[4] Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 338-344. This section contained biographical information on Mitch Kapor and the beginnings of Lotus 1-2-3
[5] Information on IBM PC release from Michael J. Miller’s editorial in the September 4, 2001 issue of PC MAGAZINE dedicated to the PC 20th anniversary.
[6] Information on Kapor and Lotus from the (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 128.
[7] Information on Compaq from (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 38.
[8] Number of PCs in 1982 was accessed on December 10, 2001 from: http://www.digitalcentury.com/encyclo/update/comp_hd.html.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets – Techno-Epistemological Power over People and Resources

Posted on | September 27, 2018 | No Comments

In previous posts, I wrote that digital spreadsheets had emerged as a constitutive technology that can shape perceptions, organize resources, and empower control over the lived experiences of people and the dynamics of social organizations. In this post, I look at how communicative, command, and cultural dynamics provide an important context for the use of spreadsheets and the production of power within various organizations. Spreadsheets are used in many ways in an organization and by many people. Who can use the spreadsheet? Who can enter information? Who can make decisions based on that information?

Understanding spreadsheets helps us see how they work in organizations and how they are implicated in the reproduction of their information practices and institutional memories over time. I previously described the different media components of the spreadsheet that come together to create the gridmatic framework that registers, classifies, and identifies new conceptual understandings of organizational dynamics. These institutions or collectivities can be a neighborhood coffee shop or a global corporation; they can be a local Girl Scout Chapter or an international NGO.

Spreadsheet use is a techno-epistemological practice that alters the structural reality of the organization and operates in the enabling and constraining aspects of its operations. They combine media and computational capabilities in ways that conceptualize organizational realities by inventorying and tracking resources, providing comprehensive schematic views, and facilitating managerial decision-making by modeling situations and providing “what-if” scenarios. Techno-epistemological practice is the production of knowledge or justified belief. What are the necessary and sufficient conditions for a person to know something? What gives spreadsheet knowledge its validity?

Spreadsheets are noted for their ease of use and a familiar tabular visual format for organizing and presenting information. Its central technology is the list, which has a long history of being integral to the management of palaces, temples, and armies.[1] Its table structure adds additional dimensions by combining columns and rows of lists that intersect at individual cells. The tabular grid of cells enhances the viewing and structuring of data values by using labels and annotations. Additionally, the computational capabilities of the spreadsheets connecting groups of cells and the low levels of competency needed for formulaic programming enhance their organizational effectiveness.[2]

For my analysis of spreadsheet power, I have often drawn on the work of Anthony Giddens, particularly his theory of “time-space power” that has information management and communication at its core as they “stretch” social institutions over durational time and geographic space. He identified three structural properties that work together to provide the cohesion institutions need to maintain themselves and grow over time. These are signification (meaning), domination (power) and legitimation (sanction).[3] An organizational agent utilizes these structures, called modalities, for social and operational interactions – communication and interpretive scheming; facilitation and provisioning; as well as; norms, shared values and proscriptions. Giddens sometimes uses the term “discipline” that resonates better with what I’m trying to argue than “domination,” so I will often use the latter term.

Gidden’s “duality of structure” describes some of the limits and possibilities of human action in a social context. The structure defines both rules and resources for the human operative as well as constraints and enabling factors. It acknowledges the knowledge-ability of the agent as well as the limits of rationality.

These structures simultaneously enable systems of comprehension and action for organizational agents. Together these structures often provide overlapping systems of cognition that form the communicative, command, and cultural dynamics of modern organizations. When spreadsheets are integrated into the organizations, they become implicated in the complex workings of these structural properties and, subsequently, they propel social organizations through time and across spatial dimensions, or what Giddens calls “time-space power.”

Signification

For the most part, my analysis of the spreadsheet has focused on signification. Words, list-making, table construction, and algorithmic formulations create points and grids of cognitive significance that produce the intelligibility of the spreadsheet. Each representation is structured by their own sets of rules and dynamics. Writing uses phonographic lettering (or ideographic in the case of Chinese and Japanese Kanji) systems with words and sentences organized by grammar and syntax.[4] The list is simple but profound – it is a non-syntactic ordering system that can be combined with columns to organize classification systems of major consequence. Tables create flexible grids of meaning that can show patterns, relationships, and connections.

Likewise, the placement system of numbers and the role of zero in a base-10 positional system helps organize accounting and financial systems. Indo-Arabic numerals standardized a book-keeping and calculative system that structured organizational dynamics and propelled global capitalism.

Discipline

How does the spreadsheet work within an organizational context? How are spreadsheets connected to the power dynamics of a modern organization? The notion of power is complex, but as Giddens argues, it is key to structuring and stretching organizations over time and across spatial distances. Power operates to ensure the repetition and expansion of institutional practices and/or to intervene to create changes and disrupt an organization. It has a transformative capacity, sometimes enabling, and sometimes dominating. What conditions provoke which transformations? Budgets, in particular, work to organize resources in an organization, and the PC-based spreadsheet made it easier to enter data and change information to suit different goals.

Giddens emphasizes that control over resources is one key to power in an organization. Power can be authoritative – control over social actors such as employees, volunteers, inmates, students, soldiers, etc. With a spreadsheet, each person is identified, registered, classified, and associated directly with responsibilities, eligibilities, and accountability. Power can also be allocative – control over the distribution of material resources such as computer equipment, vehicles, office supplies, etc. Control may be a strong term, depending on the institution; administering, coordinating, or leading are some other terms that may be useful to understand how spreadsheets help manage authoritative and allocative resources.

Authoritative power defines the capability of agents to manage the social environment of the organization through a combination of disciplinary and motivational practices. Disciplinary power is enhanced by the spreadsheet in that information-keeping is simplified and visually expressive. Spreadsheet information is usually abbreviated (as opposed to the file), and situationally limited and organized with comparison with other personnel in mind. For example, as I coordinate teaching schedules, the spreadsheet lists courses, times, days, and instructors. Take this satirical quote from Colm O’Regan, an Irish stand-up comedian and writer:

    As much as oil and water, our lives are governed by Excel. As you read these lines somewhere in the world, your name is being dragged from cell C25 to D14 on a roster. Such a simple action, yet now you’ll be asked to work on your day off. It is useless to protest. The spreadsheet has been printed – the word made mesh.

Spreadsheets can provide a surveillance function when tracking detailed information on performances and can be used to compare different workers, students, patients, etc. Spreadsheets can also “organize the time-space sequencing” of events and actions when organized as time-tables. Contrarily, spreadsheets can be organized to monitor accomplishments and assign monetary or other awards.

The other category of resource power, allocative, involves control over material objects and goods. Allocation has to do with the distribution of resources, and provides a key nexus of power in organizations when only certain individuals are empowered to use or apportion resources. Think of a military structure where the chain of command signifies the power to assign duties to subordinates or allocate provisions such as food, water, and ammunition to different units. The development of different types of barcodes and radio-frequency identification (RFID) technologies are ways modern information systems are used to track resources and integrated right into spreadsheet formulas.

It is no accident that the privatization era emerged concurrently with the spreadsheet. While a number of historical forces converged to facilitate the mass transfer of public wealth into private hands, the spreadsheet became the enabler – listing, commodifying, and valuing resources. The transition of government-owned telecommunications systems or Post, Telephone and Telegraph organizations (PTTs) into state-owned enterprises and finally into publicly-listed corporations required the identification and inventorying of assets such as copper cable lines, telephone poles, and maintenance trucks.

Spreadsheets provided an extraordinary new tool to cognize and help control the resources of an organization, including its people. It is useful to include an analysis of power when examining the spreadsheet and its use in organizations as it is involved with both the control of authoritative and allocative resources and their implication in the reproduction or transformation of organizational routines.

Legitimation

The third structural property for social interaction, legitimation, deals with the norms or sanctions that operate within an institution. Giddens emphasizes that human action is crucial in the enactment of organizational structures. Their social identities and organization status emerge out of the interplay between signification, domination and legitimation in a process he calls “positioning.” Legitimation deals with moral constitution of the organization, its rights, its values, its standards, its obligations. It defines codes of conduct such as appropriate dress and way people are addressed.

Human actors negotiate their situation with their own knowledge and skills sets and the organizational contexts that provide the “rules” for appropriate actions and behaviors. Agents draw on stocks of knowledge gathered over time via memory, social cues, and signified regulations to inform him or herself about what is acceptable action. They anticipate the rewards of that action by considering the external context, conditions, and potential results of that action and its time-space ramifications. They learn to work within the guidelines of the organization, how to do the jobs they are assigned and how to read the political dynamics.

Different organizations have varying criteria for success and sanction. Success generally relies on some measure of competence while sanction refers to both the constraining and enabling aspects of authoritative power and involves permissions and penalties. What behaviors will be encouraged or penalized? What sets of values are rewarded? Who will be held accountable for certain actions and outcomes?

Those in the organization who know how to use spreadsheets for various tabulation, optimization, and simulation purposes in support of decision making have a decided advantage. Spreadsheets have been acknowledged for their support in managerial success, primarily because of their ability to model situations and provide “what-if” scenarios. The spreadsheet table combines cells that hold assumptions, cells that contain tentative values, and a formulaic framework that produces a prediction.

In this post, I attempted to connect how spreadsheets work with some of the communicative, cultural and political processes that occur in institutions to enable control over people and material resources. In particular, I show how a combination of resources, rules, and roles work to structure the relations in institutions and convey important messages about the degree of power held by different people and positions. Although often criticized for safety and usability, spreadsheets are part of the organization’s information system that propels it through time, and across space. More ethnographic research is needed to better understand the role of spreadsheets in the organizational context.

Notes

[1] Jack Goody’s (1984) Writing and the Organization of Society is noted for its historical research on the power of the list.
[2] Bonnie A. Nardi and James R. Miller (In D. Diaper et al (Eds.), “The Spreadsheet Interface: A Basis for End-user Programming,” Human-Computer Interaction: INTERACT ’90. Amsterdam: North-Holland, 1990. Spring, 1990.
[3] “Structuration Theory in Management and Accounting,” by N.B. Macintosh and R.W. Scapens
“Structuration Theory in Management and Accounting N.B. Macintosh and R.W. Scapens” in Anthony Giddens: Critical Assessments, Volume 4. edited by Christopher G. A. Bryant, David Jary.
[4] “Differential processing of phonographic and logographic single-digit numbers by the two hemispheres,” by http://www.haskins.yale.edu/sr/sr081/SR081_14.pdf

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

The Cyberpunk Genre as Social and Technological Analysis

Posted on | August 13, 2018 | No Comments

I once taught a Freshman seminar at New York University in Information System Management (ISM). The course was introductory and only two credits, so I felt we needed a focused, fun, yet comprehensive set of analytical concepts to shape our discussions and assignments about ISM in the modern world. I decided to use the “cyberpunk” genre (a subgenre of science fiction) to look at the relationship between emerging digital technologies and the types of societies they were creating. Can the analysis of genres also be used to analyze attributes of modern technology and their impact on the lives we live?

Frances Bonner’s “Separate Development: Cyberpunk in Film and TV” in HAL-ICONFiction 2000: Cyberpunk and the Future of the Narrative (1992) provided a framework concentrating on “…computers, corporations, crime, and corporeality–the four C’s of cyberpunk film plotting.”[1] The four “C’s” were used by Bonner to analyze whether various films and television shows could be categorized as cyberpunk.

Would cyberpunk include such “Sci-Fi” literary classics as Philip K. Dick’s Do Androids Dream of Electric Sheep (1968) and William Gibson’s Neuromancer (1984)? How about films such as Blade Runner (1982), The Matrix (1999) and the Terminator series? What about the relatively more recent Ready Player One (2018)? The 4Cs can be used to evaluate each of these for their cyberpunk “qualifications.” Bonner considered the TV show Max Headroom to probably be best embodiment of a cyberpunk genre creation based on her the 4 C’s.

Interestingly, cyberpunk looks to have gone mainstream more recently with major blockbuster movies. Often they reflect the 4Cs. Tony Stark, in the Ironman series, for example, embodies corporeality with the use of the Ironman exoskeleton, the corporation with Stark Industries, and computers with networked augmented reality. Its criminality indicts several sources, including corrupt corporate executives, disgruntled Russians, and alien hordes – not standard cyberpunk icons but an indication of the expansion of the genre towards “cy-fi” – cyberfictions.

More recently, The Ghost in the Shell (2017) starring Scarlet Johansson reprised the anime classic by the same name. Created by Masamune Shirow, it became an animated movie in 1995. The movie examines whether memory or action defines identity but uses technology and cyber villainy, with the CEO of Hanka Robotics being its major antagonist.

While the 4 C’s are useful for genre analysis, they can also be helpful categories for socio-technical analysis. The typologies provide classification systems according to structural features that assist distinctions and interpretations. These have been used to examine the iconography of cyberpunk media, such as character types in graphic novels or set designs in films, to determine its adherance to the genre. But they can also help analyze the socio-technical aspects of manufactured products and processes. These include digitally-based services such as search engines or AI. The 4Cs provide convenient analytical categories for examining modern societies by using Bonner’s conceptual tools: Computers/Cyberspace, Corporations, Criminality, and Corporeality.

The 4 “Cs” in Socio-Technical Analysis

Computers can easily be replaced with “cyberspace” as the combination of digital processing and networked communications provides a convenient point of departure for an analysis of contemporary cybersocieties. Technology such as AI and robots have been a staple in cyberpunk, as are networked flying cars.

ColussusComputers initially appeared in literary productions as large, dominant “brains,” such as the giant computer in Colossus: The Forbin Project (1970) or HAL 9000 in 2001: A Space Odyssey (1968). These were no doubt based on the SAGE computers built by IBM and MIT as part of a North American hemispheric defense system based on radar stations located along the defense early warning (DEW) line ranging from Alaska, along the northern borders of Canada to the tip of Long Island in New York.

By the 1980s, the network capabilities added new dimensions and thus plot devices. War Games (1983) drew on the history of the large mainframe computer (Whoppr) used for nuclear defense purposes but also introduced home terminals and a networked environment. Cyberspace soon competed with science fiction’s interstellar rocket ship as the dominant literary icon.

Cyberspace originally meant virtual environments and simulations that simulate physical spaces, objects, and interactions in a digital context. It referred to data stored in large computers or a network represented as a three-dimensional model through which a virtual-reality user can move. It is represented in media through graphics, keyboards, text-boxes, and human-computer interfaces.

Cyberspace is still often used to refer to the realm of digital communication, especially when it comes to security. Cybersecurity has become an essential discipline for safeguarding digital assets, preserving privacy, maintaining business continuity, and protecting individuals, organizations, and society from the growing threats posed by cybercriminals, hackers, and other malicious actors in cyberspace.

Corporations are organizations with limited liability and strong incentives to maximize profits. Investors are protected to amount of their investments and not liable for negligence or criminal conduct on the part of the organization. Corporations are designed to raise capital by selling shares to the public.

Corporations often have a legal status as “artificial persons,” which gives them rights comparable to human citizens. This peculiar status emerged because of interpretation to a legal decision called Santa Clara County v. Southern Pacific Railroad that applied the 14th Amendment to corporations. This amendment to the United States Constitution was originally designed to secure rights for the recently freed slaves but corporate lawyers were able to use it to their benefit to ensure corporate entities could enter into legally binding contracts, own property, and to sue other companies and people.

Corporations are prevalent icons in the cyberfiction genres. Intelligent buildings such as Network XXIII’s headquarters in Max Headroom or DieHard‘s Nakatomi Tower represent the phallic connotation of corporate vitality. In the age of ethereal digital money, the marble and steel high-rise is the material representation of modern power. In the theological context, where the power is arranged hierarchically, height attains a spiritual significance. An example from “real” life, the corporate Majestic Tower in Wellington, New Zealand was built next to St. Mary’s Catholic church and given a mocking halo of lights as the country’s elite embraced a new corporate mentality. Corporations are often represented through icons such as skyscrapers, board rooms, logos, AIs, stock prices, ticker tapes, executives.

Criminality is a standard literary device that was successfully applied to the cyberpunk genre. It refers to transgressions of law and addresses issues of ethics. Known historically in crime fiction and especially for its use in the gangster genre. The gangster as a product of the new urban civilization confronted the contradictions of liberal capitalism with its promise of a classless, democratic society. The genre pitted desire against constraint, where the gangster violates the system of rules and bureaucracy in the name of tragic individualism. The gangster character-type with its propensity towards dramatic action and individualistic profiteering has long been a vehicle for politicizing capitalism’s perennial problems — alienation, debt, greed, poverty, and unemployment. While most cyberpunk reifies the individual neo-liberal hacker and “his” struggle against officialdom, its more politicized forms point to skill base and capital investment required of high tech corporate espionage. Criminality in fiction is often represented by icons such as dress, weapons, language, violence, bling, computer hacking, and mug shots.

Corporeality is one of the most intriguing areas of the cyberpunk domain. What is the relationship between human bodies and technologies? What is human consciousness? The ghost in the machine? How do technological developments augment or replace the human body? How can the body be bio-engineered? A central issue is commodification and the body? Drugs, implant devices, and external aids such as eyeglasses and hearing aids are some of the technology sold to augment or control the human body. Cybernetic organisms, Donna Haraway’s “Cyborgs”, and Tim Luke’s “Humachines” constantly test the boundaries of what we consider human and what we consider machine. Corporeality is often represented by icons such as mind-body and other interfaces, drugs, and interchangeable body parts.

Bonner suggested that narratives can be categorized as “cyberpunk” when they include some combination of computers, corporations, crime, and corporeality.[2] The 4 Cs of cyberpunk genre analysis also provides categories to examine the technological, economic, medical, and legal issues facing modern societies. They can help analyze the types of visual textual and auditory techniques that shape our stories of imagined futures. They can also be exploratory categories for understanding current socio-technical trajectories.

Notes

[1] Bonner, F. (1992) “Separate Development: Cyberpunk in Film and Television,” In (1992) Fiction 2000: Cyberpunk and the Future of Narrative. George Slusser and Tom Shippey (eds.) Athens, GA: The University of Georgia Press.
[2] ibid, p. 191.

Citation APA (7th Edition)

Pennings, A.J. (2018, Aug 13) The Cyberpunk Genre as Social and Technological Analysis. apennings.com https://apennings.com/dystopian-economies/the-cyberpunk-genre-as-social-analysis/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at New York University and St. Edwards University in Austin, Texas. He wrote his PhD dissertation on Symbolic Economies and the Politics of Global Cyberspaces (1993) while teaching at Victoria University in Wellington, New Zealand and a Fellow at the East-West Center in Honolulu, Hawaii.

Java Continues to be the Most Popular Programming Language

Posted on | May 31, 2018 | No Comments

It has been a while since I reviewed the most popular programming languages. The top 10 most popular programming languages according to the statistics gathered for the TIOBE Index for May 2018 are:

  1. Java
  2. C
  3. C++
  4. Python
  5. C#
  6. Visual Basic .Net
  7. PHP
  8. Javascript
  9. SQL
  10. Ruby
  11. R

The TIOBE Index uses several search engines to calculate the programming languages in which most lines of code have been written over the course of a month. In first place is the Java language that was developed by Oracle’s subsidiary Sun Microsystems in the mid-1990s.

Java was developed for interactive TV and mobile devices but found a more immediate home in the emerging World Wide Web. Sun had open-sourced the Java language under the GNU General Public License (GPLv2) in November 2006, so anyone else could copy and use its code. Java has consistently been in the top 5 programming languages for the last 15 years as has C and C++.

Java was a source of contention between Oracle and Google due to its influence on the Android operating system. Oracle claimed Google had infringed its Java copyright by using 11,500 lines of its code in its Android operating system. In 2016 Google won the Android case that protected the idea of “fair use” for APIs (application programming interfaces). The news was welcomed by developers who rely on access to open-source APIs to develop various services.

Java is valuable for developing apps in Android and is also popular in the financial field for electronic trading, confirmation, and settlement systems. Big Data applications like Hadoop, ElasticSearch, and Apache’s Java-based HBase also tend to use Java. It is also preferred for artificial intelligence (AI), expert systems, natural language, and neural network applications, mainly because of the availability of Java code bases and Java Virtual Machine (JVM) as a computing environment. It is also used for developing driverless car technology. Java tends to safer, more portable, and easier to maintain than other C languages.

Large organizations tend to use Java more than smaller, start up companies. If you want to work in start-up locations like San Francisco or Austin, Texas you might want to learn Python or a variation of Javascript. Seriously consider Java if you want to be employed in major cities with a high concentration of corporations, government agencies or research institutes.

Having said this, programming languages like C++ and Python continue to be popular. Python is probably the easiest to learn and is popular with Google Chrome and YouTube. Here are some other indexes that monitor the use and popularity of computer programming languages.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand, and St. Edwards University in Austin, Texas where he keeps his American home. He spent 9 years as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    July 2024
    M T W T F S S
    1234567
    891011121314
    15161718192021
    22232425262728
    293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.