Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

Potential Bill on Net Neutrality and Deep Pocket Inspection

Posted on | October 17, 2018 | No Comments

Just got this discussion draft by Eliot Lance Engel (D-NY) from one of my former colleagues at New York University and a former Verizon executive, Thomas Dargan. Eliot Lance Engel (D-NY) is the U.S. Representative for New York’s 16th congressional district that contains parts of the Bronx and Westchester County.[1] US telecommunications policy is based on the Communications Act of 1934 that created the Federal Communications Commission and established the importance of common carriage, a concept that is included in current understandings of net neutrality.

115TH CONGRESS 2D SESSION

[DISCUSSION DRAFT]

H. R. __

To amend the Communications Act of 1934 to prohibit broadband internet access service providers from engaging in deep packet inspection.

IN THE HOUSE OF REPRESENTATIVES
—— introduced the following bill; which was referred to the Committee on ______________

A BILL

To amend the Communications Act of 1934 to prohibit broadband internet access service providers from engaging in deep packet inspection.

Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled,

SECTION 1. SHORT TITLE.
This Act may be cited as the “Deep Packet Privacy Protection Act of 2018”.

SEC. 2. PROHIBITION ON DEEP PACKET INSPECTION.

(a) IN GENERAL.—Title VII of the Communications Act of 1934 (47 U.S.C. 601 et seq.) is amended by adding at the end the following:

“SEC. 722. PROHIBITION ON DEEP PACKET INSPECTION.

“(a) IN GENERAL.—A broadband internet access service provider may not engage in deep packet inspection, except in conducting a reasonable network management practice.

“(b) RULE OF CONSTRUCTION.—Nothing in this section shall be construed to prohibit a broadband internet access service provider from engaging in deep packet inspection as required by law, including for purposes of criminal law enforcement, cybersecurity, or fraud prevention.

“(c) DEFINITIONS.—In this section:
“(1) BROADBAND INTERNET ACCESS SERVICE.—

“(A) IN GENERAL.—The term ‘broadband internet access service’ means a mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all internet endpoints, including any capabilities that are incidental to and enable the operation of the communications service, but excluding dial-up internet access service.

“(B) FUNCTIONAL EQUIVALENT; EVASION.—The term ‘broadband internet access service’ also includes any service that—

“(i) the Commission finds to be providing a functional equivalent of the service described in subparagraph (A); or

“(ii) is used to evade the prohibitions set forth in this section.

“(2) DEEP PACKET INSPECTION.—The term ‘deep packet inspection’ means the practice by which a broadband internet access service provider reads, records, or tabulates information or filters traffic based on the inspection of the content of packets as they are transmitted across their network in the provision of broadband internet access service.

“(3) NETWORK MANAGEMENT PRACTICE.—The term ‘network management practice’ means a practice that has a primarily technical network management justification, but does not include other business practices.

“(4) REASONABLE NETWORK MANAGEMENT PRACTICE.—The term ‘reasonable network management practice’ means a network management practice that is primarily used for and tailored to achieving a legitimate network management purpose, taking into account the particular network architecture and technology of the broadband internet access service, including—

“(A) delivering packets to their intended destination;

“(B) detecting or preventing transmission of malicious software, including viruses and malware; and

“(C) complying with data protection laws and laws designed to prohibit unsolicited commercial electronic messages, including the CAN-SPAM Act of 2003 (15 U.S.C. 7701 et seq.) and section 1037 of title 18, United States Code.”.

(b) DEADLINE FOR RULEMAKING.—Not later than 180 days after the date of the enactment of this Act, the Federal Communications Commission shall issue a rule to implement the amendment made by subsection (a).

(c) EFFECTIVE DATE.—The amendment made by this section shall apply beginning on the date that is 270 days after the date of the enactment of this Act.

Notes

[1] Tom Dargan can be reached at US 914-582-8995
[2] Eliot Lance Engel (D-NY) official website.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

TIME Magazine’s “Machine of the Year”

Posted on | October 10, 2018 | No Comments

The Apple II was quite a success when it was introduced in 1977 with sales of US$770,000 in its first year. Its growth over the next few years, however, was tremendous. Revenues hit $7.9 million in its second year of operation and $49 Time's Machine of the Year covermillion in 1979. Its founders, Steve Jobs and Steve Wozniak, were soon multimillionaires despite still being in their early 20s. Apple’s sales were initially concentrated in the hobbyist market with recreational software such as games dominating. Another important market was education, with simulation games for teaching mathematics, music, and science, etc. Most of these programs were poor in quality though, and in both of these areas, the software industries failed to develop significantly. It was not until the business market for microcomputer software matured that demand for the smaller machines solidified and ultimately ensured Apple’s success, but not without a challenge from a similar machine – the IBM Personal Computer (PC).

Although it was not apparent at first, three software packages: databases, spreadsheets, and word processing created significant demand for the PC in the business world. In 1983, dBase emerged as the database leader with 150,000 units sold, WordStar sold 700,000 packages to take the lead among word processing software, while VisiCalc led the spreadsheet market with 800,000 units sold.[1] It was the spreadsheet though that had the most significant allure. When VisiCalc was created for the Apple II in late 1979, sales of both increased rapidly. Software Arts, who marketed and priced its VisiCalc for around $100, had sales of $11 million its first year while Apple’s sales also continued to grow, reaching to $600 million in 1982.[2] With the success of these three software areas, microcomputers were proving to be more than toys.

Until the electronic spreadsheet, the Apple II was largely considered primarily a hobby toy for “men with big beards” to control things like model train sets.[3] But VisiCalc combined the microcomputer and money in extraordinary new ways. It was the “killer app” which launched the PC revolution, but it also brought powerful new techniques of numerical analysis to the individual. Countering the prevailing notion that accounting calculations were the domain of meekish accountants and subordinate secretaries, electronic spreadsheets reached a whole new range of entrepreneurs, executives, traders, students, etc. Competition was growing though, and in 1982 it was a small company called Microsoft whose new spreadsheet called Multiplan received acclaim. The software continued to advance however, especially when a former employee of Personal Arts (a company hired to market VisiCalc) took his earnings from rights to software he developed to start his own company and create a new spreadsheet program that would dominate sales for the rest of the decade. The new software package was called Lotus 1-2-3.

Lotus 1-2-3 was a product of a new company started by Mitch Kapor. Like Steve Jobs and Wozniak, Kapor came from a bastion of military investment in computer technology, but in this case, it was Boston, not Silicon Valley. In the 1960s, he actually had the chance to learn computer programming in his high school and had built a small computer/adding machine using a telephone rotary dialer for inputting data. But also like Jobs, he was highly influenced by the counter-cultural movement, primarily a reaction to the Vietnam War. After exploring a wide variety of life experiences including teaching meditation for awhile and getting a masters degree in counseling psychology, Kapor returned to his computer roots. He went to work for Personal Arts and designed a program called VisiPlot/VisiTrend in order to increase the readability of the spreadsheet.

But after a management change he left the company. Before he left though, he received $1.2 million for the rights to his software program. Despite his counter-big business sensibilities, he took the money and started his own company called Micro Finance Systems. Their first product was called the Executive Briefing System. But before he released it, he changed the name of the company to Lotus Development Corporation in honor of a mediation practice. Soon he got venture capitalist Ben Rosen to invest in a new product that would integrate a spreadsheet and a word processor into his graphics program. After an unsuccessful attempt to produce the word processor, they added a database program and began to market Lotus 1-2-3.[4]

The success of this new spreadsheet software was tied intimately to the success of another new microcomputer, the IBM Personal Computer or the “PC.” On August 12, 1981, Big Blue introduced its personal computer based on a 4.77 MHz 8088 Intel chip with 16K (expandable to 256K) of RAM and an operating system, PC-DOS, licensed from an upstart company called Microsoft. The IBM PC came with a 5.25-inch floppy disk and room for another. Working secretly as “Project Acorn” in Boca Raton, Florida, the renegade IBM group also created expansion cards, monochrome monitors, and printers for new machine as well as a host of new software programs. The IBM PC was attractive to traditional businesses and to mainstream computer users who had previously considered the small computer little more than a toy. The availability of the spreadsheet however, turned the IBM PC into a practical business tool.

Just as VisiCalc helped the Apple II take off, Lotus 1-2-3 contributed greatly to the IBM’s success and vice-versa. The new spreadsheet package actually integrated its calculative abilities with graphics and database capabilities thus the numerical suffix on its name. For the user, it meant that not only could they do sophisticated types of calculating, they could also print the results out for business meetings, and store it as data in an organized manner. Within a year of its software release, the Lotus Corporation was worth over a $150 million. The relationship with the IBM PC was symbiotic, as Big Blue’s new little computer sold over 670,000 units were sold in its first year.[5] Lotus Corp meanwhile became the number one supplier of software as its sales revenues grew to $258 million in its third year.[6]

Lotus 1-2-3 also benefited greatly from what was arguably the deal of the century, Microsoft’s ability to license its operating system to IBM without granting it exclusive rights. Microsoft’s 1980 deal with IBM, which allowed it to sell its DOS software to other PC manufacturers meant that Lotus 1-2-3 could be used on any “IBM-compatible” microcomputer, including the semi-portable Compaq machine. Compaq was started in 1982 and set out immediately to reverse engineer IBM’s BIOS (Basic Input/Output System) in order to produce its own IBM “clone”. Soon it had developed its own microcomputer that would run the same software as the IBM PC. Compaq achieved remarkable sales of $111 million in its first year and went on to become a Fortune 500 Company.[7] Meanwhile, Lotus 1-2-3 became the most popular PC software program sold throughout the 1980s. Not unrelated, Bill Gates was on his way to becoming the richest man in the world as he made money off both the OS for the IBM PC but also each of the clones that used Microsoft’s operating system.

The personal computer was fast becoming a popular icon. By 1982 the sales of the Apple II were strong and the IBM PC was quickly gaining a piece of the rapidly growing market share. Furthermore, other companies were looking to compete and by the end of the year over 5.5 million personal computers were being used as Atari, Commodore, Compaq, Osborne, Sinclair, Tandy Radio Shack, and Texas Instruments each offered their own models for sale.[8] Another important factor for the personal computer’s popularity was its new data communication capability. Hayes successfully marketed a 1200bps modem, allowing computer users to access information services like Compuserve, Lockheed, and The Well.

The new PCs were so successful that TIME magazine decided to honor them. Originally it planned to name Steve Jobs as its “Man of the Year”. But because sales of other PCs were rising so dramatically, they changed their mind. Instead, in a January 1983 issue, TIME decided to name the “Personal Computer” its “Machine of the Year”. Although the magazine’s yearly acknowledgement generally goes to real people and was originally scheduled to go to Apple’s Steve Jobs, the dramatic sales of the IBM PC at the end of the year convinced them to change their minds.

Notes

[1] Computer: A History of the Information Machine. p. 260.
[2] Apple statistics from Rose, Frank (1989) East of Eden: The End of Innocence at Apple Computer. NY: Viking Penguin Group. p.47. VisiCalc information from Jones Telecommunications & Multimedia Encyclopedia at http://www.digitalcentury.com/encyclo/update/bricklin.html, accessed on October 24, 2001.
[3] This is a term from the documentary, Triumph of the Nerds that played on PBS during 1996.
[4] Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 338-344. This section contained biographical information on Mitch Kapor and the beginnings of Lotus 1-2-3
[5] Information on IBM PC release from Michael J. Miller’s editorial in the September 4, 2001 issue of PC MAGAZINE dedicated to the PC 20th anniversary.
[6] Information on Kapor and Lotus from the (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 128.
[7] Information on Compaq from (2002) Computing Encyclopedia. Volume 5: People. Smart Computing Reference Series. p. 38.
[8] Number of PCs in 1982 was accessed on December 10, 2001 from: http://www.digitalcentury.com/encyclo/update/comp_hd.html.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets – Techno-Epistemological Power over People and Resources

Posted on | September 27, 2018 | No Comments

In previous posts, I wrote that digital spreadsheets had emerged as a constitutive technology that can shape perceptions, organize resources, and empower control over the lived experiences of people and the dynamics of social organizations. In this post, I look at how communicative, command, and cultural dynamics provide an important context for the use of spreadsheets and the production of power within various organizations. Spreadsheets are used in many ways in an organization and by many people. Who can use the spreadsheet? Who can enter information? Who can make decisions based on that information?

Understanding spreadsheets helps us see how they work in organizations and how they are implicated in the reproduction of their information practices and institutional memories over time. I previously described the different media components of the spreadsheet that come together to create the gridmatic framework that registers, classifies, and identifies new conceptual understandings of organizational dynamics. These institutions or collectivities can be a neighborhood coffee shop or a global corporation; they can be a local Girl Scout Chapter or an international NGO.

Spreadsheet use is a techno-epistemological practice that alters the structural reality of the organization and operates in the enabling and constraining aspects of its operations. They combine media and computational capabilities in ways that conceptualize organizational realities by inventorying and tracking resources, providing comprehensive schematic views, and facilitating managerial decision-making by modeling situations and providing “what-if” scenarios. Techno-epistemological practice is the production of knowledge or justified belief. What are the necessary and sufficient conditions for a person to know something? What gives spreadsheet knowledge its validity?

Spreadsheets are noted for their ease of use and a familiar tabular visual format for organizing and presenting information. Its central technology is the list, which has a long history of being integral to the management of palaces, temples, and armies.[1] Its table structure adds additional dimensions by combining columns and rows of lists that intersect at individual cells. The tabular grid of cells enhances the viewing and structuring of data values by using labels and annotations. Additionally, the computational capabilities of the spreadsheets connecting groups of cells and the low levels of competency needed for formulaic programming enhance their organizational effectiveness.[2]

For my analysis of spreadsheet power, I have often drawn on the work of Anthony Giddens, particularly his theory of “time-space power” that has information management and communication at its core as they “stretch” social institutions over durational time and geographic space. He identified three structural properties that work together to provide the cohesion institutions need to maintain themselves and grow over time. These are signification (meaning), domination (power) and legitimation (sanction).[3] An organizational agent utilizes these structures, called modalities, for social and operational interactions – communication and interpretive scheming; facilitation and provisioning; as well as; norms, shared values and proscriptions. Giddens sometimes uses the term “discipline” that resonates better with what I’m trying to argue than “domination,” so I will often use the latter term.

Gidden’s “duality of structure” describes some of the limits and possibilities of human action in a social context. The structure defines both rules and resources for the human operative as well as constraints and enabling factors. It acknowledges the knowledge-ability of the agent as well as the limits of rationality.

These structures simultaneously enable systems of comprehension and action for organizational agents. Together these structures often provide overlapping systems of cognition that form the communicative, command, and cultural dynamics of modern organizations. When spreadsheets are integrated into the organizations, they become implicated in the complex workings of these structural properties and, subsequently, they propel social organizations through time and across spatial dimensions, or what Giddens calls “time-space power.”

Signification

For the most part, my analysis of the spreadsheet has focused on signification. Words, list-making, table construction, and algorithmic formulations create points and grids of cognitive significance that produce the intelligibility of the spreadsheet. Each representation is structured by their own sets of rules and dynamics. Writing uses phonographic lettering (or ideographic in the case of Chinese and Japanese Kanji) systems with words and sentences organized by grammar and syntax.[4] The list is simple but profound – it is a non-syntactic ordering system that can be combined with columns to organize classification systems of major consequence. Tables create flexible grids of meaning that can show patterns, relationships, and connections.

Likewise, the placement system of numbers and the role of zero in a base-10 positional system helps organize accounting and financial systems. Indo-Arabic numerals standardized a book-keeping and calculative system that structured organizational dynamics and propelled global capitalism.

Discipline

How does the spreadsheet work within an organizational context? How are spreadsheets connected to the power dynamics of a modern organization? The notion of power is complex, but as Giddens argues, it is key to structuring and stretching organizations over time and across spatial distances. Power operates to ensure the repetition and expansion of institutional practices and/or to intervene to create changes and disrupt an organization. It has a transformative capacity, sometimes enabling, and sometimes dominating. What conditions provoke which transformations? Budgets, in particular, work to organize resources in an organization, and the PC-based spreadsheet made it easier to enter data and change information to suit different goals.

Giddens emphasizes that control over resources is one key to power in an organization. Power can be authoritative – control over social actors such as employees, volunteers, inmates, students, soldiers, etc. With a spreadsheet, each person is identified, registered, classified, and associated directly with responsibilities, eligibilities, and accountability. Power can also be allocative – control over the distribution of material resources such as computer equipment, vehicles, office supplies, etc. Control may be a strong term, depending on the institution; administering, coordinating, or leading are some other terms that may be useful to understand how spreadsheets help manage authoritative and allocative resources.

Authoritative power defines the capability of agents to manage the social environment of the organization through a combination of disciplinary and motivational practices. Disciplinary power is enhanced by the spreadsheet in that information-keeping is simplified and visually expressive. Spreadsheet information is usually abbreviated (as opposed to the file), and situationally limited and organized with comparison with other personnel in mind. For example, as I coordinate teaching schedules, the spreadsheet lists courses, times, days, and instructors. Take this satirical quote from Colm O’Regan, an Irish stand-up comedian and writer:

    As much as oil and water, our lives are governed by Excel. As you read these lines somewhere in the world, your name is being dragged from cell C25 to D14 on a roster. Such a simple action, yet now you’ll be asked to work on your day off. It is useless to protest. The spreadsheet has been printed – the word made mesh.

Spreadsheets can provide a surveillance function when tracking detailed information on performances and can be used to compare different workers, students, patients, etc. Spreadsheets can also “organize the time-space sequencing” of events and actions when organized as time-tables. Contrarily, spreadsheets can be organized to monitor accomplishments and assign monetary or other awards.

The other category of resource power, allocative, involves control over material objects and goods. Allocation has to do with the distribution of resources, and provides a key nexus of power in organizations when only certain individuals are empowered to use or apportion resources. Think of a military structure where the chain of command signifies the power to assign duties to subordinates or allocate provisions such as food, water, and ammunition to different units. The development of different types of barcodes and radio-frequency identification (RFID) technologies are ways modern information systems are used to track resources and integrated right into spreadsheet formulas.

It is no accident that the privatization era emerged concurrently with the spreadsheet. While a number of historical forces converged to facilitate the mass transfer of public wealth into private hands, the spreadsheet became the enabler – listing, commodifying, and valuing resources. The transition of government-owned telecommunications systems or Post, Telephone and Telegraph organizations (PTTs) into state-owned enterprises and finally into publicly-listed corporations required the identification and inventorying of assets such as copper cable lines, telephone poles, and maintenance trucks.

Spreadsheets provided an extraordinary new tool to cognize and help control the resources of an organization, including its people. It is useful to include an analysis of power when examining the spreadsheet and its use in organizations as it is involved with both the control of authoritative and allocative resources and their implication in the reproduction or transformation of organizational routines.

Legitimation

The third structural property for social interaction, legitimation, deals with the norms or sanctions that operate within an institution. Giddens emphasizes that human action is crucial in the enactment of organizational structures. Their social identities and organization status emerge out of the interplay between signification, domination and legitimation in a process he calls “positioning.” Legitimation deals with moral constitution of the organization, its rights, its values, its standards, its obligations. It defines codes of conduct such as appropriate dress and way people are addressed.

Human actors negotiate their situation with their own knowledge and skills sets and the organizational contexts that provide the “rules” for appropriate actions and behaviors. Agents draw on stocks of knowledge gathered over time via memory, social cues, and signified regulations to inform him or herself about what is acceptable action. They anticipate the rewards of that action by considering the external context, conditions, and potential results of that action and its time-space ramifications. They learn to work within the guidelines of the organization, how to do the jobs they are assigned and how to read the political dynamics.

Different organizations have varying criteria for success and sanction. Success generally relies on some measure of competence while sanction refers to both the constraining and enabling aspects of authoritative power and involves permissions and penalties. What behaviors will be encouraged or penalized? What sets of values are rewarded? Who will be held accountable for certain actions and outcomes?

Those in the organization who know how to use spreadsheets for various tabulation, optimization, and simulation purposes in support of decision making have a decided advantage. Spreadsheets have been acknowledged for their support in managerial success, primarily because of their ability to model situations and provide “what-if” scenarios. The spreadsheet table combines cells that hold assumptions, cells that contain tentative values, and a formulaic framework that produces a prediction.

In this post, I attempted to connect how spreadsheets work with some of the communicative, cultural and political processes that occur in institutions to enable control over people and material resources. In particular, I show how a combination of resources, rules, and roles work to structure the relations in institutions and convey important messages about the degree of power held by different people and positions. Although often criticized for safety and usability, spreadsheets are part of the organization’s information system that propels it through time, and across space. More ethnographic research is needed to better understand the role of spreadsheets in the organizational context.

Notes

[1] Jack Goody’s (1984) Writing and the Organization of Society is noted for its historical research on the power of the list.
[2] Bonnie A. Nardi and James R. Miller (In D. Diaper et al (Eds.), “The Spreadsheet Interface: A Basis for End-user Programming,” Human-Computer Interaction: INTERACT ’90. Amsterdam: North-Holland, 1990. Spring, 1990.
[3] “Structuration Theory in Management and Accounting,” by N.B. Macintosh and R.W. Scapens
“Structuration Theory in Management and Accounting N.B. Macintosh and R.W. Scapens” in Anthony Giddens: Critical Assessments, Volume 4. edited by Christopher G. A. Bryant, David Jary.
[4] “Differential processing of phonographic and logographic single-digit numbers by the two hemispheres,” by http://www.haskins.yale.edu/sr/sr081/SR081_14.pdf

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

The Cyberpunk Genre as Social and Technological Analysis

Posted on | August 13, 2018 | No Comments

I once taught a Freshman seminar at New York University in Information System Management (ISM). The course was introductory and only two credits, so I felt we needed a focused, fun, yet comprehensive set of analytical concepts to shape our discussions and assignments about ISM in the modern world. I decided to use the “cyberpunk” genre (a subgenre of science fiction) to look at the relationship between emerging digital technologies and the types of societies they were creating.

Frances Bonner’s “Separate Development: Cyberpunk in Film and TV” in HAL-ICONFiction 2000: Cyberpunk and the Future of the Narrative (1992) provided a framework concentrating on “…computers, corporations, crime, and corporeality–the four C’s of cyberpunk film plotting.”[1] The four “C’s” were used by Bonner to analyze whether various films and television shows could be categorized as cyberpunk.

Would cyberpunk include such “Sci-Fi” literary classics as Philip K. Dick’s Do Androids Dream of Electric Sheep (1968) and William Gibson’s Neuromancer (1984)? How about films such as Blade Runner (1982), The Matrix (1999) and the Terminator series? What about the relatively more recent Ready Player One (2018)? The 4Cs can be used to evaluate each of these for their cyberpunk “qualifications.” Bonner considered the TV show Max Headroom to probably be best embodiment of a cyberpunk genre creation based on her the 4 C’s.

Interestingly, cyberpunk looks to have gone mainstream more recently with major blockbuster movies. Often they reflect the 4Cs. Tony Stark, in the Ironman series, for example, embodies corporeality with the use of the Ironman exoskeleton, the corporation with Stark Industries, and computers with networked augmented reality. Its criminality indicts several sources, including corrupt corporate executives, disgruntled Russians, and alien hordes – not standard cyberpunk icons but an indication of the expansion of the genre towards “cy-fi” – cyberfictions.

More recently, The Ghost in the Shell (2017) starring Scarlet Johansson reprised the anime classic by the same name. Created by Masamune Shirow, it became an animated movie in 1995. The movie examines whether memory or action defines identity but uses technology and cyber villainy, with the CEO of Hanka Robotics being its major antagonist.

While the 4 C’s are useful for genre analysis, they can also be helpful categories for socio-technical analysis. The typologies provide classification systems according to structural features that assist distinctions and interpretations. These have been used to examine the iconography of cyberpunk media, such as character types in graphic novels or set designs in films, to determine its adherance to the genre. But they can also help analyze the socio-technical aspects of manufactured products and processes. These include digitally-based services such as search engines or AI. The 4Cs provide convenient analytical categories for examining modern societies by providing conceptual tools on Computers/Cyberspace, Corporations, Criminality, and Corporeality.

The 4 “Cs” in Socio-Technical Analysis

Computers can easily be replaced with “cyberspace” as the combination of digital processing and networked communications provides a convenient point of departure for an analysis of contemporary cybersocieties. Technology such as AI and robots have been a staple in cyberpunk, as are networked flying cars.

ColussusComputers initially appeared in literary productions as large, dominant “brains,” such as the giant computer in Colossus: The Forbin Project (1970) or HAL 9000 in 2001: A Space Odyssey (1968). These were no doubt based on the SAGE computers built by IBM and MIT as part of a North American hemispheric defense system based on radar stations located along the defense early warning (DEW) line ranging from Alaska, along the northern borders of Canada to the tip of Long Island in New York.

By the 1980s, the network capabilities added new dimensions and thus plot devices. War Games (1983) drew on the history of the large mainframe computer (Whoppr) used for nuclear defense purposes but also introduced home terminals and a networked environment. Cyberspace soon competed with science fiction’s interstellar rocket ship as the dominant literary icon.

Cyberspace originally meant virtual environments and simulations that simulate physical spaces, objects, and interactions in a digital context. It referred to data stored in large computers or a network represented as a three-dimensional model through which a virtual-reality user can move. It is represented in media through graphics, keyboards, text-boxes, and human-computer interfaces.

Cyberspace is still often used to refer to the realm of digital communication, especially when it comes to security. Cybersecurity has become an essential discipline for safeguarding digital assets, preserving privacy, maintaining business continuity, and protecting individuals, organizations, and society from the growing threats posed by cybercriminals, hackers, and other malicious actors in cyberspace.

Corporations are organizations with limited liability and strong incentives to maximize profits. Investors are protected to amount of their investments and not liable for negligence or criminal conduct on the part of the organization. Corporations are designed to raise capital by selling shares to the public.

Corporations often have a legal status as “artificial persons,” which gives them rights comparable to human citizens. This peculiar status emerged because of interpretation to a legal decision called Santa Clara County v. Southern Pacific Railroad that applied the 14th Amendment to corporations. This amendment to the United States Constitution was originally designed to secure rights for the recently freed slaves but corporate lawyers were able to use it to their benefit to ensure corporate entities could enter into legally binding contracts, own property, and to sue other companies and people.

Corporations are prevalent icons in the cyberfiction genres. Intelligent buildings such as Network XXIII’s headquarters in Max Headroom or DieHard‘s Nakatomi Tower represent the phallic connotation of corporate vitality. In the age of ethereal digital money, the marble and steel high-rise is the material representation of modern power. In the theological context, where the power is arranged hierarchically, height attains a spiritual significance. An example from “real” life, the corporate Majestic Tower in Wellington, New Zealand was built next to St. Mary’s Catholic church and given a mocking halo of lights as the country’s elite embraced a new corporate mentality. Corporations are often represented through icons such as skyscrapers, board rooms, logos, AIs, stock prices, ticker tapes, executives.

Criminality is a standard literary device that was successfully applied to the cyberpunk genre. It refers to transgressions of law and addresses issues of ethics. Known historically in crime fiction and especially for its use in the gangster genre. The gangster as a product of the new urban civilization confronted the contradictions of liberal capitalism with its promise of a classless, democratic society. The genre pitted desire against constraint, where the gangster violates the system of rules and bureaucracy in the name of tragic individualism. The gangster character-type with its propensity towards dramatic action and individualistic profiteering has long been a vehicle for politicizing capitalism’s perennial problems — alienation, debt, greed, poverty, and unemployment. While most cyberpunk reifies the individual neo-liberal hacker and “his” struggle against officialdom, its more politicized forms point to skill base and capital investment required of high tech corporate espionage. Criminality in fiction is often represented by icons such as dress, weapons, language, violence, bling, computer hacking, and mug shots.

Corporeality is one of the most intriguing areas of the cyberpunk domain. What is the relationship between human bodies and technologies? What is human consciousness? The ghost in the machine? How do technological developments augment or replace the human body? How can the body be bio-engineered? A central issue is commodification and the body? Drugs, implant devices, and external aids such as eyeglasses and hearing aids are some of the technology sold to augment or control the human body. Cybernetic organisms, Donna Haraway’s “Cyborgs”, and Tim Luke’s “Humachines” constantly test the boundaries of what we consider human and what we consider machine. Corporeality is often represented by icons such as mind-body and other interfaces, drugs, and interchangeable body parts.

Bonner suggested that narratives can be categorized as “cyberpunk” when they include some combination of computers, corporations, crime, and corporeality.[2] The 4 Cs of cyberpunk genre analysis also provides categories to examine the technological, economic, medical, and legal issues facing modern societies. They can help analyze the types of visual textual and auditory techniques that shape our stories of imagined futures. They can also be exploratory categories for understanding current socio-technical trajectories.

Notes

[1] Bonner, F. (1992) “Separate Development: Cyberpunk in Film and Television,” In (1992) Fiction 2000: Cyberpunk and the Future of Narrative. George Slusser and Tom Shippey (eds.) Athens, GA: The University of Georgia Press.
[2] ibid, p. 191.

Citation APA (7th Edition)

Pennings, A.J. (2018, Aug 13) The Cyberpunk Genre as Social and Technological Analysis. apennings.com https://apennings.com/dystopian-economies/the-cyberpunk-genre-as-social-analysis/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at New York University and St. Edwards University in Austin, Texas. He wrote his PhD dissertation on Symbolic Economies and the Politics of Global Cyberspaces (1993) while teaching at Victoria University in Wellington, New Zealand and a Fellow at the East-West Center in Honolulu, Hawaii.

Java Continues to be the Most Popular Programming Language

Posted on | May 31, 2018 | No Comments

It has been a while since I reviewed the most popular programming languages. The top 10 most popular programming languages according to the statistics gathered for the TIOBE Index for May 2018 are:

  1. Java
  2. C
  3. C++
  4. Python
  5. C#
  6. Visual Basic .Net
  7. PHP
  8. Javascript
  9. SQL
  10. Ruby
  11. R

The TIOBE Index uses several search engines to calculate the programming languages in which most lines of code have been written over the course of a month. In first place is the Java language that was developed by Oracle’s subsidiary Sun Microsystems in the mid-1990s.

Java was developed for interactive TV and mobile devices but found a more immediate home in the emerging World Wide Web. Sun had open-sourced the Java language under the GNU General Public License (GPLv2) in November 2006, so anyone else could copy and use its code. Java has consistently been in the top 5 programming languages for the last 15 years as has C and C++.

Java was a source of contention between Oracle and Google due to its influence on the Android operating system. Oracle claimed Google had infringed its Java copyright by using 11,500 lines of its code in its Android operating system. In 2016 Google won the Android case that protected the idea of “fair use” for APIs (application programming interfaces). The news was welcomed by developers who rely on access to open-source APIs to develop various services.

Java is valuable for developing apps in Android and is also popular in the financial field for electronic trading, confirmation, and settlement systems. Big Data applications like Hadoop, ElasticSearch, and Apache’s Java-based HBase also tend to use Java. It is also preferred for artificial intelligence (AI), expert systems, natural language, and neural network applications, mainly because of the availability of Java code bases and Java Virtual Machine (JVM) as a computing environment. It is also used for developing driverless car technology. Java tends to safer, more portable, and easier to maintain than other C languages.

Large organizations tend to use Java more than smaller, start up companies. If you want to work in start-up locations like San Francisco or Austin, Texas you might want to learn Python or a variation of Javascript. Seriously consider Java if you want to be employed in major cities with a high concentration of corporations, government agencies or research institutes.

Having said this, programming languages like C++ and Python continue to be popular. Python is probably the easiest to learn and is popular with Google Chrome and YouTube. Here are some other indexes that monitor the use and popularity of computer programming languages.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand, and St. Edwards University in Austin, Texas where he keeps his American home. He spent 9 years as a Fellow at the East-West Center in Honolulu, Hawaii.

Anchoring Television News

Posted on | May 8, 2018 | No Comments

“The news is privileged discourse, invested with a special relation to the Real.” [1]

The news anchor is a finely tuned instrument for television performance. Unlike print journalism where disembodied letters of information suggest an objective third person, the televisual anchor is intimate and direct.

The news broadcaster leads the viewer through the news while “anchoring” their attention to specific topics. The anchor anchors meaning. The anchor fixes meaning, in the sense that connections are made and reinforced through the authority and credibility of the speaker. The anchor emphasizes what’s important, and what is to be dismissed or ignored.[2]

He or she, or both, believe in the news, and that makes all the difference. Groomed and conditioned into the voice of authority, the anchor trades in the currency of assurance and credibility.

As the anchor is a guest into the homes and offices of the viewer, they must be trustworthy, well groomed, appropriately dressed, and present the sufficient manners appropriate to such an intrusion. But as they make themselves at home, anchors engage in light banter, laughing and joking with each other, including the viewer, albeit vicariously, in their community.

The anchor pulls the viewer into the hyper-real globe of television news and establishes the link between the world and its representation. As surveillance of the world is one of the key aspects of mass media, the viewer is transported around the world, peeking in on floods and coups, hurricanes and elections, earthquakes and ethnic cleansings. The viewer is included in the sphere of politics and economics.

When the anchor reads the news, computer graphics are often used. In particular, charts give a dynamic, historical validity to the news. A graph of a company’s share price tracked over the last month gives an empirical rhetoric to the argument. A three-month chart of a company’s stock price, for example, reconfirms the anchor’s argument about the relative strength or weakness of that company.

Or now, the anchor can be designed as a computer graphic. Examining the news anchor from the perspective of AI is useful because it raises the question, “What makes the news anchor?” Addressing this question allows for the denotative analysis of the news anchor and the reconstruction of the anchor with digital components, like constructing an avatar in a metaverse environment.

This post introduced some aspects of a formalistic analysis of television news. By examining the “anchor” of TV news, it suggests that television news has rhetorical dimensions that influences business decisions, government policies, and personal world-views.

Share

Notes

[1] Morse, M. (1986) “The Television News Personality and Credibility: Reflections on the News in Transition. In Studies in Entertainment: Critical Approaches to Mass Culture. (ed.) Tania Modleski.
[2] A ship uses a heavy object called an anchor that is attached to a rope or chain and used to moor a vessel to the bottom of a lake or sea. Metaporically, the anchor anchors meaning.

Citation APA (7th Edition)

Pennings, A.J. (2018, May 8). Anchoring Television News. apennings.com https://apennings.com/media-strategies/show-biz-anchoring-the-financial-imagination/

Share

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand, and St. Edwards University in Austin, Texas where he keeps his American home. He spent 9 years as a Fellow at the East-West Center in Honolulu, Hawaii.

GLOBAL INNOVATION INDEX

Posted on | March 8, 2018 | No Comments

The Global Innovation Index (GII) signifies the key role of innovation in economic growth, competitiveness, and sustainability.

Co-published by Cornell University, INSEAD, and the World Intellectual Property Organization (WIPO), the GII attempts to identify and measure key innovation drivers that assist countries in developing policies to increase employment, improve productivity, and support long-term output growth.

The index is based on data from several sources, including the International Telecommunication Union (ITU), the World Bank and the World Economic Forum. It provides key insights on a wide range of national metrics that help policy-makers develop legislation and regulations that can facilitate economic activity. It currently assesses data in 127 national economies covering over 92% of the world’s population and 98% of global GDP.

The GII Report ranks world economies in terms of their innovation capabilities and results, recognizing the need for indicators that go beyond traditional measures of innovation such as research and development (R&D).

The GII publishes its data in seven major categories called “pillars.” Five input pillars comprise the Innovation Input Index
and capture elements of the national economy that enable or enhance innovative activities: Institutions, Human Capital and Research, Infrastructure, Market Sophistication, and Business Sophistication. Two pillars called the Innovation Output Index capture actual evidence of successful innovation outputs: Knowledge and Technology Outputs, and Creative Outputs.

INSTITUTIONS

The Institutions pillar captures the political economy framework of a country. These include political environment, political stability and absence of violence/terrorism, government effectiveness, and the regulatory environment. Business confidence and flexibility is important too and includes regulatory quality, rule of law, cost of redundancy dismissal, business environment, ease of starting a business, ease of resolving insolvency, ease of paying taxes.

HUMAN CAPITAL AND RESEARCH

This pillar gauges the human capital of countries and includes education levels and expenditures on education. This includes assessment in reading, mathematics, and science as well as pupil-teacher ratios in secondary and tertiary education and rankings of universities. Also considered are graduates in science and engineering, gross expenditure on R&D, and global R&D companies.

INFRASTRUCTURE

The third pillar measures information and communication technologies (ICTs), general infrastructure, and ecological sustainability. ICT includes ICT access, ICT use, government’s online services, and online e-participation. General infrastructure includes electricity output, logistics performance, and gross capital formation. Ecological sustainability measures GDP per unit of energy use, and environmental sustainability performance such as ISO 14001 environmental certificates.

MARKET SOPHISTICATION

The Market sophistication pillar has three sub-pillars structured around credit, investment and market conditions, trade, and competition. Areas include micro-finance, and venture capital as well as the total level of transactions.

BUSINESS SOPHISTICATION

The fifth enabler pillar tries to capture the level of business ability to assess how conducive firms are to innovation activity. These include number of knowledge workers: employment in knowledge-intensive services, firms offering formal training, and females employed with advanced degrees. Innovation linkages include university/industry, cluster development and research collaboration. Intellectual property and royalty payments have become prime indicators of innovation as are high tech imports, ICT services imports, and research talent in business enterprises.

KNOWLEDGE AND TECHNOLOGY OUTPUTS

This pillar covers all those variables that are traditionally thought to be the fruits of inventions and or innovations. These include knowledge creation, patent applications by origin, scientific and technical publications, and the rate of GDP per person engaged. Technology outputs include total computer software spending, high-tech and medium high-tech output, knowledge diffusion, intellectual property receipts, high-tech exports, and ICT services exports.

CREATIVE OUTPUTS

The last pillar on creative outputs measures the role of creativity for innovation. Areas include: intangible assets, trademark applications by origin, industrial designs by origin, ICTs and business model creation, ICTs and organizational model creation. Creative goods and services include cultural and creative services exports, national feature films produced, global entertainment and media market, printing and publishing output, and creative goods exports. Another area is online creativity such as generic top-level domains (gTLDs), Country-code top-level domains (ccTLDs), Wikipedia yearly edits, and Video uploads on YouTube.

These two indicators, the Innovation Input Index and the Innovation Output Index are averaged to compute the GII. The first combines five pillars while the second includes the last two and each score is calculated by a weighted average method. The overall GII score is the average of the Input and Output Sub-Indices. Below are scores tallied for the 2017 Report.

Global Innovation Index 2017 Rankings [Top 15]
Rank Country
1 Switzerland
2 Sweden
3 Netherlands
4 United States
5 United Kingdom
6 Denmark
7 Singapore
8 Finland
9 Germany
10 Ireland
11 South Korea
12 Luxembourg
13 Iceland
14 Japan
15 France

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

THE EXPERIMENT, Part I: NEW ZEALAND AS THE WORLD MODEL FOR DIGITAL MONETARISM

Posted on | February 11, 2018 | No Comments


Starting “Down Under”
One of the first “guinea pigs” for the global system of digital monetarism was New Zealand. A one-time leader in developing the “welfare state,” the small two-island nation-state in the deep Pacific Ocean had run into economic problems by the early 1980s. It had borrowed heavily during the previous decade, and its agricultural products were increasingly excluded from the rich United Kingdom markets due to their increasing participation in the European Community. New Zealand’s industrializing attempts also ran up against rapidly escalating inflation, especially oil costs. As a result, the economy struggled, and a financial crisis ensued that would turn the country’s tide.[1]

In 1984, a new Labour government was voted in under Prime Minister David Lange. It was also a time when an active environmentalist and pacifist movement was growing in the small country. Subsequently, the new government voted to restrict nuclear vessels from coming into their ports. The decision represented a major diplomatic problem as the country was party to the ANZUS Treaty. This treaty brought the nation along with Australia under the defensive protection of the United States. As US policy was never to confirm nor deny the existence of nuclear weapons on any of its ships, it effectively meant that no US ships could port in New Zealand.

Consequently, Reagan’s Secretary of State George Shultz traveled deep into the Pacific to meet with the leaders of the new Labour Government. Shultz had been Ronald Reagan’s first Secretary of the Treasury and one of the architects of the Reagan economic changes. The contents of the meeting are sketchy, but the result was that New Zealand could keep its non-nuclear status but needed to undergo major economic restructuring in line with what was going on in the US and in Britain under Margaret Thatcher.

Under the direction of New Zealand’s Treasury and Ministry of Finance, a new strategy for the country was developed. Their Economic Management (1984) report contained the seeds of their intended transformation from a “Welfare State” to a new kind of “Enterprise Society” lubricated by digital financial activities. The new government instituted radical reform measures to cut government spending, implement a neo-liberal regulatory regime, “reinvent” civil service and privatize many government organizations, including the Post Office. The intention was to monetarize the national political economy in conjunction with emerging global financial and trade practices.

“Rogernomics” as it came to be called, was a strategy for reviving the sluggish and debt-ridden economy by refocusing on private exchanges or what are aggregately called “markets.” Named after Labour’s Minister of Finance, Roger Douglas, the national program offered a host of measures designed to dismantle its welfare apparatus. Drawing on its strong export trade of animal and natural resource products and, New Zealand attempted to provide its citizens with free education, healthcare, unemployment insurance, and social security. The new Labour-led government would focus instead on cutting fiscal expenditures, streamline bureaucracy, sell off state assets, as well as liberalize trade and control inflation.

In 1985, the Labour Party government launched a review of the Post Office. Its final report recommended transforming the postal service into three state-owned enterprises. The government in 1986 passed through parliament the State-Owned Enterprises Act that corporatized several government agencies into state-owned enterprises (SOE).

The New Zealand Post Office’s corporatization was completed with the 1987 passage of the Postal Services Act. Along with the SOE Act, the legislation broke up the New Zealand Post Office into three corporations: the postal service New Zealand Post Limited, the savings bank Post Office Bank Limited, and the telecommunications company Telecom New Zealand Limited. Within a few years, PostBank and Telecom were privatized, and only New Zealand Post remained a state-owned enterprise. [3]

Central to the new strategy was the deregulation and privatization of the telecommunications sector. Previously, the sector was under the purview of the New Zealand Post Office (along with the national bank system) and operated like a traditional PTT. But under this new system, the telecommunications company was first valued and corporatized as a state-owned enterprise (SOE) and then sold off.

Throughout the world, the telecommunication infrastructure would be the regime of digital monetarism’s first target. The reason was twofold. First, telecommunications was identified as the main conduit for both domestic and transnational business. Digital monetarism needed the fluid movement of information and electronic money within and through national borders. The national telecommunications system, while mainly bureaucratic and voice-based, still presented the best opportunity to create a modernized data communications system. The second reason was that, because of the high level of investment needed for a modern telecommunications, a privatized “telco” would be a major listing on a domestic stockmarket and was a high priority for investment bankers.

Telecommunications companies became almost universally the largest companies by market capitalization (current share price times the number of shares sold) by the end of millennium. After a period of deregulation and modernization, New Zealand sold its Telecom SOE government to Bell Atlantic and Ameritech, two American “Baby Bells”. It also partially floated its shares on public stock markets and soon became the largest listing on the New Zealand Sharemarket. When the selloff occurred in 1989, it was announced with the expectation that it would retire 1/3 of the government debt.

The experiment was a move towards a newly liberalized market economy centered around digital financial transactions and telecommunications. It was led ideological by attacks on the Keynesian system of economic management but was driven by the global debt crisis of the 1980s. New monetary liquidity emerged after Nixon dismantled the Bretton Woods system of currency regulation and technological innovations were once again applied for financial gain. Companies like Reuters developed new computerized systems for currency trading and global news and a global “information standard” emerged that replaced the gold standard as the system for ordering the global economy.

Notes

[1] Britain had resisted joining the EEC because of its existing trading obligations with the Commonwealth, primarily former colonies including New Zealand. Also continental interests, primarily France, were suspicious of the British ties to the US. But as the post-WWII economic boom continued in Europe, it became too attractive and on January 1, 1973 Britain was admitted into the EEC. France agreed, partly because it represented a balance to Germany’s power.
[3] Patrick G. McCabe, (1994) “New Zealand: The Unique Experiment in Deregulation,” in Telecommunications in the Pacific Basin: An Evolutionary Approach. Edited by Eli Noam, Seisuke Komatsukzuki, and Douglas A. Conn. New York: Oxford University Press. Originally presented at the Pacific Telecommunications Conference.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    May 2024
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.