Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

How IT Came to Rule the World, 2.7: The Origins of Microsoft

Posted on | September 22, 2010 | No Comments

The Intel 8080, the Altair, and the Formation of Microsoft

As kids, Bill Gates and Paul Allen dreamed of having their own Fortune 500 company.[1] The two became friends (and sometimes adversaries) when both attended the prestigious Lakeside School in Seattle in the early 1970s. But their friendship was mediated by a third entity, a Teletype ASR-33 connected remotely to a computer. As video terminals were rare and expensive, this popular teleprinter was often recruited to provide an interface to computers. Lakeside used the ASR-33 to connect to a timeshare service offered by General Electric (GE) in the Seattle area.

However, based on sharing the resources of a PDP-10, the service soon proved to be expensive. Despite an infusion of $3,000 into the Lakeside computer account by the Lakeside Mothers Club, the boys (Lakeside was an all-male school) soon ran out of computer time. But luckily, one of the mothers of a Lakeside student was a co-founder of a new company called the Computer Center Corporation that offered students computer time on their PDP-10 in exchange for helping them debug their software.

Allen and Gates became quite proficient with the machine, and even after the project ended, they continued to “hack” into the machine. Although they were eventually caught, they nevertheless gained a notorious but beneficial reputation for their hacking. But rather than continuing with hacking, they instead went into business, forming the Lakeside Programmer’s Group with a few other students.[2]

The Lakeside Programmers Group primarily provided computer services in exchange for coding time but provided the foundation for their next enterprise, Traf-O-Data. This new company was formed in 1972 to sell computer traffic-analysis systems to municipalities. They planned to string rubber cables across roads and highways and use a microprocessor to develop statistics on traffic flow.

Their technology was based on an Intel 8008 chip that had enamored Allen and who subsequently seduced Gates into helping him develop a BASIC interpreter for it. By this time, Allen was enrolled at Washington State University, and his dorm became their headquarters. The 8008 was the first 8-bit microprocessor but working with it was a bit awkward. They used an IBM System 360 on campus to simulate the 8008.

At the same time both were hired by defense contractor TRW to develop a simulator for the 8008 chip.[3] The two labored and eventually came up with a workable system for the car counter but it had difficulties with its paper-based printer. Traf-O-Data was not very profitable as it soon found itself competing against free services from the State of Washington. Their final Traf-O-Data product was not really a computer but it provided valuable experience for the two young programmers.[4]

After Gates’ graduation, the two were in Boston, where they took a bold move towards their vision of creating a major computer company. In the winter of 1975, Paul Allen, who was now working for Honeywell picked up the January issue of Popular Electronics at Cambridge’s Harvard Square. Excitedly, he took it to his friend Bill Gates’ dorm room, who was enrolled nearby at Harvard University.

The magazine issue sparked their quest to enter and impact the new computer era. They saw their opportunity to leverage their experience with BASIC and gain a foothold in the emerging microcomputer industry. The magazine showed a low-cost microcomputer built around the Intel 8080 chip and was in desperate need of a programming language. The Altair, as it was called, was actually a kit that had to be assembled by the purchaser. It was marketed by a company called MITS (Micro Instrumentation Telemetry Systems).

Gates and Allen had been following the development of the 8080 chip. Using Traf-O-Data stationary as their institutional identity, they contacted the developer of the MITS machine to offer their services. The two had some experience with Intel chips from their Traf-O-Data days, so they designed a simulation of the Altair’s 8080 chip on a Harvard PDP-10 to build a version of BASIC that would run on the Altair. The key was their questions about how to use a Teletype machine to read and input data. This call convinced the MITS people that they were serious. Just a few months later, Allen flew to New Mexico to present their software. The memory in the Altair was so small that they were unsure their version of BASIC would work. But after a day’s delay and some last-minute software adjustments by Allen, the software worked, and the demonstration was a success. It even allowed them to play a game of Lunar Lander on the Teletype printer.

Allen was offered a job on the spot and became Vice-President of Software at MITS. Gates flew down in the summer and helped out while they worked on their new company at night. On July 23, 1975, the two companies signed a contract giving MITS exclusive rights to their BASIC with royalties going to their new company Micro-Soft. The relationship between MITS and Micro-Soft soon soured. Microsoft was not making much money from the deal. MITS sold BASIC for $75 with the kit while charging $500 for it separately. It became extremely attractive for Altair users to trade bootleg copies of BASIC rather than buying it from MITS. Although the Altair depended on BASIC to do anything useful, MITS saw its business as selling the hardware, not the software. Consequently, marketing the software was not a priority.

On February 3, 1976, Gates sent his infamous letter accusing most Altair owners of stealing the BASIC software by duplicating the paper tape. He claimed that only 10% of them had bought BASIC. Soon Gates got his father, a successful lawyer in Seattle, involved in a lawsuit to get BASIC back from Pertec, the company that had bought MITS in the meantime.

The year 1977 was a decent year for Micro-Soft, with $381,715 in revenues.[5] They got BASIC back and in August began negotiations with Apple to license its programming language for $21,000. Micro-Soft produced versions of BASIC for new processors as they came out, including the 6502 that Wozniak was using for their Apple IIs and the TRS-80 by Radio Shack. The latter’s marketing capabilities made an extraordinary impact on the popularity of the microcomputer, and Micro-Soft’s BASIC was on most of them. When the Commodore PET was designed, Micro-Soft also got the call to provide their BASIC.

Allen, Gates, and crew worked on the project despite the agreement that royalties would not be forthcoming until it shipped the next year. Ironically it was support from Apple that provided a basic level of financial backing for Microsoft. Although Wozniak had designed a BASIC early on for the Apple II, it was not the “floating point” version that many users were requesting. Micro-Soft soon developed a version of BASIC for the Apple II and received a check for $10,500 as an initial part of its 10-year license fee. It was one of the rare times that Gates allowed software to be licensed on a flat-fee basis, rather than requiring a royalty payment on every copy sold.[6] The new version, called AppleSoft BASIC was released in November 1977 and improved and rereleased the following year.

Micro-Soft was very important to Apple in its early days. The Apple II’s bus architecture made expansion possible, and Micro-Soft came up with Softcard to allow the Apple computer to run CP/M. But it was BASIC that was crucial to the success of the Apple II, and Steve Jobs later encouraged Microsoft to create a version for the Macintosh. Unfortunately, this project was a disaster, so Gates strong-armed Jobs to accept it or lose their license for the Apple II’s BASIC. Since Apple Computers was still highly dependent on the sales of the Apple II and BASIC was a strong complementary component of the microcomputer, Jobs killed their in-house MacBasic project to use the Microsoft version. At least until the Apple II computer was discontinued. [7]

In 1978 Gates agreed to move the company back to their hometown Seattle and change the name to Microsoft. Allen, in particular, had grown tired of the desert and convinced Gates to move the company. The three years they were in Albuquerque years were very challenging as the small company (16 employees in 1978) worked day and night keeping up with the new computers and chips coming on to the market. CP/M was becoming the most popular cross-platform operating system due to Kildall’s BIOS that allowed the OS to be quickly adapted for new machines.

In December of 1979, they moved to Bellevue, Seattle, and started focusing on Intel-based 8086 16-bit machines. By March 1979, the newly named Microsoft had 48 OEM (Original Equipment Manufacturer) customers for its 8080 BASIC programming language, 29 for FORTRAN, and 12 for COBOL. During that summer, they developed PASCAL and APL languages as well. The Intel-based microcomputer industry was starting to take off, and Gates and Allen had positioned themselves in the center of it.

Notes

[1] Fortune 500 dream story told by Paul Allen in an interview on Robert X. Cringely’s Triumph of the Nerds.
[2] This information on Allen and Gates’ early years were from Laura Rich’s The Accidental Zillionaire. John Wiley & Sons, Inc.
[3] Allen and Gates became an employee of TRW during his senior year earning a salary of some $30,000.
[4] Traf-O-Data and TRW employment information from Fire in the Valley, pp. 30-32.
[5] Micro-Soft’s 1977 revenues from Laura Rich’s The Accidental Zillionaire.
[6] Licensing of Microsoft’s BASIC to Apple from apple2history.com. Accessed September 19, 2003. .
[7] Microsoft’s early reliance on Apple from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 265.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 2.6: The PC and the Floppy Disk

Posted on | September 21, 2010 | No Comments

The development of the floppy disk device was a crucial factor determining the success of the personal computer. Originally developed by David Noble at IBM in 1971 as a backup storage mechanism for their System/360’s magnetic core memories, the floppy disk was soon put to use for other purposes. An employee at IBM, Alan Shugart saw the implications of the new device for smaller computers. He realized that the floppy disk could provide a new storage device that was faster, had random access (meaning that one did not have to rewind an entire tape to find desired information) and could be portable. He started a company called Shugart Associates to build and market them. However, for the microcomputer to be effective it would need to combine the power of the microprocessor with the new storage mechanism. For a microcomputer to use the floppy disk, it required a new software package to run it, what IBM had already called a “Disk Operating System”. [1]

It was Gary Kildall who pioneered the first effective disk operating systems for microcomputers. Kildall earned his doctoral degree through the military’s ROTC program and so had a choice of going to Vietnam or teaching computer science at the Naval Postgraduate School in Monterey California. While teaching, he was also hired as a consultant by Intel to write software for its microprocessors, including a compiler later called PL/M. The compiler would allow programs written in languages like FORTRAN on larger computers to be used with the microprocessor. It was used on Intel’s Intellec-4 and Intellec-8, small computers that could lay some claim to the title of the first microcomputers, but were never marketed as such. Later, Kildall began emulating Intel’s new 8080 microprocessor on an IBM computer (Microsoft founders Gates and Allen would soon do something similar to write software for the Altair computer) and developed a new version of PL/M for it. He also decided at that time to write a program to control the mainframe’s disk drive. Using commands developed by DEC to access data from its DECtape, he began to write the code for the new operating system. DEC’s OS/8 and later its RT-11 had been important developments for PHP minicomputer series and showed that smaller computers could compete with the mainframes.[2] Pulling the pieces together, Kildall created his new operating system called CP/M, short for Control Program/Monitor.

Intel didn’t really want Kildall’s OS, but the software soon became the standard for a number of new microcomputers. CP/M was announced as a commercial product in April 1976. Kildall soon quit teaching to form a new company with his wife called Intergalactic Digital Research (later just called Digital Research) to market the operating system. CP/M was soon used by a number of small computers including the Osbourne, the first portable microcomputer, and the Kaypro which is shown below.

As C/PM became the standard for microcomputer operating systems, it inspired imitation. It was the foundation for another important operating system called Q-DOS (Quick and Dirty Operating System) that was bought by a small company called Microsoft as the basis for its own microcomputer operating system. MS-DOS became the software foundation of Microsoft’s empire and the successful early run of the IBM Personal Computer.[3]

Notes

[1] The beginnings of the floppy disk from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 236-237.
[2] Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press was the source for Kildall’s development of CP/M from DEC sources. p. 238.
[3] Gary Kildall’s early story also from Robert X. Cringely’s Accidental Empires. New York: HarperBusiness, A division of HarperCollins Publishers. pp. 55-59.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 2.5: Intel and the PC

Posted on | September 14, 2010 | No Comments

The computers used for the Moon landing were already out of date when Neil Armstrong walked on the lunar surface in 1969, but the decision to use integrated circuits or “chips” by the Apollo project paved the way for the microprocessor revolution and one of its main offspring, the personal computer.

NASA decided early on to standardize its Apollo flight technology with the integrated circuits (ICs) and nurtured them into reliable, relatively high performance digital processors. Reliability, cost, and the ease of manufacturing these “chips” had been sufficiently subsidized by the space program (and the MAD “Mutually Assured Destruction” missile defense strategy) to the point where integrated circuits and their next stage, the microprocessor, could be used in business related computers. In what would become a common geek term, the PC was the “killer app” for the microchip.

While Jack Kilby at Texas Instruments is generally credited to be the first to construct an integrated circuit, his contemporaries at Fairchild conceived of a production process that could mass-produce the small chips.

While Kilby’s ICs required its combination of transistors, resistors, and capacitors to be connected with gold wires and assembled manually, Robert Noyce and others at Fairchild were developing a literal printing process to construct the ICs. His “planar process” printed thin metal lines on top of an insulating silicon oxide layer that could connect the integral components of the chip. At first they could only connect a few components, but as they refined their “photolithography” method, hundreds, then thousands of connections could be made. By the time the Internet became a household word in the 1990s, millions of transistors were placed on a single chip.

In 1968, with the Apollo project well-established, integrated chip co-inventor Robert Noyce and fellow Fairchild “traitor” Gordon Moore left the company to form a new semiconductor company. What emerged was Silicon Valley stalwart Intel, the future producer of some of the industry’s most sophisticated microprocessors and the latter half of the infamous “Wintel” combination (Windows OS/Intel microprocessor) that would dominate PC sales throughout the rest of the century.

After twenty years of government backing, the microprocessing industry was about to crawl out on its own. And it was the microcomputer that would give the semiconductor industry the legs to become viable in the commercial arena.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

The Smith Effect II: “State-istics,” Calculating Practices, and the Rise of IT

Posted on | September 6, 2010 | No Comments

This is the second in a three part exploration of Adam Smith and how his ideas laid the foundation for information technology (IT). Part one discussed Adam Smith’s reconceptualization of wealth and its importance to the role of populations in political economy.


‘Tis not a tale I tell to many.
The Government’s Engines have long memories.”

– William Gibson and Bruce Sterling, (1990) The Difference Engine.

In Smith Effect I, the argument is made that Adam Smith’s writings contributed to a set of intellectual movements that located a nation’s wealth in its population rather than the familial structure of the monarch and its treasury.

Drawing on Michael J. Shapiro‘s Reading “Adam Smith” (2002), I argue that this reconceptualization created a trajectory for the development of new information practices and technologies. Not only did Smith’s writings contribute to an understanding of “market forces” and the importance of labor, but the new emphasis on the population for a nation’s wealth provided the intellectual foundation for a transformation of the census, an ancient political tool, into a wide field of measurements that came to be called statistics (“state-istics”), the science of numbers in service of governing the nation-state. This turn led directly to creation of information machines and, ultimately, electronic digital computers.

The Smith Effect therefore provides a unique opportunity for analyzing the roots of modern society’s reliance on information technology (IT) and how these tools and practices have been integrated into a wide array of corporate and governmental bureaucracies.

Written about the time of America’s revolution, Smith’s ideas became the foundation of economic thought in the West and a major contributor to the characterization of the modern state and its role in the liberalization of the political economy. Reflecting the preoccupation of the time, Smith’s point of departure for his analysis of the economy was sovereignty. But rather than negate it, he reconfigured it. The authority of the state was no longer seen simply concerned with the maintenance of a ruling power, but also with its productive capabilities and with the wealth of a realm’s inhabitants.

Due to Smith, sovereign power was increasingly seen as a steering mechanism that could guide the flows and collaborations of social activities towards increasing the overall wealth for the nation. The energies of the population could be mobilized in a way that takes advantage of the social propensities to barter, exchange and accumulate. In Smith’s wake, governmental bureaucracy expands and takes on increasing demands in terms of textualizing, aggregating, calculating, and interpreting information about the economy and the population which was beginning to be considered an integral component of a nation’s wealth.

The origins of this new field of governmental calculation were first articulated in a chapter of the German Baron J. F. Bielfeld‘s Elements of Erudition in 1787. Entitled “Statistics,” it announced the endeavor as the “science that teaches us what is the political arrangement of all the modern states of the known world.” Bielfeld discussed the emergence of the field in Germany and its objective to chronicle the “noteworthy characteristics of the state” and to analyze the major powers in the world, including their citizens, industries, and governmental decisions.[1]

While initially concerned with categorization and verbal descriptions, by the beginning of the next century these descriptions were largely replaced by numerical data and calculation. This new enthusiasm for calculation led to the publication of a number of books and the participation of many “societies” in the task of producing lists of numbers.

By the time of Charles Babbage, generally considered the father of computing, numbers had become a major preoccupation in some social circles. Babbage published On the Economy of Machinery and Manufacturers in 1832 that established his credentials as a political economist in the lineage from Adam Smith to John Stuart Mill and Karl Marx. His analysis of factories drew on Smith’s analysis of pin manufacturing and the role of division of labor and specialization. It was also crucial for Marx’s predictions on the “means of production.”

More relevant to this thread, Babbage was a strong advocate of the value of numbers, calculation, and tables. He urged the publication of more books on numeral constants. Hacking: “Babbage had twenty kinds of numbers to be listed. They begin with familiar enough; material, astronomy, atomic weights, specific heats and so forth. They quickly pass to the number of feet of oak a man can saw in an hour, the productive powers of men, horses, camels, and steam engines compared, the relative weights of the bones of various species, the relative frequency of occurrence of letters in various languages.” With this new fascination with numerical calculation, statistics went beyond just the calculation of government revenues and assets to be used in a wider range of societal and political computations.[2]

The “avalanche” of algebraic numerals diffused calculative and listing capabilities throughout a number of social domains. As bureaucracy was in its infancy, it was initially more pronounced in the universities and societies, in areas such as epidemiology, genetics, and political economy. World statistical organizations convened such as the Manchester Statistical Society and the Statistical Society of London, which included such members as Thomas Malthus, generally considered to be the one who put the “dismal” in the “dismal science” of economics due to his dire prognostications about population and agriculture.

Dialogues on statistics were from their earliest beginnings, politically charged discourses. Of great importance to the members of these societies was “the condition of the people,” as statistics became an important part of the social reform movements that accompanied the trials of industrialization.[3]

Calculations took a new turn in the wake of Smith’s contribution. The use of “political arithmetic” for the mercantile state had a long history in the area of taxation and finances as it concerned itself with the internal affairs of the state and the monitoring of wealth extracted from its empire. During the nineteenth and twentieth centuries, however, the state turned towards registering information on their populations. This includes the “centralized collation of materials registering births, marriages and deaths, statistics pertaining to residence, ethnic background and occupation; and what came to be called by Quetelet and others ‘moral statistics,’ relating to suicide, delinquency, divorce and so on.”[4] It also shows in the Belgian census of 1840s, which would go on to become the international model as countries learned from each other the techniques of constructing a numerical representation of its population and also used these numbers to compete for national status.[5]

The new focus on population in the late 18th century led directly to the emergence of a series of knowledges for constructing, reading, and acting upon this problem. Numbering for the state, or “statistics,” emerged as a formative knowledge in the construction of government practices. Bureaucracies expand and officials are put to use in collecting the new information. The comprehension of the population as the new source of national wealth and as the focus of administrative activity called forth new languages, many utilizing alphanumeric figuring.

Based on the elevation of the alphanumeric notation system as the quantitative rhetoric of reality (from the old French term real , the space controlled by royalty), statistics developed as a state instrument for social surveying and has consequences for both the constitution of the population, and according to Foucault, new forms of “governmentality,” including the elaboration of political economy as a new discipline of measuring a nation’s wealth.

Statistics provided a new view of the economy and society. No longer could the family serve as a viable model of economic accumulation and social governance. The family instead becomes part of the demographic realm to be studied and calculated. Foucault elaborates:

    Whereas statistics had previously worked within the administrative frame and thus in terms of the functioning of sovereignty, it now gradually reveals that population has its own regularities, its own rate of deaths and diseases, its cycle of scarcity, etc.; statistics shows also that the domain of population involves a range of intrinsic, aggregate effects, phenomena that are irreducible to those of the family, such as epidemics, endemic levels of mortality, ascending spirals of labour and wealth; lastly it shows that, through its shifts, customs, activities, etc., population has specific economic effects: statistics, by making it possible to quantify these specific phenomena of population, also shows that this phenomenon is irreducible to the dimension of the family.[6]

By the late nineteenth century, frustration with manual methods of compiling statistics was rising. Despite increasing loads and classification projects; pencils, pens, and rulers were still the main tools for classifying, calculating, and summarizing work sheets into journals and ledgers. The United States started to look for alternatives after running into difficulties tabulating the 1880 census. It was required by the US Constitution to keep a register of the population and was desperately trying to keep up with the large-scale immigration of the late 19th century. The Census Bureau could not keep accurate track of the growth and by 1887 was desperately soliciting ideas to help them complete the 1880 census. The solution would be mechanical and lead to the formation of International Business Machines (IBM).

Preview of the Smith Effect III: The Census and the Rise of IBM

From the eighteen-century, changes start to occur in the way state sovereignty is generally construed. Mercantilism, which did much to apply the new calculating rationality and arts of government for the welfare of the monarchical state, began to give way to an even broader application of political practices involving the collection and calculation of information. Whereas the instruments of the state under mercantilism worked to increase the wealth of the ruler, the new practices and knowledges of the state paid increasing heed to the newly conceived problem of population. The rigid framework of sovereignty that had been previously modeled on the patriarchal family begins to confront an increasing money supply, new numerical techniques allowing demographical accounting and the expansion of agricultural and goods production.

But it was the American census that provided the most immediate relief for the problems associated with aggregating large amounts of statistical data. Mandated by the US constitution to be held every 10 years, its calculation ran into difficulties as immigration soared in the late 19th century. By 1880, the task was nearly impossible to complete before the next one was due. The solution was the “census machine” and it changed the trajectory of both corporate and governments bureaucracy as well as the information practices that run them.

Statistics: The Calculating Governmentality

This section follows the thesis that Adam Smith’s new conception of the wealth for nations created the impetus for the development of statistics and other information collecting and calculating technologies. This new direction of political economy from one based on governmental wealth to the vitality and enterprise of its population led to continual interest and innovations in information practices. The United States Constitution, written in 1787 and ratified in 1788, was influential as it required a census every ten years. This lead to Herman Hollerith’s tabulating machines created for the 1890 US Census.

Hollerith’s company merged later with two other companies to form International Business Machines or IBM. IBM began to sell its tabulating machines and customized census services to countries like Russia and then later, Nazi Germany. The punch-card tabulating systems were then generalized for a wide range of commercial and government purposes, including monitoring racial politics as well as parts management for the Luftewaffe, Germany’s air force.

Notes

[1] Notes and quotes from Statistics and Society: Data Collection and Interpretation (1991) by Walter Theodore Federer. CRC Press.
[2] Hacking, I. (1991) “How should we do a history of statistics?” in Burchell, G., Gordon, C. and Miller, P. (eds.) The Foucault Effect: Studies in Governmentality. (Chicago: University of Chicago Press). p. 186. Babbage, Charles. On the Economy of Machinery and Manufacturers. London: Charles Knight, 1832.

[3] Manicus, P. (1987) A History and Philosophy of the Social Sciences. (Oxford: Basil Blackwell). p. 196 197. See also Poovey, M. (1993) “Figures of Arithematic, Figures of Speech: The Discourse of Statistics in the 1830’s,” CRITICAL INQUIRY. Winter, Vol. 19, No. 2.
[3] Hacking, I. (1991) “How should we do a history of statistics?” in Burchell, G., Gordon, C. and Miller, P. (eds.) The Foucault Effect: Studies in Governmentality. (Chicago: University of Chicago Press). p. 182.
[4] See also Bruce Curtis’ The Politics of Population: State Formation, Statistics, and the Census of of Canada, 1840-1875. (2002) University of Toronto Press.
[5] Foucault, M. (1991) “Governmentality,” in Burchell, G., Gordon, C. and Miller, P. (eds.) The Foucault Effect: Studies in Governmentality. (Chicago: University of Chicago Press). p. 99.

Citation APA (7th Edition)

Pennings, A.J. (2010, Sept 06). The Smith Effect II: “State-istics,” Calculating Practices, and the Rise of IT. apennings.com https://apennings.com/dystopian-economies/the-smith-effect-ii-from-political-arithmetic-to-state-istics-to-it/?preview_id=1033&preview_nonce=c51b6b9b63&preview=true

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.


Anthony
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global political economy.

The Smith Effect I: Markets, Governments, and the Rise of Information Technologies

Posted on | August 30, 2010 | No Comments

This is the first in a three part exploration of Adam Smith and how his ideas laid the foundation for modern information technology (IT)

In his book, Reading “Adam Smith” (2002), Michael Shapiro states that because of Adam Smith’s contributions, “he is always at least implicitly present when one negotiates modernity’s codes and arrangements.” What he means is that not only have Smith’s writings become richly woven into the fabric of modern thought, its economics, its philosophies, but also its government and economic institutions.[1] Shapiro’s “Smith Effect” can be used to understand the influence Adam Smith has had on both the modern emphasis on economic markets as well as the role and the growth of government. Integral to both these processes has been a wide range of information management imperatives that have spurred the development of computers and communications technologies.

Smith wrote his classic text, An Inquiry into the Nature and Causes of the Wealth of Nations (1776), commonly known as The Wealth of Nations, in a time when much of modern thought was being introduced and discussed. Intellectuals in the 18th Century were obsessed with challenging the reigning political authority and the basis from which it derived its political legitimacy, namely a vertically oriented, religion-based ruling system. For example, while the emerging United States of America would remain one nation “under God,” it instituted a set of horizontally oriented checks and balances based on the machine metaphors of the nascent industrialization age.

Smith took aim at the hierarchically oriented mercantile structure that was characterized by a governing sovereignty that derived its authority from the rhetoric of divine origins. His book became a major force in the liberalization of this ruling structure. In the process, he helped create the impetus for a new rationality that would change modern government and how the economy was viewed. This new rationality would both liberalize the economy and draw nations into a new type of governance, the management of populations.

From Treasure to Laboring Energies

European monarchies maintained a strong grip on the land and their subjects by invoking a privileged place in society. By claiming divine legitimacy, they justified their positions in a stationary order. Consequently, early imperial mercantilism conceived of wealth as static hoarding for the monarch and feudal lords, not as a process of dynamic practices involving the enterprise of labor and capital investment, as Smith would later argue. In challenging this organization, Smith’s writings helped “recast divine will as a set of dynamic mechanisms regulating the process of production.”[2]

His argument was based on the protest ideas of the time, which challenged the Catholic church’s role and their teachings that God could be petitioned to intervene in worldly affairs. For Smith, divinity was infused in earthly nature and demonstrated by the harmonious order of economic markets based on human desiring. Smith was particularly comfortable with using musical metaphors to talk about the workings of the economy. Markets ensured the harmonious working of the world as long as those involved worked to their own benefit. In making this philosophical and political leap, Smith helped shape the moral, political, and social dimensions which have become integral to modern capitalist political economies.

Smith’s often-quoted “invisible hand” was mentioned only once in the Wealth of Nations. Still, it has continued to function as powerful currency for his train of thought that relocated the problem of wealth from the realm of the sovereign purse to the processes involved in economic production. For Smith, the nation’s wealth was not what could be accumulated for the monarch but rather the sum of goods and services that were available per capita for the population.

Hailing from Scotland, on the periphery of the British kingdom, Smith objected to the hoarding mercantile practices of the English monarchy because it made the people he saw poorer. Smith’s writings were significant because he located value in the act of production and in the creative and energetic expenditures of laboring people, not the result of amassing wealth.

Adam Smith set the stage for Karl Marx, not one of his direct students, but certainly one of his most dedicated. What intrigued Marx was that Smith conceptualized the plight of laboring bodies. This meant that Smith, “shifted the focus from national rivalries to the conditions of work, thereby helping to enfranchise a neglected constituency, the working poor, and to draw them into a new conversation on problems of inequity, a conversation which could not be held with the old mercantilist conversation on value.”[3]

In doing so, Smith intellectually recognized the working population and helped transgress certain prejudices against working activities and those engaged in them. As Zuboff argued, “The Middle Ages produced a conception of labor infused with a loathing drawn from three traditions: (1) the Greco-Roman legacy that associated labor with slavery, (2) the barbarian heritage that disdained those who worked the land and extolled the warrior who gained his livelihood in bloody booty, and (3) the Judeo-Christian theology that admired contemplation over action.” Smith’s new intellectual negotiations set the stage for empowering labor and a historic wealth boom based on new types of industrial production and the distribution of affluence among the classes.[4]

The “Invisible Hand” and the Escape from History

To better understand the “invisible hand,” one of modernity’s most prevalent truisms, it is vital to examine Smith’s philosophy of religion and nature. Smith conceived of God as the “Author” of the world, but one that had retreated from the day-to-day world and left behind a structural guarantee that the self and the social order would remain attuned. This guarantee is a form of human desiring we call “enterprise” at its best and “greed” at its worst.

Smith’s upbeat version of the world was one that resulted in an aggregated “harmony” when individuals followed their passions. Shapiro explains, “Among what Smith’s ‘Author’ had left behind as “nature” was the regulative mechanism of a socially felicitous tendency in individual human desiring, which along with some inevitable tendencies in collective arrangements, would eventuate an order that progressed toward general prosperity and broadly distributed human contentment.”[5] What has become commonly known as the “market” was Smith’s divine mechanism that was “infused in nature” to produce a harmonious aggregate outcome when individuals act on their passions.[6]

The “Smith Effect” provides an opportunity for new ways to analyze the social field and the overlap between economic, social, and political spheres. Smith was a crucial critical theorist in his rejection of mercantile thought, and his writings were a forerunner of modern political economy. Two major bodies of economic analysis would emerge from Smith’s writings.

One was the classical liberal tradition that combined Smith’s anti-mercantile stance in favor of “markets” with an increasing emphasis on empirical and quantitative calculation. This stance focused largely on theoretically reconceptualizing Smith’s divinely gifted “invisible hand” in terms of more naturalistic and physics-derived metaphors.

One such metaphor was “equilibrium” that allowed political economy “to escape from history.”[7] By focusing on economic tendencies to revert to one price among many competitors, economics was able to devise a theoretical system that could marginalize historical events and focus exclusively on modelling forces of supply and demand. Economic, or market equilibrium was conceived as a condition or state in which economic forces become balanced. This philosophical foundation informs mainstream liberal economics in both Keynesian and Monetarist (Austrian) traditions.

The other body of analysis was the Marxist tradition that drew its investigation from Smith’s concern for the worker and the processes of valuing commodity forms and accumulating capital. This latter posture organized the history of the economy around the role of labor and its role in the “mode of production.” Class struggle between laboring forces and capitalist owners drive the ebb and flow of history. Marx combined Smith’s political economy with German dialectical philosophy and French utopian theory to create a unique and radical analytical perspective.

These two ways of viewing the political economy helped shape the next few centuries. This included the classic division during the Cold War as the West developed a form of capitalist industrialization based on the New Deal’s negotiated relationship between government and business. In the “East,” forms of state-controlled communism emerged in China, Cuba, North Korea and the USSR. In the 1970s, liberalism re-emerged in reaction to Keynesian/New Deal macromanagement of the economy as neo-liberalism and once again stressed market forces, particularly in the financial sphere.

While Smith opened up new areas for investigating the economy, it would be a mistake to equate his writings with the anti-state rhetoric of modern neoliberalism. His writings marked not so much the end of the state, or even a way of curbing its power, but rather a way of theorizing the economy so that it could be even more productive for the nation. More than ever, the intervention of the state in the economy was legitimized.

This came about largely because the theoretical conversations he set in motion changed the connotations associated with both the state and the economy. For the former, it meant a state with a powerful new system of calculable observations that could be used to guide its actions. To help with these observations, “political arithmetik” or “statistics” emerged as a focus of knowledge that could record a wide range of demographic and epidemiological events and processes. These new economic and social surveillance techniques provided new ways to scrutinize, and consequently, new ways emerged to intervene in social activities.

Conclusion

Smith wrote extensively on the proper expenditures of the state, including defense, education, justice, public works, the dignity of the sovereign, and institutions for facilitating the commerce of society. For the latter, it meant a redefinition of the economy from that based on the model of a household to a larger set of social activities covering a wide range of laboring activities and entrepreneurial production. While no doubt a significant contributor to modern liberalism and its commitment to international comparative advantage and “free enterprise,” Smith’s writings helped conceive a more prominent role for government in managing the economy, not a smaller one. In the process, calculating tools and information management practices would be created to facilitate this control.

Preview to The Smith Effect II: Calculating Practices and the Rise of IT

Smith’s writings legitimized the state’s intervention in the economy by identifying the crucial role of managing populations for national prosperity. His contention that the natural order channels individual desires into positive collective outcomes helped to open up flows of capital in service of manufactures. It set in motion conversations that changed the connotations associated with the state and the economy.

A body of knowledge known as statistics emerged as a set of techniques for amassing important data on a wide range of demographic information important to the state and the emergence of democratic governance. It provided new economic and social surveillance techniques provided ways to analyze and scrutinize the political economy. Consequently, these new techniques and the indicators they enabled suggested ways to intervene in social activities to improve the conditions and productivity of national populations.

These techniques eventually found mechanical enhancement with the invention of the tabulating machine and telegraphy that were used to collect data from a wide geographical area in service of the wealth of a nation. Herman Hollerith invented the census machine, used to tabulate and calculate the US population in accordance with the specifications of the US Constitution. He went on to form companies that were merged into International Business Machines (IBM), the dominant information machine company of the big computer era and whose personal computer (PC) solidified the market for “microcomputers.”

Notes

[1] Shapiro, M.J. (1993) Reading “Adam Smith”: Desire, History, and Value. p. xxix.
[2] Shapiro, M.J. (1992) Reading the Postmodern Polity: Political Theory as Textual Practice.
[3] Quote on Smith and his perspective on labor. Shapiro in Reading “Adam Smith”. p. xxxi.
[4] Three traditions that influenced the contempt for labor from Shoshana Zuboff’s In the Age of the Smart Machine: The Future of Work and Power. (1988) pp. 25-26.
[5] Smith regulative mechanism left behind by the “Author”. Shapiro, Reading “Adam Smith” p. xxxii.
[6] Shapiro on Smith’s “market”. Ibid, p. 79.
[7] Peter Manicus on equilibrium as the vehicle for political economy’s escape from history. A History and Philosophy of the Social Sciences. Oxford: Basil Blackwell. p. 40.

Citation APA (7th Edition)

Pennings, A.J. (2010, Aug 30). The Smith Effect: Markets, Governments, and the Rise of Information Technologies. apennings.com https://apennings.com/dystopian-economies/the-smith-effect-markets-and-bureaucracy/

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York (SUNY) in South Korea. Previously, he taught in the MBA program at St. Edwards University in Austin, Texas and comparative and digital economies at New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 1.9: Xerox PARC

Posted on | August 26, 2010 | No Comments

A major, but a transitional step for personalized computing and data networking occurred at Xerox, the paper copier megalith. Xerox appropriated much of the research work done at the Augmentation Research Center (ARC) and the Advanced Research Projects Agency (ARPA) for their new research center situated in northern California. The Palo Alto Research Center (PARC) was set up by Xerox in 1970 to establish leadership in the “architecture of information,” a vague but enticing term coined by Xerox CEO Peter McColough.(1) Drawing on Xerox’s great wealth, PARC harvested the fruits of ARPA’s continuous funding by hiring one of their former directors and by recruiting some of computer science’s top researchers. At PARC, Xerox developed the Alto and the Star, early personalized computers with a GUI interface, mouse, and even Ethernet data networking. These PARC innovations inspired companies like Apple, Cisco and 3Com to develop new technologies like the Macintosh PC and data routers.

Deeply implicated in the development of the Great Society’s new bureaucracies, Xerox achieved extraordinary growth and profitability during the 1960s. Formerly the Haloid Company, the firm became Haloid-Xerox in 1958. The next year it launched the Xerox 914, the first automatic, plain-paper office copier and considered one of the most popular industrial products of all time. It subsequently saw its annual profits increase from $32 million in 1959 to over $1.1 billion in 1968.(2)

When Xerox went public in 1961, it became one of the hottest stocks of the year. The Xerox name was so strongly tied to the copying process that the term “Xeroxing” became synonymous with paper reproductions. Just as people “Google” information on the web, they would “Xerox” a paper copy of a document. During the 1960s, Xerox maintained a sales force of some 15,000 people to maintain contact with the new bureaucracies and attempt to solve their customer’s every information need. Xerox was aiming at revenues of $10 billion by 1980 and saw computer-related technologies as a key ingredient in the recipe for reaching that goal, especially after it began to encounter serious competition in their major product lines from the Japanese.

Xerox set out to develop a strong presence in the computer field and recruited some key talent from ARPA and ARPA-funded projects throughout the country. ARPA had been literally creating the field of computer science during that time by seeding programs throughout US universities, and their “talent” was prized and sought-after. Xerox also sought to enter this emerging field through acquisitions. In one of their first attempts, they bought Scientific Data Systems (SDS) in 1969. SDS was a small computer company, but with impressive sales. The new purchase was successful with batch processing for scientific and engineering applications but proved to be slow to capitalize on timesharing developments. Ultimately SDS personal ran afoul of Xerox’s new computer elite from ARPA, who were much more committed to developing an interactive “timesharing” computing environment based on apportioning a computer’s processing time via remotely-located stations. The “dumb terminals” connected a user to a major computer via data communications and allowed a single mainframe to service many people at the same time.

A key person in helping to achieve the Xerox plan was a former director of ARPA’s IPTO. Robert Taylor had resigned as the director of ARPA’s main computer division over concerns about Vietnam and ended up at PARC to help them recruit the brightest people in the computing area. Taylor had worked under Licklider at ARPA and had funded many projects including Robert Engelbart’s NLS project. He had even hired Larry Roberts, who coordinated the ARPANET project. Taylor’s background at ARPA proved instrumental in Xerox’s new plan. He was in a key position to reap the rewards of ARPA’s widespread funding. Subsequently, he hired a number of computer stars from the ARPA universe including those from MIT, Harvard, Carnegie-Mellon, the University of Utah, as well as BBN. He was particularly interested in those that had worked for Project Genie, a project at the University of California at Berkeley that had converted a SDS batch-processing computer into a time-sharing utility. Taylor had parted ways with Xerox management after refusing to use SDS computers in their research environment. PARC wanted DEC’s PDP-10, so it could run ARPA’s time-sharing software or develop its own. In the end, the SDS acquisition became a failure.

Drawing on ARPA’s network of computer expertise, PARC contributed significantly to the future of data communications. Robert Metcalfe, left his studies at Harvard to spend some time at the University of Hawaii with the ALOHANET project so he could study data networks. There Metcalfe picked up crucial ideas on packet-switching and collision detection from Engineering Professor Norm Abramson that would prove useful in his Ph.D. dissertation and later for innovations at PARC. Concepts emerging from the ALOHA project were extremely important for the future of data networking technology. These concepts led ultimately to Xerox’s local area networking products called Ethernet (and even later a company called 3Com). While the ARPANET had solved certain issues related to long distance data communications, Ethernet tackled short-range communications needed in an office or campus environment.

Here is a later blog responding to an article in the Wall Street Journal overemphasizing PARC’s influence.

Notes

(1) Xerox CEO Peter McColough’s speech went: “The basic purpose of the Xerox Corporation is to find the best means to bring greater order and discipline to information. Thus our fundamental thrust, our common denominator, has evolved towards establishing leadership in what we call “the architecture of information.” From: Fumbling the Future: How Xerox Invented, Then Ignored, The First Personal Computer, by Douglas K. Smith and Robert C. Alexander. NY: toExcel Press. p. 50.
2) Segeller, (1999) Nerds 2.0.1: A Brief History of the Internet. New York: TV Books. p. 158
3) Ibid, p. 130.

© ALL RIGHTS RESERVED

Share

Anthony

Anthony J. Pennings, PhD recently joined the Digital Media Management program at St. Edwards University in Austin TX, after ten years on the faculty of New York University.

Is Cyberpunk Making a Comeback?

Posted on | August 23, 2010 | No Comments

    . . . I concocted cyberspace by sitting at a manual typewriter with a blank sheet of paper and writing a bunch of words with I think. . . double-spaced capital letters. . hyperspace. . . other there. . . . you know horrible, like horrible things that would never stick and then I typed cyberspace, and I thought oh, you know.. that’s kind of sexy. . . . [1]
    – William Gibson interviewed in the Canadian documentary Cyberscribe (1993)


The word is that Splice (2009) director, Vincenzo Natali, is set to direct the classic cyberpunk novel Neuromancer (1984) by William Gibson. Known also for the 1997 movie The Cube, Natali has picked up the rights to Neuromancer and is working with Gibson to bring it to the screen.

vincenzo natali to direct neuromancer

Before the Web, We had Cyberspace

By the late 1980’s, the notion of “cyberspace” began to circulate in discussions about the future of the world’s telecommunications networks. Its meaning was in some contention but it no doubt referred to both the new network technologies and accelerating computing abilities of the new electronic microprocessors that began to combine to form the world’s new telecommunications grid, a dynamic multitrillion dollar infrastructure opening up the electronic “frontier” as the railroads and telegraph opened up the American west.

Cyberspace was often connected with the new “virtual reality” technologies of the time, especially as they came to support diverse participants sharing an electronic computer-generated environment through the use of the new networks. This conception arose because author William Gibson produced the term to describe what he called the electronic “consensual hallucination” in which the characters immersed themselves in his award-winning novel Neuromancer. In his fictional narrative, “console cowboys” connect to the network by “jacking in,” linking into the electronic telecommunications “matrix” via electronic velcro-held “trodes” attached to their heads. Somewhat like a flight simulator, the user experiences a vast simulated space scattered with geometric shapes representing institutional databanks such as the “green cubes of Mitsubishi Bank of America.”

Admittedly that sounds quite weak given the “virtual” reality of recent games like Halo or Call of Duty: Modern Warfare 2 not to Second Life or or the military simulations used these days, but it helped sparked imaginations at the time and changed the culture of telecommunications from one dominated by telephone company engineers and Washington DC lawyers to the promise of the web and creative imaginations tech-savvy multimedia designers and entrepreneurs of the 1990s and the zeroes.

It will be interesting to see if Vincenzo Natali can pull off Neuromancer and if it will come out a bit better than Gibson’s other ideation, the Keanu Reeves vehicle, Johnny Mnemonic.

—-

Notes

[1] Quote from Cyberscribe (1991) Canadian Broadcasting Corporation Production. By Producer/Director Frances-Mary Morrison, Editor Jacques Milette.
[2] Image from http://www.collider.com/2010/05/07/neuromancer-vincenzo-natali-splice/

© ALL RIGHTS RESERVED

Share

Anthony

Anthony J. Pennings, PhD recently joined the Digital Media Management program at St. Edwards University in Austin TX, after ten years on the faculty of New York University.

Flash: Multimedia Embraces HTML 5

Posted on | July 28, 2010 | Comments Off on Flash: Multimedia Embraces HTML 5

Apple’s rejection of Adobe’s Flash for its iPhones, iPods, and iPads helped to highlight the utility of HTML 5 for multimedia development on the web and in mobile devices. Steve Jobs, in an open letter last April, criticized the legacy media platform. He said it was power-hungry, non-proprietary, lacking in security, unfriendly to mobile applications, non-touch, and just wrong for the future of multimedia applications development.

Check out the details of this image by Robert Gaal

Steve Jobs

Jobs recommended other established standards such as CSS, Javascript, H.264 for video, and HTML5. See HTML Cheatsheet.

Granted, one always has to consider the “reality distortion field (RDF)” in any Jobs pronouncement, but HTML 5 is picking up momentum. It wasn’t long before Microsoft announced it would support HTML5 in its new IE 9 browser, joining the ranks of Apple’s Safari, Google’s Chrome, Firefox, and Opera. The code is rather useless without the cooperation of the dominant browsers.

One controversial issue is the choice of H.264 video encoding technology that is problematic for Mozilla due to the patents filed in countries throughout the world. The open-source browser doesn’t have the deep pockets of Apple, Google, and Microsoft and has pushed Theora video encoding technology instead. Is this an attempt to destroy the Mozilla Firefox browser?

Jobs disputed Adobe’s claim that much of the web’s video and games would not be accessible without Flash. He cited the adoption of HTML5 video players by major media websites like ABC, CBS, CNN, MSNBC, ESPN, Facebook, Flickr, Fox News, National Geographic, Netflix, NPR, People, Time, Sports Illustrated, The New York Times, The Wall Street Journal, and even YouTube. Developers like the elimination of the awkward ‘Object’ tag that has been replaced with more focused and robust tags such as ‘video’ and ‘audio’ that allow them to add specific attributes for multimedia applications within web browsers. For example, the “onemptied” media event attribute specifies the script to be run at the conclusion of an audio or video file.

So a debate is on about the merits of Flash and the potential of HTML5. Flash is striving to stay relevant. It has become H.264-compatible, for example, and works in some applications with touch interfaces. HTML 5, however, has broached Flash’s primary domain by incorporating scalable vector graphics and is challenging Flash’s strength in animation. Many developers are heavily invested in the Flash technology and will continue to use and defend it – but HTML 5 is likely a game-changer.

Anthony
Anthony J. Pennings, PhD has been on the New York University faculty since 2001 teaching digital media, information systems management, and global political economy.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    March 2025
    M T W T F S S
     12
    3456789
    10111213141516
    17181920212223
    24252627282930
    31  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.