Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

How IT Came to Rule the World-The Information Standard and Other Sovereignties

Posted on | October 3, 2010 | No Comments

President Nixon’s decision to close the gold window in 1971 signaled a dramatic shift in the US international financial policy and the future of the world political economy. The move largely meant the end of the containment of international finance set up at the end of World War II. No longer was the US constrained by the financial rules it set up during the United Nations Monetary and Financial Conference at Bretton Woods, New Hampshire. Known more commonly as the Bretton Woods conference, leaders from the major Allied countries tied the US dollar to gold and major currencies around the world to the dollar. With the end of this de facto gold standard, a new type of global power emerged based on news flows, financial information, and computer and communications technologies.[1]

Empowered by new developments in computer services and data communications, currency markets turned electronic in the wake of Nixon’s decision. Banks throughout the world started to trade the US dollar, the British pound, the German mark, the French franc, the Russian ruble, the Italian lira, the Japanese yen, the Korean won, the Brazilian real and others. Traders bought and sold currencies, betting on the direction of price movements that had been previously pegged under the Bretton Woods rules. Through new electronic conduits and financial news services, they monitored economic information, military movements, political crises, and weather forecasts, all for the purpose of making instantaneous decisions about the viability of country’s money and other financial instruments.

This decision-making capability propelled financial traders, who Tom Wolfe called “the Masters of the Universe” in his (1987) Bonfire of the Vanities, into a major power. Through the vast funds that they accumulated in their portfolios they effectively began to discipline countries around the world through the force of their trading decisions. With the end of the gold standard’s discipline, a new power emerged based on the utility of global communications and information technologies, what Walter Wriston, the notorious and visionary Citibank CEO, called the “information standard” in his controversial book, The Twilight of Sovereignty.[2]

This global information standard became a sovereign power in itself. Nation-states and organizations were caught up in the opportunities and consequences of the new financial trading system that began to structure modern life along the dictates of a new techno-economic imperative. When Reuters offered international price information over data communication lines, it initiated the beginning of a global pan-optic market system that read and interpreted the world according to the regiment of electronic finance. This system expanded rapidly, globally, and comprehensively. It reached into the policies and practices of nearly every nation and organization, both private and public. Reuters caught a break when the Arab-Israeli War broke out in October of 1973, sending the newly freed currency markets into a frenzy and panicked dealers to their computer monitors. Reuter’s “Money Monitor Rates” became the news agency’s major source of revenue and a pioneer of the electronic marketplace.

While the mechanics of the information standard was based on its capability to develop virtual markets using the Reuters electronic news and trading system, the “energy” of the system was provided by the infusion of debt taken on by countries around the world. Ironically, it was the oil crises that created the surpluses of dollars that were lent to nation-states around the world. Addiction to oil drove the growth of the “eurodollars” – US monies outside its geographical boundaries that lead to that debt. Banks pressured countries around the world to take loans for a variety of projects. Growing national debt during the 1970s led to the so-called “Third World Debt Crisis” that blew up in the early 1980s, and gave financial traders substantial leverage within this global system of discipline.

The information standard began disciplining the world political economy and its nation-states into adopting an agenda that included: 1) privatizing government assets and agencies while capitalizing domestic industries on newly electronic stock markets; 2) deregulating domestic industries and taking down barriers to flows of capital and investment; 3) reducing government expenditures and increasing taxes to pay off debt; and; 4) disciplining labor forces into lower cost workers or innovating entrepreneurs. This new global political economy combined a new “free enterprise” fundamentalism led by Margaret Thatcher and Ronald Reagan, with a system of unprecedented capital mobility.

Empowered by the calculative and organizing powers of the spreadsheet, global finance targeted debt-ridden governments and began a process of “privatizing” public assets such as airlines, broadcasting, electricity, transportation, oil fields and telecommunications by valuing assets, creating state-owned enterprises (SOEs) and then finally selling them off as corporate entities to global institutional investors such as pension and sovereign funds. Most significantly, government-owned telecommunications systems were sold off and listed on domestic and international share markets in a process called “privatizing.” These former PTTs (Post, Telephone and Telegraph) eventually incorporated Internet Protocols (IP) and began opening up to the World Wide Web and its flows of capital, news, global e-commerce and social media.

Public and private institutions began to succumb to the new logic of digital finance and a system of hyper-real representational strategies. Both types of organizations fell under the discipline of the financial markets with the former particularly susceptible to bond and currency traders, while the latter continued under constant surveillance by the stock markets and lenders. Central to this emerging regime of “digital monetarism” was the knowledge disciplines of accounting and finance, that congealed their techniques into a new tool, the electronic spreadsheet. The original “killer app” of the personal computer revolution, this versatile program allowed the widespread calculation of financial formulas and “what-if” scenarios allowing the plotting of a wide variety of corporate acquisitions, initial public offerings (IPOs), leveraged buyouts (LBOs), and mergers.

The use of the electronic spreadsheet exploded after IBM introduced its own “Personal Computer” in August of 1981. Soon after, Lotus 1-2-3 became available for the “PC” and all the “IBM-compatible” clones such as Compaq and Dell. Lotus 1-2-3 was named for its spreadsheet, graphing, and database capabilities that combined to produce an extraordinary new facility to both conceptually and textually organize financial information. Lotus 1-2-3 didn’t make the transition to the graphical user interface and Microsoft’s Excel, originally developed for the Apple Macintosh became the dominant spreadsheet application.

In a new era of spreadsheet capitalism, countries were forced to succumb to a new logic of numerical, graphical and textual representations – a realm of computerized hyper-mediated information organized around the techno-economic imperative. Money emerged as a leveraging factor and VisiCalc, Lotus 1-2-3 and Excel became new tools of control and discipline. Facilitated by high-speed network technologies and powerful trading workstations, the information standard began to subject organizations, both public and private, to a new international discipline. Combined with innovations in mathematical algorithms, global money morphed into a variety of financial instruments traded in electronic “dark pools” and on interconnected financial exchanges.

The world economy began to undergo what Harvey called a “time-space compression” due to new permissive technologies such as jet airplanes and IP-based telecommunications. [3] This has meant a shift from vertically-organized to new networked organizations that privilege inter-organizational ties by such means as outsourcing and sub-contracting. Spatial and temporal dimensions of the economy are being reorganized in the need to reduce turnover times for flexible production and marketing strategies on a global scale. For example, coordinating the logistics of containerization, inventory control, and packaging needed to compete in the new marketplace requires contact with a wide of array of competing services.

Notes

[1] A classic source on Bretton Woods was Moffit, M. (1983) The World’s Money. NY: Simon & Schuster, Inc.
[2] Wriston’s interpretation of the Information Standard The Twilight of Sovereignty: How the Information Revolution Is Transforming Our World was organized around a rhetoric of assurance, not a critical analysis. He argued the power of multinational corporations, nation-state dictatorships, and any aggregation of power antithetical to democratic prospects will fall to the sovereign power of the information standard.
[3] Harvey, D. (1989) The Condition of Post-Modernity. Oxford: Basil Blackwell. p. 147.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World: Transformation of the Internet

Posted on | October 1, 2010 | No Comments

A remarkable transformation took shape in the few short years between the Internet’s 25th anniversary in 1994 and the end of the millennium. During these six years, the Internet transformed from a very novel but quaint system for sending ASCII email messages, transferring files, and linking homepages into a global system of electronic communications and commerce. What was once a tiny network linking a handful of university and research institute mainframes soon expanded to include small workstations, personal computers, laptops, and mobile wireless devices. The Internet was no longer confined to campuses, but could be accessed from homes, offices, cars, and even on the move.

The Promise of E-Commerce

What sparked this transition in such a short time? The Internet’s new uses were varied, but it was clear that its potential for global e-commerce was attracting investment capital of staggering proportions. The Clinton Administration followed a set of national policies that supported the development of a “national information infrastructure”, wired schools, and the development of e-commerce. These included a moratorium on taxes, privatization of radio spectrum, the creation of a domain registration administration, and a new pro-competitive policy framework, the Telecommunications Act of 1996.

Global Flows of Cash

Consequently, venture capitalists and the financial markets began to dramatically fund technology and “dot.com” companies. Individual investors also got involved through mutual funds, work-place pension accounts, 401K’s, and new online trading brokerages such as Ameritrade and E*Trade. Capital returning from “Emerging Markets” of Asia and the former USSR during the international financial crises of 1997 and 1998 streamed into the coffers of Internet startups, leading to stark competition in the race to develop the most effective and profitable e-commerce sites. Some called it “The New Economy.”

Companies such as Amazon.com, eBay, RealNetworks, and Yahoo as well as associated “tech” stocks such as Cisco, Dell Computers, and Intel became the darlings of investors. Even America Online, with a somewhat ancillary history, was able to take advantage of the share markets to raise money to buy Netscape and Time-Warner and stake out a considerable presence on the World Wide Web. Billions of dollars poured into the development of new companies for business to consumer (B2C) and business to business (B2B) interchange. With this cash they were able to integrate and use a wide variety of traditional computer, media, and telecommunications such as legacy databases, cable networks, and satellite systems.

The Crash of 2000

By April 2000, the “dot.com” market started to dissipate. A combination of factors including new reporting requirements by the US Securities and Exchange Commission, brought a new discipline to the “dot.coms” and other technology stocks. The SEC required all public companies (except foreign companies and companies with less than $10 million in assets and 500 shareholders) to file periodic reports, registration statements, and other forms electronically through EDGAR, its Electronic Data Gathering, Analysis, and Retrieval system.

The investment bubble collapsed with the preponderance of poor “earnings” reports, but for the most part, the technology and a substantial amount of skilled employees emerged to continue the process of globalizing electronic commerce.

Citation APA (7th Edition)

Pennings, A.J. (2010, Oct 1) How IT Came to Rule the World: Transformation of the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/transformation-of-the-internet/

Share

© ALL RIGHTS RESERVED

Anthony J. Pennings, PhD was on the New York University faculty from 2002-2012 teaching digital media, information systems management, and comparative political economy. He is currently a professor at the State University of New York in South Korea (SUNY Korea). Anthony J. Pennings

Apple, Silicon Valley and the Counter-Cultural Impulse

Posted on | September 24, 2010 | No Comments

Power to the People

A curious subculture had been developing around the use of computers during the post-Vietnam War years of the 1970s. Working off the counter-cultural energy of the 1960s, it challenged “the establishment,” the political-corporate interests that had entangled America in the Indochina war. Then, in 1974, Ted Nelson, the originator of the hypertext concept, wrote a book called Computer Lib about the importance of decentralizing these machines. Subsequently, this book became a manifesto urging people to claim the power of computers for themselves and not leave it in the hands of the military and corporate elite.

Nelson’s agenda did not go unnoticed by the youth of Silicon Valley, particularly the future founders of Apple Computers, Steve Jobs and Steve Wozniak. Coming of age in the early 1970s, they were heavily influenced by the emerging currents of music, poetry, and political philosophy. Sentiments against the Vietnam War and “The Establishment” were strong at the time and contributed to their view that the individual should be empowered by technological means. In 1976, they formed Apple and tapped the swelling counter-cultural rejection of the militarization and corporatization of computing.

From the crumbs of MAD (Mutually Assured Destruction) and the “Space Race,” these two impertinent young men started a small company that changed the face of computing. Steve Jobs and Steve Wozniak were not from Jackson, Tennessee; Albany, Georgia; or Burlington, Vermont; but from a suburban complex of industries and small towns amalgamated by arms and the space races. Both grew up in a culture inundated with electronics – Silicon Valley. They had ready access to computer clubs, electronic surplus stores, trade shows, technical libraries, and an economic system of electronic production.

They did start Apple Computers in their garage. Still, the displaced car regularly drove through the streets of technologically-sophisticated suburbs bred and nurtured by the demands of the Cold War. Within this electronics-rich environment, they concocted a relatively primitive computer from readily available equipment in the Silicon Valley area. The first Apple computer was extraordinary for its size, price, and basic microprocessing abilities that together promised a future of individual empowerment through computing power.

Long before Vice-President Al Gore’s “e-rate” and National Information Infrastructure (NII) would democratize computer access for schoolchildren, Woz would have access to a PDP-8 computer in high school. When they started their business in Jobs’ garage, they had access to computer clubs, electronic surplus stores, corporate technical libraries, trade shows, and a culture of electronic engineering and production.

    “The Woz was not the only student in Silicon Valley with such a dream. Actually, in some ways, he was fairly typical. Many students at Homestead High had parents in the electronics industry. Having grown up with the new technology around them, these kids were not intimidated by it. They were accustomed to watching their parents mess around with oscilloscopes and soldering irons. Homestead High also had teachers who encouraged their students’ interest in technology.”[1]

.

Apple was an offspring of the military-industrial complex, but it was rebellious. Wozniak and Jobs achieved “hacker” status early on because of their Blue-Box business. This device allowed them to make unlimited long-distance telephone calls at a time when they were actually expensive. Moreover, the telephone system was arguably the largest electrical machine in the world. Dominated by AT&T, the network had increasingly been attacked by “phone phreaks” who used various gadgets to mimic the electromagnetic tones used in the telephone system to route calls.

As Jobs told it, they created their Blue-Box after finding an AT&T technical journal at the Stanford Linear Accelerator, a massive device that assists basic research in nuclear physics. In one of the most famous cases, Wozniak, who would later invent the original Apple computer, called the Vatican in Rome to talk to the Pope. Impersonating Henry Kissinger, he failed to make it to the Pope, but he tells the story gleefully, and it stands as an interesting example of the impertinence and resourcefulness of this young subculture to try to use technology for its counter-cultural purposes.[2]

The Altair microcomputer caught the attention of the Homebrew Computer Club, an informal group that met at Stanford University during 1974 and 1975, the last years of the Vietnam War. This club was a strange brew of well-educated hobbyists and hippies living amid the military-industrial complex while heavily influenced by the counterculture movement. Members of this group once got the Altair to play music by placing a transistor radio next to the microcomputer to pick up on the electrical energies sent out by the Altair. They actually programmed it to play the Beatles’ “Fool on the Hill.”

Two regular group members at the Homebrew Computer Club were recent high school graduates who worked nearby in the computer industry. Steve Jobs and Steve Wozniak also viewed the Altair but began to receive a lot of attention when they brought in their own microcomputer. The scene is dramatized in the movie Pirates of Silicon Valley. As Jobs says, “I never had a problem with the Altair, until I tried to use it.” Of course, it was the Altair that inspired Bill Gates to leave Harvard and get into the software business.

On April 1, 1976, they started Apple Computers. The name reportedly came from Jobs’ belief that the Apple was the perfect fruit. It has high nutritional value, tastes good, and comes in an attractive, protective package.[3] But Jobs’ genuine concern was a type of spiritual nutrition that he thought was lacking in modern society and certainly in the dominant computer industry. His dream was to sell “the first real packaged computer” that could empower and help enlighten the individual.

Jobs and Wozniak decided not to go with the 8080 chip that was fascinating to the Homebrew crowd and chose instead to go with a variation of a Motorola chip. Motorola was one of the original licensees of Bell Labs’ transistor and continued to play an important role in chip design and production. Its 6800 family of microchips, in particular, had a substantial impact on the emergence of the microcomputer. Several of its employees left to form a computer called MOS Technology that released an imitative chip called the 6502. MOS was soon sold to Commodore computers that used the 6502 in its well-regarded computers. It was the 6502 that became a workhorse for the Acorn, Atari, and Commodore microcomputers. Perhaps most importantly, it was readily and cheaply available by 1976 and used by a new company called Apple Computers.[4]

Silicon Valley was not really interested in microcomputers. Instead, they saw their markets as military systems and other industrial products, such as manufacturing sensors. The two kids from Apple, however, managed to find enough venture capital to produce a small number of computers. Jobs, a fruitarian at the time, went to Arthur Rock, a venture capitalist, for money to get the company off the ground. Rock had originally funded Intel and referred them to a recent Intel retiree, Mike Markula, who became an important part of Apple’s management team. With the new funding, they contracted with a local company to build 1,000 computers. The Apple II was launched at the West Coast Computer Fair in 1978. It was a big hit, and they signed up distributors. Initially, the market was hobbyists, but Apple soon began to market education and business.

It was Apple Computers that first implemented the smaller 5-¼ inch disk drive. Wozniak redesigned the controller chips for the disk drive into a more elegant configuration. He reduced the number needed from around fifty to five chips, reducing the required space in Apple II for chips considerably. Shugart Associates was a leader in developing the 5-¼ inch floppy that it sold to Apple. The new floppy drive system allowed many third-party software developers to produce and market software that could be easily installed and used on the microcomputer.[6] Apple sold the new device, which could hold 113 Kbytes of information, for $495. The drives were important, not just for storing data, but they would prove crucial for distributing software applications.

While Woz earned his title as the “Mozart of digital design” through his design of the Apple II, Jobs helped conceive the computer as a democratizing tool with the motto-“One person–one computer.”[5] The microcomputer was sold as a tool that would balance the unequal relationship between institutions and the individual. It would empower the individual and allow their inner artist to emerge. The Apple II Computer became the darling of the counter-cultural crowd. It would remain a symbol of resistance against the corporate forces of IBM and, later, the predatory practices of Microsoft.

The message here is not that computer creativity could only emerge in the midst of the military-industrial complex, but rather that context matters. Enabling infrastructures matter. Later the Internet would spread the opportunities available to youth through PCs and dial-up capabilities connecting to ISPs allowing webpages that could be easily built with HTML, gifs, and Adobe Photoshop jpegs.

Notes

[1] Quote about Steve Wozniak “Woz” from a chapter called “The Prankster” in Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 255.
[2] Triumph of the Nerds video.
[3] The Apple name from http://apple2history.org/history/ah02.html accessed July 6, 2005.
[4] Useful information on Zilog and Motorola chips came from George Gilder’s Microcosm. p. 108-112.
[5] Wozniak as the Mozart of digital design from Triumph of the Nerds video, Part One.
[6] Apple’s new floppy disk proved decisive. From Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 266-267.

Citation APA (7th Edition)

Pennings, A.J. (2010, Sept 11). Apple, Silicon Valley, and the Counter-Cultural Impulse. apennings.com https://apennings.com/dystopian-economies/apple-and-the-counter-cultural-impulse/

Share

© ALL RIGHTS RESERVED


Anthony
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global political economy.

How IT Came to Rule the World, 2.7: The Origins of Microsoft

Posted on | September 22, 2010 | No Comments

The Intel 8080, the Altair, and the Formation of Microsoft

As kids, Bill Gates and Paul Allen dreamed of having their own Fortune 500 company.[1] The two became friends (and sometimes adversaries) when both attended the prestigious Lakeside School in Seattle in the early 1970s. But their friendship was mediated by a third entity, a Teletype ASR-33 connected remotely to a computer. As video terminals were rare and expensive, this popular teleprinter was often recruited to provide an interface to computers. Lakeside used the ASR-33 to connect to a timeshare service offered by General Electric (GE) in the Seattle area.

However, based on sharing the resources of a PDP-10, the service soon proved to be expensive. Despite an infusion of $3,000 into the Lakeside computer account by the Lakeside Mothers Club, the boys (Lakeside was an all-male school) soon ran out of computer time. But luckily, one of the mothers of a Lakeside student was a co-founder of a new company called the Computer Center Corporation that offered students computer time on their PDP-10 in exchange for helping them debug their software.

Allen and Gates became quite proficient with the machine, and even after the project ended, they continued to “hack” into the machine. Although they were eventually caught, they nevertheless gained a notorious but beneficial reputation for their hacking. But rather than continuing with hacking, they instead went into business, forming the Lakeside Programmer’s Group with a few other students.[2]

The Lakeside Programmers Group primarily provided computer services in exchange for coding time but provided the foundation for their next enterprise, Traf-O-Data. This new company was formed in 1972 to sell computer traffic-analysis systems to municipalities. They planned to string rubber cables across roads and highways and use a microprocessor to develop statistics on traffic flow.

Their technology was based on an Intel 8008 chip that had enamored Allen and who subsequently seduced Gates into helping him develop a BASIC interpreter for it. By this time, Allen was enrolled at Washington State University, and his dorm became their headquarters. The 8008 was the first 8-bit microprocessor but working with it was a bit awkward. They used an IBM System 360 on campus to simulate the 8008.

At the same time both were hired by defense contractor TRW to develop a simulator for the 8008 chip.[3] The two labored and eventually came up with a workable system for the car counter but it had difficulties with its paper-based printer. Traf-O-Data was not very profitable as it soon found itself competing against free services from the State of Washington. Their final Traf-O-Data product was not really a computer but it provided valuable experience for the two young programmers.[4]

After Gates’ graduation, the two were in Boston, where they took a bold move towards their vision of creating a major computer company. In the winter of 1975, Paul Allen, who was now working for Honeywell picked up the January issue of Popular Electronics at Cambridge’s Harvard Square. Excitedly, he took it to his friend Bill Gates’ dorm room, who was enrolled nearby at Harvard University.

The magazine issue sparked their quest to enter and impact the new computer era. They saw their opportunity to leverage their experience with BASIC and gain a foothold in the emerging microcomputer industry. The magazine showed a low-cost microcomputer built around the Intel 8080 chip and was in desperate need of a programming language. The Altair, as it was called, was actually a kit that had to be assembled by the purchaser. It was marketed by a company called MITS (Micro Instrumentation Telemetry Systems).

Gates and Allen had been following the development of the 8080 chip. Using Traf-O-Data stationary as their institutional identity, they contacted the developer of the MITS machine to offer their services. The two had some experience with Intel chips from their Traf-O-Data days, so they designed a simulation of the Altair’s 8080 chip on a Harvard PDP-10 to build a version of BASIC that would run on the Altair. The key was their questions about how to use a Teletype machine to read and input data. This call convinced the MITS people that they were serious. Just a few months later, Allen flew to New Mexico to present their software. The memory in the Altair was so small that they were unsure their version of BASIC would work. But after a day’s delay and some last-minute software adjustments by Allen, the software worked, and the demonstration was a success. It even allowed them to play a game of Lunar Lander on the Teletype printer.

Allen was offered a job on the spot and became Vice-President of Software at MITS. Gates flew down in the summer and helped out while they worked on their new company at night. On July 23, 1975, the two companies signed a contract giving MITS exclusive rights to their BASIC with royalties going to their new company Micro-Soft. The relationship between MITS and Micro-Soft soon soured. Microsoft was not making much money from the deal. MITS sold BASIC for $75 with the kit while charging $500 for it separately. It became extremely attractive for Altair users to trade bootleg copies of BASIC rather than buying it from MITS. Although the Altair depended on BASIC to do anything useful, MITS saw its business as selling the hardware, not the software. Consequently, marketing the software was not a priority.

On February 3, 1976, Gates sent his infamous letter accusing most Altair owners of stealing the BASIC software by duplicating the paper tape. He claimed that only 10% of them had bought BASIC. Soon Gates got his father, a successful lawyer in Seattle, involved in a lawsuit to get BASIC back from Pertec, the company that had bought MITS in the meantime.

The year 1977 was a decent year for Micro-Soft, with $381,715 in revenues.[5] They got BASIC back and in August began negotiations with Apple to license its programming language for $21,000. Micro-Soft produced versions of BASIC for new processors as they came out, including the 6502 that Wozniak was using for their Apple IIs and the TRS-80 by Radio Shack. The latter’s marketing capabilities made an extraordinary impact on the popularity of the microcomputer, and Micro-Soft’s BASIC was on most of them. When the Commodore PET was designed, Micro-Soft also got the call to provide their BASIC.

Allen, Gates, and crew worked on the project despite the agreement that royalties would not be forthcoming until it shipped the next year. Ironically it was support from Apple that provided a basic level of financial backing for Microsoft. Although Wozniak had designed a BASIC early on for the Apple II, it was not the “floating point” version that many users were requesting. Micro-Soft soon developed a version of BASIC for the Apple II and received a check for $10,500 as an initial part of its 10-year license fee. It was one of the rare times that Gates allowed software to be licensed on a flat-fee basis, rather than requiring a royalty payment on every copy sold.[6] The new version, called AppleSoft BASIC was released in November 1977 and improved and rereleased the following year.

Micro-Soft was very important to Apple in its early days. The Apple II’s bus architecture made expansion possible, and Micro-Soft came up with Softcard to allow the Apple computer to run CP/M. But it was BASIC that was crucial to the success of the Apple II, and Steve Jobs later encouraged Microsoft to create a version for the Macintosh. Unfortunately, this project was a disaster, so Gates strong-armed Jobs to accept it or lose their license for the Apple II’s BASIC. Since Apple Computers was still highly dependent on the sales of the Apple II and BASIC was a strong complementary component of the microcomputer, Jobs killed their in-house MacBasic project to use the Microsoft version. At least until the Apple II computer was discontinued. [7]

In 1978 Gates agreed to move the company back to their hometown Seattle and change the name to Microsoft. Allen, in particular, had grown tired of the desert and convinced Gates to move the company. The three years they were in Albuquerque years were very challenging as the small company (16 employees in 1978) worked day and night keeping up with the new computers and chips coming on to the market. CP/M was becoming the most popular cross-platform operating system due to Kildall’s BIOS that allowed the OS to be quickly adapted for new machines.

In December of 1979, they moved to Bellevue, Seattle, and started focusing on Intel-based 8086 16-bit machines. By March 1979, the newly named Microsoft had 48 OEM (Original Equipment Manufacturer) customers for its 8080 BASIC programming language, 29 for FORTRAN, and 12 for COBOL. During that summer, they developed PASCAL and APL languages as well. The Intel-based microcomputer industry was starting to take off, and Gates and Allen had positioned themselves in the center of it.

Notes

[1] Fortune 500 dream story told by Paul Allen in an interview on Robert X. Cringely’s Triumph of the Nerds.
[2] This information on Allen and Gates’ early years were from Laura Rich’s The Accidental Zillionaire. John Wiley & Sons, Inc.
[3] Allen and Gates became an employee of TRW during his senior year earning a salary of some $30,000.
[4] Traf-O-Data and TRW employment information from Fire in the Valley, pp. 30-32.
[5] Micro-Soft’s 1977 revenues from Laura Rich’s The Accidental Zillionaire.
[6] Licensing of Microsoft’s BASIC to Apple from apple2history.com. Accessed September 19, 2003. .
[7] Microsoft’s early reliance on Apple from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 265.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 2.6: The PC and the Floppy Disk

Posted on | September 21, 2010 | No Comments

The development of the floppy disk device was a crucial factor determining the success of the personal computer. Originally developed by David Noble at IBM in 1971 as a backup storage mechanism for their System/360’s magnetic core memories, the floppy disk was soon put to use for other purposes. An employee at IBM, Alan Shugart saw the implications of the new device for smaller computers. He realized that the floppy disk could provide a new storage device that was faster, had random access (meaning that one did not have to rewind an entire tape to find desired information) and could be portable. He started a company called Shugart Associates to build and market them. However, for the microcomputer to be effective it would need to combine the power of the microprocessor with the new storage mechanism. For a microcomputer to use the floppy disk, it required a new software package to run it, what IBM had already called a “Disk Operating System”. [1]

It was Gary Kildall who pioneered the first effective disk operating systems for microcomputers. Kildall earned his doctoral degree through the military’s ROTC program and so had a choice of going to Vietnam or teaching computer science at the Naval Postgraduate School in Monterey California. While teaching, he was also hired as a consultant by Intel to write software for its microprocessors, including a compiler later called PL/M. The compiler would allow programs written in languages like FORTRAN on larger computers to be used with the microprocessor. It was used on Intel’s Intellec-4 and Intellec-8, small computers that could lay some claim to the title of the first microcomputers, but were never marketed as such. Later, Kildall began emulating Intel’s new 8080 microprocessor on an IBM computer (Microsoft founders Gates and Allen would soon do something similar to write software for the Altair computer) and developed a new version of PL/M for it. He also decided at that time to write a program to control the mainframe’s disk drive. Using commands developed by DEC to access data from its DECtape, he began to write the code for the new operating system. DEC’s OS/8 and later its RT-11 had been important developments for PHP minicomputer series and showed that smaller computers could compete with the mainframes.[2] Pulling the pieces together, Kildall created his new operating system called CP/M, short for Control Program/Monitor.

Intel didn’t really want Kildall’s OS, but the software soon became the standard for a number of new microcomputers. CP/M was announced as a commercial product in April 1976. Kildall soon quit teaching to form a new company with his wife called Intergalactic Digital Research (later just called Digital Research) to market the operating system. CP/M was soon used by a number of small computers including the Osbourne, the first portable microcomputer, and the Kaypro which is shown below.

As C/PM became the standard for microcomputer operating systems, it inspired imitation. It was the foundation for another important operating system called Q-DOS (Quick and Dirty Operating System) that was bought by a small company called Microsoft as the basis for its own microcomputer operating system. MS-DOS became the software foundation of Microsoft’s empire and the successful early run of the IBM Personal Computer.[3]

Notes

[1] The beginnings of the floppy disk from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 236-237.
[2] Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press was the source for Kildall’s development of CP/M from DEC sources. p. 238.
[3] Gary Kildall’s early story also from Robert X. Cringely’s Accidental Empires. New York: HarperBusiness, A division of HarperCollins Publishers. pp. 55-59.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 2.5: Intel and the PC

Posted on | September 14, 2010 | No Comments

The computers used for the Moon landing were already out of date when Neil Armstrong walked on the lunar surface in 1969, but the decision to use integrated circuits or “chips” by the Apollo project paved the way for the microprocessor revolution and one of its main offspring, the personal computer.

NASA decided early on to standardize its Apollo flight technology with the integrated circuits (ICs) and nurtured them into reliable, relatively high performance digital processors. Reliability, cost, and the ease of manufacturing these “chips” had been sufficiently subsidized by the space program (and the MAD “Mutually Assured Destruction” missile defense strategy) to the point where integrated circuits and their next stage, the microprocessor, could be used in business related computers. In what would become a common geek term, the PC was the “killer app” for the microchip.

While Jack Kilby at Texas Instruments is generally credited to be the first to construct an integrated circuit, his contemporaries at Fairchild conceived of a production process that could mass-produce the small chips.

While Kilby’s ICs required its combination of transistors, resistors, and capacitors to be connected with gold wires and assembled manually, Robert Noyce and others at Fairchild were developing a literal printing process to construct the ICs. His “planar process” printed thin metal lines on top of an insulating silicon oxide layer that could connect the integral components of the chip. At first they could only connect a few components, but as they refined their “photolithography” method, hundreds, then thousands of connections could be made. By the time the Internet became a household word in the 1990s, millions of transistors were placed on a single chip.

In 1968, with the Apollo project well-established, integrated chip co-inventor Robert Noyce and fellow Fairchild “traitor” Gordon Moore left the company to form a new semiconductor company. What emerged was Silicon Valley stalwart Intel, the future producer of some of the industry’s most sophisticated microprocessors and the latter half of the infamous “Wintel” combination (Windows OS/Intel microprocessor) that would dominate PC sales throughout the rest of the century.

After twenty years of government backing, the microprocessing industry was about to crawl out on its own. And it was the microcomputer that would give the semiconductor industry the legs to become viable in the commercial arena.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

The Smith Effect II: “State-istics,” Calculating Practices, and the Rise of IT

Posted on | September 6, 2010 | No Comments

This is the second in a three part exploration of Adam Smith and how his ideas laid the foundation for information technology (IT). Part one discussed Adam Smith’s reconceptualization of wealth and its importance to the role of populations in political economy.


‘Tis not a tale I tell to many.
The Government’s Engines have long memories.”

– William Gibson and Bruce Sterling, (1990) The Difference Engine.

In Smith Effect I, the argument is made that Adam Smith’s writings contributed to a set of intellectual movements that located a nation’s wealth in its population rather than the familial structure of the monarch and its treasury.

Drawing on Michael J. Shapiro‘s Reading “Adam Smith” (2002), I argue that this reconceptualization created a trajectory for the development of new information practices and technologies. Not only did Smith’s writings contribute to an understanding of “market forces” and the importance of labor, but the new emphasis on the population for a nation’s wealth provided the intellectual foundation for a transformation of the census, an ancient political tool, into a wide field of measurements that came to be called statistics (“state-istics”), the science of numbers in service of governing the nation-state. This turn led directly to creation of information machines and, ultimately, electronic digital computers.

The Smith Effect therefore provides a unique opportunity for analyzing the roots of modern society’s reliance on information technology (IT) and how these tools and practices have been integrated into a wide array of corporate and governmental bureaucracies.

Written about the time of America’s revolution, Smith’s ideas became the foundation of economic thought in the West and a major contributor to the characterization of the modern state and its role in the liberalization of the political economy. Reflecting the preoccupation of the time, Smith’s point of departure for his analysis of the economy was sovereignty. But rather than negate it, he reconfigured it. The authority of the state was no longer seen simply concerned with the maintenance of a ruling power, but also with its productive capabilities and with the wealth of a realm’s inhabitants.

Due to Smith, sovereign power was increasingly seen as a steering mechanism that could guide the flows and collaborations of social activities towards increasing the overall wealth for the nation. The energies of the population could be mobilized in a way that takes advantage of the social propensities to barter, exchange and accumulate. In Smith’s wake, governmental bureaucracy expands and takes on increasing demands in terms of textualizing, aggregating, calculating, and interpreting information about the economy and the population which was beginning to be considered an integral component of a nation’s wealth.

The origins of this new field of governmental calculation were first articulated in a chapter of the German Baron J. F. Bielfeld‘s Elements of Erudition in 1787. Entitled “Statistics,” it announced the endeavor as the “science that teaches us what is the political arrangement of all the modern states of the known world.” Bielfeld discussed the emergence of the field in Germany and its objective to chronicle the “noteworthy characteristics of the state” and to analyze the major powers in the world, including their citizens, industries, and governmental decisions.[1]

While initially concerned with categorization and verbal descriptions, by the beginning of the next century these descriptions were largely replaced by numerical data and calculation. This new enthusiasm for calculation led to the publication of a number of books and the participation of many “societies” in the task of producing lists of numbers.

By the time of Charles Babbage, generally considered the father of computing, numbers had become a major preoccupation in some social circles. Babbage published On the Economy of Machinery and Manufacturers in 1832 that established his credentials as a political economist in the lineage from Adam Smith to John Stuart Mill and Karl Marx. His analysis of factories drew on Smith’s analysis of pin manufacturing and the role of division of labor and specialization. It was also crucial for Marx’s predictions on the “means of production.”

More relevant to this thread, Babbage was a strong advocate of the value of numbers, calculation, and tables. He urged the publication of more books on numeral constants. Hacking: “Babbage had twenty kinds of numbers to be listed. They begin with familiar enough; material, astronomy, atomic weights, specific heats and so forth. They quickly pass to the number of feet of oak a man can saw in an hour, the productive powers of men, horses, camels, and steam engines compared, the relative weights of the bones of various species, the relative frequency of occurrence of letters in various languages.” With this new fascination with numerical calculation, statistics went beyond just the calculation of government revenues and assets to be used in a wider range of societal and political computations.[2]

The “avalanche” of algebraic numerals diffused calculative and listing capabilities throughout a number of social domains. As bureaucracy was in its infancy, it was initially more pronounced in the universities and societies, in areas such as epidemiology, genetics, and political economy. World statistical organizations convened such as the Manchester Statistical Society and the Statistical Society of London, which included such members as Thomas Malthus, generally considered to be the one who put the “dismal” in the “dismal science” of economics due to his dire prognostications about population and agriculture.

Dialogues on statistics were from their earliest beginnings, politically charged discourses. Of great importance to the members of these societies was “the condition of the people,” as statistics became an important part of the social reform movements that accompanied the trials of industrialization.[3]

Calculations took a new turn in the wake of Smith’s contribution. The use of “political arithmetic” for the mercantile state had a long history in the area of taxation and finances as it concerned itself with the internal affairs of the state and the monitoring of wealth extracted from its empire. During the nineteenth and twentieth centuries, however, the state turned towards registering information on their populations. This includes the “centralized collation of materials registering births, marriages and deaths, statistics pertaining to residence, ethnic background and occupation; and what came to be called by Quetelet and others ‘moral statistics,’ relating to suicide, delinquency, divorce and so on.”[4] It also shows in the Belgian census of 1840s, which would go on to become the international model as countries learned from each other the techniques of constructing a numerical representation of its population and also used these numbers to compete for national status.[5]

The new focus on population in the late 18th century led directly to the emergence of a series of knowledges for constructing, reading, and acting upon this problem. Numbering for the state, or “statistics,” emerged as a formative knowledge in the construction of government practices. Bureaucracies expand and officials are put to use in collecting the new information. The comprehension of the population as the new source of national wealth and as the focus of administrative activity called forth new languages, many utilizing alphanumeric figuring.

Based on the elevation of the alphanumeric notation system as the quantitative rhetoric of reality (from the old French term real , the space controlled by royalty), statistics developed as a state instrument for social surveying and has consequences for both the constitution of the population, and according to Foucault, new forms of “governmentality,” including the elaboration of political economy as a new discipline of measuring a nation’s wealth.

Statistics provided a new view of the economy and society. No longer could the family serve as a viable model of economic accumulation and social governance. The family instead becomes part of the demographic realm to be studied and calculated. Foucault elaborates:

    Whereas statistics had previously worked within the administrative frame and thus in terms of the functioning of sovereignty, it now gradually reveals that population has its own regularities, its own rate of deaths and diseases, its cycle of scarcity, etc.; statistics shows also that the domain of population involves a range of intrinsic, aggregate effects, phenomena that are irreducible to those of the family, such as epidemics, endemic levels of mortality, ascending spirals of labour and wealth; lastly it shows that, through its shifts, customs, activities, etc., population has specific economic effects: statistics, by making it possible to quantify these specific phenomena of population, also shows that this phenomenon is irreducible to the dimension of the family.[6]

By the late nineteenth century, frustration with manual methods of compiling statistics was rising. Despite increasing loads and classification projects; pencils, pens, and rulers were still the main tools for classifying, calculating, and summarizing work sheets into journals and ledgers. The United States started to look for alternatives after running into difficulties tabulating the 1880 census. It was required by the US Constitution to keep a register of the population and was desperately trying to keep up with the large-scale immigration of the late 19th century. The Census Bureau could not keep accurate track of the growth and by 1887 was desperately soliciting ideas to help them complete the 1880 census. The solution would be mechanical and lead to the formation of International Business Machines (IBM).

Preview of the Smith Effect III: The Census and the Rise of IBM

From the eighteen-century, changes start to occur in the way state sovereignty is generally construed. Mercantilism, which did much to apply the new calculating rationality and arts of government for the welfare of the monarchical state, began to give way to an even broader application of political practices involving the collection and calculation of information. Whereas the instruments of the state under mercantilism worked to increase the wealth of the ruler, the new practices and knowledges of the state paid increasing heed to the newly conceived problem of population. The rigid framework of sovereignty that had been previously modeled on the patriarchal family begins to confront an increasing money supply, new numerical techniques allowing demographical accounting and the expansion of agricultural and goods production.

But it was the American census that provided the most immediate relief for the problems associated with aggregating large amounts of statistical data. Mandated by the US constitution to be held every 10 years, its calculation ran into difficulties as immigration soared in the late 19th century. By 1880, the task was nearly impossible to complete before the next one was due. The solution was the “census machine” and it changed the trajectory of both corporate and governments bureaucracy as well as the information practices that run them.

Statistics: The Calculating Governmentality

This section follows the thesis that Adam Smith’s new conception of the wealth for nations created the impetus for the development of statistics and other information collecting and calculating technologies. This new direction of political economy from one based on governmental wealth to the vitality and enterprise of its population led to continual interest and innovations in information practices. The United States Constitution, written in 1787 and ratified in 1788, was influential as it required a census every ten years. This lead to Herman Hollerith’s tabulating machines created for the 1890 US Census.

Hollerith’s company merged later with two other companies to form International Business Machines or IBM. IBM began to sell its tabulating machines and customized census services to countries like Russia and then later, Nazi Germany. The punch-card tabulating systems were then generalized for a wide range of commercial and government purposes, including monitoring racial politics as well as parts management for the Luftewaffe, Germany’s air force.

Notes

[1] Notes and quotes from Statistics and Society: Data Collection and Interpretation (1991) by Walter Theodore Federer. CRC Press.
[2] Hacking, I. (1991) “How should we do a history of statistics?” in Burchell, G., Gordon, C. and Miller, P. (eds.) The Foucault Effect: Studies in Governmentality. (Chicago: University of Chicago Press). p. 186. Babbage, Charles. On the Economy of Machinery and Manufacturers. London: Charles Knight, 1832.

[3] Manicus, P. (1987) A History and Philosophy of the Social Sciences. (Oxford: Basil Blackwell). p. 196 197. See also Poovey, M. (1993) “Figures of Arithematic, Figures of Speech: The Discourse of Statistics in the 1830’s,” CRITICAL INQUIRY. Winter, Vol. 19, No. 2.
[3] Hacking, I. (1991) “How should we do a history of statistics?” in Burchell, G., Gordon, C. and Miller, P. (eds.) The Foucault Effect: Studies in Governmentality. (Chicago: University of Chicago Press). p. 182.
[4] See also Bruce Curtis’ The Politics of Population: State Formation, Statistics, and the Census of of Canada, 1840-1875. (2002) University of Toronto Press.
[5] Foucault, M. (1991) “Governmentality,” in Burchell, G., Gordon, C. and Miller, P. (eds.) The Foucault Effect: Studies in Governmentality. (Chicago: University of Chicago Press). p. 99.

Citation APA (7th Edition)

Pennings, A.J. (2010, Sept 06). The Smith Effect II: “State-istics,” Calculating Practices, and the Rise of IT. apennings.com https://apennings.com/dystopian-economies/the-smith-effect-ii-from-political-arithmetic-to-state-istics-to-it/?preview_id=1033&preview_nonce=c51b6b9b63&preview=true

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University where he taught comparative political economy, digital economics and traditional macroeconomics. He also taught in Digital Media MBA atSt. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.


Anthony
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global political economy.

The Smith Effect I: Markets, Governments, and the Rise of Information Technologies

Posted on | August 30, 2010 | No Comments

This is the first in a three part exploration of Adam Smith and how his ideas laid the foundation for modern information technology (IT)

In his book, Reading “Adam Smith” (2002), Michael Shapiro states that because of Adam Smith’s contributions, “he is always at least implicitly present when one negotiates modernity’s codes and arrangements.” What he means is that not only have Smith’s writings become richly woven into the fabric of modern thought, its economics, its philosophies, but also its government and economic institutions.[1] Shapiro’s “Smith Effect” can be used to understand the influence Adam Smith has had on both the modern emphasis on economic markets as well as the role and the growth of government. Integral to both these processes has been a wide range of information management imperatives that have spurred the development of computers and communications technologies.

Smith wrote his classic text, An Inquiry into the Nature and Causes of the Wealth of Nations (1776), commonly known as The Wealth of Nations, in a time when much of modern thought was being introduced and discussed. Intellectuals in the 18th Century were obsessed with challenging the reigning political authority and the basis from which it derived its political legitimacy, namely a vertically oriented, religion-based ruling system. For example, while the emerging United States of America would remain one nation “under God,” it instituted a set of horizontally oriented checks and balances based on the machine metaphors of the nascent industrialization age.

Smith took aim at the hierarchically oriented mercantile structure that was characterized by a governing sovereignty that derived its authority from the rhetoric of divine origins. His book became a major force in the liberalization of this ruling structure. In the process, he helped create the impetus for a new rationality that would change modern government and how the economy was viewed. This new rationality would both liberalize the economy and draw nations into a new type of governance, the management of populations.

From Treasure to Laboring Energies

European monarchies maintained a strong grip on the land and their subjects by invoking a privileged place in society. By claiming divine legitimacy, they justified their positions in a stationary order. Consequently, early imperial mercantilism conceived of wealth as static hoarding for the monarch and feudal lords, not as a process of dynamic practices involving the enterprise of labor and capital investment, as Smith would later argue. In challenging this organization, Smith’s writings helped “recast divine will as a set of dynamic mechanisms regulating the process of production.”[2]

His argument was based on the protest ideas of the time, which challenged the Catholic church’s role and their teachings that God could be petitioned to intervene in worldly affairs. For Smith, divinity was infused in earthly nature and demonstrated by the harmonious order of economic markets based on human desiring. Smith was particularly comfortable with using musical metaphors to talk about the workings of the economy. Markets ensured the harmonious working of the world as long as those involved worked to their own benefit. In making this philosophical and political leap, Smith helped shape the moral, political, and social dimensions which have become integral to modern capitalist political economies.

Smith’s often-quoted “invisible hand” was mentioned only once in the Wealth of Nations. Still, it has continued to function as powerful currency for his train of thought that relocated the problem of wealth from the realm of the sovereign purse to the processes involved in economic production. For Smith, the nation’s wealth was not what could be accumulated for the monarch but rather the sum of goods and services that were available per capita for the population.

Hailing from Scotland, on the periphery of the British kingdom, Smith objected to the hoarding mercantile practices of the English monarchy because it made the people he saw poorer. Smith’s writings were significant because he located value in the act of production and in the creative and energetic expenditures of laboring people, not the result of amassing wealth.

Adam Smith set the stage for Karl Marx, not one of his direct students, but certainly one of his most dedicated. What intrigued Marx was that Smith conceptualized the plight of laboring bodies. This meant that Smith, “shifted the focus from national rivalries to the conditions of work, thereby helping to enfranchise a neglected constituency, the working poor, and to draw them into a new conversation on problems of inequity, a conversation which could not be held with the old mercantilist conversation on value.”[3]

In doing so, Smith intellectually recognized the working population and helped transgress certain prejudices against working activities and those engaged in them. As Zuboff argued, “The Middle Ages produced a conception of labor infused with a loathing drawn from three traditions: (1) the Greco-Roman legacy that associated labor with slavery, (2) the barbarian heritage that disdained those who worked the land and extolled the warrior who gained his livelihood in bloody booty, and (3) the Judeo-Christian theology that admired contemplation over action.” Smith’s new intellectual negotiations set the stage for empowering labor and a historic wealth boom based on new types of industrial production and the distribution of affluence among the classes.[4]

The “Invisible Hand” and the Escape from History

To better understand the “invisible hand,” one of modernity’s most prevalent truisms, it is vital to examine Smith’s philosophy of religion and nature. Smith conceived of God as the “Author” of the world, but one that had retreated from the day-to-day world and left behind a structural guarantee that the self and the social order would remain attuned. This guarantee is a form of human desiring we call “enterprise” at its best and “greed” at its worst.

Smith’s upbeat version of the world was one that resulted in an aggregated “harmony” when individuals followed their passions. Shapiro explains, “Among what Smith’s ‘Author’ had left behind as “nature” was the regulative mechanism of a socially felicitous tendency in individual human desiring, which along with some inevitable tendencies in collective arrangements, would eventuate an order that progressed toward general prosperity and broadly distributed human contentment.”[5] What has become commonly known as the “market” was Smith’s divine mechanism that was “infused in nature” to produce a harmonious aggregate outcome when individuals act on their passions.[6]

The “Smith Effect” provides an opportunity for new ways to analyze the social field and the overlap between economic, social, and political spheres. Smith was a crucial critical theorist in his rejection of mercantile thought, and his writings were a forerunner of modern political economy. Two major bodies of economic analysis would emerge from Smith’s writings.

One was the classical liberal tradition that combined Smith’s anti-mercantile stance in favor of “markets” with an increasing emphasis on empirical and quantitative calculation. This stance focused largely on theoretically reconceptualizing Smith’s divinely gifted “invisible hand” in terms of more naturalistic and physics-derived metaphors.

One such metaphor was “equilibrium” that allowed political economy “to escape from history.”[7] By focusing on economic tendencies to revert to one price among many competitors, economics was able to devise a theoretical system that could marginalize historical events and focus exclusively on modelling forces of supply and demand. Economic, or market equilibrium was conceived as a condition or state in which economic forces become balanced. This philosophical foundation informs mainstream liberal economics in both Keynesian and Monetarist (Austrian) traditions.

The other body of analysis was the Marxist tradition that drew its investigation from Smith’s concern for the worker and the processes of valuing commodity forms and accumulating capital. This latter posture organized the history of the economy around the role of labor and its role in the “mode of production.” Class struggle between laboring forces and capitalist owners drive the ebb and flow of history. Marx combined Smith’s political economy with German dialectical philosophy and French utopian theory to create a unique and radical analytical perspective.

These two ways of viewing the political economy helped shape the next few centuries. This included the classic division during the Cold War as the West developed a form of capitalist industrialization based on the New Deal’s negotiated relationship between government and business. In the “East,” forms of state-controlled communism emerged in China, Cuba, North Korea and the USSR. In the 1970s, liberalism re-emerged in reaction to Keynesian/New Deal macromanagement of the economy as neo-liberalism and once again stressed market forces, particularly in the financial sphere.

While Smith opened up new areas for investigating the economy, it would be a mistake to equate his writings with the anti-state rhetoric of modern neoliberalism. His writings marked not so much the end of the state, or even a way of curbing its power, but rather a way of theorizing the economy so that it could be even more productive for the nation. More than ever, the intervention of the state in the economy was legitimized.

This came about largely because the theoretical conversations he set in motion changed the connotations associated with both the state and the economy. For the former, it meant a state with a powerful new system of calculable observations that could be used to guide its actions. To help with these observations, “political arithmetik” or “statistics” emerged as a focus of knowledge that could record a wide range of demographic and epidemiological events and processes. These new economic and social surveillance techniques provided new ways to scrutinize, and consequently, new ways emerged to intervene in social activities.

Conclusion

Smith wrote extensively on the proper expenditures of the state, including defense, education, justice, public works, the dignity of the sovereign, and institutions for facilitating the commerce of society. For the latter, it meant a redefinition of the economy from that based on the model of a household to a larger set of social activities covering a wide range of laboring activities and entrepreneurial production. While no doubt a significant contributor to modern liberalism and its commitment to international comparative advantage and “free enterprise,” Smith’s writings helped conceive a more prominent role for government in managing the economy, not a smaller one. In the process, calculating tools and information management practices would be created to facilitate this control.

Preview to The Smith Effect II: Calculating Practices and the Rise of IT

Smith’s writings legitimized the state’s intervention in the economy by identifying the crucial role of managing populations for national prosperity. His contention that the natural order channels individual desires into positive collective outcomes helped to open up flows of capital in service of manufactures. It set in motion conversations that changed the connotations associated with the state and the economy.

A body of knowledge known as statistics emerged as a set of techniques for amassing important data on a wide range of demographic information important to the state and the emergence of democratic governance. It provided new economic and social surveillance techniques provided ways to analyze and scrutinize the political economy. Consequently, these new techniques and the indicators they enabled suggested ways to intervene in social activities to improve the conditions and productivity of national populations.

These techniques eventually found mechanical enhancement with the invention of the tabulating machine and telegraphy that were used to collect data from a wide geographical area in service of the wealth of a nation. Herman Hollerith invented the census machine, used to tabulate and calculate the US population in accordance with the specifications of the US Constitution. He went on to form companies that were merged into International Business Machines (IBM), the dominant information machine company of the big computer era and whose personal computer (PC) solidified the market for “microcomputers.”

Notes

[1] Shapiro, M.J. (1993) Reading “Adam Smith”: Desire, History, and Value. p. xxix.
[2] Shapiro, M.J. (1992) Reading the Postmodern Polity: Political Theory as Textual Practice.
[3] Quote on Smith and his perspective on labor. Shapiro in Reading “Adam Smith”. p. xxxi.
[4] Three traditions that influenced the contempt for labor from Shoshana Zuboff’s In the Age of the Smart Machine: The Future of Work and Power. (1988) pp. 25-26.
[5] Smith regulative mechanism left behind by the “Author”. Shapiro, Reading “Adam Smith” p. xxxii.
[6] Shapiro on Smith’s “market”. Ibid, p. 79.
[7] Peter Manicus on equilibrium as the vehicle for political economy’s escape from history. A History and Philosophy of the Social Sciences. Oxford: Basil Blackwell. p. 40.

Citation APA (7th Edition)

Pennings, A.J. (2010, Aug 30). The Smith Effect: Markets, Governments, and the Rise of Information Technologies. apennings.com https://apennings.com/dystopian-economies/the-smith-effect-markets-and-bureaucracy/

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is a professor at the State University of New York (SUNY) in South Korea. Previously, he taught in the MBA program at St. Edwards University in Austin, Texas and comparative and digital economies at New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    January 2025
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.