Anthony J. Pennings, PhD

WRITINGS ON AI POLICY, DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL E-COMMERCE

Factors Establishing the Feasibility of Global E-Commerce

Posted on | October 19, 2010 | No Comments

The advent of global e-commerce emerged out of a unique combination of economic, political, and technological circumstances during the 1990s. The following is a partial list of influential events that led to its development.

  • The fall of the Berlin Wall and the Communist USSR bloc meant the world was no longer significantly divided by Cold War antagonisms and a pan-capitalist condition of free flows of information and money began to spread globally.
  • The first Gulf War demonstrated a new found optimism in the US capability to conduct conventional warfare and police “rogue nations” with “smart bombs” and spy satellites, regenerating America’s belief in high technology and the power of the microprocessor.
  • Digital monetarism’s global techno-financial regime, led ideologically by the “Washington Consensus”, continued to discipline countries around the world into modernizing their telecommunications and deregulating their political economies to the global flows of capital and news.
  • The Internet had emerged by first connecting universities and research centers and then opening up to commercial activities with a series of new “killer apps”: email, FTP (File Transfer Protocol) and Gopher.
  • Tim Berners Lee at Europe’s largest nuclear research center proposed a communication system with hypertext features to facilitate communication among scientists that he called with World Wide Web. This became the technical foundation for the “click” dynamics of the Internet.
  • The multimedia browser was invented by a group of university students that allowed users from a wide spectrum of society to “surf” the World Wide Web. They would go on to form Netscape whose Initial Public Offering (IPO) would start a major investment boom in tech companies while the browser would provide the basic platform to conduct e-commerce.
  • The FCC had made distinctions between voice and data during the 1980s that allowed users to log long periods of Internet surfing without paying exorbitant phone bills, although it meant using the relatively slow 28.8 kbps and 56kbps modems of the time.
  • A new proactive Democratic government in the US pushed forward new domestic telecommunications legislation in 1996 that spurred widespread interest in the sector while its global activism helped solidify support for a the “global information infrastructure” (GII).
  • The World Trade Organization (WTO) was formed in 1995 and resulted in two quick agreements that liberalized telecommunications and sales of information technologies as 60 nation-states agreed to open up their markets.
  • Capital markets became enthused about the “tech stocks” and poured money into the notorious “dot-com” and other companies involved in the “new economy” of Internet-based electronic commerce including telecommunications. Financial destabilization increased investment rush to the US as financial shocks in the world financial system resulted in money from Asia and other distressed economies finding “safe haven” in the American stock markets.
  • IT and particularly the Internet became a “media event” in itself as television, magazines, newspapers, websites, and particularly the new business channels such as Bloomberg, CNBC, and CNNfn constantly reported on the potential of the “new economy”.
  • Search engines made the information on the web much more accessible and began to offer more sophisticated, yet easy to use methods for advertising.
  • Software concerns over the new millennium sparked major interest and purchases of new computer systems in order to avoid computer system failure on January 1st, 2000. As businesses gave up their legacy systems they looked to integrate new systems that did more than provide “back-office” capabilities. IT was seen to be able to interface more directly with sales and customer services.

Of course, this list is incomplete, but it’s a start at examining one of the most important trends in human commerce and the future of the web.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

Google: Monetizing the Automatrix

Posted on | October 18, 2010 | No Comments

Google recently announced its work on a driverless car to mixed reviews. While a technical success, with only one mishap in 140,000 miles of testing, many felt that Google was losing its focus. I think this latter view underestimates the Google strategy – to monetize the road.

As we move towards the “Automatrix,” the newly forming digital environment for transportation, Google looks to situate its search/advertising business at its center.

Let’s face it; the car is almost synonymous with shopping and consumerism. Whether going to the mall to buy some new shoes, picking up groceries, or going out to look for a new washing machine – the car transports both our bodies and our booty. Nothing in the fridge? Drive out to nearest Applebee, Dennys, or Olive Garden for some nachos and diet coke. Got kids? Try the drive-in for a Happy Meal or some Chuck E. Cheese’s pizza after a day at the water park. You get the point: have car, will spend. It’s American.

Google, who wants to organize the world’s information, clearly sees your car as a major generator of that data and the car occupants as major traffic generators – the good kind of traffic – on the web, not the road. They want the passenger to focus on the navigation, not the road. They want to provide destinations, stops, places to rest and refresh. The car will provide the movement while “the fingers do the walking,” to draw on a famous Yellow Pages ad. While AC Nielsen, famous for its ratings business, has championed the three screen advertising measurement (TV, PC, mobile phone), you could say is Google is going for a four-screen strategy: PC, mobile, TV, and now the dashboard. Talk about a captured audience! It has the potential to pay off big, adding billions more to Google’s bottom line by tying passengers to the web.

Can driving through downtown Newark, sitting at a light, or leaving a movie theater parking lot, really compete with the latest user-generated video on YouTube? As you drive to the airport, wouldn’t you rather be making dinner reservations or checking out entertainments at your flight destination? No, Route 66 is going to be route66.com because, well, Pops Restaurant bought the ad word and you would rather be enjoying a coke and burger anyway.

Actually, I’m all for computers driving my car, as long as they are doing it for other drivers as well. Yes, I enjoy the occasional thrill of driving and probably more, the relaxing feel from the directed focus of the activity. However, I prefer looking out the window, listening to music, and even reading a book. GPS has already rescued me from the travel maps as I now need reading glasses to see them anyway. Besides, the road is dangerous. It’s really scary passing that zigzagging car because the driver is zoning out in a conversation with his ex-wife or some teenager is texting the girl he has a crush on.

Sure, I have mixed feelings about sliding into the Automatrix. Taking over the steering wheel seems like a bit of a stretch, even for Moore’s Law modern day microprocessors. It will require a whole new framework for car safety testing. However, it has been over 40 years since they guided the Apollo spacecraft to Moon, so it makes sense to replace the current system of haphazard meat grinders we currently use.

Google, you can drive my car.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

How IT Came to Rule the World-The Information Standard and Other Sovereignties

Posted on | October 3, 2010 | No Comments

President Nixon’s decision to close the gold window in 1971 signaled a dramatic shift in the US international financial policy and the future of the world political economy. The move largely meant the end of the containment of international finance set up at the end of World War II. No longer was the US constrained by the financial rules it set up during the United Nations Monetary and Financial Conference at Bretton Woods, New Hampshire. Known more commonly as the Bretton Woods conference, leaders from the major Allied countries tied the US dollar to gold and major currencies around the world to the dollar. With the end of this de facto gold standard, a new type of global power emerged based on news flows, financial information, and computer and communications technologies.[1]

Empowered by new developments in computer services and data communications, currency markets turned electronic in the wake of Nixon’s decision. Banks throughout the world started to trade the US dollar, the British pound, the German mark, the French franc, the Russian ruble, the Italian lira, the Japanese yen, the Korean won, the Brazilian real and others. Traders bought and sold currencies, betting on the direction of price movements that had been previously pegged under the Bretton Woods rules. Through new electronic conduits and financial news services, they monitored economic information, military movements, political crises, and weather forecasts, all for the purpose of making instantaneous decisions about the viability of country’s money and other financial instruments.

This decision-making capability propelled financial traders, who Tom Wolfe called “the Masters of the Universe” in his (1987) Bonfire of the Vanities, into a major power. Through the vast funds that they accumulated in their portfolios they effectively began to discipline countries around the world through the force of their trading decisions. With the end of the gold standard’s discipline, a new power emerged based on the utility of global communications and information technologies, what Walter Wriston, the notorious and visionary Citibank CEO, called the “information standard” in his controversial book, The Twilight of Sovereignty.[2]

This global information standard became a sovereign power in itself. Nation-states and organizations were caught up in the opportunities and consequences of the new financial trading system that began to structure modern life along the dictates of a new techno-economic imperative. When Reuters offered international price information over data communication lines, it initiated the beginning of a global pan-optic market system that read and interpreted the world according to the regiment of electronic finance. This system expanded rapidly, globally, and comprehensively. It reached into the policies and practices of nearly every nation and organization, both private and public. Reuters caught a break when the Arab-Israeli War broke out in October of 1973, sending the newly freed currency markets into a frenzy and panicked dealers to their computer monitors. Reuter’s “Money Monitor Rates” became the news agency’s major source of revenue and a pioneer of the electronic marketplace.

While the mechanics of the information standard was based on its capability to develop virtual markets using the Reuters electronic news and trading system, the “energy” of the system was provided by the infusion of debt taken on by countries around the world. Ironically, it was the oil crises that created the surpluses of dollars that were lent to nation-states around the world. Addiction to oil drove the growth of the “eurodollars” – US monies outside its geographical boundaries that lead to that debt. Banks pressured countries around the world to take loans for a variety of projects. Growing national debt during the 1970s led to the so-called “Third World Debt Crisis” that blew up in the early 1980s, and gave financial traders substantial leverage within this global system of discipline.

The information standard began disciplining the world political economy and its nation-states into adopting an agenda that included: 1) privatizing government assets and agencies while capitalizing domestic industries on newly electronic stock markets; 2) deregulating domestic industries and taking down barriers to flows of capital and investment; 3) reducing government expenditures and increasing taxes to pay off debt; and; 4) disciplining labor forces into lower cost workers or innovating entrepreneurs. This new global political economy combined a new “free enterprise” fundamentalism led by Margaret Thatcher and Ronald Reagan, with a system of unprecedented capital mobility.

Empowered by the calculative and organizing powers of the spreadsheet, global finance targeted debt-ridden governments and began a process of “privatizing” public assets such as airlines, broadcasting, electricity, transportation, oil fields and telecommunications by valuing assets, creating state-owned enterprises (SOEs) and then finally selling them off as corporate entities to global institutional investors such as pension and sovereign funds. Most significantly, government-owned telecommunications systems were sold off and listed on domestic and international share markets in a process called “privatizing.” These former PTTs (Post, Telephone and Telegraph) eventually incorporated Internet Protocols (IP) and began opening up to the World Wide Web and its flows of capital, news, global e-commerce and social media.

Public and private institutions began to succumb to the new logic of digital finance and a system of hyper-real representational strategies. Both types of organizations fell under the discipline of the financial markets with the former particularly susceptible to bond and currency traders, while the latter continued under constant surveillance by the stock markets and lenders. Central to this emerging regime of “digital monetarism” was the knowledge disciplines of accounting and finance, that congealed their techniques into a new tool, the electronic spreadsheet. The original “killer app” of the personal computer revolution, this versatile program allowed the widespread calculation of financial formulas and “what-if” scenarios allowing the plotting of a wide variety of corporate acquisitions, initial public offerings (IPOs), leveraged buyouts (LBOs), and mergers.

The use of the electronic spreadsheet exploded after IBM introduced its own “Personal Computer” in August of 1981. Soon after, Lotus 1-2-3 became available for the “PC” and all the “IBM-compatible” clones such as Compaq and Dell. Lotus 1-2-3 was named for its spreadsheet, graphing, and database capabilities that combined to produce an extraordinary new facility to both conceptually and textually organize financial information. Lotus 1-2-3 didn’t make the transition to the graphical user interface and Microsoft’s Excel, originally developed for the Apple Macintosh became the dominant spreadsheet application.

In a new era of spreadsheet capitalism, countries were forced to succumb to a new logic of numerical, graphical and textual representations – a realm of computerized hyper-mediated information organized around the techno-economic imperative. Money emerged as a leveraging factor and VisiCalc, Lotus 1-2-3 and Excel became new tools of control and discipline. Facilitated by high-speed network technologies and powerful trading workstations, the information standard began to subject organizations, both public and private, to a new international discipline. Combined with innovations in mathematical algorithms, global money morphed into a variety of financial instruments traded in electronic “dark pools” and on interconnected financial exchanges.

The world economy began to undergo what Harvey called a “time-space compression” due to new permissive technologies such as jet airplanes and IP-based telecommunications. [3] This has meant a shift from vertically-organized to new networked organizations that privilege inter-organizational ties by such means as outsourcing and sub-contracting. Spatial and temporal dimensions of the economy are being reorganized in the need to reduce turnover times for flexible production and marketing strategies on a global scale. For example, coordinating the logistics of containerization, inventory control, and packaging needed to compete in the new marketplace requires contact with a wide of array of competing services.

Notes

[1] A classic source on Bretton Woods was Moffit, M. (1983) The World’s Money. NY: Simon & Schuster, Inc.
[2] Wriston’s interpretation of the Information Standard The Twilight of Sovereignty: How the Information Revolution Is Transforming Our World was organized around a rhetoric of assurance, not a critical analysis. He argued the power of multinational corporations, nation-state dictatorships, and any aggregation of power antithetical to democratic prospects will fall to the sovereign power of the information standard.
[3] Harvey, D. (1989) The Condition of Post-Modernity. Oxford: Basil Blackwell. p. 147.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World: Transformation of the Internet

Posted on | October 1, 2010 | No Comments

A remarkable transformation took shape in the few short years between the Internet’s 25th anniversary in 1994 and the end of the millennium. During these six years, the Internet transformed from a very novel but quaint system for sending ASCII email messages, transferring files, and linking homepages into a global system of electronic communications and commerce. What was once a tiny network linking a handful of university and research institute mainframes soon expanded to include small workstations, personal computers, laptops, and mobile wireless devices. The Internet was no longer confined to campuses, but could be accessed from homes, offices, cars, and even on the move.

The Promise of E-Commerce

What sparked this transition in such a short time? The Internet’s new uses were varied, but it was clear that its potential for global e-commerce was attracting investment capital of staggering proportions. The Clinton Administration followed a set of national policies that supported the development of a “national information infrastructure”, wired schools, and the development of e-commerce. These included a moratorium on taxes, privatization of radio spectrum, the creation of a domain registration administration, and a new pro-competitive policy framework, the Telecommunications Act of 1996.

Global Flows of Cash

Consequently, venture capitalists and the financial markets began to dramatically fund technology and “dot.com” companies. Individual investors also got involved through mutual funds, work-place pension accounts, 401K’s, and new online trading brokerages such as Ameritrade and E*Trade. Capital returning from “Emerging Markets” of Asia and the former USSR during the international financial crises of 1997 and 1998 streamed into the coffers of Internet startups, leading to stark competition in the race to develop the most effective and profitable e-commerce sites. Some called it “The New Economy.”

Companies such as Amazon.com, eBay, RealNetworks, and Yahoo as well as associated “tech” stocks such as Cisco, Dell Computers, and Intel became the darlings of investors. Even America Online, with a somewhat ancillary history, was able to take advantage of the share markets to raise money to buy Netscape and Time-Warner and stake out a considerable presence on the World Wide Web. Billions of dollars poured into the development of new companies for business to consumer (B2C) and business to business (B2B) interchange. With this cash they were able to integrate and use a wide variety of traditional computer, media, and telecommunications such as legacy databases, cable networks, and satellite systems.

The Crash of 2000

By April 2000, the “dot.com” market started to dissipate. A combination of factors including new reporting requirements by the US Securities and Exchange Commission, brought a new discipline to the “dot.coms” and other technology stocks. The SEC required all public companies (except foreign companies and companies with less than $10 million in assets and 500 shareholders) to file periodic reports, registration statements, and other forms electronically through EDGAR, its Electronic Data Gathering, Analysis, and Retrieval system.

The investment bubble collapsed with the preponderance of poor “earnings” reports, but for the most part, the technology and a substantial amount of skilled employees emerged to continue the process of globalizing electronic commerce.

Citation APA (7th Edition)

Pennings, A.J. (2010, Oct 1) How IT Came to Rule the World: Transformation of the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/transformation-of-the-internet/

Share

© ALL RIGHTS RESERVED

Anthony J. Pennings, PhD was on the New York University faculty from 2002-2012 teaching digital media, information systems management, and comparative political economy. He is currently a professor at the State University of New York in South Korea (SUNY Korea). Anthony J. Pennings

Apple, Silicon Valley and the Counter-Cultural Impulse

Posted on | September 24, 2010 | No Comments

Power to the People

A curious subculture had been developing around the use of computers during the post-Vietnam War years of the 1970s. Working off the counter-cultural energy of the 1960s, it challenged “the establishment,” the political-corporate interests that had entangled America in the Indochina war. Then, in 1974, Ted Nelson, the originator of the hypertext concept, wrote a book called Computer Lib about the importance of decentralizing these machines. Subsequently, this book became a manifesto urging people to claim the power of computers for themselves and not leave it in the hands of the military and corporate elite.

Nelson’s agenda did not go unnoticed by the youth of Silicon Valley, particularly the future founders of Apple Computers, Steve Jobs and Steve Wozniak. Coming of age in the early 1970s, they were heavily influenced by the emerging currents of music, poetry, and political philosophy. Sentiments against the Vietnam War and “The Establishment” were strong at the time and contributed to their view that the individual should be empowered by technological means. In 1976, they formed Apple and tapped the swelling counter-cultural rejection of the militarization and corporatization of computing.

From the crumbs of MAD (Mutually Assured Destruction) and the “Space Race,” these two impertinent young men started a small company that changed the face of computing. Steve Jobs and Steve Wozniak were not from Jackson, Tennessee; Albany, Georgia; or Burlington, Vermont; but from a suburban complex of industries and small towns amalgamated by arms and the space races. Both grew up in a culture inundated with electronics – Silicon Valley. They had ready access to computer clubs, electronic surplus stores, trade shows, technical libraries, and an economic system of electronic production.

They did start Apple Computers in their garage. Still, the displaced car regularly drove through the streets of technologically-sophisticated suburbs bred and nurtured by the demands of the Cold War. Within this electronics-rich environment, they concocted a relatively primitive computer from readily available equipment in the Silicon Valley area. The first Apple computer was extraordinary for its size, price, and basic microprocessing abilities that together promised a future of individual empowerment through computing power.

Long before Vice-President Al Gore’s “e-rate” and National Information Infrastructure (NII) would democratize computer access for schoolchildren, Woz would have access to a PDP-8 computer in high school. When they started their business in Jobs’ garage, they had access to computer clubs, electronic surplus stores, corporate technical libraries, trade shows, and a culture of electronic engineering and production.

    “The Woz was not the only student in Silicon Valley with such a dream. Actually, in some ways, he was fairly typical. Many students at Homestead High had parents in the electronics industry. Having grown up with the new technology around them, these kids were not intimidated by it. They were accustomed to watching their parents mess around with oscilloscopes and soldering irons. Homestead High also had teachers who encouraged their students’ interest in technology.”[1]

.

Apple was an offspring of the military-industrial complex, but it was rebellious. Wozniak and Jobs achieved “hacker” status early on because of their Blue-Box business. This device allowed them to make unlimited long-distance telephone calls at a time when they were actually expensive. Moreover, the telephone system was arguably the largest electrical machine in the world. Dominated by AT&T, the network had increasingly been attacked by “phone phreaks” who used various gadgets to mimic the electromagnetic tones used in the telephone system to route calls.

As Jobs told it, they created their Blue-Box after finding an AT&T technical journal at the Stanford Linear Accelerator, a massive device that assists basic research in nuclear physics. In one of the most famous cases, Wozniak, who would later invent the original Apple computer, called the Vatican in Rome to talk to the Pope. Impersonating Henry Kissinger, he failed to make it to the Pope, but he tells the story gleefully, and it stands as an interesting example of the impertinence and resourcefulness of this young subculture to try to use technology for its counter-cultural purposes.[2]

The Altair microcomputer caught the attention of the Homebrew Computer Club, an informal group that met at Stanford University during 1974 and 1975, the last years of the Vietnam War. This club was a strange brew of well-educated hobbyists and hippies living amid the military-industrial complex while heavily influenced by the counterculture movement. Members of this group once got the Altair to play music by placing a transistor radio next to the microcomputer to pick up on the electrical energies sent out by the Altair. They actually programmed it to play the Beatles’ “Fool on the Hill.”

Two regular group members at the Homebrew Computer Club were recent high school graduates who worked nearby in the computer industry. Steve Jobs and Steve Wozniak also viewed the Altair but began to receive a lot of attention when they brought in their own microcomputer. The scene is dramatized in the movie Pirates of Silicon Valley. As Jobs says, “I never had a problem with the Altair, until I tried to use it.” Of course, it was the Altair that inspired Bill Gates to leave Harvard and get into the software business.

On April 1, 1976, they started Apple Computers. The name reportedly came from Jobs’ belief that the Apple was the perfect fruit. It has high nutritional value, tastes good, and comes in an attractive, protective package.[3] But Jobs’ genuine concern was a type of spiritual nutrition that he thought was lacking in modern society and certainly in the dominant computer industry. His dream was to sell “the first real packaged computer” that could empower and help enlighten the individual.

Jobs and Wozniak decided not to go with the 8080 chip that was fascinating to the Homebrew crowd and chose instead to go with a variation of a Motorola chip. Motorola was one of the original licensees of Bell Labs’ transistor and continued to play an important role in chip design and production. Its 6800 family of microchips, in particular, had a substantial impact on the emergence of the microcomputer. Several of its employees left to form a computer called MOS Technology that released an imitative chip called the 6502. MOS was soon sold to Commodore computers that used the 6502 in its well-regarded computers. It was the 6502 that became a workhorse for the Acorn, Atari, and Commodore microcomputers. Perhaps most importantly, it was readily and cheaply available by 1976 and used by a new company called Apple Computers.[4]

Silicon Valley was not really interested in microcomputers. Instead, they saw their markets as military systems and other industrial products, such as manufacturing sensors. The two kids from Apple, however, managed to find enough venture capital to produce a small number of computers. Jobs, a fruitarian at the time, went to Arthur Rock, a venture capitalist, for money to get the company off the ground. Rock had originally funded Intel and referred them to a recent Intel retiree, Mike Markula, who became an important part of Apple’s management team. With the new funding, they contracted with a local company to build 1,000 computers. The Apple II was launched at the West Coast Computer Fair in 1978. It was a big hit, and they signed up distributors. Initially, the market was hobbyists, but Apple soon began to market education and business.

It was Apple Computers that first implemented the smaller 5-¼ inch disk drive. Wozniak redesigned the controller chips for the disk drive into a more elegant configuration. He reduced the number needed from around fifty to five chips, reducing the required space in Apple II for chips considerably. Shugart Associates was a leader in developing the 5-¼ inch floppy that it sold to Apple. The new floppy drive system allowed many third-party software developers to produce and market software that could be easily installed and used on the microcomputer.[6] Apple sold the new device, which could hold 113 Kbytes of information, for $495. The drives were important, not just for storing data, but they would prove crucial for distributing software applications.

While Woz earned his title as the “Mozart of digital design” through his design of the Apple II, Jobs helped conceive the computer as a democratizing tool with the motto-“One person–one computer.”[5] The microcomputer was sold as a tool that would balance the unequal relationship between institutions and the individual. It would empower the individual and allow their inner artist to emerge. The Apple II Computer became the darling of the counter-cultural crowd. It would remain a symbol of resistance against the corporate forces of IBM and, later, the predatory practices of Microsoft.

The message here is not that computer creativity could only emerge in the midst of the military-industrial complex, but rather that context matters. Enabling infrastructures matter. Later the Internet would spread the opportunities available to youth through PCs and dial-up capabilities connecting to ISPs allowing webpages that could be easily built with HTML, gifs, and Adobe Photoshop jpegs.

Notes

[1] Quote about Steve Wozniak “Woz” from a chapter called “The Prankster” in Freiberger, P. and Swaine, M. (2000) Fire in the Valley: The Making of the Personal Computer. Second Edition. NY: McGraw-Hill. p. 255.
[2] Triumph of the Nerds video.
[3] The Apple name from http://apple2history.org/history/ah02.html accessed July 6, 2005.
[4] Useful information on Zilog and Motorola chips came from George Gilder’s Microcosm. p. 108-112.
[5] Wozniak as the Mozart of digital design from Triumph of the Nerds video, Part One.
[6] Apple’s new floppy disk proved decisive. From Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 266-267.

Citation APA (7th Edition)

Pennings, A.J. (2010, Sept 11). Apple, Silicon Valley, and the Counter-Cultural Impulse. apennings.com https://apennings.com/dystopian-economies/apple-and-the-counter-cultural-impulse/

Share

© ALL RIGHTS RESERVED


Anthony
Anthony J. Pennings, PhD has been on the NYU faculty since 2001 teaching digital media, information systems management, and global political economy.

How IT Came to Rule the World, 2.7: The Origins of Microsoft

Posted on | September 22, 2010 | No Comments

The Intel 8080, the Altair, and the Formation of Microsoft

As kids, Bill Gates and Paul Allen dreamed of having their own Fortune 500 company.[1] The two became friends (and sometimes adversaries) when both attended the prestigious Lakeside School in Seattle in the early 1970s. But their friendship was mediated by a third entity, a Teletype ASR-33 connected remotely to a computer. As video terminals were rare and expensive, this popular teleprinter was often recruited to provide an interface to computers. Lakeside used the ASR-33 to connect to a timeshare service offered by General Electric (GE) in the Seattle area.

However, based on sharing the resources of a PDP-10, the service soon proved to be expensive. Despite an infusion of $3,000 into the Lakeside computer account by the Lakeside Mothers Club, the boys (Lakeside was an all-male school) soon ran out of computer time. But luckily, one of the mothers of a Lakeside student was a co-founder of a new company called the Computer Center Corporation that offered students computer time on their PDP-10 in exchange for helping them debug their software.

Allen and Gates became quite proficient with the machine, and even after the project ended, they continued to “hack” into the machine. Although they were eventually caught, they nevertheless gained a notorious but beneficial reputation for their hacking. But rather than continuing with hacking, they instead went into business, forming the Lakeside Programmer’s Group with a few other students.[2]

The Lakeside Programmers Group primarily provided computer services in exchange for coding time but provided the foundation for their next enterprise, Traf-O-Data. This new company was formed in 1972 to sell computer traffic-analysis systems to municipalities. They planned to string rubber cables across roads and highways and use a microprocessor to develop statistics on traffic flow.

Their technology was based on an Intel 8008 chip that had enamored Allen and who subsequently seduced Gates into helping him develop a BASIC interpreter for it. By this time, Allen was enrolled at Washington State University, and his dorm became their headquarters. The 8008 was the first 8-bit microprocessor but working with it was a bit awkward. They used an IBM System 360 on campus to simulate the 8008.

At the same time both were hired by defense contractor TRW to develop a simulator for the 8008 chip.[3] The two labored and eventually came up with a workable system for the car counter but it had difficulties with its paper-based printer. Traf-O-Data was not very profitable as it soon found itself competing against free services from the State of Washington. Their final Traf-O-Data product was not really a computer but it provided valuable experience for the two young programmers.[4]

After Gates’ graduation, the two were in Boston, where they took a bold move towards their vision of creating a major computer company. In the winter of 1975, Paul Allen, who was now working for Honeywell picked up the January issue of Popular Electronics at Cambridge’s Harvard Square. Excitedly, he took it to his friend Bill Gates’ dorm room, who was enrolled nearby at Harvard University.

The magazine issue sparked their quest to enter and impact the new computer era. They saw their opportunity to leverage their experience with BASIC and gain a foothold in the emerging microcomputer industry. The magazine showed a low-cost microcomputer built around the Intel 8080 chip and was in desperate need of a programming language. The Altair, as it was called, was actually a kit that had to be assembled by the purchaser. It was marketed by a company called MITS (Micro Instrumentation Telemetry Systems).

Gates and Allen had been following the development of the 8080 chip. Using Traf-O-Data stationary as their institutional identity, they contacted the developer of the MITS machine to offer their services. The two had some experience with Intel chips from their Traf-O-Data days, so they designed a simulation of the Altair’s 8080 chip on a Harvard PDP-10 to build a version of BASIC that would run on the Altair. The key was their questions about how to use a Teletype machine to read and input data. This call convinced the MITS people that they were serious. Just a few months later, Allen flew to New Mexico to present their software. The memory in the Altair was so small that they were unsure their version of BASIC would work. But after a day’s delay and some last-minute software adjustments by Allen, the software worked, and the demonstration was a success. It even allowed them to play a game of Lunar Lander on the Teletype printer.

Allen was offered a job on the spot and became Vice-President of Software at MITS. Gates flew down in the summer and helped out while they worked on their new company at night. On July 23, 1975, the two companies signed a contract giving MITS exclusive rights to their BASIC with royalties going to their new company Micro-Soft. The relationship between MITS and Micro-Soft soon soured. Microsoft was not making much money from the deal. MITS sold BASIC for $75 with the kit while charging $500 for it separately. It became extremely attractive for Altair users to trade bootleg copies of BASIC rather than buying it from MITS. Although the Altair depended on BASIC to do anything useful, MITS saw its business as selling the hardware, not the software. Consequently, marketing the software was not a priority.

On February 3, 1976, Gates sent his infamous letter accusing most Altair owners of stealing the BASIC software by duplicating the paper tape. He claimed that only 10% of them had bought BASIC. Soon Gates got his father, a successful lawyer in Seattle, involved in a lawsuit to get BASIC back from Pertec, the company that had bought MITS in the meantime.

The year 1977 was a decent year for Micro-Soft, with $381,715 in revenues.[5] They got BASIC back and in August began negotiations with Apple to license its programming language for $21,000. Micro-Soft produced versions of BASIC for new processors as they came out, including the 6502 that Wozniak was using for their Apple IIs and the TRS-80 by Radio Shack. The latter’s marketing capabilities made an extraordinary impact on the popularity of the microcomputer, and Micro-Soft’s BASIC was on most of them. When the Commodore PET was designed, Micro-Soft also got the call to provide their BASIC.

Allen, Gates, and crew worked on the project despite the agreement that royalties would not be forthcoming until it shipped the next year. Ironically it was support from Apple that provided a basic level of financial backing for Microsoft. Although Wozniak had designed a BASIC early on for the Apple II, it was not the “floating point” version that many users were requesting. Micro-Soft soon developed a version of BASIC for the Apple II and received a check for $10,500 as an initial part of its 10-year license fee. It was one of the rare times that Gates allowed software to be licensed on a flat-fee basis, rather than requiring a royalty payment on every copy sold.[6] The new version, called AppleSoft BASIC was released in November 1977 and improved and rereleased the following year.

Micro-Soft was very important to Apple in its early days. The Apple II’s bus architecture made expansion possible, and Micro-Soft came up with Softcard to allow the Apple computer to run CP/M. But it was BASIC that was crucial to the success of the Apple II, and Steve Jobs later encouraged Microsoft to create a version for the Macintosh. Unfortunately, this project was a disaster, so Gates strong-armed Jobs to accept it or lose their license for the Apple II’s BASIC. Since Apple Computers was still highly dependent on the sales of the Apple II and BASIC was a strong complementary component of the microcomputer, Jobs killed their in-house MacBasic project to use the Microsoft version. At least until the Apple II computer was discontinued. [7]

In 1978 Gates agreed to move the company back to their hometown Seattle and change the name to Microsoft. Allen, in particular, had grown tired of the desert and convinced Gates to move the company. The three years they were in Albuquerque years were very challenging as the small company (16 employees in 1978) worked day and night keeping up with the new computers and chips coming on to the market. CP/M was becoming the most popular cross-platform operating system due to Kildall’s BIOS that allowed the OS to be quickly adapted for new machines.

In December of 1979, they moved to Bellevue, Seattle, and started focusing on Intel-based 8086 16-bit machines. By March 1979, the newly named Microsoft had 48 OEM (Original Equipment Manufacturer) customers for its 8080 BASIC programming language, 29 for FORTRAN, and 12 for COBOL. During that summer, they developed PASCAL and APL languages as well. The Intel-based microcomputer industry was starting to take off, and Gates and Allen had positioned themselves in the center of it.

Notes

[1] Fortune 500 dream story told by Paul Allen in an interview on Robert X. Cringely’s Triumph of the Nerds.
[2] This information on Allen and Gates’ early years were from Laura Rich’s The Accidental Zillionaire. John Wiley & Sons, Inc.
[3] Allen and Gates became an employee of TRW during his senior year earning a salary of some $30,000.
[4] Traf-O-Data and TRW employment information from Fire in the Valley, pp. 30-32.
[5] Micro-Soft’s 1977 revenues from Laura Rich’s The Accidental Zillionaire.
[6] Licensing of Microsoft’s BASIC to Apple from apple2history.com. Accessed September 19, 2003. .
[7] Microsoft’s early reliance on Apple from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 265.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 2.6: The PC and the Floppy Disk

Posted on | September 21, 2010 | No Comments

The development of the floppy disk device was a crucial factor determining the success of the personal computer. Originally developed by David Noble at IBM in 1971 as a backup storage mechanism for their System/360’s magnetic core memories, the floppy disk was soon put to use for other purposes. An employee at IBM, Alan Shugart saw the implications of the new device for smaller computers. He realized that the floppy disk could provide a new storage device that was faster, had random access (meaning that one did not have to rewind an entire tape to find desired information) and could be portable. He started a company called Shugart Associates to build and market them. However, for the microcomputer to be effective it would need to combine the power of the microprocessor with the new storage mechanism. For a microcomputer to use the floppy disk, it required a new software package to run it, what IBM had already called a “Disk Operating System”. [1]

It was Gary Kildall who pioneered the first effective disk operating systems for microcomputers. Kildall earned his doctoral degree through the military’s ROTC program and so had a choice of going to Vietnam or teaching computer science at the Naval Postgraduate School in Monterey California. While teaching, he was also hired as a consultant by Intel to write software for its microprocessors, including a compiler later called PL/M. The compiler would allow programs written in languages like FORTRAN on larger computers to be used with the microprocessor. It was used on Intel’s Intellec-4 and Intellec-8, small computers that could lay some claim to the title of the first microcomputers, but were never marketed as such. Later, Kildall began emulating Intel’s new 8080 microprocessor on an IBM computer (Microsoft founders Gates and Allen would soon do something similar to write software for the Altair computer) and developed a new version of PL/M for it. He also decided at that time to write a program to control the mainframe’s disk drive. Using commands developed by DEC to access data from its DECtape, he began to write the code for the new operating system. DEC’s OS/8 and later its RT-11 had been important developments for PHP minicomputer series and showed that smaller computers could compete with the mainframes.[2] Pulling the pieces together, Kildall created his new operating system called CP/M, short for Control Program/Monitor.

Intel didn’t really want Kildall’s OS, but the software soon became the standard for a number of new microcomputers. CP/M was announced as a commercial product in April 1976. Kildall soon quit teaching to form a new company with his wife called Intergalactic Digital Research (later just called Digital Research) to market the operating system. CP/M was soon used by a number of small computers including the Osbourne, the first portable microcomputer, and the Kaypro which is shown below.

As C/PM became the standard for microcomputer operating systems, it inspired imitation. It was the foundation for another important operating system called Q-DOS (Quick and Dirty Operating System) that was bought by a small company called Microsoft as the basis for its own microcomputer operating system. MS-DOS became the software foundation of Microsoft’s empire and the successful early run of the IBM Personal Computer.[3]

Notes

[1] The beginnings of the floppy disk from Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press. p. 236-237.
[2] Paul E. Ceruzzi’s A History of Modern Computing. Second Edition. Cambridge, MA: MIT Press was the source for Kildall’s development of CP/M from DEC sources. p. 238.
[3] Gary Kildall’s early story also from Robert X. Cringely’s Accidental Empires. New York: HarperBusiness, A division of HarperCollins Publishers. pp. 55-59.

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor of Global Media at Hannam University in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

How IT Came to Rule the World, 2.5: Intel and the PC

Posted on | September 14, 2010 | No Comments

The computers used for the Moon landing were already out of date when Neil Armstrong walked on the lunar surface in 1969, but the decision to use integrated circuits or “chips” by the Apollo project paved the way for the microprocessor revolution and one of its main offspring, the personal computer.

NASA decided early on to standardize its Apollo flight technology with the integrated circuits (ICs) and nurtured them into reliable, relatively high performance digital processors. Reliability, cost, and the ease of manufacturing these “chips” had been sufficiently subsidized by the space program (and the MAD “Mutually Assured Destruction” missile defense strategy) to the point where integrated circuits and their next stage, the microprocessor, could be used in business related computers. In what would become a common geek term, the PC was the “killer app” for the microchip.

While Jack Kilby at Texas Instruments is generally credited to be the first to construct an integrated circuit, his contemporaries at Fairchild conceived of a production process that could mass-produce the small chips.

While Kilby’s ICs required its combination of transistors, resistors, and capacitors to be connected with gold wires and assembled manually, Robert Noyce and others at Fairchild were developing a literal printing process to construct the ICs. His “planar process” printed thin metal lines on top of an insulating silicon oxide layer that could connect the integral components of the chip. At first they could only connect a few components, but as they refined their “photolithography” method, hundreds, then thousands of connections could be made. By the time the Internet became a household word in the 1990s, millions of transistors were placed on a single chip.

In 1968, with the Apollo project well-established, integrated chip co-inventor Robert Noyce and fellow Fairchild “traitor” Gordon Moore left the company to form a new semiconductor company. What emerged was Silicon Valley stalwart Intel, the future producer of some of the industry’s most sophisticated microprocessors and the latter half of the infamous “Wintel” combination (Windows OS/Intel microprocessor) that would dominate PC sales throughout the rest of the century.

After twenty years of government backing, the microprocessing industry was about to crawl out on its own. And it was the microcomputer that would give the semiconductor industry the legs to become viable in the commercial arena.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    July 2025
    M T W T F S S
     123456
    78910111213
    14151617181920
    21222324252627
    28293031  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.
  • Verified by MonsterInsights