Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

CISCO SYSTEMS: FROM CAMPUS TO THE WORLD’S MOST VALUABLE COMPANY, PART THREE: PUSHING TCP/IP

Posted on | April 20, 2017 | No Comments

Len Bosack and Sandy Lerner combined several technologies being developed at Stanford University and the Silicon Valley area to form the networking behemoth, Cisco Systems. However, success was by no means foreseen in the early years. The key was when the pair obtained access to Bill Yeager’s source code for the multiple-protocol “Blue Box” router in 1985. Yeager’s software became the foundation for the Cisco router operating system and a major stimulant for the adoption of the Transmission Control Protocol and Internet Protocol (TCP/IP) in data communications systems around the world.

In 1980, Yeager became responsible for networking computers at the Stanford Medical School. A number of devices were proliferating on campus such as the DEC10, PDP11/05 and VAX Systems, and especially a number of Xerox machines, including PARC Lisp machines and Altos file servers and printers. The Xerox influence was substantial as the project started with routing Parc Universal Packet (PUP) for the Xerox PARC systems and mainframes. It was later configured to include IP addresses for the new VAX750s and Xerox’s own proprietary network XNS – Xerox Network Services.

After some controversy, the Stanford Office of Technology Licensing eventually decided to allow Cisco to use the technology. The venerable institution ultimately decided they would try “to make the best of a bad situation,” and on April 15, 1987, licensed the router software and two computer boards to Cisco for $19,300 in cash and royalties of $150,000 as well as product discounts. It refused equity in Cisco Systems as “a matter of policy.”[1] The agreement named Yeager as the principal developer of the source code, making him one of the unsung heroes of the Internet age.

The couple initially ran the company from their home at 199 Oak Grove Avenue in Atherton, CA. Using their credit cards for startup capital, they set up an office and started assembling routers in their living room. Self-financing, they even installed a mainframe computer in their garage for $5000. They needed the big computer to stay on the ARPANET and take orders for their network equipment, making them an early e-commerce innovator.[2] Bosack focused on technology and Lerner on marketing. Lerner’s ad meme “IP Everywhere” was ahead of its time.

The early years were tough. Venture capital was difficult to acquire, and the couple reportedly made over 70 visits to potential investors to find money for their company. At that time, only semi-conductor companies were being funded, and the Internet was barely known outside of academia. Finally, Sequoia Capital stepped in for a considerable percentage of the business. Don Valentine had passed on Steve Jobs and Apple Computers and consequently had developed a more open mind to new innovations. They put in $2.5 million for one-third of the company and the ability to make major management decisions.

By the end of 1986, the company was growing rapidly. Although it took two years to get out of the garage, the computer and communications revolution was taking off. PCs were becoming commonplace and even more important, becoming networked. Also, TCP/IP was gaining traction. The Department of Defense mandated its use at the center of the ARPANET and granting projects that coded TCP/IP implementations for IBM machines and operating systems such as UNIX. The emerging NSFNET had also required TCP/IP protocols and compliant networking technologies. By mid-1985, almost 2000 computers hosted TCP/IP technology.

Despite the growing enthusiasm for TCP/IP in the military/academic/research institute sphere, the major manufacturers of computer communications equipment were focused on the OSI models and believed market forces would eventually move in its favor. However, in 1986, advocates of TCP/IP took action to improve and promote their protocols. The Internet Advisory Board (IAB) began to implement strategies in two parts. The first was to encourage more participation in TCP/IP standards development. It resulted in the May 1986 publication of RFC 985, “Requirements for Internet Gateways” and other recommendations. The second was to inform equipment vendors about the features and advantages of TCP/IP. This involved organizing several vendor conferences including the “TCP/IP Vendors Workshop” on August 25-27, 1986 and the “TCP/IP Interoperability Conference” during March 1987.

While some vendors were disappointed that no certification and testing process was forthcoming, it allowed the advocates of TCP/IP to provide some guidance for equipment manufacturers to incorporate their protocols. It was with under leadership of Dan Lynch that these conferences were started and in 1988 they came under the heading of INTEROP. As the industry took shape with innovations like Simple Network Management Protocol, or SNMP, introduced at INTEROP ’88, Cisco was one of its main beneficiaries.

In 1986, Cisco introduced the F Gateway Server (FGS) remote access server and also their first commercial multi-protocol network router, the Advanced Gateway Server (AGS). The “Massbus-Ethernet Interface Subsystem,” was an interface card made for DEC computers and could create bridges between local area networks of different protocols such as TCP/IP and PUP and its highest line rate on the system was 100Mbps FDDI. The AGS was capable of connecting multiple LAN and WAN networks. Its technical characteristics included a throughput of 10M bits a second and it could process 300 packets a second. The AGS supported 200 routing tables and cost approximately $5,550. The AGS was quickly adopted in networks such as CSUNet at Colorado State University. Soon, universities all around the country were calling and emailing them about their equipment.

In November 1986, Cisco moved to their first office, 1360 Willow Road, Menlo Park, CA. Revenues had reached $250,000 a month. By May of 1988, sales had doubled, and then just three months later, they doubled again. By 1989, Cisco had three products and 115 employees and reported revenues of $27 million.[2]

Notes

[1] Information on Cisco’s origins is relatively scarce and dominated by Cisco public relations. Tom Rindfleisch of Stanford University and Bill Yeager, a Senior Staff Engineer at Sun Micro Systems, Inc. present a larger story at http://smi-web.stanford.edu/people/tcr/tcr-cisco.html.
[2] Information on Cisco’s habitats from Segeller, (1999) Nerds 2.0.1: A Brief History of the Internet. New York: TVBooks. pp.240-247.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Lotus 1-2-3 – A Star is Born

Posted on | April 7, 2017 | No Comments

It was during the November of 1982 on the giant floor of the Comdex show in Las Vegas that Lotus 1-2-3 would first make its mark. While VisiCalc for the Apple II had shown both the viability of digital spreadsheets and the new “microcomputers,” Lotus 1-2-3 showed that spreadsheets would become indispensable for modern organizations and global digital finance.

Initially Jonathan Sachs was worried. He had spent nearly a year working with Mitch Kapor and his small startup company called Lotus Development Corporation on a new spreadsheet program for the recently released IBM PC. A former programmer at Data General Corp, he was apprehensive about the prospects of his new creation. “It was too difficult for its intended audience, he thought, and would scare users away.

With questions in his mind and an updated resume in hand, Sachs took his baby to the crowds on the Las Vegas show floor.” Lotus had begun to publicize its new spreadsheet nearly a half a year before the Comdex show and a Wall Street Journal article just before the event was particularly helpful. But the new application was nearly as popular as Apple’s debut at the West Coast Computer Faire. Lotus 1-2-3 turned out to be a big hit with the swarming exhibit crowd.

“By the time the workers started to tear down the exhibit stalls at the end of the Comdex show, Lotus had taken about $3 million in orders based on the demo alone. Little did Sachs know his creation would change the computer industry forever.”[1] Nor did he know it would change the world of finance.

But that was just the start of their success. Soon Lotus 1-2-3 would top the sales list of all computer software. After the new electronic spreadsheet was officially released on January 26, 1983, the company logged sales of some 60,000 copies in its first month. Because software can be reproduced quickly after it is developed, they were able, barely, to keep up with demand. In a few months 1-2-3 was heading distributor Softsel Computer Products Inc.’s best-seller list and where it would stay for the next two years. Lotus 1-2-3 would go on to become the number one best-seller computer software application of the 1980s.

Lotus 1-2-3’s success was based on a combination of great programming skills and market savvy. Sachs and Kapor had decided to create a spreadsheet that would better the functionality and speed of VisiCalc, the tremendously popular application designed for the Apple II. Kapor had created graphics capabilities for VisiCalc while Sachs had written for minicomputers while at Data General. Because of his experience with the popular market, Kapor called the marketing shots while Sachs had the better programming experience and worked to realize Kapor’s conceptual software ideas.

With startup funds from Ben Rosen, who had left Morgan Stanley to become venture capitalist, the small firm began to work on a new product for the business market. Together they decided to create a new microcomputer spreadsheet product by using a faster programming language than most of the competition such as Context MBA and SuperCalc. Their target was a software application for the newly released IBM PC market.

Instead of the easier Pascal programming language, Kapor and Sachs decided to use the more tedious assembly language to construct their new software package. Assembly worked closer to the original machine language of the computer. That meant that it was much harder to program, but could provide a faster and more robust final spreadsheet package. They spent 10 months developing the application that grew from a core product to the final 1-2-3 through a series of decisions to add on various features.

Step-by-step they built on and tested the new prototype, at one point dropping a word processor module because it too difficult. As Sachs said afterwards, “From a programmer’s standpoint, it was a mind-boggling challenge to write that much code in assembly language that fast and get it to be really solid.” The finished product, Lotus 1-2-3 Release 1 For DOS, which included a spreadsheet, graphing capabilities, database storage and a macro language, required only 192K of RAM to run. Because of the functionality of the assembly language, it was much faster than its major competitors: Context MBA, Multiplan and VisiCalc, despite including the extra features.

Just as VisiCalc helped Apple’s sales, Lotus 1-2-3’s popularity helped IBM’s sales take off. Launched in the late summer of 1981, IBM faced stiff competition in the Apple II and a host of new computer manufacturers using the CP/M operating system. Although IBM had the name recognition, particularly in the business world, it still needed the kind of practical application that would justify its expense. Lotus 1-2-3 would supply the type of incentive that would put a PC “on top of every desk in the business world”. Sachs reminisced on the impact of Lotus 1-2-3 on the IBM PC, “It was pretty amazing because a factor of five more PCs got sold once that software was available. It was very closely tuned to the original IBM PC and pretty much used all of its features.”[2] Lotus could also run on “IBM-compatible” machines such as the simultaneously developed Compaq portable computer.

Both Sachs and Kapor left Lotus Corporation in the mid-1980s. The fast growth had taken its toll and created many problems that diminished their enthusiasm. Lotus faced many formidable challenges: Supply shortages, labor problems, and disputes with distributors took their toll on the original cast. Sachs left in 1985 after the introduction of Release 2 for MS-DOS that included add-in support, better memory management, as well as more rows and support for math coprocessors. Kapor left in 1987 after Release 2.01 was introduced.

But neither left leave before Lotus 1-2-3, with its combination of graphics, spreadsheets, and data management caught the eye of business entrepreneurs and corporate executives who saw the value of a computer program that simplified the monumental amount of numerical calculations and manipulation needed by the modern corporation. By October 1985, CFO magazine was reporting that “droves of middle managers and most financial executives are crunching numbers with spreadsheet programs such as Lotus 1-2-3.”[3]

Notes

[1] Information about Lotus at Comdex from “From the floor at Comdex/Fall in 1982, Lotus 1-2-3 hit the ground running and has not slowed down”. CMP Media LLC. Accessed on August 26, 2003.
http://www.crn.com/sections/special/supplement/816/816p71_hof.asp
[2]Quotes attributed to Sachs were taken from “From the floor at Comdex/Fall in 1982, Lotus 1-2-3 hit the ground running and has not slowed down”. CMP Media LLC. Accessed on August 26, 2003.
http://www.crn.com/sections/special/supplement/816/816p71_hof.asp
[3] Quote from CFO on the impact of Lotus 1-2-3 in the corporate world from David M. Katz, “The taking of Lotus 1-2-3? Blame Microsoft.” CFO.com. December 31, 2002.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

That Remote Look: History of Sensing Satellites

Posted on | March 27, 2017 | No Comments

We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win, and the others, too.

– President John F. Kennedy, September 12, 1962

During U.S. President Kennedy’s speech at Rice University, where he dedicated the new Manned Spacecraft Center in nearby Houston, he stressed that not only would the US go to the Moon, but it would “do the other things.” He mentioned:

“Within these last 19 months at least 45 satellites have circled the Earth. Some 40 of them were made in the United States of America. Transit satellites are helping our ships at sea to steer a safer course. TIROS satellites have given us unprecedented warnings of hurricanes and storms, and will do the same for forest fires and icebergs.”

The TIROS-1 satellite (shown above) was launched on April 1, 1960, from Cape Canaveral, Florida, and carried two TV cameras and two video recorders. RCA, a major TV and radio manufacturer, primarily built the satellite. Short for Television InfraRed Observational Satellite, TIROS weighed 122 kg and only stayed up for 78 days. Nevertheless, it showed the practicality of using the dynamics of electromagnetism for viewing cloud formations and observing patterns for weather event prediction.

President Dwight Eisenhower had been secretly coordinating the space program as part of the Cold War since the early 1950s. He had become accustomed to the valuable photographic information obtained from spy planes. When the new administration took office in early 1953, tensions with Communist countries increased rapidly. After the USSR conducted successful atomic and hydrogen bomb tests, he considered satellites a crucial new Cold War technology.

Eisenhower’s “New Look” policy identified aerospace as a decisive component of future US military strategy. The D-Day successful invasion of Europe, which he had managed as the head of the Allied Forces during World War II, had been meticulously reconnoitered with low and high-altitude photography from various reconnaissance aircraft. Given the growing nuclear capacity of the USSR, he particularly wanted satellites that could assess how rapidly the Communists were producing its long-range bombers and where they were being stationed. In addition, as the Soviets began to deploy rocket technology siphoned from defeated Nazi Germany, locating and monitoring launchpads with nuclear ballistic missiles became important.

The top-secret Corona spy program was the first attempt to start mapping the Earth from space with satellites. Their Corona spacecraft were built by Lockheed Martin for the CIA and Air Force and equipped with 70mm “Keyhole” cameras. These started with an imaging resolution of approximately 40 feet, enough to locate airfields and large rockets.

The destruction of an American U-2 spy plane during a USSR overflight on May 1, 1960, accelerated the need for satellite-based surveillance. President Eisenhower had proposed an “open skies” plan at a 1955 Summit conference in Geneva with England, France, and Russia that would allow each country to make flights over each others’ sovereign territory to conduct inspections of launchpads capable of rocketing Intercontinental Ballistic Missiles (ICBMs) into space. Soviet leader Nikita Khrushchev had refused the proposal and ordered missiles to bring down the high-altitude US spy plane. Khrushchev took pleasure in displaying the wreckage for the international press and in the following show trial for pilot Francis Gary Powers. The U-2 would once again show its value when it detected Soviet missiles in Cuba, but the new competition to conquer space would dramatically improve aerospace technology and the ability to see from space.

Like most of the early US attempts to achieve space flight, the first Keyhole-equipped satellites failed to achieve orbit or suffered other technical failures. The US had also obtained its rocket technology from the Nazis, and early adaptions such as the A-4 and Redstone rockets required much testing before reliable launches occurred. This knowledge was applied to the next generation Thor-Agena rockets that were used as launch vehicles for Corona spy satellites from June 1959. By the late summer of 1960, a capsule containing the first Keyhole film stock was retrieved in mid-air by an Air Force cargo plane as it parachuted back down to Earth. By 1963, Keyhole resolution had increased to 10 feet and to 5 feet by 1967.

The USSR, though, set a precedent for orbital overflight with its Sputnik satellites. While Eisenhower had sought at the Geneva Summit to assure the world of its peaceful intentions in space, the Soviets launched an R-7 ICBM 100 km into space two years later with a payload the size of a beach ball called the Sputnik. It is still a matter of speculation whether Eisenhower baited the USSR into going into orbital space first, but when the US and other countries around the world failed to protest the overflight of Sputnik, it set the legal precedent for satellites flying over other countries.

As the “Space Race” heated up during the mid-1960s, rocket capabilities improved and new applications were being conceived. The Mercury and Gemini space capsules began to use innovative photographic technologies to capture Earth images. Weather satellites like the TIROS-1 had been monitoring Earth’s atmosphere since 1960, and the idea of sensing land and ocean terrains was being developed. Although the details of the spy satellites were highly classified, enough information about the possibilities of high-altitude sensing of earth terrains circulated in the scientific community. In 1965, William Pecora, the director of the U.S. Geological Survey (USGS), proposed that a satellite program could gather information about the natural resources of our planet. The idea of remote sensing was born, and the USGS would partner with NASA to take the lead.

NASA, the National Aeronautics and Space Administration, had been created in 1958 to engage the public’s imagination and support for the civilian uses of spacecraft. The Apollo program was conceived as early as 1960 and eventually would reach the Moon. The program also sparked reflection, not just on reaching the apex of an extraordinary human journey, but on the origins of that trip. We went to the Moon, but we also discovered our home planet, what Buckminster Fuller called “Spaceship Earth.

History was made on Aug. 23, 1966, when the first photo of the Earth from the perspective of the Moon was transmitted by NASA’s Lunar Orbiter I. It was received at the NASA tracking station at Robledo De Chavela near Madrid, Spain. The image was taken during the spacecraft’s 16th orbit and was the first view of Earth taken by a spacecraft from the vicinity of the Moon. The Lunar Orbiter was a series of five unmanned missions designed to help select landing sites for the Apollo. In mapping the Moon’s surface, they pioneered some of the earliest remote sensing techniques.

In 1966, the USGS and the Department of the Interior (DOI) began working together to produce their own Earth-observing satellite program. Unfortunately, they faced many obstacles, including budget problems due to the increasing costs of the war in Vietnam. But they persevered, and on July 23, 1972, the Earth Resources Technology Satellite (ERTS) was launched. It was soon called Landsat 1, the first of the series of satellites launched to observe and study the Earth’s land masses. It carried a system of cameras built for remote sensing by the Radio Corporation of America (RCA) called the Return Beam Vidicon (RBV). Three independent cameras sensed different spectral wavelengths to obtain visible and near-infrared (IR) photographic images of the Earth. RBV data was processed to 70 millimeter (mm) black and white film rolls by NASA’s Goddard Space Flight Center and then analyzed and archived by the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center.

The second device on Landsat-1 was the Multispectral Scanner (MSS), built by the Hughes Aircraft Company. It provided radiometric images of the Earth through the ability to distinguish very slight differences in energy and continues to be a major contributor to Earth sensing data.

The Landsat satellite program has been the longest-running program for the acquisition and archiving of satellite-based images of Earth. Since the early 1970s, Landsat satellites have constantly circled the Earth, taking pictures, collecting “spectral information,” and storing them for scientific and emergency management services. These images serve various uses – from gauging global agricultural production to monitoring the risks of natural disasters.

A successful partnership between NASA and the U.S. Geological Survey (USGS), Landsat’s critical role is monitoring, analyzing, and managing the earth resources needed for sustainable human environments. It manages and provides the largest archive of remotely sensed – current and historical – land data in the world. Landsat uses a passive approach, measuring light and other energy reflected or emitted from the Earth. Much of this light is scattered by the atmosphere, but techniques have been developed for the Landsat space vehicles to improve image quality dramatically. Each day, Landsat-8 adds another 700 high-resolution images to an unparalleled database, allowing researchers to assess changes in Earth’s landscape over time. Landsat-9 will has even more sophisticated technologies and was launched into space in September 2021 replacing Landsat 7. It has better radiometric and geometric imaging capacity than previous Landsats, adding about 1,400 scenes per day to the Landsat global land archive publicly available from USGS.

Since 1960, the National Oceanic and Atmospheric Administration (NOAA) has worked with NASA to build and operate two fleets of satellites to monitor the Earth. One is the Polar-orbiting Environmental Satellites (POES) that fly north and south over the Arctic and Antarctica regions. These make about 14 orbits a day, with each rotation covering a different band of the Earth.

The other are the Geostationary Operational Environmental Satellite system (GOES) that operate in the higher geosynchronous “Clarke Belt.” This position allows them to measure reflected radiation and some Earth-emitted energies from a single stationary source over set locations. It then records a wide range of atmospheric and terrestrial information for weather and potential disaster warnings.

Both the Landsat satellites and the GOES satellites provide a constant stream of data and imagery to help understand weather events and earth resources. Both are vital to observing current meteorological and land-based events that warrant monitoring, study, and reporting.

Citation APA (7th Edition)

Pennings, A.J. (2017, Mar 27). That Remote Look: History of Sensing Satellites. apennings.com https://apennings.com/digital-geography/that-remote-look-history-of-sensing-satellites/

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part IV: Al Gore and the Internet

Posted on | March 23, 2017 | No Comments

This is the fourth part of my argument about how the Internet changed from a military network to a wide scale global network of interconnected networks. Part I discussed the impact of the Strategic Defense Initiative (SDI) or “Star Wars” on funding for the National Science Foundation’s adoption of the ARPANET. In Part II I looked at how the fears about nuclear war and Japan’s artificial intelligence (AI) propelled early funding on the Internet. In Part III, I introduced the “Atari Democrats” and their early role in crafting the legislation in creating the Internet. This is a followup to make a some points about Al Gore’s role in the success of the Internet.

This story should probably have been laid to rest awhile ago, but I was always intrigued by it. The issue says a lot about way election campaigns operate, about role of science and technology in the economy, and especially about the impact of governance and statecraft in economic and technological development.

I’m talking about the famous story about Al Gore and the “invention of the Internet” meme. The story started after the Vice-President was interviewed by CNN’s Wolf Blitzer in 1999 and gained traction during the 2000 Presidential Campaign against George W. Bush. The accusation circulated that Gore claimed he “invented the Internet” and the phrase was used to tag the Vietnam Vet and Vice President as a “liar” and someone who couldn’t be trusted. Here is what he actually said.

Of course, the most controversial part of this interview about Vice President Gore’s plans to announce his candidacy was this statement, “During my service in the United States Congress, I took the initiative in creating the Internet.” That was turned into “inventing the Internet” and was used against him in the 2000 elections.

The meanings are quite different. Inventing suggests a combination of technical imagination and manipulation usually reserved for engineers. To “create” has a much more flexible meaning, indicating more of an art or a craft. There was no reason to say he “invented” the Internet except to frame it in a way that suggested he designed it technically and had patents to prove it, which does sound implausible. Gore would win the popular vote in 2000 but failed in his bid for the Presidency. The Supreme Court ruled he had lost Florida’s electoral votes in a close and controversial election.

The controversy probably says more about how little we understand technological development and how impoverished our conversation was about network infrastructure and information technologies. The history of information technologies, particularly communications networking, has been one of the interplay between technical innovation, market dynamics, and intellectual leadership guiding policy actions. The communications infrastructure, probably the world’s most giant machine, required a set of political skills to manifest. In addition to the engineering skills that created the famed data packets and their TCP/IP protocols, political skills were needed for the funding, regulatory changes, and the power needed to guide the international frameworks.

Al Gore got support from somebody generally considered to be the “real” inventor of the Internet. While Republicans continued to ridicule (or “swiftboat” Gore for trying to claim he “invented the Internet”, many in the scientific community including the engineers who designed the Internet, Robert Kahn and Vinton Cerf verified Gore’s role.

    As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s. But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises. – Vint Cerf

Gore is a wealthy man with considerable economic and political success. He can defend himself and has probably come to terms with what happened. He is interesting because he has been a successful legislator and a mover of public opinion. He can also take much credit for bringing global warming or climate change to the public’s attention and mobilizing action on markets for carbon credits and the acceleration of developments in alternative energy. His actions and tactics are worth studying. We need more leaders who can create a positive future rather than obsessing over tearing down what we have.

In the 1980s, Congressman and then Senator Gore was heavily involved in sponsoring legislation to research and connect supercomputers. Gore, who had served in Vietnam as a military journalist, was an important member of the “Atari Democrats.” Along with Senator Gary Hart, Robert Reich, and other Democrats, they pushed forward “high tech” ideas and legislation for funding and research. The meaning of the term varied but “Atari Democrat” generally referred to a pro-technology and pro free trade social liberal Democrat. The term emerged around 1982 and generally linked them to the Democrats’ Greens and an emerging “neo-liberal” wing. It also suggested that they were “young moderates who saw investment and high technology as the contemporary answer to the New Deal.”

The New York Times also discussed the tensions that emerged during the 1980s between the traditional Democratic liberals and the Atari Democrats who attempted to find a middle ground. High-speed processors and new software systems were recognized at the time as a crucial components in developing a number of new military armaments, especially any space-based “Star Wars” technologies.

So what happened? In the Supercomputer Network Study Act of 1986 to direct the Office of Science and Technology Policy (the Office) to study critical issues and options regarding communications networks for supercomputers at universities and Federal research facilities in the United States and required the Office to report the results to the Congress within a year. The bill was not voted on but got attached to the Senate Bill S. 2184: National Science Foundation Authorization Act for Fiscal Year 1987. Still, the report was produced and pointed to the potential role of the NSF. It also lead to an agreement for the NSFNET backbone to be managed with Merit and IBM.

In October 1988, Gore sponsored additional legislation for “data superhighways” in the 100th Congress. S.2918 National High-Performance Computer Technology Act of 1988 The Act was to fund a 3 Gigabit network:

(1) work for the development of a three gigabit per second national research computer network to link government, industry, and education communities to;
(2) convene a committee to advise on network user needs; and
(3) determine the most efficient mechanism for assuring operating funds for the long-term maintenance and use of such a network.

It also directed the National Telecommunications and Information Administration to evaluate current telecommunications regulations and how it influences private industry participation in the data transmission field. Although no legislative action was taken on the bill,

In April 1993, the Supercomputing Act supported the University of Illinois’ National Center for Supercomputing Applications where the first browser was invented. Called Mosaic, it drew on the developments by Tim Berners-Lee work at CERN on hypertext protocols and the first Web server. Tim Berners-Lee is generally credited with developing the World Wide Web.

Notes

[1] From the Introduction to Amusing Ourselves to Death: Public Discourse in the Age of Show Business

Share

© ALL RIGHTS RESERVED

AnthonybwAnthony J. Pennings, PhD is the Professor at the State University of New York in South Korea. Previously, he taught at St. Edwards University in Austin, Texas and was on the faculty of New York University from 2002-2012. He also taught at Victoria University in Wellington, New Zealand and was a Fellow at the East-West Center in Hawaii in the 1990s.

Space Shuttles, Satellites, and Competition in Launch Vehicles

Posted on | February 25, 2017 | No Comments

The NASA space shuttle program provided a valuable new launch vehicle for satellites. This post recounts the beginning of the US space shuttle development and its impact on satellite launches.

The notion of a reusable spacecraft had been a dream since the days of Flash Gordon in the 1930s, but a number of technical problems precluded its feasibility for NASA’s objectives. Foremost was the lack of sufficient insulation to protect the shuttle during multiple re-entries. NASA instead had relied on the launch-only rocket model inherited from Nazi Germany’s work on the V-2 that killed approximately 10,000 civilians in attacks on England but later launched the team that landed on the Moon. With the Apollo program winding down in the early 1970s, new plans were developed and forwarded for a reusable space shuttle.

In January 1972, President Nixon made the announcement that NASA would begin a program to build a Space Transportation System (STS), more commonly known as the Space Shuttle. During the previous summer of 1971, Nixon was convinced by John Ehrlichman and Casper Weinberg that the US should pursue the Space Shuttle.

NASA had various plans for “Reusable Ground Launch Vehicle” as early as 1966 but in the wake of the public’s boredom with the Moon visits, enthusiasm for space exploration diminished. Democrats running for President in 1972 were critical of the billions of dollars needed for the “space truck.” Senator Edmund Muskie (D-ME), campaigned on the promise of shelving the space shuttle. Senator Walter Mondale (D-MN), another candidate for president, called the Space Shuttle program “ridiculous” during a nationally televised debate. The country felt that problems of housing, urban decay, and poor nutrition for children were higher priorities.

But the Congressional vote that passed in the Spring of 1972 for the NASA budget 277-60 included funding for the Space Shuttle and Nixon’s resounding electoral victory later that year ensured the administration’s support, at least for awhile.

The next ten years were challenging ones for NASA which faced numerous funding and technical problems. The space agency made up for its diminishing budget by allocating more internal funds to the space shuttle project. Although enthusiasm for space exploration had diminished, the practical uses of space-based satellites were encouraging.

The space shuttle was be launched on the back of a traditional rocket, maintain a relatively low orbit, and then glide down to a runway on Earth. This latter part was particularly difficult at temperatures exceeded 3000 degrees F during the descent. This was solved by gluing some 33,000 silica thermal tiles to the bottom of the vehicle.

As the STS descended at 25 times the speed of sound, it also needed a complex guidance system to direct it. The avionics (guidance, navigation, and control) system used four computers to coordinate data from star trackers, gyros, accelerometers, star trackers, and inputs from ground-based laboratories to guide the spacecraft. Whereas the Mercury flights were satisfied with landings within a mile from their pickup ships, the space shuttle required a precise landing on a specific runway after a several thousand mile glide.[1]

On April 12, 1981, the space shuttle STS-1 Columbia blasted off from Cape Kennedy on its inaugural flight. After 54 hours and 37 earth orbits it landed safely at Edwards Air Force Base in California. (I remember the event because my little kitten, Marco Polo, went up to the TV and started to paw at the descending spacecraft) During its initial flight, it had a successful test of its cargo doors that needed to be open to launch satellites from the maximum shuttle orbit to the geosynchronous orbit thousands of miles higher. During the next flight, they tested a Canadian remote manipulator arm designed to retrieve satellites from orbit and repair them.

The space shuttle provided a significant boost to the satellite industry. Columbia’s fifth flight successfully launched two satellites, the Canadian Anik C and the Satellite Business Systems’ (SBS) third satellite for commercial use. The remote manipulator arm later proved useful when it retrieved and repaired the Solar Max satellite in April 1984 and then later one of Indonesia’s Palapa satellites that had failed to reach the geosynchronous Clarke orbit.

The program ran very smoothly until the Challenger space shuttle blew up on a chilly January morning in 1986. Seventy-three seconds after launch, the spacecraft exploded, killing the entire crew. The disaster stopped shuttle launches for over two years. During this time President Reagan announced that when the shuttle resumed service, it would carry very few if any commercial satellites. Reagan’s intention was to privatize launch services and reserve the shuttle for military and scientific activities, including the infamous “Star Wars” program to create a space-based shield to protect the US from attack.[2]

Incidentally, the plan to privatize space launches proved disastrous for the US, as the Europeans and Chinese quickly captured a significant share of the market. The Ariane and Long March rockets proved to be a viable alternative to the space shuttle. The Bush and Clinton administrations continuously approved the launching of American satellites by other countries under pressure from the rapidly growing telecommunications industry and the transnational corporate users who needed the additional communications capacity.

Notes

[1] Space Shuttle’s development from http://history.nasa.gov/SP-4219/Chapter12.html. By Henry C. Dethloff. Accessed February 24, 2006.
[2] Winter, F. 1990. Rockets into Space. Cambridge, MA: Harvard University Press. pp. 113-126.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

GOES-16 Satellite and its Orbital Gaze

Posted on | February 7, 2017 | No Comments

“With this kind of resolution, if you were in New York City and you were taking a picture of Wrigley Field in Chicago, you’d be able to see home plate.” So says Eric Webster, vice president and general manager of environmental solutions and space and intelligence systems for the Harris Corp. of Fort Wayne, Indiana about the capabilities of the newly launched GOES-16 satellite (Geostationary Operational Environmental Satellite). But what this statement fails to reveal is the comprehensive view of the Earth that the satellite provides and the extraordinary amount of information that can be gleaned from its images.

NASA launched the GOES-16, formerly known as GOES-R, on November 19, 2016, and after testing, it became operational earlier this year. This satellite provides powerful new eyes for monitoring potential disasters including floods and other weather-related dangers. It was built for the National Oceanic and Atmospheric Administration (NOAA) in Denver, Colorado by Lockheed Martin, with imagers by Harris and launched in an Atlas Rocket.

With 16 different spectral channels and improved resolution, scientists can monitor a variety of events such as hurricanes, volcanoes, and even wildfires. The satellite’s two visible channels, ten infrared, and four near-infrared channels allows the identification and monitoring of a number of earth and atmospheric events. Unlike the earlier built GOES-13, it can combine data from the ABI’s sixteen spectral channels to produce high-resolution composite images.

Operating from geosynchronous orbit roughly 36,000 km (22,240 miles) above the equator, the satellite can take images with its Advanced Baseline Imager (ABI) instrument of the entire earth disk. It can also focus on just a continent or a smaller region that may be impacted by a specific climate event. Parked at 89.5 degrees West longitude, the satellite has a good view of the Americas all the way to the coast of Africa. (A future GOES satellite will focus on the Pacific side) It can take a full disk image of the Earth every 15 minutes and a smaller image of the continental U.S. every 5 minutes, and a specific locale can be captured every 30 seconds.

Spac0559 - Flickr - NOAA Photo Library
Photo from the NOAA Photo Library

What is most significant is what the satellite can do to inform the public of weather events and potential disasters. It can monitor water vapor in the atmosphere and depict rainfall rates. It can gauge melting snowpacks, predict spreading wildfires and measure the poisonous sulfur dioxide emissions of volcanic eruptions. It can sense sea surface temperatures and provide real-time estimates of the intensity of hurricanes, including central pressure and maximum sustained winds.

One of the most valuable benefits will be to monitor the key ingredients of severe weather like lightning and tornadoes. The GOES-16 also utilizes the Geostationary Lightning Mapper (GLM) to monitor the weather for severe conditions, primarily by detecting lighting. It uses high-speed cameras that take pictures 200 times per second allowing it to detect cloud-to-ground lightning and also lightning between clouds. These features allows it to decrease the warning time for severe weather events.

GOES-16 will reduce the risks associated with weather and other potential disasters throughout the Americas and provide much needed support for first responders as well as policy makers.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

How Schindler Used the List

Posted on | January 28, 2017 | No Comments

When Schindler’s List (1993) was released, I was living in Wellington, New Zealand. But I caught the film during the winter holidays in Hawaii. When I got back to Wellington, I read the book Schindler’s Ark and wrote this article for the city’s newspaper in anticipation of the movie’s NZ premiere in March. It appeared in The Evening Post on March 8, 1994. In it, I examine the political ideology and technology used by the Nazis.

ShindlersList

Schindler’s List (1993), Steven Spielberg’s acclaimed movie on the Holocaust, premieres in Wellington on Friday. Dr Anthony Pennings backgrounds the reasons for the programme of mass genocide.

The cinematic adaption of Thomas Keneally’s 1982 novel, Schindler’s Ark by Steven Speilberg has won international acclaim as one of the best movies of the year. The story of Oskar Schindler credits the Austrian-born industrialist with saving over 1200 Jews from almost certain slaughter in Nazi death camps during the Second World War. By employing them as slave labour in his factories, he was able to harbour them from the mass genocide programme conducted throughout the German-occupied territories.

Although excellent narratives about Oskar Schindler, the book and movie lack adequate descriptions of why and how the Nazis conducted their murders. Not that any justification can be given for the killing of nearly six million Jews, but the popular stories are lacking in the historical background needed to come to grips with the horrible actions of the Nazis. The rationales behind the Nazi extermination programme against the Jews are not as obscure as some people would think, though often hard to hear for our enlightened, liberal ears. The belief in “humanity” and the equality of races, although predominant in our time, is a rather new idea with a weak historical foundation.

One the strongest challenges to the enlightenment period that advanced these ideas was the German Nationalist Socialist movement, a parochial, tribal movement based on the belief of their racial superiority. The Nazis believed that the Germans embodied the Aryan bloodlines, which gave them privileged access to a type of spiritual plane or electrical force that could make them living gods.

They sought to destroy communism, democracy, industrial capitalism, and other forces that supposedly threatened their Aryan bloodlines and sought the rule of wise priest-kings who were imbued with mystical power.

They believed that any dilution of their gene pool through mixing with “lower races” would lock them out of their Garden of Eden. This deeply held mystical paganism was strengthened by the teachings of Darwinism and the pseudo-science of Eugenics, which emerged in the late 19th century. These new beliefs gave the Nazis the rationalisation, however misguided, to their fears of mixing with outsiders.

The Nazis believed the Jewish race was the chief threat to the Germanic people. This belief can be traced back to the writings of Martin Luther, who was the first best-selling book author not only and sparked the Protestant Reformation but left a lasting anti-Semitic legacy with his later writings. According to Luther, Jews were second only to the devil in their capacity for evil.

The later Nazis also used metaphorical devices to denigrate the Jews, such as in the Eternal Jew. This film clip interspersed images of ghetto Jews with footage of rat hordes to suggest Jews were unsanitary and less than human.

Using a vast network of radio relays and loudspeakers dispersed throughout German cities, Adolph Hitler was able to preach his xenophobic version of the Jewish threat to millions of Germans. He argued that the ultimate goal of the Jew was world domination, and the Jewish doctrine of Marxism, in particular, would mean the end of governance by the “aristocratic principle of nature,” the only hope for the German-Aryan bloodlines. Parliamentarism, the press, and the trade union movement were other conspiratorial techniques of the Jews who would ultimately face the Aryan in a worldwide apocalyptic battle.

The Nazi Volkdom (the merging of race politics with the machinery of the State) became committed to eliminating the Jews (and other “sub-races” such as the Slavs) as a matter of national policy. Hitler’s elite warrior class, the black-uniformed SS (Schutzstaffel, or Defence Corps) became the main instrument for carrying out the Race and Resettlement Act, their euphemism for the extermination process.

Headed by Heinrich Himmler, this new group took charge of the secret police (the infamous Gestapo) and the concentration camps which were being built to hold political prisoners and other “anti-Reich” elements such as Bolsheviks and Freemasons. Pledged to give their lives to the Fuhrer, this treacherous and highly indoctrinated Teutonic brotherhood carried out the Holocaust orders.

Two groups, in particular, conducted most of the killings: the Tofenkopfverbande, which bore the chilling death head insignia on its label; and the Einsatzgruppen, a special police force whose tactics even shocked many of the German generals. They combined precise military training and a high level of technocratic competence towards their ideal of a German racial utopia. Unfortunately, the cost would be the lives of several million Jews from Western Europe, 1.7 million from the Soviet Union and the incredible figure of three million from Poland, where most of the Schindler’s List story takes place.

What is so extraordinary about the Nazi Pogrom is that the full force of modernity, with its technologies of chemical production, engineering design, information management, and logistical transport, were brought together under the management of a highly indoctrinated, or at least compliant, professional class. Bureaucratic and scientific advances were marshaled with incredible indeterminacy to carry out the ghastly killings.

The SS spread over the occupied territories to co-ordinate the corralling and transporting of Jews. From small villages, medium-sized cities, industrial centres, and other locations around Europe and the Soviet Union, millions of Jewish families were set into motion.

At first, the Jews were sent to ghettos in the large cities or to industrial factories and other sites of slave labour. As the war progressed, however, the “resettlement” process took priority. Competition arose between the Army and the SS over the use of the railroads, but the Army’s need for supplies, reinforcements and sometimes retreat were secondary to the ideological satisfaction of the Final Solution.

Even the war effort’s need for skilled labour gave way to Himmler and the SS who, with Hitler’s blessing, only increased their extermination efforts as the prospects for winning the war dimmed. Trains flowed day and night with human cargo destined for the death camps at Auschwitz (2,000,000 estimated killed), Belzec (600,000), Chelmo (340,000), Majdanek (1,380,000), Sobibor (250,000) and Treblinka (800,000).

As a scholar of communications, I have been deeply influenced by Cambridge professor Jack Goody, whose Logic of Writing and the Organization of Society (1986) has helped me understand some of the crucial relationships between information technology and the politics of modern life.

Innovators in bureaucracy and population technology, the Germans were leaders in the use of telegraph and teletype communications to control their national administrators and armies. By the turn of the century, the Germans had transformed British “political arithmetic” into “statistics” (state-istics), numerical techniques in the service of State and population administration. They used the tabulating machines and punch cards designed for the US census to identify and control the population. These techniques were taken up by the SS in their management of the Final Solution.

From its first spoken word, “Name?” Schindler’s List investigates the political technology used in the Holocaust. The use of the census was an integral part of the process, as it allowed the Nazis to round up Jews and start the continual process of selecting who would be eligible for work, who would be transported to a concentration camp, and who would be killed. Everyone was assigned a number that was tattooed on their arms. Every number had an associated punch card. Every name needed to be accounted for, registered and given a position.

The list is ancient political technology, which Spielberg chose as a major motif. It is linked to the film’s narrative in a meaningful way, so that it reinforces some of the main themes, such as the bureaucratic momentum of the Nazi machine.  A striking example is shown when Schindler’s trusted accountant (Itzhak Stern, played by Ben Kingsley), forgets to bring his working papers one day and winds up on a train awaiting deportation to an extermination camp. Schindler rushes down to the station to intervene but is told nothing can be done as Stern is now on the list to be transported. Schindler can only get an exemption after he convinces the SS officer that he has the influence to have the officer sent to the Russian Front within weeks.

The list and its physical counterpart, the line, figure prominently throughout the film as mediums of control and efficiency. The line is a particularly brutal and yet effective political technology. It renders people passive and orderly. Disrupting or attempting to escape its smooth, linear surface is an invitation for punishment or death, as many Jews discover.

However, the list also becomes a technology of resistance and escape. With the Russian advancing, Schindler’s factory must yield to the Final Solution. He bribes enough Nazi officials, however, to transport 1200 of his Jews to a new location near his hometown of Zwittau in Austria. From within the Nazi bureaucratic maze, Schindler’s list emerges as a ticket to freedom for the Jews. The list is a manifest for getting on the train to Schindler’s new factory. Getting on this list is a matter of life and death.

For Keneally, it is a modern-day Noah’s Ark. As he writes in Schindler’s Ark about the legends that developed around Schindler’s list: “The list is an absolute good. This list is life.”

It is difficult to say whether Schindler’s List has a happy ending. Spielberg is much harder on Schindler than Keneally. Whereas the latter credits him with an early transformation, the movie-maker waits until nearly the end to acknowledge his attempts to put the welfare of the Jews in front of his own self-interest. He invokes a Talmud equation which is inscribed on the ring offered as a gift to Schindler from the Jews he saved: “Whoever saves one life, saves the world entire.”

Schindler is overcome with grief at the end as he calculates the lives he could have saved with the money he wasted. Ultimately, we left with this moral balance sheet.

Dr Anthony J Pennings is a political scientist and a lecturer in communications studies at Victoria University. He is not Jewish but his parents lived under Nazi occupation in the Netherlands, a country that had 75 percent of its Jewish population shipped to Nazi concentration camps. 

Citation APA (7th Edition)

Pennings, A.J. (2017, Jan 28). How Schindler Used the List. apennings.com https://apennings.com/the-smith-effect/how-schindler-used-the-list/

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Digital Spreadsheets – The Time-Space Power of Accounting, Part 1

Posted on | January 22, 2017 | No Comments

Part of upcoming book on Digital Money and Spreadsheet Capitalism

Accounting is, understandably, an acquired taste, but it should be recognized as a key component of an organization’s structural characteristics and a key source of its longevity and power to grow. One of the first major uses of computers was to make the accounting process easier and faster. Later, the spreadsheet became a key technology in the accounting process and its use for management decisions. Accounting practices and associated technologies are complicit in the formation of modern capitalism and the way it develops.

This new series of posts continues my analysis of the digital spreadsheet as a technology of power by considering their integration into organizational information systems, particularly accounting. This post covers the important historical role of accounting in developing time-space dominance for early bureaucracies and the formation of capitalism. The next post investigates how spreadsheets have transformed accounting and the modern global economy.

While accounting is often dismissed as a realm of the mundane, a more serious inquiry connects it to real power over material and communicative domains. Anthony Giddens’s theory of “time-space power” is particularly useful here as it has the control of information and communication at its core. The former head of the London School of Economics and Political Science, Giddens’ developed a wide-ranging analysis of social systems that connected information technologies, including accounting and book-keeping, to economic, political, social power.

For Giddens, there is no overall mechanism or motor of social change such as class conflict or universal progress. Instead, he claimed, societies can better be understood through a process of “structuration” that reconciles the influence of human actors and the rigidity of social structures. Structuration includes the production of time-space power – the ability to reproduce and expand social systems (such as corporations, governments, and other collectivities) over chronological spans of time and geographical distances of space.[1]

Drawing on anthropologist Jack Goody, Giddens pointed out that the keeping of written accounts such as ledgers and lists about people, objects, and events generated new types of social control and organizational power. In Writing and the Organization of Society (1984), Goody studied ancient temples and monasteries and argued that writing techniques were developed as a form of social power. Lists became containers, not just an aid to memory, but a definite means of encoding and protecting information over time, first as a mechanism to store information over time, and then in more narrative forms.

Giddens argued that every social system ‘stretches’ across time and space and that “information systems,” including early media such as books and clay tablets have been critical to this dynamic. Writing, lists, and tables, as well as modern computer-based technologies, combine storage abilities meant to capture and store information over long durations as well as media that can be transported or transmitted over long distances. Steam engines and then modern communications systems, starting with the Victorian telegraph, allowed for relevant financial and logistical information to be gathered quickly over large spans of geographical space.

Another influence on Giddens’ perspective on the historical role of accounting was the seminal sociologist, Max Weber. Weber studied the emergence of capitalism and identified several key precursors including cities; the separation of the household from companies; contract laws; the bureaucratic nation-state; filing systems; and the organized control of territory by a unified government that allows commerce to develop. He especially stressed the importance of money and the associated role of accounting.

Weber saw money culture and the associated role of double entry bookkeeping as central components in the development of capitalism and modern bureaucracies. New accounting techniques allowed businesses to keep track of items and inventories and balance assets with monetary accounting. This enabled the calculation of the inflows and outflows of money and helped determine sources of profit and losses. Bookkeeping as a system of information, along with file keeping, allows for crucial organizational information to be maintained and supports the stability of organizations over time.

Giddens continued this train of thought, emphasizing that accounting “…allows for the distancing of economic relations across time-space, facilitating the storage and co-ordination of information used to regularize such relations.”[2] Lists in narrative and numerical representations form the basis of accounting techniques and most notably double-entry bookkeeping.

Giddens emphasized, “Double-entry bookkeeping allows the adjusting of inflows and outflows that occur over long periods of time-space.”[3] The system of double-entry accounting developed over the years from simple writing technology using lists and journals of written text and moved towards the more abstract book-keeping and eventually spawned an accounting discipline.

With numbers having mostly shed their connotations of mysticism and superstition by the 19th century, they were quickly becoming a preferred mode of representing business fact. Business accounting’s “credit-ability” was sanctified by the long-term development of a system of accounting with double-entry bookkeeping as its center method. Especially with the new mechanical techniques of calculation, accounting’s influence expanded culturally and geographically.

Historically, Laura Poovey’s analysis of the influence of double-entry bookkeeping on the rise of the European mercantile class is instructive here. She argued that the system of books used in early accounting helped raise the status of merchants through the verification of debits and credits. The process of recording inventories of wealth and transactions in a series of lists, journals, and ledgers, ultimately rendered them in terms of monetary accounts in a single currency. This provided a growing system of trust among merchants that formed the basis of Western capitalism.

    The nature of the double-entry fact can be grasped by recognizing that this system of bookkeeping did not simply record the things merchants traded so that they could keep track of assets or calculate profits and losses. Instead as a system of writing, double-entry bookkeeping produced effects that exceeded transcription and calculation. One of its social effects was to proclaim the honesty of merchants as a group. One of its epistemological effects was to make the formal precision of the double-entry system, which drew on the rule-bound system of arithmetic, seem to guarantee the accuracy of the details it recorded. – Mary Poovey

Double-entry bookkeeping provided a structured narrative that provided a trusted representation of the organization for owners and investors. It’s techniques emerged first to check for errors, but later resulted in the separation of a business from its owner, a precursor condition for the emergence of the corporation and the wide-scale success of capitalism. The stock ticker, for example, allows ownership to be dispersed more easily over space. Ticker-tape machines provided stock prices and the first electrically-powered broadcast news to investors over wide geographical spaces.
New levels of certainty brought on by accounting methods created widespread social changes that transformed Western Society. Along with the proliferation of capitalism and the modern corporation, the emergence of “state-istics” as it was increasingly used by governments created a wider social dynamic that included a new level of trust in numbers being used in science and engineering.

Accounting and other numerical techniques changed Western civilization. Modern capitalism emerged only after the integration of telegraphic systems with the information processing of accounting processes. The telegraph emerged in the 19th century as a key technology to collect and organize accounting information. It should be of no surprise that Western Union, the first modern corporation, connected telegraph systems across the US continent.

While double-entry bookkeeping made capitalism possible, the spreadsheet took it to new levels of possibility. Monetary accounting provided the written “real-time” constitution of a corporation, and with the spreadsheet, this power multiplied. Accounting as a type of information storage became integral for an organization’s power and consequentially its long-term survival. Not only did the speed in which accounting procedures and calculations occurred become vastly faster; new types of analysis and information were produced, and the transmission of accounting information expanded. Spreadsheets increased organizational tempo and coordination over distances.

Notes

[1] “Structuration Theory in Management and Accounting,” by N.B. Macintosh and R.W. Scapens
“Structuration Theory in Management and Accounting N.B. Macintosh and R.W. Scapens” in Anthony Giddens: Critical Assessments, Volume 4. edited by Christopher G. A. Bryant, David Jary.
[2] Giddens, Anthony. A Contemporary Critique of Historical Materialism. Power, Property and the State. Vol. 1. Berkeley: U of California, 1981. Print.
[3] Giddens, Anthony. A Contemporary Critique of Historical Materialism. Power, Property and the State. Vol. 1. Berkeley: U of California, 1981. Print. p. 117.
[4] Poovey, M.A. (1998) A History of Modern Fact. p. 30.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    April 2024
    M T W T F S S
    1234567
    891011121314
    15161718192021
    22232425262728
    2930  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.