Hypertext, Ad Inventory, and the Use of Behavioral Data
Posted on | October 14, 2021 | No Comments
Artificial Intelligence (AI) and the collection of “big data” are quickly transforming from technical and economic challenges to governance and political problems. This post discusses how the World Wide Web (WWW) protocols became the foundation for new advertising techniques based initially on hypertext coding and online auction systems. It also discusses how the digital ad economy became the basis of a new means of economic production based on the wide-scale collection of data and its processing into extrapolation products and recommendation engines that influence and guide user behaviors.
As Shoshana Zuboff points out in her book, Surveillance Capitalism (2019), the economy expands by finding new things to commodify, to make into products or services that can be sold.[1] When the Internet was opened to commercial traffic in the early 1990s and the World Wide Web established the protocols for hypertext and webpages, a dramatic increase in content and ad space became available. New virtual “worlds” of online real estate emerged. These digital media spaces were made profitable by placing digitized ads on them.
Then, search engines emerged that commodified popular search terms for advertising and also began to produce extraordinary amounts of new data to improve internal services and monitor customer behaviors. Data was increasingly turned into prediction products for e-commerce and social entertainment. Much of the data is collected via advertising processes, but also purchasing behaviors and general sentiment analysis based on all the online activity that can be effectively monitored and registered. The result is a systemic expansion of a new system of wealth accumulation that is dramatically changing the conditions of the prevailing political economy.
The Internet’s Killer App
The World Wide Web was the “killer app” of the Internet and became central to the modern economy’s advertising, data collection, e-commerce, and search industries. Killer apps are computer applications that make the technology worth purchasing. Mosaic, Netscape, Internet Explorer, Firefox, Opera, Chrome were the main browsers for the WWW that turned Internet activities into popular and sometimes profitable pastimes.
In addition, computer languages made the WWW malleable. Markup languages like HTML were utilized to display text, hypertext links, and digital images on web pages. Programming languages like JavaScript, Java, Python, and others made web pages dynamic. First by working with the browser and then servers that could determine the assembly and content of a web page, including where to place advertisements.
Hypertext and the Link Economy
The World Wide Web actualized the dream of hypertext, linking a “multiverse” of documents long theorized by computer visionaries such as Vannevar Bush and Ted Nelson. Hypertext provides digital documents with links to other computer resources. What emerged from these innovations was the link economy and the meticulous collection and tracking of information based on the words, numbers, graphics or images what that people “clicked.”
Apple’s HyperCard in the late 1980s created documents with “hot buttons” that could access other documents within that Apple Macintosh computer. Tim Berners-Lee at CERN used one of Steve Jobs’ Next Computers to create the hypertext environment grafted on the Internet to become the World Wide Web. The HyperText Transfer Protocol (HTTP) allowed links in a cyber document on a web browser to retrieve information from anywhere on the Internet, thus the term World Wide Web.
The “click” within the WWW is an action with a finger on a mouse or scratchpad that initiates a computer request for information from a remote server. For example, online advertising entices a user to click on a banner to an advertiser’s website. The ability to point and click to retrieve a specific information source created an opportunity to produce data trails that could be registered and analyzed for additional value. This information could be used for quality improvement and also probabilities of future behaviors
All these new websites, called “publishers” by the ad industry, contained the potential for “impressions” – spaces on the website that contained code that called to an ad server to place a banner ad on the website. The banner presented the brand and allowed visitors to click on the ad to go to a designated site. Over the years, this process became quite automated.
Ad Metrics
When a web page retrieves and displays an ad, it is called an impression. Cost per impression (CPM) is one monetization strategy that measures an advertiser’s costs when their ad is shown. It is based on the number of times the ad is called to the site per one thousand impressions. Online ads have undergone a bit of a resurgence because they do more for branding than previously recognized.
A somewhat different strategy is based on the click-through rate or CTR. In the advertising world, CTR is a fundamental metric. It is the number of clicks that a link receives divided by the number of times the ad is shown:
clicks ÷ impressions x 100 = CTR
For example, if an ad has 1,000 impressions and five clicks, then your CTR would be 0.5%. A high CTR is a good indication that users find your ads intriguing enough to click. Averages closer to 0.2 or 0.3 percent are considered quite successful as banner popularity has decreased.
The Monetization of Search
An advertiser can also pay the publisher when they specifically drive traffic to a website. This is called Pay-per-click (PPC) or cost per click (CPC). PPC is now used by search engines as well as some website publishers.
PPC can be traced to 1996 when Planet Oasis launched the first pay-per-click advertising campaign. A year later, the Yahoo! search engine and hundreds of other content providers began using PPC as a marketing strategy. Pricing was based on a flat-rate cost per click ranging from $0.005 to $0.25. Companies vied for the prime locations on host websites that attracted the most web traffic. As competition increased for preferred online ad spaces, the click-based payment system needed a way to arbitrate the advertisers’ interest.
This led to the first auction systems based on PPC. A company called Overture.com was created at Idealabs, a Pasadena-based incubator run by Bill Gross. Later called GoTo.com, they launched the first online auction system for search in 1998.
Gross thought the concept of Yellow Pages could be applied to search engines. These large books were significant money makers for telephone companies. Businesses would pay to have their names and telephone numbers listed or purchase an ad listed under a category like bookstore, car insurance, or plumber.
Many words entered into online searches were also strongly connected to commercial activities and potential purchases. Therefore, it made sense that advertisers might pay to divert a keyword search to their proprietary websites. How much they would pay was the question.
Overture.com’s real-time keyword bidding system paid online publishers a specific price each time their link was clicked. They even developed an online marketplace so advertisers could bid against one another for better placement. They started with clicks worth only 1 cent but planning that valuable keywords would be worth far more. They invented PPC to emphasize that it was more important that the link be clicked than seen. By the end of the dot.com bubble in 2001, Overture was making a quarter of a billion dollars a year.
The tech recession in the early 2000s put new pressures on Internet companies to develop viable revenue models. Google had developed the best search engine with its PageRank system but wasn’t making enough money to cover its costs. PageRank ordered search results based on how many valid websites linked to a website. So a company like Panasonic would have many valid sites connected to them. Sites that attracted other search engines just because they listed the names of major companies would not get the same priority as with Google. But good search did not mean good profits.
The dominant strategy at the time was to build a portal to other sites. People would come to the website for the content, and the banner ads would provide revenues. Companies would license search capabilities from DEC’s AltaVista or Inktomi and build content around it. This is how companies like HotBot and Yahoo! progressed. So it was a mystifying surprise when Google rolled out its website with no content or banners. Just a logo with an empty form line for entering search terms.
Informed by Overture, Google rolled out a new advertising model called AdWords in late 2000. Initially a CPM (Cost-per-thousand-impressions) model, it developed into a subscription model that allowed Google to manage marketing campaigns. Then, in 2002, a revamped AdWords Select incorporated PPC advertising with an auction-based system based on the work at Idealabs.
Overture sued Google for infringement of their intellectual property but eventually settled. They had changed their name to Goto.com and were acquired by Yahoo! At Goto.com, advertisers “bid” against each other to be ranked high for popular words. When someone searches and clicks on a word like “insurance,” the sites for the highest bidders will appear according to the highest bids. They also automated the subscriber account process. A settlement was agreed to for millions of Google shares in exchange for intellectual property rights to their bidding and pay-per-click and systems. The move marked an offshoot of the digital ad economy. Emerging powerfully with keyword search and auctioning, and combined with MapReduce and Hadoop-driven “big data” processing, Google’s AdWords became an immediate revenue driver.
Google also bought YouTube in 2006 and eventually created a new ad market for streaming videos. They used a new advertising product called AdSense that was offered in 2003 after Google acquired Applied Semantics. AdSense served advertisements based on site content and audience. It placed ads around or on the video based on what it sensed the content was about. Monetization depended on the type of ad, the cost of views or CPM, and the number of views.
Using Behavioral Data
Facebook’s social media platform started its ascent in 2005, but it also needed a way to monetize its content. It first focused on gathering users and building its capital base. That it used, in part, to acquire several companies for their technical base, such as the news-gatherer FriendFeed. By 2009, it had determined that advertising and data-gathering would necessarily be its profit-making strategy with Facebook Ads and Pages.
Facebook started as a more traditional advertising medium, at least conceptually. It would provide content designed to capture the user’s awareness and time, and then sell that attention to advertisers. Advertising had always merged creativity and metrics to build its business model. Facebook capitalized on the economies of the user-generated content model (UGC) and added user feedback experiences such as the “like” button. Also, sharing and space for comments to provided a more interactive experience, i.e., adding dopamine hits.
Facebook had the tools and capital to build an even more elaborate data capturing and analysis system. They started to integrate news provided via various feeders using coding techniques and XML components to move beyond just a users’ friends’ content. Facebook built EdgeRank, an algorithm that decided which stories appeared in each user’s newsfeed. It used hundreds of parameters to determine what would show up at the top of the user’s newsfeed based on their clicking, commenting, liking, sharing, tagging, and, of course, friending.
Facebook then moved to more dynamic machine learning-based algorithms. In addition to the Affinity, Weight, and Time Decay metrics that were central to EdgeRank, some 100,000 individual weights were factored into the new algorithms. Facebook began using what we can call artificial intelligence to curate the pictures, videos, and news that users visitors saw. This aggressive curation has raised concerns and increased scrutiny of Facebook and its algorithms’ impact on teens, democracy, and society.
Frances Haugen, a former Facebook employee testified in October 2021 to Congress about the potential dangers of Facebook’s algorithms. Legal protections allowed her to present thousands of pages of data from Facebook research.[2] The new scrutiny has raised questions about how much Facebook, and other platforms like it, can operate opaque “black box” AI systems outside of regulatory oversight.
Summary
This post discussed how the hypertext protocols created an opportunity to gather useful data for advertisers and also became money makers for web publishers and search engines. The Internet and the World Wide Web established the protocols for hypertext and webpages allowing for a dramatic increase in content to be made available and with it, ad space. The web’s click economy not only allowed users to “surf” the net but collected information on those activities to be tallied and processed by artificial intelligence.
Subsequently, information on human actions, emotions, and sentiments were technologically mined as part of a new means of economic production and wealth accumulation based on advanced algorithmic and data science techniques used to gather and utilize behavioral data to predict and groom user behaviors.
Citation APA (7th Edition)
Pennings, A.J. (2021, Oct 14). Hypertext, Ad Inventory, and the Use of Behavioral Data. apennings.com https://apennings.com/global-e-commerce/hypertext-ad-inventory-and-the-production-of-behavioral-data/
Notes
[1] Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.
[2] Legal protections based on federal laws including the Dodd-Frank Act, a 2010 Wall Street reform law, and the Sarbanes-Oxley Act, a 2002 legal reaction to the Enron scandal give some protections to corporate “whistleblowers.”
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Adsense > Adwords > click-through-rate (CTR) > Cost per Click (CPC) > Cost per impression (CPM) > DEC AltaVista > Goto.com > HotBot > hypertext > Hypertext theory > Overture.com > Pay-per-click (PPC) > Shosana Zuboff
The North-South Politics of Global News Flows
Posted on | October 3, 2021 | No Comments
“Free flow was at once an eloquent democratic principle and an aggressive trade position on behalf of US media interests.”
– Herman, E.S., and McChesney, R. W., The Global Media [1]
The “free-flow” of transborder news and data communications became a hot topic for international governance and politics in the 1970s after the US went off the global gold standard. It was the dawning of a new age of digital monetarism. Freeing the flow of capital from the constraints of individual nation-states (including the US since the New Deal) was one of the foremost global issues of the Reagan Administration in the 1980s, and news became a contentious part of the process. Major areas of disagreement emerged between the South’s Non-Aligned Movement (NAM) and the North (Group of 5) that would shape the future of the global economy.
This research summarizes and discusses the globalization of capital movement and the information and news that lubricates its transactional systems. An international information divide, literally a divide by national boundaries, needed to be transcended for this globalization to work. I examine the perspective of what has been called the “South” – the countries that inhabit the southern hemisphere (except for Australia and New Zealand) and their historic struggles with the more developed “North.” These countries organized into a ninety-member Non-Aligned Movement (NAM) and subsequently began to attack what they considered a new type of “neo-colonialism.” While quite different, almost all these countries were concerned about corporate power’s growing strength and influence for the First World “North” countries.[2]
Governments became increasingly concerned that the computerized society with its international data flows could affect its citizens and interfere with national security, cultural sovereignty, and economic success. One of the concerns was exemplified by the debates over what was called “Transborder Data Flow” (TDF).[3] TDF was first used in discussions on privacy protection by the OECD in June of 1974. Then in a subsequent OECD seminar in 1977, the definition expanded to include nonpersonal information. South countries also expressed concerns about social and cultural information flowing in from developed countries. In contrast, news about their countries often focused on natural disasters, political instability, and other topics that did not show them in a positive light.
The South called for both a New International Economic Order (NIEO) and a New World Information Order (NWIO), which would provide a collective voice and address these issues. Galtung and Vincent listed the NIEO’s five basic points. The first was better terms of trade for the Third World. Countries of the South wanted improved and/or decreased trade with the countries of the North. Tariffs and other restrictions in the First and Second Worlds were a significant concern, as was the tendency for the South to export raw materials to the North. In contrast, the opposite flow of trade tended to be value-added products such as cars, processed foods, military arms, and electronics. Second, South countries wanted more control over productive assets in their own countries. Capital, nature, labor, technology, and management of foreign corporate branches tended to elude local concerns.[4]
These countries wanted to set up locally controlled industries leading to “import-substitution,” replacing foreign-produced products with those made within the nation. South countries also wanted more Third World interaction. This meant increased South-South trade and economic/technical cooperation between developing countries. Fourth, they also wanted more Third World counter-penetration such as financial investment in “rich” countries. Lastly, they wanted more Third World influence in world economic institutions, such as the World Bank, the IMF, and UNCTAD, as well as in the policies and activities of transnational corporations. In 1974, the NIEO was adopted by a special session of United Nations General Assembly.
These concerns were followed up by calls for a “New World Information and Communication Order” (NWICO), an important but largely rhetorical attack on the global news media, particularly the newswires like Associated Press. In 1976, UNESCO (United Nations Economic, Social, and Cultural Organization) established what later became known as the MacBride Commission, after its chair, Sean MacBride, from Ireland. The commission was charged with studying global communications. The commission’s report, Many Voices, One World (MacBride Commission 1980/2004), outlined the main international problems in communication and summarized NWICO’s primary philosophical thrust. Two years later, UNESCO passed the Mass Media Declaration that spelled out the ethical and professional obligations of the mass media and its journalists. After the MacBride Commission “vaguely” endorsed the NWICO in 1980, UNESCO passed a resolution calling for the elimination of international media imbalances and stressing the importance of news serving national development.[5]
These issues are relevant because they foreshadowed problems inherent in the internationalization of the Internet. They are also indicative of the increasing tensions building between the two as the North attempted to use the “Third World debt crisis” to institute a set of structural reforms designed to open these countries to the flows of money-capital, data processing, and finance related news. Going off gold and the oil crises in the 1970s resulted in most of these countries going heavily into debt. This left them vulnerable to pressure by the North. The Reaganomic response was swift and effective.
The MacBride Report reflected UNESCO’s traditional concerns with the “free flow” of information and calls for a “plurality” of communication channels, but it was released at a time when the new Reagan and Thatcher governments were setting out their own agendas for an international order of communications and capital flows. In the wake of international money’s new demands for information and news, they wanted to maintain a strong neoliberal stance on international communication. Despite strong international opposition, the US redrew from UNESCO and stopped paying membership dues to the United Nations.
But the primary strategy was a structural adjustment of domestic economies to open them up to money-capital and news. By making additional lending subject to the scrutiny of the International Monetary Fund (IMF), the North pressured the South to liberalize their economies and protect the free flow of information moving through their borders. Thus, in conjunction with the financial industry’s need for international data communications, the debt crisis allowed the North to pave the way for the globalization of news and, eventually, the Internet.
Consequently, the main thrust of this research argues that the road for the international Internet and e-commerce was substantially paved by the attempts to free up the global flows of financial news and information needed for the new regime of digital monetarism. Share markets were opened to international investments, and governments were pressured to privatize public utilities and other government assets. A new era of spreadsheet capitalization was emerging that allowed for inventorying and valuing of assets. Turning these assets into tradeable securities was heavily reliant on information and news flows. News became a contentious issue during the 1970s, especially for the “Third World,” which tied it to other issues of economic and informational importance.
This post argues that international flows of information and news were substantially altered in the late 1970s and early 1980s. Freeing the flow of capital from the constraints of individual nation-states (including the US government) was the foremost international issue of the Reagan Administration outside of the Cold War. The securitization of assets required information and news to adequately price and sell on global sharemarkets. Reagan’s tax cuts became the new foreign aid as the US deindustrialized, and capital flows created a global system of digital finance supply chains. By the 1990s, the digital global system had entrenched itself, and a condition of pan-capitalism developed, with South countries becoming “emerging markets” in the global order.[6]
Notes
[1] Herman, E.S., and McChesney, R. W. (1997) The Global Media: The New Missionaries of Global Capitalism. London: Cassell. p. 17.
[2] Chilote, R.H. (1984) Theories of Development and Underdevelopment. Boulder, CO: Westview Press.
[3] R. Turn, “An Overview of Transborder Data Flow Issues,” in null, Oakland, CA, USA, 1980 pp. 3-3.doi: 10.1109/SP.1980.10010
https://doi.ieeecomputersociety.org/10.1109/SP.1980.10010
[4] Galtung, J. and Vincent, R. (1992) Global Glasnost: Towards a New World Information and Communication Order. NJ: Hampton Press.
[5] Jussawalla, M. (1981) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[6] Tehranian, M. (1999) Global Communication and World Politics: Domination, Development, and Discourse. Boulder, CO: Lynne Rienner Publishers. p. 83.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Ⓒ ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: MacBride Commission > news flow > NIEO > NWICO > NWIO > Transborder Data Flow (TDF) > UNESCO
ARPA and the Formation of the Modern Computer Industry, Part 2: Memex, Personal Computing, and the NSF
Posted on | September 26, 2021 | No Comments
With World War II winding down, President Roosevelt asked Vannevar Bush, his “czar” of all federally funded scientific research, for a set of recommendations on the application of the lessons learned during the war. The President was particularly interested in how the scientific and technological advances achieved in the war effort could improve issues like national employment, the creation of new industries, and the health of the nation’s population. This post looks at Bush’s contribution to the formation of the National Science Foundation and computing developments that lead to interactivity and networking.
Bush managed the Office of Scientific Research and Development under Roosevelt and later became President Truman’s science advisor. While not actually stationed in New Mexico, Bush had the overall supervisory responsibility for building the first atomic bomb and was uniquely positioned to understand the new technologies and scientific advances coming out of the war. As a result, Bush took up the President’s challenge and wrote two articles that would have far-reaching consequences.
Bush’s articles provided the rationale for funding a wide range of technological and scientific activities and inspired a new generation of researchers. His first written response, “Science, the Endless Frontier,” led to the formation of the National Science Foundation (NSF). The NSF provided widespread funding for scientific and technological research throughout the country’s universities and research institutes.[1] In the mid-1980s, it would take control of the non-military part of the ARPANET and link up supercomputing centers in response to the Japanese economic-technological threat. The NSFNET, as it was called, would also standardize the TCP/IP protocols and lead to the modern Internet.
Bush’s second response, “As We May Think,” was published in the Atlantic Monthly in July 1945, just a few months after Hitler committed suicide and a month before Bush’s atomic bombs were dropped on Japan. The article received lukewarm attention at first, but it persisted and inspired many people, including J.C.R. Licklider, who pursued a vision of computing interactivity and communications based on Bush’s ideas.
In a war that increasingly turned to science and technology to provide the ultimate advantage, Bush’s responsibilities were crucial to the outcome of World War II. These burdens also placed him in a troublesome position of needing to read, digest and organize an enormous amount of new scientific information. This responsibility led him to develop and forward the idea of an information technology device known as the “memex,” something he had been working on in the late 1930s while he was a professor and the Vice-President of MIT.[2]
The memex is arguably the model for the personal computer and was a distinct vision of man-machine interactivity that motivated Licklider’s interest in time-sharing technologies and networking. Bush’s conception of a new device aimed at organizing and storing information at a personal level led to a trajectory of government-sponsored research projects that aimed to realize his vision. In 1960, Licklider, a lecturer at MIT, published “Man-Computer Symbiosis,” a theoretical article on real-time interactive computing. [3] Licklider, a psychologist by training, later moved to Department of Defense’s Advanced Research Projects Agency (ARPA) in 1962 to become the first director of its Information Processing Techniques Office (IPTO).
It was the intersection of his vision with the momentum of the Cold War that led to the fruition of Bush’s ideas, largely through the work of Licklider. The first timesharing systems were constructed at MIT with funding from ARPA, as well as the Office of Naval Research. Constructed over the years 1959 to 1962, these efforts led to a working model called Compatible Time-Sharing System (CTSS). Using the new IBM 7090 and 7094 computers, CTSS proved that the time-sharing concept could work, even though it only linked three computers.
The military later supplied MIT with a $3 million grant to develop man-machine interfaces. By 1963 Project MAC, as it was called, connected some 160 typewriter consoles throughout the campus and in some faculty homes with up to 30 users active at any one time. It allowed for simple calculations, programming and eventually what became known as word processing. In 1963 the project was refunded and expanded into a larger system called MULTICS (Multiplexed Information and Computing Service) with Bell Labs also collaborating in the research. MULTICS demonstrated the capacity to handle 40-50 users and use cathode ray tube (CRT) graphic devices, and accommodate users that were not professional programmers.[4]
As the cases of computing and timesharing show, the military-industrial tie drove early computing developments and created the foundation for the Internet to emerge. Funding for a permanent war economy introduced the information-processing regime to the modern world. In conjunction with research institutes like MIT, MITRE, and RAND, and corporations such IBM, GE, as well as the Bell System, IT got its start.
Licklider’s notion of an “Inter-Galactic Computer Network” began to circulate as a vague idea through a like-minded group of computer scientists who were beginning to see the potential of connected computers. The IPTO was beginning to seed the literal invention of computer science as a discipline and its establishment in universities around the country. In Licklider’s memo of April 25, 1963, he addressed the “members and affiliates” of the network that had coalesced around his vision, and the money of ARPA. His concern was that computers should be able to communicate with each other easily and provide information on demand. The project was posed in terms of cross-cultural communications. The concept helped ARPA change its focus from what went on inside the computer to what went on between computers.
The technology was not quite there yet but the expertise was coming together that would change computing and data communications forever. Using military money Licklider began the support of actual projects to create computer technologies that expanded Bush’s vision. A little-known corporation called Bolt, Beranek, and Newman (BBN) was one of the most significant to come out of a new complex of agencies and companies working on computing projects. The bond between this small corporation, MIT, and ARPA produced a packet-switched network that became the precursor to today’s modern Internet.
In conjunction with the National Science Foundation, ARPA pursued human-computer interactivity and subsidized the creation of computer science departments throughout the country. It funded time-sharing projects and funded the first packet-switching technology and would be the foundational technology of the Internet.
Notes
[1] Bush stayed with the government throughout the 1940s directing science funding and then becoming the first head of the National Science Foundation after it was established in 1950.
[2] Information on Bush’s early conception of the memex from M. Mitchell Waldrop’s (2001) The Dream Machine: J.C.R. Licklider and the Revolution that Made Computing Personal. New York: The Penguin Group. Particularly useful is the second chapter that focuses on Bush.
[3] Licklider, J.C.R (1960) “Man-Machine Symbiosis,” IRE TRANSACTIONS ON HUMAN FACTORS IN ELECTRONICS. March.
[4] Denicoff, M. (1979) “Sophisticated Software: The Road to Science and Utopia,” in Dertouzos, M.L. and Moses, J.(1979) The Computer Age: A Twenty Year View. Cambridge, Massachusetts: The MIT Press. p. 370-74.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: ARPA > BBN > Memex > National Science Foundation (NSF) > Vannevar Bush
Engineering the Politics of TCP/IP and the Enabling Framework of the Internet
Posted on | September 22, 2021 | No Comments
The Internet was designed with a particular architecture – an open system that would accept any computer and connect it to any other computer. A set of data networking protocols allowed any application on any device to communicate through the network to another application on another device. Email, web files, text messages, and data from sensors could be sent quickly over the Internet without using significant power or other resources from the device. The technical “architecture” of the Internet was designed to empower the network’s edges – the users and their hardware. Its power has been borne out as those edges are no longer just large mainframes and supercomputers but continued to incorporate new devices like PCs, laptops, smartphones, and the tiniest of sensors in the emerging Internet of Things (IoT).
This post explores the “political engineering” of the Internet protocols and the subsequent policy framework for a national and global data communications network that empowered users and created an open environment for competition, social interaction, and innovation. This system has been challenged over the years by programmatic advertising, oligopolistic ISPs, security breaches, and social media. But it’s still a powerful communications system that has changed commerce, entertainment, and politics worldwide.
The Power of Protocols
What gives communicative power to the Internet’s architecture are the protocols that shape the flows of data. With acronyms like TCP, IMAP, SMTP, HTTP, FTP, as well as UDP, BGP, and IP, these protocols formed the new data networks that would slowly become the dominant venue for social participation, e-commerce, and entertainment. These protocols were largely based on a certain philosophy – that computer hosts should talk to computer hosts, that networks were unreliable and prone to failure, and that hosts should confirm with other hosts that the information was passed to them successfully. The “TCP/IP suite” of protocols emerged to enact this philosophy and propel the development of the Internet.[1]
TCP/IP protocols allow packets of data to move from application to application, or from web “clients” to “servers” and back again. They gather content such as keystrokes from an application and package them for transport through the network. Computer devices use TCP to turn information into packets of data – 1s and 0s – sent independently through the web using Internet Protocol. Each packet has the address of its destination, the source of origination, and the “payload,” such as part of an email or video.
The nodes in the network “route” the packets to the computer where they are headed. Destinations have an IP address that are included in routing tables that are regularly updated in routers on the Internet. This involves some “handshaking” and acknowledging the connections and packets received between what we have been calling alternatively the edges/devices/hosts/applications/processes.
More specifically, a “process” on an application on one device talks to a “process” on an application on another device. So, for example, a text application like Kakao, Line, WhatsApp, or WeChat communicates to the same application on another device. Working with the device’s operating system, TCP takes data from the application and sends it into the Internet.
There, it gets directed through network routers to its final destination. The data is checked on the other side, and if mistakes are found, it requests the data be sent again. IMAP and SMTP retrieve and move email messages through the Internet, and most people will recognize HTTP (Hypertext Transfer Protocol) from accessing web pages. This protocol requests a file from a distant server, sets up a connection, and then terminates that connection when the files download successfully. Connecting quickly to a far off resource, sometimes internationally, and being able to sever the link when finished is one of the features that makes the web so successful.
HTTP is at the center of what has been called the World Wide Web (WWW). Mostly called the ‘web” these days, it combined the server with the “browser” to provide a powerful new utility – the website. Hypertext Markup Language (HTML) enabled the browser to present text and images on a 2-D color screen. The WWW empowered the “dot.com” era and allowed many people to develop computer skills to produce websites. Every organization had to have an online “presence” to remain viable, and new organizations were started to take advantage of the fantastic reach of the web. Soon, server-side software empowered a myriad of new possibilities on the net, including browser-based email, e-commerce, search, and social media.
Devices connect or “access” an Internet Service Provider (ISP), either from a home, school, or Wi-Fi connections at a café or public network in a train or park. Mobile subscriptions allow access to a wireless cell tower with a device antenna and SIM card. Satellite service is becoming more available, primarily through HughesNet, ViaSat, and increasingly SpaceX’s Starlink as more low-orbit satellites are launched. Starlink is teaming up with T-Mobile in the US to connect a smartphone directly to the low-orbit satellite network.
Physical media make a difference in good Internet access by providing the material access to the ISP. Various types of wires and fiber optic cables or combinations provide the critical “last mile” connection from the campus, home premise, or enterprise. Ethernet connections or wireless routers connect to a modem and router from your cable company or telco ISP to start and end communication with the edge devices.
Conceptually, the Internet has been divided into layers, sometimes referred to as the protocol stack. These are:
-
Application
Transport
Network
Link
and Physical layers.
The Internet layers schematic survived the Open Systems Interconnection (OSI) model with a more efficient representation that simplified the process of developing applications. Layers help conceptualize the Internet’s architecture for instruction, service and innovation. They visualize the services that one layer of the Internet provides to another using the protocols and Application Programming Interfaces (APIs). They provide discrete modules that are distinct from the other levels and serve as a guideline for application development and network design and maintenance.
The Internet’s protocol stack makes creating new applications easier because the software that needs to be written only for the applications at the endpoints (client and server) and not for the network core infrastructure. Developers use APIs to connect to sockets, a doorway from the Application layer to the next layer of the Internet. Developers have some control of the socket interface software with buffers and variables but do not have to code for the network routers. The network is to remain neutral to the packets running through it.
The Network layer is where the Internet Protocol (IP) does its work. At this layer, the packets are repackaged or “encapsulated” into larger packets called datagrams. These also have an address on them that might look like 192.45.96.88. The computers and networks only use numerical names, so they need to use a Domain Name Service (DNS) if the address is an alphabetical name like apennings.com.
Large networks have many possible paths, and the router’s algorithms pick the best routes for the data to move them along to the receiving host. Cisco Systems became the dominant supplier of network routers during the 1990s.
Although the central principle of the Internet is the primacy of the end-to-end connection and verification – hosts talk to hosts and verify the successful movement of data, the movement of the data through the network is also critical. The network layer in the TCP/IP model transparently routes packets from a source device to a destination device. The job of the ISPs are to take the data encapsulated at transport and network and transport it – sometimes over long distances via microwave towers, fiber optic cables, or satellites. The term “net neutrality” has emerged to protect the end-to-end principle and restrict ISPs from interfering with the packets at the network layer. If ISPs are allowed to examine data from the Application layer, they could alter speed, pricing, or even content based on different protocols.
The diffusion of the TCP/IP protocol was not inevitable. Computer companies like IBM, Honeywell, and DEC developed their own proprietary data communications systems. Telecommunications companies had already established X.25 protocols for packet-switched data communications with X.75 gateway protocols used by international banks and other major companies. TCP looked like a long shot, but the military’s subsequent decisions in 1982 to mandate it and National Science Foundation’s NSFNET support secured momentum for TCP/IP. Then, in 1986, the Internet Advisory Board (IAB) began to promote TCP/IP standards with publications and vendor conferences about its features and advantages. By the time the NSFNET was decommissioned in 1995, the protocols were well established.
The Philosophy of TCP
The military began to conceptualize the decentralized network as part of its defense against nuclear attack in the early 1960s. Conceived primarily by Paul Baran at RAND, packet-switching was developed as way of moving communications around nodes in the network that were destroyed or rendered inoperable by attack. Packets could be routed around any part of the network that was congested or disabled. If packets going from San Francisco in California to New York City could not get through a node in Chicago, they could be routed around the Windy City through nodes in other cities. As networks were being considered for command and control operations they had to consider that eventually computers would not only be in fixed installations but in airplanes, mobile vehicles, and ships at sea. The Defense Advanced Research Projects Agency (DARPA) funded Vint Cerf and others to create what became the TCP and IP protocols to connect them.
The Internet was also informed by a “hacker ethic” that emerged at MIT in the late 1950s and early 1960s as computers moved away from punch-cards and began to “time-share” their resources. Early hacking stressed openness, decentralization, and sharing information. In addition, hackers championed merit, digital aesthetics, and the possibilities of computers in society. Ted Nelson’s Computer Lib/Dream Machines (1974) was influential as the computer world moved to California’s Silicon Valley.
The counter-culture movement, inspired by opposition to the Vietnam War was also important. Apple founders Steve Jobs and Wozniak were sympathetic to the movement, and their first invention was a “Bluebox” device to hack the telephone system. Shortly after, the Apple founders merged hacktivism with the entrepreneurial spirit as they emphasized personal empowerment through technology in developing the Apple II and Macintosh.
The term hackers has fallen out of favor because computers are so pervasive and people don’t like to be “hacked” and their private data stolen or vandalized. But the hacker movement that started with noble intentions and continues to be part of the web culture. [2]
Developing an Enabling Policy Framework
Although the Internet was birthed in the military and nurtured as an academic and research network, it was later commercialized with an intention to provide an enabling framework for economic growth, education, and new sources of news and social participation. The Clinton-Gore administration was looking for a strategy to revitalize the struggling economy. “It’s the Economy, Stupid” was their mantra in the 1992 campaign that defeated President George H. Bush and they needed to make good on the promise. Their early conceptualization as information highways framed them as infrastructure and earned the information and telecommunications sectors both government and private investment.
Initially, Vice-President Gore made the case for “information highways” as part of the National Information Infrastructure (NII) plan and encouraged government support to link up schools and universities around the US. He had been supporting similar projects as one of the “Atari Democrats” since the early 1980s, including the development of the NSFNET and the supercomputers it connected.
As part of the National Information Infrastructure (NII) plan, the US government handed over interconnection to four Network Access Points (NAPs) in different parts of the country. They contracted with big telecommunications companies to provide the backbone connections. These allowed ISPs to connect users to a national infrastructure and provide new e-business services, link classrooms, and electronic public squares for democratic debate.
The US took an aggressive stance in both controlling the development of the Internet and pressing that agenda around the world. After the election, Gore pushed the idea of the Global Information Infrastructure (GII) worldwide that was designed to encourage competition in both the US and globally. This offensive resulted in a significant decision by the World Trade Organization (WTO) that reduced tariffs on IT and network equipment. Later the WTO encouraged the breakup of national post and telegraph agencies (PTTs) that dominated national telecommunications systems. The Telecommunications Act of 1996 and the administration’s Framework for Global E-Commerce were additional key policy positions on Internet policy. The result of this process was essentially the global Internet structure that gives us relatively free international data, phone, and video service.
Summary
As Lotus and Electronic Freedom Frontier founder Mitch Kapor once said: “Architecture is politics”. He added, “The structure of a network itself, more than the regulations which govern its use, significantly determines what people can and cannot do.” The technical “architecture” of the Internet was primarily designed to empower the network’s edges – the users and their hardware. Its power has been borne out as those edges are no longer large mainframes and supercomputers but laptops, smartphones, and the tiniest of sensors in the emerging Internet of Things (IoT). Many of these devices have as much or more processing power than the computers the Internet was invented and developed on. The design of the Internet turned out to be a unique project in political engineering.
Citation APA (7th Edition)
Pennings, A.J. (2021, Sep 22). Engineering the Politics of TCP/IP and the Enabling Framework of the Internet. apennings.com. https://apennings.com/telecom-policy/engineering-tcp-ip-politics-and-the-enabling-framework-of-the-internet/
Notes
[1] Larsen, Rebekah (2012) “The Political Nature of TCP/IP,” Momentum: Vol. 1 : Iss. 1 , Article 20.
Available at: https://repository.upenn.edu/momentum/vol1/iss1/20
[2] Levy also described more specific hacker ethics and beliefs in chapter 2, Hackers: Heroes of the Computer Revolution. These include openness, decentralization, free access to computers, and world improvement and upholding democracy.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Ⓒ ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: global e-commerce > Net Neutrality > Network Layers > NII > NREN > TCP/IP > Telecommunications Act of 1996
ARPA and the Formation of the Modern Computer Industry, Part I: Transforming SAGE
Posted on | September 12, 2021 | No Comments
In response to the Russian Sputnik satellites launched in late 1957, US President Dwight D. Eisenhower formed the Advanced Research Projects Agency (ARPA) within the Department of Defense (DoD). As the former leader of the Allied forces during D-Day and the invasion of the European theater, he was all-to-aware of the problems facing the military in a technology-intensive era. ARPA was created, in part, to research and develop high technology for the military and bridge the divide between the Air Force, Army, Marines, and Navy.
Under pressure because of the USSR’s continuous rocket launches, the Republican President set up ARPA despite considerable Congressional and military dissent. Although it scaled back some of its original goals, ARPA went on to subsidize the creation of computer science departments throughout the country, funded the Internet, and consistently supported projects that enhanced human/computer interactivity.
Forming ARPA
Headquartered in the Pentagon, ARPA was established to develop the US lead in science and technology innovations applicable to the military and help it respond quickly to any new challenges. Eisenhower had multiple suspicions about the military and its industrial connections. However, he did believe in basic research and appointed a man with similar notions, Neil McElroy, the head of Proctor & Gamble, as his Secretary of Defense. McElroy pushed his vision of a “single manager” for all military-related research through Congress. Despite objections by the heads of the various armed forces, Eisenhower sent a request to Congress on January 7, 1958, for startup funds to create ARPA and appointed its director, a vice-president from General Electric. Shortly after, Congress appropriated funds for ARPA as a line item in an Air Force appropriations bill.[1]
Roy Johnson came to head ARPA from GE, dreaming of human-crewed space stations, military moon bases, orbital weapons systems, global surveillance satellites, and geostationary communications satellites. But by the end of ARPA’s first year, Eisenhower had established NASA, dashing Johnson’s space fantasies. Space projects moved to the new civilian agency or back to the individual military services, including the covert ones like those of the CIA’s spy planes and satellites. ARPA desperately searched for a new mission and argued effectively for going into “basic research” areas that were considered too “far out” for the other services and agencies.
With the Kennedy Administration taking office and its appeal for the nation’s “best and brightest” to enter government service, ARPA found its prospects improving. It looked aggressively for talent to develop the best new technologies. Behavioral research, command and control, missile defense, and nuclear test detection were some of the newest projects taken on by ARPA, although not necessarily “basic” research. The new agency also got increasingly involved with computers, especially after Joseph Carl Robnett “JCR” Licklider joined the staff in October 1962.[2]
ARPA’s Information Processing Techniques Office (IPTO)
The IPTO emerged in the early 1960s with the charge of supporting the nation’s advanced computing and networking projects. Initially called the Office of Command and Control Research, its mandate was to extend the knowledge gained by researching and developing the multi-billion dollar SAGE (Semi-Automatic Ground Environment) project and extend it to other command and control systems for the military.
SAGE was a joint project by MIT and IBM with the military to computerize and network the nation’s air defense system. It linked a wide array of radar and other sensing equipment throughout Canada and the US to what was to become the Colorado-based NORAD headquarters. SAGE was meant to detect aircraft (bombers and later ICBMs) coming over the Artic to drop nuclear bombs on Canada and the US. The “semi-automatic” in SAGE meant that humans would be a crucial component of the air defense system, and that provided an opening for Licklider’s ideas.
SAGE consisted of some 50 computer systems located throughout North America. Although each was a 250-ton monster, SAGE computers had many innovations that further sparked the dream of man-machine interactivity. These included data communications over telephone lines, cathode ray terminals to display incoming data, and light pens to pinpoint potential hostile aircraft on the screen. ARPA’s IPTO helped transform SAGE innovations into the modern IT environment.
From Batch to Timesharing
Throughout the 1960s, three directors at IPTO poured millions of dollars into projects that created the field of computer science and got computers “talking” to people and to each other. Licklider had the Office of Command and Control Research changed to Information Processing Techniques Office (IPTO) when he moved from BBN to ARPA to become its first director. Licklider was also from MIT, but what made him unusual was that he was a psychologist amongst a majority of engineers. He got his Ph.D. from the University of Rochester in 1942 and lectured at Harvard University before working with the Air Force. Foremost on his agenda was to encourage the transition from “batch processing” to a new system called “timesharing” to promote a more real-time experience with computers, or at least a delay measured in seconds rather than hours or days.
These new developments meant the opportunity for new directions, and Licklider would provide the guidance and government’s cash. During the mid-1950s, Licklider worked on the SAGE project focusing mainly on the “human-factors design of radar console displays.”[3] From 1959 to 1962, he was a Vice-President for BBN, overseeing engineering, information systems, and psycho-acoustics projects. He was also involved in one of the first time-sharing cases at BBN with a DEC PDP-1 before taking a leave of absence to join ARPA for a year.[4]
Licklider swiftly moved IPTO’s agenda towards increasing the interactivity of computers by stressing Vannevar Bush’s ideas and the notion of a more personal and interactive computing experience. An influential military project at MIT was the TX-2, one of the first computers to be built with transistors and a predecessor to the PDP line of computers. It also had a graphics display, unlike most computers that used punch cards or a teletypewriter. The TX-2 was located at MIT’s Lincoln Laboratories and had a major influence on Licklider. The brilliant psychologist would ride the waves of Cold War grant monies and champion research and development for man-machine interactivity, including a radical new computer-communications technology called timesharing.
Early computer users submitted their requests and punch cards to a receptionist at a computer center. Then a team of computer operators would run several (or a “batch”) of these programs at a time. The results were usually picked up a day or two after submitting the requests. After Bell Labs developed transistor technology, individual transistors were wired into circuit boards, creating the “second generation” computer series. This new technology allowed vacuum tubes to be replaced by a smaller, cheaper, and more reliable technology and produced an exciting increase in processing speeds. Faster technology eventually led to machines that could handle several different computing jobs at one time – timesharing.
Time-sharing would allow several users to use a computer by taking advantage of the increasing processing speeds. It also used enhanced computer communications by allowing users to connect via teletype and later cathode-ray terminals. Rather than punching out programs on stacks of paper cards and submitting them for eventual processing, time-sharing made computing a more personal experience by making it more immediately interactive. Users could interact with a large mainframe computer via teletypewriters used originally for telex communications and the cathode-ray terminals used for televisions.
Timesharing emerged from the MIT environment and its support by the US government. Sets of procedures used for timesharing originated at MIT after receiving an IBM 704 in 1957, a version of the AN/FSQ-7 developed for SAGE. John McCarthy, a Sloan Fellow from Dartmouth, recognized some possibilities of sharing the computer’s capabilities among several users. As the keyboard replaced punch cards and magnetic-tape-to-magnetic-tape communication as the primary source of data entry, it became easier for the new computers to switch their attention to various users.[5]
As its human users paused to think or look up new information, the computer could handle the requests of other users. Licklider pressed the notion of timesharing to increase the machine’s interactivity with humans, but the rather grandiose vision would not be immediately accepted throughout the military-related sphere of ARPA. It was still in a relatively primitive state of computing in the early 1960s, but ARPA would soon be won over.
First on Licklider’s list was Systems Development Corporation (SDC), a RAND spin-off that had done most of the programming for the SAGE project. ARPA had inherited SDC, and a major part of the IPTO budget was set to help them transition from the SAGE air defense project to command and control computing. SDC had been given one of SAGE’s ANSFQ-32 mainframes, but to Licklider’s chagrin, they used it for batch processing. Licklider thought it ridiculous to use it in this manner, where responses often took hours or even days to help a commander react to battle situations.[6] Licklider immediately went to work to persuade SDC to switch from batch processing to time-sharing, including bringing in his allied colleagues such as Marvin Minsky for seminars to cajole SDC.
Soon they were convinced, and Licklider moved on to other time-sharing projects, pouring ARPA money into like-minded projects at MIT and Carnegie Mellon. Luckily, he had joined ARPA the same month as the Cuban Missile Crisis. The event raised concerns about the ability of the President and others high on the chain of command to get effective information. In fact, Kennedy had been pushing for better command and control support in the budget, reflecting his concerns about being the Commander-in-Chief of a major nuclear power.
In the next part I will examine timesharing and the first attempts to commercialize it as a utility.
Notes
[1] Background on ARPA from Hafner, K. and Lyon, M. (1998) Where Wizards Stay Up Late. New York: Touchstone. pp. 20-27.
[2] A much more detailed version of these events can be found in a chapter called “The Fastest Million Dollars,” in Hafner, K. and Lyon, M. (1998) Where Wizards Stay Up Late. New York: Touchstone. pp. 11-42.
[3] Information on Licklider’s involvement with SAGE from Campbell-Kelly, M. and Aspray, W. (1996) Computer: A History of the Information Machine. Basic Books, pp. 212-213.
[4] Information on JCR Licklider’s background at BBN from the (2002) Computing Encyclopedia Volume 5: People. Smart Computing Reference Series.
[5] Evans, B.O. “Computers and Communications” in Dertouzos, M.L. and Moses, J.(1979) The Computer Age: A Twenty Year View. Cambridge, Massachusetts: The MIT Press. p. 344.
[6] A good investigative job on Licklider and SDC was done by Waldrop, M. Mitchell (2001) The Dream Machine: J.C.R. Licklider and the Revolution that Made Computing Personal. New York: The Penguin Group.
Ⓒ ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: AN/FSQ-7 > ANSFQ-32 > ARPA > batch processing > J.C.R. Licklider > SAGE (Semi-Automatic Ground Environment) > Systems Development Corporation (SDC)
ICT and Sustainable Development: Some Origins
Posted on | August 17, 2021 | No Comments
I teach a course called ICT for Sustainable Development (ICT4SD) every year. It refers to information and communications technologies (ICT) enlisted in the service of cities, communities, and countries to help them be economically and environmentally healthy. An important consideration for sustainability is that they don’t impose on conditions or compromise resources that will be needed for future generations. Sustainable Development (SD) is an offshoot of traditional “development” that dealt primarily with national economies that were organizing to “take off” into a westernized, pro-growth, industrial scenarios, and had some consideration of the colonial vestiges they needed to take into account.
While development was also cognizant of the need to support agriculture, education, governance, and health activities, SD put a major focus on related environmental issues and social justice. (See Heeks) SD has been embraced by the United Nations (UN) that came out with seventeen Sustainable Development Goals (SDGs) that were adopted by all UN organizations in 2015.
In this post, I briefly introduce ICT4D and it’s connection to SD. Also, how it emerged and why it is beneficial. Of particular importance are the economic benefits of ICT and recognizing them in the renewable energies so crucial to sustainable development.
ICT was not well understood by development economists and largely ignored by funding agencies, except for telephone infrastructure. Literacy and education were early concerns. Book production, radio, and then television sets were monitored as crucial indicators of development progress. Telephones and telegraphs helped transact business over longer distances but were installed and managed by government agencies called Post, Telephone, and Telegraph (PTTs) entities. PTTs found funding difficult and were challenging to manage, given their technical complexity and enormous geographical scope. Satellites were used in some countries like India and Indonesia and facilitated better mass communications as well as distance education and disaster management.
Most of the economic focus in “developing countries” was on the extraction and growing of various commodities, utilizing low-cost labor for manufacturing, or adding to the production processes of global supply chains. It was only when television and films became important domestic industries that “information products” were recognized economically in the development process.
New dynamics to development and economic processes were introduced with computerization and ICTs. I began my career as an Intern on a National Computerization Policy program at the East-West Center in Honolulu, Hawaii. Inspired by the Nora-Minc Report in France, it was part of the overall emphasis on development at their Communications Institute. I had an office next to Wilbur Schramm, who was one of the most influential development pioneers with his Mass Media and National Development: The Role of Information in the Developing Countries (1964).[1]
With my mentor, Syed Rahim, I co-authored Computerization and Development in Southeast Asia (1987) that serves as a benchmark studies in understanding the role of ICT in development. One objective of the book was to study the mainframe computers that were implemented, starting in the mid-1960s, for development activities. These “large” computers some of them with RAM of merely 14K, were implemented in many government agencies dealing with development activities: agriculture, education, health, and some statistical organizations. We also looked at what narratives were being created to talk about computerization at that time. For example, the term “Information Society” was becoming popular. Also, with the rise of the “microcomputer” or personal computer (PC), the idea of computer technology empowering individuals was diffusing through advertisements and other media.
Information economics opened up some interesting avenues for ICT4D and sustainable development. Initially, it was concerned with measuring different industrial sectors and how many people were employed in each area, such as agriculture, manufacturing, information, and services. Fritz Machlup, wrote the The Production and Distribution of Knowledge in the United States in 1973 that showed that the information goods and services accounted for nearly 30 percent of the U.S. gross national product. A major contributor to information economics, he concluded the “knowledge industry” employed 43 percent of the civilian labor force.
Machlup was also a student of Ludwig von Mises, known today as the founder of the so-called “Austrian School of Economics.” But he was soon overshadowed by fellow “members” Friedrich von Hayek and Milton Friedman, and the resurgence of Von Mises himself. While this debate was primarily against mainstream Keynesian economics, it was also significant for development studies as these economists saw government activities as running counter to the dynamics of the market. The main nemesis of the Austrian school was socialism and government planning activities. While most developing countries were not communist countries, the Cold War was a significant issue that was playing out in countries worldwide.
The Austrian movement had a significant impact in the 1970s and 1980s. Transactions in the economy were seen as knowledge-producing activities and they focused on the use of prices as communication or signaling devices in the economy. It led to a new emphasis on markets and Hayek and Friedman both received Nobel Prizes for their work.
For context, President Nixon had taken the US off the gold standard in August 1971 and the value of the US dollar dropped sharply. But currency markets were free to operate on market principles. It was also a time when the microprocessor was invented and computers were becoming more prominent. In 1973, Reuters set up its Money Monitor Rates, the first virtual market for foreign exchange transactions. They used computer terminals to display news and currency prices and charged banks to both subscribe to the prices and to post them. With the help of the Group of 5 nations, it brought order to international financial markets, especially after the Arab-Israel War broke out in late 1973. The volatility of the war ensured the economic success of the Reuters technology and currency markets have been digitally linked ever since.
Many development theorists by that time were becoming frustrated by the slow progress of capitalism in the “Third World.” Although the Middle East war was short, it resulted in increasing prices for oil around the world. This was a major strain on developing countries that had bought into mechanized development and the “Green Revolution” of the 1960s that emphasized petroleum-based fertilizers and pesticides. The Arab-dominated Organization of Petroleum Exporting Countries (OPEC) began an embargo of western countries for their support of Israel that refused to withdraw from the occupied territories. Prices of oil increased by 70 percent and the US suffered additional setbacks as they ended the war in Vietnam and inflation raged.
A split occurred between traditional development studies and market fundamentalists. British Prime Minister Margaret Thatcher and US President Ronald Reagan were strong advocates of the Austrian School. Both had been taken by Hayek’s Road to Serfdom (1949) and stressed a pro-market approach to development economics. The IMF was mobilized to pressure countries to undergo “structural adjustment” towards more market-oriented approaches to economic development. The PTTs were a primary target and investment strategies were utilized to turn them into state-owned enterprises (SEOs) and parts sold off to domestic and international investors.
Researchers began to focus on the characteristics or “nature” of information. As the economies became more dependent on information, more scholarship was conducted. It became understood that information was not diminished by use or by sharing. Certainly the value of information varied, often by time. The ability to easily share information by email and FTP created interest in network effects and the viral diffusion of information.
These characteristics became particular important after the development of the Internet that quickly globalized. Vice-President Gore’s Global Information Infrastructure (GII) became the foundation for the World Trade Organization’s Information Technology Agreements (ITA) and the privatization of telecommunications services. Tariffs on information and communications technologies decreased significantly. Countries that had gotten into debt in the 1970s were pressured into selling off their telecommunications infrastructure to private interests and they quickly adopted TCP and Internet Protocols (IP).
Other studies focused on efficiencies of production brought on by science and technology, specifically reducing the marginal costs of producing additional units of a product. Marginal costs have been a major issue in media economics because electronic and then digital technologies have allowed the increasing efficiency of producing these types of products. Media products have historically had high production costs, but decreasing marginal costs on the “manufacture” or reproduction of each additional unit of that product.
If we start with books for example, we know it is time-consuming to write a book and the first physical copies of the book are likely to be expensive, especially if only a small number of them are actually printed. But as traditional economies of scale are applied, the cost of each additional book becomes cheaper. Electronic copies of books in particular have become very cheap to produce, and even distribute through the Internet. Although that hasn’t necessarily resulted in major price decreases.
Digital outputs are generally unique economic products. They have unusual characteristics that make it difficult to exclude people from using them, and they are also not used up in consumption. Microsoft faced this problem in the early days of the microcomputer when it was getting started. It criticized computer hobbyists for sharing cassette tapes of their computer programs. Later, their investment in the MS-DOS operating system and subsequently Windows paid off handsomely when they were able to sell it with enormous margins for IBM PCs and then “IBM Compatibles” such as Acer, Compaq, and Dell. That is how Bill Gates became the richest man in the world (or one of them).
The issue of marginal costs have resonated with me for a long time, due to my work on media economics and what economists call “public goods.” In some of my previous posts, I addressed the taxonomy of goods based on key economic characteristics. Public goods and such as digital and media products are misbehaving economic goods in that they are not used up in consumption and are difficult to exclude from use. These writings examined what kind of products are conducive to reduced marginal costs and what social systems are conducive to managing these different types of goods. Originally, the focus was more on media products like film, radio and television, but then digital products like games and operating systems. Will these efficiencies apply to sustainable development?
Can the economics of media products apply to other products. More recently sustainable technologies like solar and wind are being examined for their near-zero marginal costs. A major voice on this topic is Jeremy Rifkin, who is most noted for his book The Third Industrial Revolution (2011) that refers to the importance of concurrent communications, energy, and transportation transitions. We have moved from an integrated political economy based on telephone/telex communications, and carbon combustion energy and transportation to a digital, clean energy. Two books by Jeremy Rifkin, The Near Zero Marginal Cost Society and The Green New Deal are significant points of departure for sustainable development.
Sustainable development initiatives by definition look to economize and reduce costs for the future. It is important to analyze the characteristics of economic goods and their social implications. This level of understanding is important to understand the market structure and types of regulation.
ICT4D has struggled to claim a strong narrative and research stake in the trajectory of development. The Earth Institute’s ICTs for SDGs: Final Report: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals (2015) and the World Bank’s (2016) World Development Report were significant boosts for ICT4D, especially for economic development, and the move towards sustainable development.
Citation APA (7th Edition)
Pennings, A.J. (2021, Aug 21) ICT and Sustainable Development: Some Origins. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/ict-and-sustainable-development-marginal-costs/
Notes
[1] Mass Media and National Development: The Role of Information in the Developing Countries. Stanford University Press. 1964.
[2] Sachs J et al (2016) ICT & SDGs: How Information and Communications Technology can Accelerate Action on the Sustainable Development Goals. The Earth Institute: Columbia University. Accessed at https://www.ericsson.com/assets/local/about-ericsson/sustainability-and-corporate-responsibility/documents/ict-sdg.pdf. 15 Jan 2019
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Ⓒ ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: computerization > marginal costs > Network effects > Nora-Minc Report > Wilbur Schramm
Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains
Posted on | August 12, 2021 | No Comments
Are we ready for the age of diamonds? Instead of mining gold or even Bitcoin for a currency base, why not use diamonds from captured carbon? And why not use recaptured carbon from the atmosphere via hemp? Instead of “conflict diamonds,” why not have “climate diamonds?”
Can we move to a diamond standard? Can diamonds replace gold and back other currencies? Can this be done in an economical and sustainable process? This post examines the processes of biological carbon sequestration and the manufacture of diamonds that can be used in a fintech environment. While I’m generally ok with the prospects of fiat money, if a hard money alternative emerged that was dynamic and contributed to climate security, why not try it?
The process of creating “industrial diamonds” is well established and has produced impressive results. Diamonds can be “grown” using small ones as seeds and adding to the crystalline structure in two ways. One is using superheated “greenhouses” with pressurized methane and hydrogen. Another uses CVD (Carbon Vapor Deposition), a low pressure, vacuum-based system, that uses heat or microwaves to bond carbon-rich gasses to the diamond seeds.
I wrote my Ph.D. dissertation about money and standards, so I’ve been thinking about this topic a lot. In Symbolic Economies and the Politics of Global Cyberspaces (1993), I examined the social forces that drive us to use general equivalents like money and the forces that establish monetary standardization in a digital environment. So, I’m not entirely convinced of my argument so far, but I want to consider available systems of regenerative agriculture and manufacturing that can be mobilized for making climate diamonds and tie them into newer generation cryptocurrency developments.
I was intrigued by ” rel=”noopener” target=”_blank”>a video from “Have a Think.” It presents an intriguing industrial scenario for growing and using hemp to sequester CO2 from the air and use it to produce several very valuable byproducts, one of which can be be transformed into sparking diamonds. Hemp is controversial due to its connection with THC, a psychoactive substance, but that is only present in certain strains. Hemp has a rich history and was particularly important for ropes needed by the sea-faring ships that relied entirely on wind.
Hemp is a dynamic plant that grows quickly. In the process, it can produce several types of industrial products, including lubricants for cars and wind turbines as well as ingredients for cosmetics, soaps, and printer ink. In addition, they have fiber substances that can make products like cloth, paper, and rope. The seeds have positive nutritional benefits and may be a replacement for soy proteins. It can also be processed to produce biochar, a type of charcoal used for fertilizers, graphene products, and the manufactured diamonds mentioned extensively in this post.
Agriculture is going through a technological transformation, with increased use of big data, hydroponics, and robotics. Hydroponics are a way of growing plants on a water-based medium in a protective environment. Hemp farms can remove significant amounts of carbon dioxide (C02) from the air and produce the oils and fibers mentioned above in a clean and economical way. Biological carbon sequestering is probably better then geologic sequestration that injects carbon into underground porous rock formations. Both may be necessary for reducing climate threats, but it is better done by plants that can produce many valuable byproducts.
Can we create a new monetary standard based on climate diamonds? Is it feasible? Is that something we want to do? Globally, we have been on an information standard since the 1970s, anchored by the strength of the US dollar and hedged by multiple financial instruments worldwide. The new diamond market will likely grow within the cryptocurrency environment.
Much of this depends on the future of cryptocurrency platforms and the digital ledger systems that are emerging in new generations. Cardona’s blockchain platform, for example, is evolving to create a peer-to-peer transactional system to trade many types of value like labor and portions of investment vehicles like houses, labor, art, etc. Imagine, for example, going to Home Depot and buying your gardening supplies with informational “deeds” to fractions of your car, your house, your future work, or your diamonds. Diamonds are likely to be another “value” in a chain of intersecting commodities classified on blockchain with dynamic pricing and smart contracts.
Diamonds have utility based on their beauty but also their durability and strength. Most notable is their sparkling effervescence that makes jewelry treasured symbols of relationship and commitment. Their crystalline structure is used in high-tech products like audio speakers as they can vibrate rapidly without affecting or deforming sound quality. Their high heat and voltage tolerance make diamond-based microprocessors an increasingly viable component of digital technologies.
Their hardness also makes them extremely valuable for drill bits. They have practical uses in delicate drilling for art, dentistry, and manufacturing. With a melting point of around 3550 degrees Celsius, they have the durability to drill industrial metals, geothermal wells, and underground tunnels.
Diamonds can also be money. They are portable, durable, measurable, and difficult to counterfeit. For diamonds, size matters, although color and clarity matter as well. Bigger is better and they pack a lot more value per size than gold. Gold has higher storage costs that quickly eat up any profit gains from their appreciated prices. Granted, neither diamonds or gold are generally used as currency, primarily because they lack acceptability by merchants and interoperability between financial systems. Try cashing in your diamonds at Home Depot.
That is why the cryptocurrency environment is the most likely solution to that problem, barring an economic collapse. While the dollar is going digital, officially known as CDBC (Central Bank Digital Currency), it will not be replaced by Bitcoin or other “cryptocurrencies” like Ethereum and Litecoin. Instead, other “values” will line up in relation to the dollar. The Fed will regulate but protect crypto-currencies because it knows that the financial system wants to trade anything, anywhere, anytime. So cryptocurrencies will survive, and diamonds will find their place within their ethereal block-chained cyberspace. In the future, who knows?
This is an exploratory essay stimulated by the “Have a Think” video but also shaped by my interest in fintech, monetary policy, and cryptocurrencies. The reintroduction of hemp in the modern economy and its potential for absorbing carbon dioxide can be a powerful addition to the global economy. It’s not entirely about taking the CO2 out of the air and bonding the carbon to diamonds. Rather, I am hopeful that the green manufacturing of diamonds will help incentivize and stimulate an industrial process with multiple benefits for the economy and the environment.
Citation APA (7th Edition)
Pennings, A.J. (2021, Aug 12). Diamonds are a World’s Best Friend? Carbon Capture and Cryptocurrency Blockchains apennings.com https://apennings.com/dystopian-economies/electric-money/diamonds-are-a-worlds-best-friend/
Ⓒ ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Carbon Vapor Deposition (CVD) > climate diamonds > conflict diamonds > hemp > hydroponics > industrial diamonds
Modern Monetary “Practice” and the Fear of Inflation in a Low-Supply Economy
Posted on | August 1, 2021 | No Comments
One of America’s most potent political myths is that you, as a citizen, pay for government spending. People talk about paying for this or that program, but what really happens is that Congress appropriates the money for spending. Then the Treasury instructs the Federal Reserve Bank to credit the spending accounts. Taxes and borrowing are separate entries. The issue is gaining scrutiny as the US economy reconciles 2020’s economic output slowdown due to COVID-19 in the context of record government spending.
A relatively new area of economic analysis, called Modern Monetary Theory (MMT), emerged from practitioners in the finance industry telling a different story about government spending. It is worth examining as it is based more on financial traders’ practices, particularly those who work with government bonds. Warren Mosler was critical in formulating and shedding light on the actual processes involved in government spending. Thus my emphasis on Modern Monetary “Practice” as it starts with this description of the spending process that allows us to reframe its dynamics.
Spending should not be seen as a panacea for the economy. Spending can be wasteful and lead to inflation. Spending needs to be productive. The $28 trillion debt accumulated by May 2021 is worthy of monitoring, but what does it really mean? What are its implications?
Taxes are registered by government, yes, but it’s not like household economics. A household needs a breadwinner, someone to bring home the bacon, to load up the metaphors. Someone needs to have money to pay the bills. Governments operate under a different set of rules and responsibilities. They can print or mint minor amounts of money and use the Fed for larger quantities. Government provides the money for the economy to operate, and the incentive – taxes – to make people want to own it. Mosler argues that governments have a monopoly on their currency and the responsibility to get it into the economy, by spending, to enable markets to work.
Central banks can make purchases of bonds and quite frankly, whatever it wants to buy. The Fed traditionally only bought government treasuries but now regularly buys mortgage-backed securities in a process called quantitative easing. Ideally, they can sell these treasuries and securities to absorb money from the economy if it smells inflation. The banking sector also creates money when it loans money to consumers.
The Treasury auctions bonds. But to pay for spending? They essentially provide:
- Time deposits for investors;
- Hedge instruments for traders;
- Opportunities for foreign countries to keep their currencies cheap versus the dollar;
- A vehicle for the Fed to influence the money supply and coordinate interest rates.
Borrowing should be seen as a political strategy to keep the financial system secure, provide a stable hedge, and manage the dollar’s value.
So, rather than worrying about “paying” for something, US citizens should be active in deciding how taxes should be used in public policy. The US policy should be designed to tax what it doesn’t want. Well, that isn’t going to be easy. But it is what democracy is about. Spending should also be determined on what will keep the US safe and secure. It should keep the economy productive while providing opportunities and avoid excessive inflation.
This last point is important. Inflation is the primary limiting factor when it comes to spending and is a calculus between supply and “effective” demand. “Too many dollars chasing too few goods” is the standard explanation by economists. Spending is easier for a government to coordinate. A good example occurred during the COVID-19 when the US government passed several emergency spending packages to support businesses and families, especially airlines hurt by the shutdown in travel. While the economy skyrocketed due to the fiscal and monetary stimulus, the slowdowns in production, disruptions of supply chains, and people staying at home caused a significant spike in inflation during early 2021.
Inflation in the US had been largely absent since Nixon took us off the gold standard in the early 1970s. At that time, the dollar deflated, and OPEC countries restricted oil production. So they wanted to drive up prices to make up for the diminishing value of the US greenback. Meanwhile, lacking banking systems due to Islamic restrictions on credit, they recycled US dollars through a global euro-dollar system. Called “petrodollars,” banks worldwide coordinated syndicated loans with these funds for countries needing dollars for energy purchases and development projects. “Economic hit men” scoured the world and pressured countries around the world to borrow the money, eventually creating what was called the “Third World Debt Crisis” in the early 1980s.
Since the 1980s, financialization and the commercialization of Cold War technologies created sufficient competition and disruption to keep prices down. Primarily information technologies, they increased productivity, reducing labor and resource costs. Also, globalization created new forms of interstate competition and cooperation, as supply chains supported innovation and higher quality products. The US government also floated bonds internationally to countries like England and Japan that strengthened the dollar and kept it as the world’s reserve currency.
The COVID-19 pandemic presents unprecedented economic challenges, particularly with its 2021 resurgence as the delta variant. A rising stock market that saw the DJIA hit 35,000 and S&P hitting 4,395.26 with a market capitalization of US$38.2 trillion. But concerns about inflation grew as commodities such as copper, lumber, and oil increased. A computer chip shortage also raised concerns about the production and pricing of cars, computers, and other commodities based on microprocessing capability. But the Fed and others saw this phenomenon as “transitory,” citing disruption, demographics, debt, and productivity as factors that would reduce inflationary pressures.
So the economy looks to be at risk in late 2021. Will the practical application of MMT provide operational guidance for a new era of prosperity? Can infrastructure and climate change solutions provide sufficient returns on these investments? The big question is whether government spending for such programs can avoid significant inflationary pressures? With COVID, we are struggling with how to spend in a low-output economy.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Ⓒ ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: inflation > Modern Monetary Theory (MMT) > petrodollars