Anthony J. Pennings, PhD

WRITINGS ON DIGITAL STRATEGIES, ICT ECONOMICS, AND GLOBAL COMMUNICATIONS

The International Politics of Domain Name Governance, Part Two: ICANN and the Clinton-Gore Administration

Posted on | November 12, 2020 | No Comments

This post is the second in a series about the global politics of domain name registration and management. Domain names are critical identifiers of web resources that facilitate easy access for users. Part One about Jon Postel discussed the heroic but ad hoc process of managing addresses in the earliest days of the Internet. As the World Wide Web (WWW) emerged with the invention of the hypertext transfer protocol (HTTP), management of the domain name system became crucial for e-commerce. It also became a controversial issue for international politics. The Clinton-Gore administration saw the Internet as a major opportunity but also a historically tricky infrastructure to manage, including complications with other countries.

The monopoly for the domain name registry system was turned over to InterNIC (Internet Network Information Center) in 1995. It was a subsidiary of Scientific Applications International Corporation (SIAC), a private company heavily engaged in activities for the Pentagon and the National Security Agency (NSA). Led by a board of ex-NSA, CIA, and DoD officials, the company made money from issuing customized Internet addresses.

These domain names became very valuable as the WWW and its “dot.com” economy started to expand rapidly in the mid-1990s. The commercialization of the NSFNet in 1992 and the introduction of the Mosaic browser in 1994 spread the hope of a “new economy.” The following year, the highly successful Netscape IPO, based on another successful browser, unleashed new investment in high technology and Internet stocks.

The Clinton-Gore administration became particularly aggressive in creating the Internet’s policy framework for domestic and international expansion and commerce. Branded initially in 1993 as the National Information Infrastructure (NII) and later the Global Information Infrastructure (GII) in 1994, the new vernacular by Vice-President Gore allowed for a government interventionist approach. The GII was a conceptual framework to challenge telecommunications companies worldwide to pave the way for data communications and all the related services promised by ISDN.

At home, they pushed an enabling framework for the NII that encouraged private investment; promoted and protected competition; and provided open access to the Internet by consumers and service providers. This approach also emphasized advancing universal service to avoid the digital divide – a society of information “haves” and “have nots.”

Internationally, their work to set up the World Trade Organization (WTO) facilitated the modernization of telecom networks worldwide and broke down the tariff barriers to global IP. In his speeches to the ITU and the GATT (General Agreement on Tariffs and Trade) in 1994, Gore set up the conditions for the World Wide Web we know today. Gore traveled to Marrakesh, Morocco, and at the closing meeting of the Uruguay Round of the GATT negotiations called for creating a World Trade Organization.

The WTO was one of the original objectives of the New Deal’s Bretton Woods agreements at the end of World War II but never received US Congressional approval. However, on November 29, 1994, a bi-partisan vote in Congress allowed the bill to move to the Senate that year and the WTO was approved 76-24 on December 1. The WTO would quickly conclude two historical agreements that liberalized global trade in information technology (1996) and telecommunications trade (1997).

In 1996, Ira Magaziner had set up an interagency group to study domain names as part of his responsibility in the Clinton-Gore administration for international trade. Magaziner’s position paper was released as “A Framework for Global Electronic Commerce” announced by President Clinton and Vice President Al Gore at a public event on July 1, 1997 and became the basis for the Administration’s policy of managed liberalization for e-commerce and the management of the domain name system.

The Clinton-Gore administration wanted to hold off efforts by the United Nations and its International Telecommunications Union (ITU) to manage the Internet. They valued the international organizations but felt the Internet required a more dynamic organizational structure to facilitate its complex growth. Other nations, particularly the BRICs (Brazil, Russia, India, China), questioned the efficacy of US management of the World Wide Web. The US stood its ground however, and staked its claim for control over the Internet.

Magaziner reflected on the problems facing a growing Internet at the time. Governments wanted to tax transmission bits, place tariffs on electronic commerce, and censor the Internet. Debates on digital signatures, regulating prices, and intellectual property (IP) issues such as domain name trademarks were also coming to the fore.

“For this potential to be realized fully,” the draft report stated, “governments must adopt a nonregulatory, market-oriented approach to electronic commerce, one that facilitates the emergence of a transparent and predictable legal environment to support global business and commerce. Official decision makers must respect the unique nature of the medium and recognize that widespread competition and increased consumer choice should be the defining features of the new digital marketplace.”

The ITU had been an essential “club” for the world’s telecommunications agencies to coordinate technical standards for telephony and electromagnetic spectrum allocations. But as a one country, one vote organization, the U.S. was vulnerable to ITU decisions. And that meant its businesses were vulnerable too. On May 1, 1997, eighty organizations supported a Memorandum of Understanding (MoU) addressing the way generic Top Level Domains (gTLDs) were allocated and managed. An International Ad Hoc Committee (IAHC) was established to address perceived problems with the current method of registering generic top level domains on the Internet.

In July 1997, President Clinton issued an executive order to privatize domain name management and in September 1997, Network Solutions (NASDAQ: NSOL) had an initial public offering (IPO) and became a public company. In the first five months of 1998, Network Solutions Inc. (NSI) registered more than 340,000 domain names, an increase of 73 percent from the same period in 1997.

But the company was overwhelmed by the extraordinary growth of the Internet. Registration systems and billing lacked the ability to keep up with volume of domain name requests. NSI was losing its near-monopoly over the domain name business and the company began preparing for a new competitive environment. Still at issue was whether Internet oversight was going to eventually move from U.S. control to an international body.

In late 1998, The U.S. Clinton-Gore administration introduced a new domain name system to encourage competition and effectively manage the DNS. The U.S. Department of Commerce took ownership of the process. Ira Magaziner and others helped design a new organization called ICANN, the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN was created as a not-for-profit company to administer and help set policy from the bottom-up for the Internet name and address system.

ICANN received preliminary approval from the Commerce Department to manage the Internet domain name system (DNS) in November 1998. The two organizations signed a Memorandum of Understanding (MOU) that provided for the DNS management’s gradual privatization. This involved deploying a network of computer database/servers worldwide to keep track of IP addresses and facilitate the quick connection of domain names to requested sites. Also, a dispute resolution system to resolve issues regarding the ownership of a domain name was set up.

In the next post, I explore ICANN’s transition to a global multistakeholder community management and the end of the Commerce Department’s National Telecommunications and Information Administration (NTIA).

Notes

[1] Drezner, D. (2004). The Global Governance of the Internet: Bringing the State Back In. Political Science Quarterly, 119(3), 477-498. doi:10.2307/20202392
[2] Drezner, Daniel W. All Politics Is Global: Explaining International Regulatory Regimes. Princeton, N.J.: Princeton U, 2008. Print. Chapter on “Global Governance of the Internet.” http://press.princeton.edu/titles/8422.html

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Russia and the Era of Pan-Capitalism

Posted on | October 9, 2020 | No Comments

The fall of the Berlin Wall and the disintegration of the Communist Union of Soviet Socialist Republics (USSR) bloc meant the world was no longer significantly divided by Cold War antagonisms. Communist China had already embraced market dynamics and global trade and a pan-capitalist condition of free flows of information and money was spreading globally. This post briefly discusses the breakup of the USSR and the globalization of digital capitalism.[1]

Although a Communist bloc, the USSR had become deeply indebted to the global banking system, exerting additional pressure on a system already addicted to military spending. The term “Eurodollar” reportedly gets its name from a Russian bank in France that was laundering dollars after the Communist Revolution in 1949. The cable address of the bank happened to be “Eurobank.” The Russians also placed their dollars in the Paris-based Russian Banque Commerciale pour l’ Europe du Nord and the Moscow Narodny Bank in London. It was soon traded by other European banks and purportedly took the name “Eurodollar” from the cable address in Paris.

As mentioned in my earlier work on digital monetarism, Eurodollars were the prime credit vehicle for recycling OPEC’s petrodollars worldwide during the 1970s. Through syndicated lending, banks lent Eurodollars excessively to many nation-states, including those in the USSR.

The result of petrodollar recycling was the “Third World Debt Crisis,” that created havoc throughout the 1970s and into the 1980s. Debt put pressure on public resources that were often transformed into state-owned enterprises (SOEs) and eventually sold off to investors. Excessive debt led to an era of imposed deregulation, liberalization, and privatization. Although painful, it opened up the telecommunications world to the Internet and its World Wide Web applications.

The “official” start of the world debt crisis can be traced to the March 1981 announcement by Poland that it could not meet its payment commitments. Previously, banks found it easy to reschedule old debt and lend them new Eurodollars. An “umbrella theory” circulated which held that the Soviet Union would guarantee the debts of the countries in its sphere. But the USSR was having economic problems that went unnoticed by the banks. They still retained a high credit rating and the banks continued to pour money into it.

By 1984, the Communist bloc was gathering significant debt and the economy was faltering, largely due to the drop in oil prices. Defecting spies were reporting that the USSR was a mess. Workers were unmotivated because the store shelves were empty. Lines to purchase scarce goods were everywhere. Nearly half the Russian economy was devoted to military spending and the other half producing shoddy and scarce consumer goods, determined and designed by Communist committees.

With Reagan’s “Star Wars,” Premier and Communist Secretary-General Mikhail Gorbachev knew that the USSR could not keep up with the capitalist world’s innovation and spending. He pleaded with Ronald Reagan at a Reykjavík Summit of 1986 to give up the militarization of space and instead work to reduce nuclear weapons. But Reagan refused. So Gorbachev instead began a public relations campaign to encourage more debate about the USSR, and it’s political and economic future.

In June of 1989, Gorbachev made the call to Poland to tell their Communist party leaders to accept the results of their democratic election. The decision started the processes of glasnost and perestroika throughout the Soviet system. The first term meant political openness – media freedom, democratization of power, and the release of political prisoners. The second meant gradually allowing entrepreneurial activities, reforming state-owned industries, and privatizing government assets.

Deng Xiaoping had already started to reform Communism in China during the 1980s with his “socialism with Chinese characteristics.” Deng and other post-Mao Communist leaders argued that “China had mistakenly entered into Communism during its feudal stage instead of waiting until advanced capitalism, as Marx had theorized. Private ownership and a market economy were suddenly embraced as solutions, not problems. This allowed the Chinese Communist Party to legitimize both its turn to market capitalism and the continuance of its political control over the country through Marxist ideology.”[2]

The fall of the Berlin Wall in 1989 began the process of dismantling the Warsaw Pact and, with it, the USSR Communist bloc. Started in 1955, the Warsaw Pact was initially a defense treaty among Albania, East Germany, Poland, Hungary, Romania, Bulgaria, Czechoslovakia, and Russia. But after East Germany left, the other countries clamored to leave as well.

Czechoslovakia and the Baltic states (Estonia, Lithuania, and Latvia) soon declared their independence from the USSR along with the Republic of Belarus and the Ukraine. Some joined Azerbaijan, Kazakhstan, Kyrgyzstan, Moldova, Turkmenistan, Tajikistan, and Uzbekistan to create the Commonwealth of Independent States (CIS).

President George H. Bush met with Gorbachev in early December 1989, just a month after Europe’s “9/11” dissolving of the barriers between East and West Germany. Meeting in Malta, they resumed START negotiations on nuclear arms control as well as came to agreement on how conventional forces would dismantled in Europe. Gorbachev’s decision to allow a multi-party system and presidential elections in Russia also began to destabilize Communist control and contributed to the collapse of the Soviet Union.

A coup was engineered by Communist hardliners in August of 1991 and although it failed, Gorbachev resigned by Christmas. But not before dissolving the Central Committee of the Communist Party of the Soviet Union and resigning as its Secretary General. Also in 1991, the Soviet military relinquished control over the other militaries. Russia also agreed to take on USSR debt held by the USSR, in excess of US$70 billion.

By promising glasnost and perestroika, Gorbachev changed the political dynamic of a dying system. It was the promise of a political and legal infrastructure for a democratic and market political economy integrated into the world system. The process accelerated in Russia with the election of Boris Yeltsin in 1991 as the first President of the new country. Yegor Gaidar, an economist known for pushing free markets, became the Prime Minister. Yeltsin worked with a group of opportunistic Russians to outmaneuver the Communist directors of the USSR economy to take control of major industries, many going on to become billionaires, the so-called “oligarchs.”

In the first few months of 1992, the new government freed prices and legalized entrepreneurial activity in a process called “shock therapy” – rapid liberalization of the economy. By 1994, Yeltsin worked with Russian banks to raise cash to help privatize major companies. The Russian “loans for shares” program lent the government money in exchange for temporary shares in state-owned companies. When the government defaulted in 1995, they auctioned off major stakes in companies involved in aluminum, oil, nickel, and other important resources as well as food production, telecommunications, and media.

The end of the Warsaw Pact signaled a new liberalization by Moscow and the satellite countries of the USSR. However, it was displaced by economic “shock therapy” – severe austerity and privatization that crippled economic recovery. The resultant chaos led to the return of the Russian “strong man.” Vladimir Putin became the President of Russia in 1999.

Notes

[1] I use the term digital capitalism here as many parameters operate to shape types of capitalism.
[2] Pennings, A. (2014, April 22). E-Commerce with Chinese Characteristics. Retrieved from http://apennings.com/global-e-commerce/china-e-commerce-ipo/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

The International Politics of Domain Name Governance, Part One: The Czar

Posted on | October 1, 2020 | No Comments

An ongoing global concern deals with the control and management of the World Wide Web’s Domain Name System (DNS). The basic issue is whether the address you put into your browser connects to the right computer and retrieves the right information. When you type in apennings.com, for example, how does it get to my blog, and how can you be sure you are getting the right site? What if you typed in google.com, and it was directed to yahoo.com? (just an example) Or if you typed in Amazon.com and it was directed to a site for Barnes and Noble or some other bookseller? These scenarios are possible if the domain name system is not managed correctly.

The Domain Name System (DNS) is a server service that matches website addresses to the right computer. As the Internet has grown exponentially and globally, governance and management issues continue to be complicated and contentious. What is at stake? In this post I look at the beginning of the DNS and the influence of Jon Postel.

It was recognized early on that managing Internet addresses would be a global concern. Internet traffic was increasing domestically and across borders. Decentralization provided the technical and operational strategy to globalize the Internet and its World Wide Web (WWW). It would provide quicker responses and decrease network traffic congestion. Maintenance issues, including redundancy and backing up systems, were easier to manage. Globalization of the Internet, however, raised other issues.

Daniel Drezner identified three reasons to be concerned about the governance of the Internet. The first was that an “actor” such as a government, corporation, or NGO could take over the Internet. Any actor that could benefit from controlling the connections between users and the sites they want to visit should would certainly undergo scrutiny on the matter. Second, it was important for a legal system to be created to ensure that trademarked names were not captured and monopolized by “cybersquatters,” who could withhold or use important trademarked names such as “mcdonalds.com” or “toyota.com.” Also, a lot of money was at stake in the creation of domain names. Little cost is involved in the production of domain names. Providing domain names is like printing money in some respects.[1]

So when you type in the address of the website you want to access, DNS makes sure you make the connection and find the right file. ARPANET, the original Internet that came to life in September 1969, first addressed the issue in the early 1970s. Jon Postel of the University of Southern California (USC) in Los Angeles took up the challenge and was eventually given the nickname “God” because of his power over the early Internet’s addressing system. Postel started with writing addresses on scraps of paper and would continue until a global network was established.

Postel’s influence ranged from its inception to his death in 1998. On March 26, 1972, Postel started collating a catalog of numerical addresses like 123.47.17.49. He asked network administrators to submit information on socket numbers and network service activities at each host computer. He worked with the Stanford Research Institute (now SRI International) to develop a simple text file called HOSTS.txt that tracked hostnames and their numerical addresses. Published as RFC 433 in December 1972, it proposed a registry of port number assignments to network services. He also called himself the “czar” of socket numbers as he pledged to keep a list of all addresses. SRI would distribute the list to all Internet hosts.

The Domain Name System (DNS) was primarily designed by Paul Mockapetris of the Information Sciences Institute at the USC. It was adopted by ARPANET in 1984. The ARPA DNS originally consisted of six different Top Level Domain (TLDs) types: .com (commercial), .edu (education), .gov (government), .mil (military), .net (network provider), and .org (organization). The designation of domain names below them, like hawaii.edu or ecommerce.gov, were left to the discretion of the administrators of the various networks. As the Internet expanded globally, a two-letter suffix such as .nl for the Netherlands, or .nz for New Zealand and .kr for South Korea was allowed individual countries. The first domain name was reportedly symbolics.com, registered through the DNS on March 15, 1985.

In 1988, the U.S. gave the DNS contract to USC’s Information Sciences Institute (ISI). This gave Mockapetris and Postel the opportunity to continue to work together and with SRI International. They continued the functions of address management in what became known as The Internet Assigned Numbers Authority (IANA) that continues to this day. IANA was funded by the U.S. government under a contract with the Defense Advanced Research Projects Agency (DARPA). At this time, the Internet started to expand rapidly in the U.S., and abroad.

Notes

[1] Drezner, D. (2004). The Global Governance of the Internet: Bringing the State Back In. Political Science Quarterly, 119(3), 477-498. doi:10.2307/20202392

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Oliver Stone’s Platoon: Jesus Christ Superstar vs. the Marlboro Man

Posted on | August 12, 2020 | No Comments

In Oliver Stone’s award-winning film, Platoon (1986), Charlie Sheen plays Chris Taylor, a “coming of age” infantry soldier trying to reconcile his identity between the influences of two sergeants in his US Army platoon. The setting is the Vietnam War circa late 1967. The sergeants, played William Dafoe and Tom Berenger, were directed to represent two mythical poles of ideology and belief that have come to heavily influence American political culture. I refer to these two men and the contrasting themes they represent as “Jesus Christ Superstar” vs. “the Marlboro Man.”

Platoon (1986) won Best Picture at the 1987 Academy Awards received additional awards for Best Film Editing, Best Sound, and a nomination for Best Cinematography. Oliver Stone won an “Oscar” for Directing and was nominated for Writing. Both sergeants were nominated for Best Actor in a Supporting Role, which brings me back to the two conflicting myths.

With Sgt. Elias (William Dafoe) representing “Jesus Christ Superstar” and Sgt Barnes (Tom Berenger) “the Marlboro Man,” the movie condenses a series of meanings into the two contending perspectives. These viewpoints divided America and haunt to this day its view of the war. Barnes characterizes the tension succinctly at one point, “there’s the way it ought to be, and there’s the way it is.” Barnes, who was shot seven times, including in the face, has the physical scars to represent “the way it is.”

Jesus Christ Superstar was a rock opera album that was released in 1970 on Broadway and as a movie in 1973. The film was shot in Israel and other Middle Eastern locations and was the eighth highest-grossing film of that year. It reconciled different gospels of the Bible and focused on the relationship between Jesus, Judas, and Mary Magdalene, emphasizing betrayal and picking one’s battles. It was in some ways an anthem of the time as its roll and roll music and energy resonated with the “hippie” counterculture that emerged during the height of the Vietnam War. Jesus of Nazareth, with his long hair, certainly looked the part. Stone “paints” Elias and several other soldiers with iconography from the era. Peace symbols, headbands, drugs, and Sixties music like Jefferson Airplane’s “White Rabbit” are used to represent this counter-culture.

It’s hard to portray Elias as a Jesus-like pacifist when he volunteered for three tours in the “Nam” and was a skilled and experienced soldier. But from the first scene, we see him carrying a machine gun on his shoulders like a cross and climbing up a mountainside like Jesus ascending Calvary. As Sergeant O’Neil from a third squad says about Elias after an argument, “Guy’s is in three years and he thinks he is Jesus fuckin’ Christ or something.”

Elias is portrayed as the more sensitive leader. We next encounter him helping Chris and offering to carry much of the load from his amateurishly stuffed backpack. Most importantly, he is the voice of restraint when the platoon is searching a Vietnamese village for guns and ammunition. When Sgt. Barnes shoots a Vietnamese village woman during an interrogation, Elias confronts him and initiates a fistfight.

This scene creates a tension between the two as Barnes faces a potential court marital for the murder. The conflict eventually ends up with Barnes shooting Elias during a battle with the Viet Cong. The shots don’t kill him though and as the platoon is being evacuated by helicopters he is sighted from the air being chased by Vietnamese troops. He is shot several times in the back but struggles to continue. Finally, as he falls to his knees, writhing in pain, a medium shot shows him with his arms outstretched and gaze towards the heavens, as if he was being crucified.

The Marlboro Man was another iconic figure of the Vietnam era. It became the masculine symbol of the famous cigarette brand. Invented to subvert the early impression that Marlboro cigarettes were for women, it successfully became the icon of rugged individualism and aggressive patriarchy. The first scene of Barnes shows him in the jungle with a pack of Marlboro cigarettes strapped to his helmet.

Barnes was clearly the leader of the platoon, as even the lieutenant deferred to his judgment. His first words in the movie were “Get a move on, boy” to Chris, in his Southern accent. He is regularly portrayed as the tough but competent, no-nonsense leader. At one point, while criticizing the pot smokers for what he calls their attempt to “escape reality,” he says, “I am reality.”

Oliver Stone served in Vietnam and was awarded the Bronze Star medal. The story was based roughly on his experience there. In Stone’s interview with Joe Rogan, he speaks to his respect for both sergeants. While Stone clearly favors Elias, his portrayal of Barnes is surprisingly sympathetic, and we see how both men influence Chris.

Chris arrives in Vietnam as a “cherry,” a virgin to the war experience. But after he recovers from being shot during their first ambush, he befriends a black man named King and a “surfer dude” from California named Crawford. They are all assigned to cleaning the latrines and the scene allows Chris to tell his story of why he quit college and enlisted in the Army. “I wasn’t learning anything. I figured why should just the poor kids go off to war and the rich kids always get away with it?” The others laugh off his naivety but invite him to the Feel Good Cave, a bunker where they “party” by playing music and smoking pot.

King introduces Taylor as the resurrected “Chris” to the “heads,” including those soldiers played by Johnny Depp and Forrest Whitaker. Elias is there smoking pot as well and welcomes Chris with a “hit” of marijuana blown through the barrel of a rifle. You can hear Grace Slick singing “feed your head” as Chris says he feels good and can’t feel the pain from his injury. Elias responds, “feeling good is good enough.”

Tom Berenger is masterful in his performance as Sgt Barnes. While Elias is “partying” with the “stoners,” Barnes is listening to country music and playing cards while drinking whiskey and beer with his group. Later, after Elias is dead, Barnes goes the Feel Good bunker to confront Elias’ friends in the platoon. With a Jack Daniels Tennessee whiskey in hand, he goes on to criticize the recently departed Elias.

    Elias was full of shit. Elias was a Crusader. Now, I got no fight with any man who does what he’s told, but when he don’t, the machine breaks down. And when the machine breaks down, we break down. And I ain’t gonna allow that in any of you. Not one.

The scene ends with Chris attacking Barnes, who quickly subdues the young soldier. He is convinced not to kill him as he would face ten years in military prison for killing an enlisted man.

In a later battle, the platoon is overrun with Viet Cong, and an airstrike is called in to bomb the camp. Barnes takes advantage of the chaos to try to kill Chris, but the sergeant is knocked out by the bombing concussion. Chris and Barnes barely survive. When Barnes asks Chris to get a medic, Chris shoots him in retaliation for Elias’ death.

As Chris is airlifted from the battleground, his voice-over narrates an inner conflict:

    I think now, looking back, we did not fight the enemy; we fought ourselves. And the enemy was in us. The war is over for me now, but it will always be there, the rest of my days as I’m sure Elias will be, fighting with Barnes for what Rhah called possession of my soul. There are times since, I’ve felt like the child born of those two fathers. But, be that as it may, those of us who did make it have an obligation to build again, to teach to others what we know, and to try with what’s left of our lives to find a goodness and a meaning to this life.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor at the Department of Technology and Society, State University of New York, Korea and from 2002-2012 was on the faculty of New York University. He has also taught at Hannam University in South Korea, Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Memes, Propaganda, and Virality

Posted on | July 31, 2020 | No Comments

Today we are faced with a new and potentially ominous form of manipulation, an insidious form of propaganda dissemination. I’m talking about the viral spread of memes. The “meme” has emerged as a powerful communication device in the modern world of social media such as Facebook, Instagram, and Twitter. A meme here refers to a digital image, usually a jpeg or PNG, with a short text caption that is easily posted and shared. Usually, they imply a joke or political message that can be diffused quickly and widely to a broad audience.

In this post, I examine the proliferation of memes and their potentially damaging effect on political culture. I discuss the rhetoric of memes and particularly the viral spread of memes as a new form of propaganda. Propaganda utilizes elements of powerful media techniques to have specific effects on the political and social consciousness of individuals.

What is propaganda? One of my colleagues at New York University, had an apt description. Media ecologist Neil Postman called propaganda an “intentionally designed communication that invites us to respond emotionally, immediately and in an either-or manner.” Propaganda is the use of powerful rhetorical forms that work on the individual to energize, promote, stimulate, and determine ideologies. Propaganda can mobilize support for political action as well as pressure for the legislation and implementation of specific policies.

    Meme wars are a consistent feature of our politics, and they’re not just being used by internet trolls or some bored kids in the basement, but by governments, political candidates, and activists across the globe. Russia used memes and other social-media tricks to influence the US election in 2016, using a troll farm known as the Internet Research Agency to seed pro-Trump and anti-Clinton content across various online platforms. Both sides in territorial conflicts like those between Hong Kong and China, Gaza and Israel, and India and Pakistan are using memes and viral propaganda to sway both local and international sentiment. – Joan Donovan

What makes memes more insidious is that the propaganda is administered by some of a person’s most trusted friends. The goal of a meme is to to spread rapidly through a population, to go viral. This is a diffusion process where a meme is shared from person-to-person or person-to-group, despite the existence of weak links. The goal is to reach a point of exponential growth for the meme’s exposure and influence.

The success of the virality depends on a high “pass-along rate” where individuals are motivated to share their meme to others. A common communication network facilitates virality as a meme may be “retweeted” in the Twitter environment or “shared” on Facebook. It helps if the process is easy – clicking a button rather than cutting and pasting. This is setting the bar low, but decisions are made very quickly and dependent on relatively weak motivations.

An important measure is the virality rate, the number of people who went on to share your meme compared to the number of unique views or impressions it had during a certain period. You can get this metric by dividing the number of total shares of the meme by the number of impressions. Multiply that figure by 100 to get your virality rate percentage. The higher the percentage, the better the meme.

Memes can be created with a program like Adobe’s Photoshop or Adobe Spark. You can also use a specialized meme generator application on the Internet or your mobile phone, such as Word Swag.

Memes are designed to crystallize or fix sets of meanings that Postman argued causes us react quickly and emotionally. They draw on bits of culture, such as slogans, cartoon images, and brand items that are small and easily remembered. They are packaged semiotically with images and text juxtaposed in ways that invite us to construct more complex associations. They are usually structured enough to draw us into some preferred meanings, yet evocative yet enough to draw on the reader’s string of cultural, economic or political associations. Memes are usually vague enough to leave much for our imaginations to interject.

Memes are much like posters in that they remove authorship. The effect can be ominous, creating an anonymous yet authoritative “voice of God.” No one “has to answer for transgressive or hateful ideas.” Memes can weaponize half-truths, lies, and inappropriate material easily and diffuse them quickly through society.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. He has also taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

Letteracy and Logos

Posted on | May 5, 2020 | No Comments

More than ever, people are interested in how visual design influences the production of meaning as well as its intellectual emotional effects. Typography is the design and arrangement of letters and text so that the writing is easy to understand, appealing, and conveys an appropriate set of feelings and meanings. A logo is the graphic signature of a person or organization that is meant to encapsulate and communicate the preferred symbolic meanings of an organization.

Below is one of my favorite TED talks about typography.

This blog post discusses both the importance of typography in visual design such as in a magazine or webpage layout as well as in the use of logos. A new type of literacy in the new media age has emerged that Seymour Papert (1993) and others began to call “letteracy.” Papert was critical of the idea of introducing letters too early in a child’s development, but recognized that connecting with culture and history required alphabetical literacy.

“Letteracy” suggests a larger conversation about global visual culture and why people are increasingly more interested in the impact of typography in our media world. A twist on “literacy,” it points to discrepancy between a world in which reading is pervasive, and the relative ignorance of how letters are designed and have an influence on us. One of the first questions to ask is “What are letters?”

Capturing Sound

Letters are phonographic – they code the sounds of language in scripted figures. A few writing systems like Chinese characters are ideographic, they code ideas into their figures. Phonographic writing has the advantage of coding everyday language in their letters while being flexible enough to incorporate new words. Ideographic writing requires extensive memorization and social mentoring to enforce meanings and consistency in sound reproduction.

Asian societies like Korea, and to a lesser extent, Japan, have replaced Chinese characters with the phonographic characters. Korea instituted “Hangul” that is phonographic but with some iconic aspects. The characters represent oral movements of the tongue and lips used to achieve those sounds. The change allowed Korea to achieve a high rate of the population, achieving reading literacy. Japan has two sets of phonographic characters, hiragana, and katakana. These both are sound based, but each character represents a whole syllable – the vowel and the consonant. To make the situation a bit more complicated, they still use “Kanji” ideographic characters borrowed from China.

From Printing Press to Desktop Publishing

Johannes Gutenberg is credited with inventing both the printing press and the production of durable typefaces around 1460 AD. The technology had also been developed in China and Korea, but conditions in Europe were better for its expansion. Printing presses in China and Korea were state-based projects that eventually withered. Conversely, religious, market, and political conditions in Europe improved their chances of success.

The first best-seller? The Christian Bible. In 1517, the Protestant revolution began that emphasized reading of the Bible over the services of the Catholic church and its priests. It also helped Europe develop separate nation-states as people became more literate in their local languages. Printed materials in different dialects began to coagulate community identities. People began to identify with others who spoke the same dialect and recognize them as sharing the same national values. Benedict Anderson called these “imagined communities” in the book by the same name.

Thanks to Steve Jobs and the Apple Macintosh graphical user interface, different typefaces were added to computers. Along with WYSIWYG display, the new GUI enabled desktop publishing. This democratized the printing “press.” Consequently, understanding the importance of different styles of letters became an important literacy of the digital age.

Part of this literacy is an understanding the various meanings associated with typography. Type fonts can be designed and used with various purposes in mind. The “Power of Typography” video above explains in more detail.

Share

Ⓒ ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, Ph.D. is Professor of the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, Victoria University in New Zealand. He keeps his American home in Austin, Texas and has taught there in the Digital Media MBA program at St. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.

US INTERNET POLICY, PART 2: THE SHIFT TO BROADBAND

Posted on | March 24, 2020 | No Comments

This post is second in a series that I am producing during the COVID-19 pandemic about the importance of telecommunications policy in ensuring the widespread availability of affordable high-speed Internet access. Teaching online and working from home have gone from fringe activities to be central components of life. As we move to a Smart New Deal to transform American life, how we structure our digital environments will be central. This post discusses the transition from dial-up modems in the early days of the Internet to high-speed broadband connections. With that technical transition and the FCC’s 2005 decision, the competitive environment that the Clinton-Gore administration built, collapsed into the cable-telco duopoly we see today and made “net neutrality” an issue.

The Internet Service Providers (ISPs) mentioned in US Internet Policy, Part 1 facilitated the process of getting individuals, businesses and government “online” by linking to the Internet backbone and going “retail” with dial-up modems and then high-speed broadband connections. The term “online” emerged as way to distinguish data communications from telephony, which was highly regulated by the Communications Act of 1934. ISPs offered businesses and consumers high-speed data services for accessing the World Wide Web, hosting websites, and providing large file transfers (FTP). The key was accessing the rapidly expanding Internet packet-switching backbone network that had been developed by the National Science Foundation (NSF).

The National Science Foundation’s backbone network (NSFNET) began data transmissions at 56 kilobits per second (Kbit/s) but was upgraded to T1 lines in 1988, sending at 1.544 megabits per second (Mbit/s). It eventually consisted of some 170 smaller networks connecting research centers and universities. In 1991, the NSFNET backbone was upgraded to T3 lines sending data at 45 Mbit/s. From 1993, NSFNET was privatized, and a new backbone architecture was solicited, that incorporated the private sector.

The next-generation very-high-performance Backbone Network Service (vBNS) was developed as the successor to the NSFNet. vBNS began operation in April 1995 and was developed with MCI Communications, now a part of Verizon. The new backbone consisted primarily of glass Optical Carrier (OC) lines, each of which had several fiber-optic cables banded together to increase the total amount of capacity of the line. The interconnected Optical Carrier (OCx) lines operated at 155 Mbit/s and higher. These high-speed trunk lines soon multiplied their capabilities from OC-3 operating at 155 Mbit/s, to OC-12 (622 Mbit/s), OC-24 (1244 Mbit/s), and OC-48 (2488 Mbit/s). By 2005, OC-48 was surpassed by OC-192 (9953.28 Mbit/s) and 10 Gigabit Ethernet.

As part of NSFNET decommission in 1995, these backbone links connected to the four national network access points (NAPs) in California, Chicago, New Jersey, and Washington D.C. The backbone expanded to multiple carriers that coordinated with ISPs to provide high-speed connections for homes and businesses.

At first consumers used analog dial-up modems over the telephone lines at speeds that increased to 14.4 kilobits per second (Kbit/s or just k) by 1991 to 28.8 kbit/s in 1994. Soon the 33.6 Kbit/s was invented that many thought to be the upper limit for phone line transmissions. But the 56K modem was soon available and a new set of standards continue to push speeds of data over the telephone system. The 56K modem was invented by Dr. Brent Townshend for an early music streaming service. This new system avoided the analog to digital conversion that seriously hampered data speeds and allow content to be switched digitally to the consumer’s terminal device, usually a PC.

Also, during the 1990s, the telcos were conducting tests using a new technology called ADSL (Asynchronous Digital Subscriber Line). It was initially designed to provide video over copper lines to the home. Baby Bells, in particular, wanted to offer television services to compete with cable television. It was called asynchronous because it could send data downstream to the subscriber faster (256 kbit/s-9 Mbit/s) than upstream (64 Kbit/s-1.54 Kbit/s) to the provider.

ADSL was able to utilize electromagnetic frequencies that telephone wires carry, but don’t use. ADSL services separated the telephone signals into three bands of frequencies-one for telephone calls and the other two bands for uploading and downloading Internet activities. Different versions and speeds emerged based on the local telco’s ability and willingness to get an optical fiber link close to the neighborhood or “to the curb” next to a household or business location.

They were soon called Digital Subscriber Lines (DSL), and they began to replace dial-up modems. High demand and competition from cable companies with high-speed coaxial lines pressured ISPs and telcos to adapt DSL technologies. DSL and new cable technologies that carried Internet traffic, as well as television, came to be collectively called “broadband” communications.

Internet traffic grew at a fantastic rate during the late 1990s as individuals and corporations rushed to “get on the web.” The rhetoric of the “new economy” circulated and fueled investments in web-based companies and telecommunications providers.

A temporary investment bubble emerged as many companies lacked the technology or business expertise to obtain profits. Dot.coms such as Drkoop.com, eToys.com, Flooz.com, GeoCities, Go.com, Kozmo.com, Pets.com, theGlobe.com, and Webvan.com failed for a variety of reasons but mainly flawed business plans and the premature expenditure of investment capital.

Similarly, many carriers such as Global Crossing, WorldCom, and ISPs overestimated web traffic and built excess capacity. In the wake of the dot.com crash in 2000 and the telecom crash in 2002, many ISPs filed for bankruptcy, including Wall Street darlings like Covad, Excite@home, NorthPoint, PSINet, Rhythms NetConnections, and Winstar Communications.

The broadband industry changed significantly after the 2000 election. The technological infrastructure was significantly devastated by the dot.com crash of 2000 and the telecom crash of 2002.

Furthermore, Internet policy changed when the Bush administration was reelected in 2004. The FCC revoked Computer II in 2005 when it redefined carrier-based broadband as an information service.

This meant that broadband was effectively not regulated and telcos could go on to compete with the ISPs. Instead of offering backbone services and being required to interconnect with the ISPs, they became ISPs and were no longer required to provide ISPs their connection to the Internet. The competitive environment that nurtured Internet growth was effectively decimated.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

US Internet Policy, Part 1: The Rise of ISPs

Posted on | March 15, 2020 | No Comments

Much of the early success of the Internet in the USA can be attributed to the emergence of a unique organizational form, the Internet Service Provider or “ISP,” which became the dominant provider of Internet and broadband services in the 1990s. These organizations resulted from a unique set of regulatory directives that pushed the Internet’s development and created a competitive environment that encouraged the proliferation of ISPs and the spread of the World Wide Web.

In this series on Internet policy, I look at the rise of the World Wide Web, the shift to broadband, deregulation, and consolidation of broadband services. Later in the series, I address the issue of net neutrality and raise the question, “Is Internet service a utility? Is it an essential service that should be made universally available to all Americans at regulated prices?

The first ISPs began as US government-funded entities that served research and education communities of the early Internet. Secured by Al Gore in 1991, legislation signed by President George H. Bush created the model of the National Research and Education Network (NREN), a government-sponsored internet service provider dedicated to supporting the needs of the research and education communities within the US. Internet2, Merit, NYSERNET, OARnet, and KanRen were a few of the systems that provided schools, and other non-profit organizations access to the World Wide Web. While dialup services like Compuserve existed in the early 1980s, only later were the ISPs released for commercial traffic and services.

While telecommunications carriers had been moving some Internet traffic since the late 1980s, their role expanded dramatically after the Internet began to allow commercial activities. In June of 1992 Congressman Rick Boucher (D-Va) introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the US National Science Foundation Network (NSFNET). “A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law.” The amendment allowed advertising and sales activities on the NSFNET.

As part of the National Information Infrastructure (NII) plan, the US government decommissioned the US National Science Foundation Network (NSFNET) in 1995. It had been the publicly financed backbone for most IP traffic in the US. The NII handed over interconnection to four Network Access Points (NAPs) in different parts of the country to create a bridge to the modern Internet of many private-sector competitors.

These NAPS contracted with the big commercial carriers such as Ameritech, Pacific Bell, and Sprint for new facilities to form a network-of-networks, anchored around Internet Exchange Points (IXPs). The former regional Bell companies were to be primarily wholesalers, interconnecting with ISPs. This relatively easy process of connecting routers was to put the “inter” in the Internet but also became sites of performance degradation and unequal power relations.

As the Internet took off in the late 1990s, thousands of new ISPs set up business to commercialize the Internet. The major markets for ISPs were: 1) access services, 2) wholesale IP services, and 3) value-added services offered to individuals and corporations. Access services were provided for both individual and corporate accounts and involved connecting them to the Internet via dial-up, ISDN, T-1, frame-relay or other network connections. Wholesale IP services were primarily offered by facilities-based providers like MCI, Sprint, and WorldCom UUNET (a spinoff of a DOD-funded seismic research facility) and involved providing leased capacity over its backbone networks. Value-added services included web-hosting, e-commerce, and networked resident security services. By the end of 1997, over 4,900 ISPs existed in North America, although most of them had fewer than 3,000 subscribers.[2] See the below video and this response for how much things have changed.

FCC policy had allowed unlimited local phone calling for enhanced computer services and early Internet users connected to their local ISP using their modems over POTS (Plain Old Telephone System). ISPs quickly developed software that was put on CD-ROMs that could be easily installed on a personal computer. The software usually put an icon on the desktop screen of the computer that when clicked on would dial the ISP automatically, provide the password, and connect the user to Internet. A company called Netscape created a popular “browser” that allowed text and images to be displayed on the screen. The browser used what was called the World Wide Web, a system of accessing files quickly from computer servers all over the globe.

The ISPs emerged as an important component to the Internet’s accessibility and were greatly aided by US government policy. The distinctions made in the FCC’s Second Computer Inquiry in 1981 allowed ISPs to bypass many of the regulatory roadblocks experienced by traditional communication carriers. Telcos were to provide regulated basic services and “enhanced services” were to stay unregulated. Schiller explained:

    Under federal regulation, U.S. ISPs had been classed as providers of enhanced service. This designation conferred on ISPs a characteristically privileged status within the liberalized zone of network development. It exempted them from the interconnection, or access, charges levied on other systems that tie in with local telephone networks; it also meant that ISPs did not have to pay into the government’s universal service fund, which provided subsidies to support telephone access in low-income and rural areas. As a result of this sustained federal policy, ISPs enjoyed a substantial cross-subsidy, which was borne by ordinary voice users of the local telecommunications network.[3]

ISPs looked to equip themselves for potential new markets and also connect with other companies. For example, IBM and telecom provider Qwest hooked up to offer web hosting services. PSINet bought Metamor to not only transfer data but to host, design, and move companies from the old software environment to the new environment. ISPs increasing saw themselves as not only providers of a transparent data pipe but also as a provider of value-added services such as web hosting, colocation, and support for domain name registration.

The next part of this series will discuss the shift to higher speed broadband capabilities. Later, the consolidation of the industries starting in 2005 when the FCC changed the regulatory regime for wireline broadband services.

Notes

[1] Hundt, R. (2000) You Say You Want a Revolution? A Story of Information Age Politics. Yale University Press. p. 25.
[2] McCarthy, B. (1999) “Introduction to the Directory of Internet Service Providers,” Boardwatch Magazine’s Directory of Internet Service Providers. Winter 1998-Spring 1999. p. 4.
[3] Schiller, D. (1999) Digital Capitalism. The MIT Press. p. 31.
Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Undergraduate Director at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

keep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from http://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Calendar

    November 2020
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    30  
  • Pages

  • November 2020
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    30  
  • Flag Counter
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.