The International Politics of Domain Name Governance, Part Two: ICANN and the Clinton-Gore Administration
Posted on | November 12, 2020 | No Comments
This post is the second in a series about the global politics of domain name registration and management. Domain names are critical identifiers of web resources that facilitate easy access for users. Part One about Jon Postel discussed the heroic but ad hoc process of managing addresses in the earliest days of the Internet.
This post describes the Clinton-Gore administration’s endeavors to internationalize the Internet through the Global Information Infrastructure (GII) and the WTO while maintaining control of the DNS through the Internet Corporation for Assigned Names and Numbers (ICANN). This stance was in the face of calls that an international organization like the International Telecommunications Union would better manage control of the Internet. By 1997, ICANN agreed to coordinate the Internet domain name system (DNS) and set up a network of computer databases/servers worldwide to keep track of IP addresses and help users make quick connections to requested sites.
As the World Wide Web (WWW) emerged with the invention of the hypertext transfer protocol (HTTP), management of the domain name system became crucial for e-commerce and other uses of the Internet. It also became a controversial issue for international politics. The Clinton-Gore administration saw the Internet as a major opportunity for the US, but also a historically tricky infrastructure to manage, including complications with other countries.
The monopoly for the domain name registry system was turned over to InterNIC (Internet Network Information Center) in 1995. InterNIC was a subsidiary of Scientific Applications International Corporation (SAIC), a private company heavily engaged in activities for the Pentagon and the National Security Agency (NSA). Led by a board of ex-NSA, CIA, and DoD officials, the company made money from issuing customized Internet addresses.
These domain names became very valuable as the WWW and its “dot.com” economy started to expand rapidly in the mid-1990s. The commercialization of the NSFNet in 1992 and the introduction of the Mosaic browser in 1994 spread the hope of a “new economy.” The following year, the highly successful Netscape IPO, based on another successful browser, unleashed new investment in high technology and Internet stocks.
The Clinton-Gore administration became particularly aggressive in creating the Internet’s policy framework for domestic and international expansion and commerce. Branded initially in 1993 as the National Information Infrastructure (NII) and later the Global Information Infrastructure (GII) in 1994, the new vernacular by Vice-President Gore allowed for a government interventionist approach. The GII was a conceptual framework to challenge telecommunications companies worldwide to pave the way for data communications and all the related services promised by ISDN.
At home, they pushed an enabling framework for the NII that encouraged private investment; promoted and protected competition; and provided open access to the Internet by consumers and service providers. This approach also emphasized advancing universal service to avoid the digital divide – a society of information “haves” and “have nots.”
Internationally, their work to set up the World Trade Organization (WTO) facilitated the modernization of telecom networks worldwide and broke down the tariff barriers to global IP. In his speeches to the ITU and the GATT (General Agreement on Tariffs and Trade) in 1994, Gore set up the conditions for the World Wide Web we know today. Gore traveled to Marrakesh, Morocco, and at the closing meeting of the Uruguay Round of the GATT negotiations and called for creating a World Trade Organization (WTO).
The WTO was one of the original objectives of the New Deal’s Bretton Woods agreements at the end of World War II but never received US Congressional approval. However, on November 29, 1994, a bi-partisan vote in Congress allowed the bill to move to the Senate that year and the WTO was approved 76-24 on December 1. The WTO would quickly conclude two historical agreements that liberalized global trade in information technology (1996) and telecommunications trade (1997).
In 1996, Ira Magaziner had set up an interagency group to study domain names as part of his responsibility in the Clinton-Gore administration for international trade. Magaziner’s position paper was released as “A Framework for Global Electronic Commerce” announced by President Clinton and Vice President Al Gore at a public event on July 1, 1997 and became the basis for the Administration’s policy of managed liberalization for e-commerce and the management of the domain name system.
The Clinton-Gore administration wanted to hold off efforts by the United Nations and its International Telecommunications Union (ITU) to manage the Internet. They valued the international organizations but felt the Internet required a more dynamic organizational structure to facilitate its complex growth. Other nations, particularly the BRICs (Brazil, Russia, India, China), questioned the efficacy of US management of the World Wide Web. The US stood its ground however, and staked its claim for control over the Internet.
Magaziner reflected on the problems facing a growing Internet at the time. Governments wanted to tax transmission bits, place tariffs on electronic commerce, and censor the Internet. Debates on digital signatures, regulating prices, and intellectual property (IP) issues such as domain name trademarks were also coming to the fore.
“For this potential to be realized fully,” the draft report stated, “governments must adopt a nonregulatory, market-oriented approach to electronic commerce, one that facilitates the emergence of a transparent and predictable legal environment to support global business and commerce. Official decision makers must respect the unique nature of the medium and recognize that widespread competition and increased consumer choice should be the defining features of the new digital marketplace.”
The ITU had been an essential “club” for the world’s telecommunications agencies to coordinate technical standards for telephony and electromagnetic spectrum allocations. But as a one country, one vote organization, the U.S. was vulnerable to ITU decisions. And that meant its businesses were vulnerable too. On May 1, 1997, eighty organizations supported a Memorandum of Understanding (MoU) addressing the way generic Top Level Domains (gTLDs) were allocated and managed. An International Ad Hoc Committee (IAHC) was established to address perceived problems with the current method of registering generic top level domains on the Internet.
In July 1997, President Clinton issued an executive order to privatize domain name management and in September 1997, Network Solutions (NASDAQ: NSOL) had an initial public offering (IPO) and became a public company. In the first five months of 1998, Network Solutions Inc. (NSI) registered more than 340,000 domain names, an increase of 73 percent from the same period in 1997.
But the company was overwhelmed by the extraordinary growth of the Internet. Registration systems and billing lacked the ability to keep up with volume of domain name requests. NSI was losing its near-monopoly over the domain name business and the company began preparing for a new competitive environment. Still at issue was whether Internet oversight was going to eventually move from U.S. control to an international body.
In late 1998, The U.S. Clinton-Gore administration introduced a new domain name system to encourage competition and effectively manage the DNS. The U.S. Department of Commerce took ownership of the process. Ira Magaziner and others helped design a new organization called ICANN, the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN was created as a not-for-profit company to administer and help set policy from the bottom-up for the Internet name and address system.
ICANN received preliminary approval from the Commerce Department to manage the Internet domain name system (DNS) in November 1998. The two organizations signed a Memorandum of Understanding (MOU) that provided for the DNS management’s gradual privatization. This involved deploying a network of computer database/servers worldwide to keep track of IP addresses and facilitate the quick connection of domain names to requested sites. Also, a dispute resolution system to resolve issues regarding the ownership of a domain name was set up.
In the next post, I explore ICANN’s transition to a global multistakeholder community management and the end of the Commerce Department’s National Telecommunications and Information Administration (NTIA) management of DNS.
Citation APA (7th Edition)
Pennings, A.J. (2012, Nov 12). The International Politics of Domain Name Governance, Part Two: ICANN and the Clinton-Gore Administration. apennings.com https://apennings.com/global-e-commerce/the-international-politics-of-domain-name-governance-part-two-icann-and-the-clinton-gore-administration/
Notes
[1] Drezner, D. (2004). The Global Governance of the Internet: Bringing the State Back In. Political Science Quarterly, 119(3), 477-498. doi:10.2307/20202392
[2] Drezner, Daniel W. All Politics Is Global: Explaining International Regulatory Regimes. Princeton, N.J.: Princeton U, 2008. Print. Chapter on “Global Governance of the Internet.” http://press.princeton.edu/titles/8422.html
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Clinton-Gore Administration > DNS > Domain Name System > ICANN > Ira Magaziner > Jon Postel > WTO agreement on basic telecommunication services
Russia, the Fall of the USSR, and the Era of Pan-Capitalism
Posted on | October 9, 2020 | No Comments
The fall of the Berlin Wall and the disintegration of the Communist Union of Soviet Socialist Republics (USSR) bloc meant the world was no longer significantly divided by Cold War antagonisms. Communist China had already embraced market dynamics and global trade. This post briefly discusses the breakup of the USSR and the globalization of carbon-based digital capitalism leading to a new Russia headed by Vladimir Putin and the oligarchs.[1]
Although a Communist bloc, the USSR had become deeply indebted to the global banking system, exerting additional pressure on a system already addicted to military spending. Especially after oil was discovered in the Samotlor Field in 1969, the USSR had wanted to take advantage of the 1970s oil crises and invest in exploiting the vast gas and oil in Siberia and even the Artic. Imports of oil and gas equipment from 1970 to 1983 increased eighty times in terms of value.
But it needed US dollars and other strong currencies to invest in its petro industries and procure the needed expertise and technology.[2]
The term “Eurodollar” reportedly gets its name from a Russian bank in France that was laundering dollars after the Communist Revolution in 1949. The cable address of the bank happened to be “Eurobank.” The Russians also placed their dollars in the Paris-based Russian Banque Commerciale pour l’ Europe du Nord and the Moscow Narodny Bank in London. It was soon traded by other European banks and purportedly took the name “Eurodollar” from the cable address in Paris.
As mentioned in my earlier work on digital monetarism, Eurodollars were the prime credit vehicle for recycling OPEC’s petrodollars worldwide during the 1970s. Through syndicated lending, banks lent Eurodollars excessively to many nation-states, including those in the USSR.
The result of petrodollar recycling was the “Third World Debt Crisis,” that created havoc throughout the 1970s and into the 1980s. Debt put pressure on public resources that were often transformed into state-owned enterprises (SOEs) and eventually sold off to investors. Excessive debt led to an era of imposed deregulation, liberalization, and privatization. Although painful, it opened up the telecommunications world to the Internet and its World Wide Web applications.
The “official” start of the world debt crisis can be traced to the March 1981 announcement by Poland that it could not meet its payment commitments. Previously, banks found it easy to reschedule old debt and lend them new Eurodollars. An “umbrella theory” circulated which held that the Soviet Union would guarantee the debts of the countries in its sphere. But the USSR was having economic problems that went unnoticed by the banks. They still retained a high credit rating and the banks continued to pour money into it.
By 1984, the Communist bloc was gathering significant debt and the economy was faltering, largely due to the drop in oil prices. Defecting spies were reporting that the USSR was a mess. Workers were unmotivated because the store shelves were empty. Lines to purchase scarce goods were everywhere. Nearly half the Russian economy was devoted to military spending and the other half producing shoddy and scarce consumer goods, determined and designed by Communist committees.
With Reagan’s “Star Wars,” Premier and Communist Secretary-General Mikhail Gorbachev knew that the USSR could not keep up with the capitalist world’s innovation and spending. He pleaded with Ronald Reagan at a Reykjavík Summit of 1986 to give up the militarization of space and instead work to reduce nuclear weapons. But Reagan refused. So Gorbachev instead began a public relations campaign to encourage more debate about the USSR, and it’s political and economic future.
In June of 1989, Gorbachev made the call to Poland to tell their Communist party leaders to accept the results of their democratic election. The decision started the processes of glasnost and perestroika throughout the Soviet system. The first term meant political openness – media freedom, democratization of power, and the release of political prisoners. The second meant gradually allowing entrepreneurial activities, reforming state-owned industries, and privatizing government assets.
Deng Xiaoping had already started to reform Communism in China during the 1980s with his “socialism with Chinese characteristics.” Deng and other post-Mao Communist leaders argued that “China had mistakenly entered into Communism during its feudal stage instead of waiting until advanced capitalism, as Marx had theorized. Private ownership and a market economy were suddenly embraced as solutions, not problems. This allowed the Chinese Communist Party to legitimize both its turn to market capitalism and the continuance of its political control over the country through Marxist ideology.”[3]
The fall of the Berlin Wall in 1989 began the process of dismantling the Warsaw Pact and, with it, the USSR Communist bloc. Started in 1955, the Warsaw Pact was initially a defense treaty among Albania, East Germany, Poland, Hungary, Romania, Bulgaria, Czechoslovakia, and Russia. But after East Germany left, the other countries clamored to leave as well.
Czechoslovakia and the Baltic states (Estonia, Lithuania, and Latvia) soon declared their independence from the USSR along with the Republic of Belarus and the Ukraine. Some joined Azerbaijan, Kazakhstan, Kyrgyzstan, Moldova, Turkmenistan, Tajikistan, and Uzbekistan to create the Commonwealth of Independent States (CIS).
President George H. Bush met with Gorbachev in early December 1989, just a month after Europe’s “9/11” dissolving of the barriers between East and West Germany. Meeting in Malta, they resumed START negotiations on nuclear arms control as well as came to agreement on how conventional forces would dismantled in Europe. Gorbachev’s decision to allow a multi-party system and presidential elections in Russia also began to destabilize Communist control and contributed to the collapse of the Soviet Union.
A coup was engineered by Communist hardliners in August of 1991 and although it failed, Gorbachev resigned by Christmas. But not before dissolving the Central Committee of the Communist Party of the Soviet Union and resigning as its Secretary General. Also in 1991, the Soviet military relinquished control over the other militaries. Russia also agreed to take on USSR debt held by the USSR, in excess of US$70 billion.
By promising glasnost and perestroika, Gorbachev changed the political dynamic of a dying system. It was the promise of a political and legal infrastructure for a democratic and market political economy integrated into the world system. The process accelerated in Russia with the election of Boris Yeltsin in 1991 as the first President of the new country. Yegor Gaidar, an economist known for pushing free markets, became the Prime Minister. Yeltsin worked with a group of opportunistic Russians to outmaneuver the Communist directors of the USSR economy to take control of major industries, many going on to become billionaires, the so-called “oligarchs.”
In the first few months of 1992, the new government freed prices and legalized entrepreneurial activity in a process called “shock therapy” – rapid liberalization of the economy. By 1994, Yeltsin worked with Russian banks to raise cash to help privatize major companies. The Russian “loans for shares” program lent the government money in exchange for temporary shares in state-owned companies. When the government defaulted in 1995, they auctioned off major stakes in companies involved in aluminum, oil, nickel, and other important resources as well as food production, telecommunications, and media.
The end of the Warsaw Pact signaled a new liberalization by Moscow and the satellite countries of the USSR. However, it was displaced by economic “shock therapy” – severe austerity and privatization that crippled economic recovery. The resultant chaos led to the return of the Russian “strong man.” Vladimir Putin became the President of Russia in 1999.
Notes
[1] I use the term digital capitalism here as many parameters operate to shape types of capitalism.
[2] The USSR went heavily in debt to buy equipment for its nascent oil industry.
[3] Pennings, A. (2014, April 22). E-Commerce with Chinese Characteristics. Retrieved from https://apennings.com/global-e-commerce/china-e-commerce-ipo/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Boris Yeltsin > Deng Xiaoping > eurodollars > shock therapy > state-owned enterprises (SOEs) > USSR > Warsaw Pact > Yegor Gaidar
The International Politics of Domain Name Governance, Part One: The Czar
Posted on | October 1, 2020 | No Comments
An ongoing global concern deals with the control and management of the World Wide Web’s Domain Name System (DNS). The basic issue is whether the address you put into your browser connects to the right computer and retrieves the right information. When you type in apennings.com, for example, how does it get to my blog, and how can you be sure you are getting the right site? What if you typed in google.com, and it was directed to yahoo.com? (just an example) Or if you typed in Amazon.com and it was directed to a site for Barnes and Noble or some other bookseller? These scenarios are possible if the domain name system is not managed correctly.
The Domain Name System (DNS) is a server service that matches website addresses to the right computer. As the Internet has grown exponentially and globally, governance and management issues continue to be complicated and contentious. What is at stake? In this post I look at the beginning of the DNS and the influence of Jon Postel.
It was recognized early on that managing Internet addresses would be a global concern. Internet traffic was increasing domestically and across borders. Decentralization provided the technical and operational strategy to globalize the Internet and its World Wide Web (WWW). It would provide quicker responses and decrease network traffic congestion. Maintenance issues, including redundancy and backing up systems, were easier to manage. Globalization of the Internet, however, raised other issues.
Daniel Drezner identified three reasons to be concerned about the governance of the Internet. The first was that an “actor” such as a government, corporation, or NGO could take over the Internet. Any actor that could benefit from controlling the connections between users and the sites they want to visit should would certainly undergo scrutiny on the matter. Second, it was important for a legal system to be created to ensure that trademarked names were not captured and monopolized by “cybersquatters,” who could withhold or use important trademarked names such as “mcdonalds.com” or “toyota.com.” Also, a lot of money was at stake in the creation of domain names. Little cost is involved in the production of domain names. Providing domain names is like printing money in some respects.[1]
So when you type in the address of the website you want to access, DNS makes sure you make the connection and find the right file. ARPANET, the original Internet that came to life in September 1969, first addressed the issue in the early 1970s. Jon Postel of the University of Southern California (USC) in Los Angeles took up the challenge and was eventually given the nickname “God” because of his power over the early Internet’s addressing system. Postel started with writing addresses on scraps of paper and would continue until a global network was established.
Postel’s influence ranged from its inception to his death in 1998. On March 26, 1972, Postel started collating a catalog of numerical addresses like 123.47.17.49. He asked network administrators to submit information on socket numbers and network service activities at each host computer. He worked with the Stanford Research Institute (now SRI International) to develop a simple text file called HOSTS.txt that tracked hostnames and their numerical addresses. Published as RFC 433 in December 1972, it proposed a registry of port number assignments to network services. He also called himself the “czar” of socket numbers as he pledged to keep a list of all addresses. SRI would distribute the list to all Internet hosts.
The Domain Name System (DNS) was primarily designed by Paul Mockapetris of the Information Sciences Institute at the USC. It was adopted by ARPANET in 1984. The ARPA DNS originally consisted of six different Top Level Domain (TLDs) types: .com (commercial), .edu (education), .gov (government), .mil (military), .net (network provider), and .org (organization). The designation of domain names below them, like hawaii.edu or ecommerce.gov, were left to the discretion of the administrators of the various networks. As the Internet expanded globally, a two-letter suffix such as .nl for the Netherlands, or .nz for New Zealand and .kr for South Korea was allowed individual countries. The first domain name was reportedly symbolics.com, registered through the DNS on March 15, 1985.
In 1988, the U.S. gave the DNS contract to USC’s Information Sciences Institute (ISI). This gave Mockapetris and Postel the opportunity to continue to work together and with SRI International. They continued the functions of address management in what became known as The Internet Assigned Numbers Authority (IANA) that continues to this day. IANA was funded by the U.S. government under a contract with the Defense Advanced Research Projects Agency (DARPA). At this time, the Internet started to expand rapidly in the U.S., and abroad.
The next post on DNS will discuss how the Clinton administration played a significant role in shaping the development and governance of the Domain Name System (DNS) during the 1990s.
Notes
[1] Drezner, D. (2004). The Global Governance of the Internet: Bringing the State Back In. Political Science Quarterly, 119(3), 477-498. doi:10.2307/20202392
Citation APA (7th Edition)
Pennings, A.J. (2020, Oct 1). The International Politics of Domain Name Governance, Part One: The Czar. apennings.com https://apennings.com/global-e-commerce/the-international-politics-of-domain-name-governance-part-one-the-czar/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea. He teaches digital economics, broadband policy, networked robotics, as well as ICT and Sustainable Development. Before joining SUNY Korea, he spent most of his career on the faculty of New York University. Previously, he also taught at Marist College in New York, and Victoria University in New Zealand.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: DNS > Domain Name System > Paul Mockapetris > Postel > Top Level Domain (TLDs)
Digital Disruption in the Film Industry – Gains and Losses – Part 2: Non-Linear Editing
Posted on | August 26, 2020 | No Comments
In 1999, The Matrix (1999) won 4 Oscars with the Avid, a non-linear video editing computer system, that launched a revolution in the production of audio-visual materials. A new years before, James Cameron edited much of Titanic (1997) himself with an Avid application on a computer at his house.
This post returns to the theme of digital disruption in the film industry. In Part 1, digital cameras were discussed. Below, I examine the transformation of post-production practices with the advent of Non-linear Editing (NLE) and digital-enabled special effects (F/X), paying particular attention to the introduction of the Avid NLE. These technologies came about when computer micro-processing power was sufficiently miniaturized, and software applications became efficient enough for immediate interaction.
To get a quick overview of the digital transformation in the film industry, this quick summary of Keanu Reeves’ documentary Side by Side (2012) is useful:
NLE clearly came about because customers were often under-served by traditional film/video editing technology. While chemically-coated film had been used for over a hundred years, it required a massive industry to operate and thrive. Cameras and their film reels were heavy and difficult to transport. Silver oxide film is dangerous and complicated to develop.
Cameras would capture the action on film but the film could not be viewed until after the film was processed. The director and his team would gather after the shooting and view the “dailies.” They would make decisions on which film was good and any adjustments on camera lighting or style.
Editing was a manual chore requiring sifting through scores of film reels to find the right clips. It was a laborious activity that often left editors with bloody fingers. The final project required literally cutting the film and taping them together.
Finally, only a few techniques existed to add special effects to the film in post production.
Early NLE technology was cumbersome and weak, but it progressed rapidly. This is in line with Clayton Christensen’s theory of innovative disruption. While early developments in digital video were crude and presented little aesthetic and production efficiencies to challenge film, innovations increased rapidly as the PC integrated the fruits of Moore’s law, the rapid increase of transistors on a microprocessor chip. Eventually, the sophistication of digital technologies presented a major threat to the film industry.[1]
While the digital non-linear editing process was conceptualized in the early 1970s, when CBS and Memorex collaborated on the CMX 600, the technology was not practical enough for commercial use. It used washing machine-size storage disks and cost a quarter of a million dollars to record and edit low resolution black and white images.[2]
The next stage in digital editing stemmed from George Lucas’special effects empire with the invention of EditDroid. This computer used footage stored on LaserDisks. EditDroid was sold by a George Lucas spin-off company, DroidWorks. The technology did not really work very well, and the company shut down in 1987.
The major disruption came with the development of the Avid NLE system. Bill Warner, the founder of AVID Technologies, Inc. had been crippled in an accident that rendered him unable to walk. In 1984, he sought to buy digital editing equipment but was surprised that the technology didn’t really exist on the market.
Warner spent the next few years in production to create a non-linear editing system. In 1987 invented the digital editor and formed the Avid company. At first, the digital editing took place in the Apple Macintosh and then it would render the edits on tape offline by controlling a stack of regular video editing machines.
He started the company in September of 1987, but then in November, the stock market fell apart. Warner said, “I started the company in September of 1987. I got going and then in November of 1987 boom, the stock market fell apart, Black Monday, 500 point fall in the Dow and people said to me, they said, they said oh, now you’ll never raise money. And I said I just need one — I am just one person who needs money. I need $500,000. It’s still out there. I’ll get it.” He took his Avid Technology to the National Association of Broadcasters (NAB) annual trade show and found the response overwhelming.
A colleague of mine was at the 1988 NAB conference and was intrigued with the Avid. Patricia Amaral was the head of the Media Lab at the University of Hawaii and worked with Professors Stan Harms and Dan Wedemeyer to get the Avid NLE. In 1989, the University of Hawaii was the first educational institution to purchase the first Avid.
I was intrigued when I first saw the AVID non-linear digital editing system as a PhD student at the University of Hawaii in 1989. It was a clunky system but connected to a new Apple Mac that brought up thumbnails of all the clips that had been digitized. Using a mouse and keyboard, the clips could be edited/assembled and transformations added, but it took a while for video assemblage because the editing technology still required the actual rendering to take place using traditional 3/4 inch tape drives. At the end of the day they would start their laborious task and in the course of a few hours, they would finish their edits.
How does it fit the requirements of Christensen disruptive innovation model? For Christensen, almost all disruptive innovations happen when a new entrant can enter a market, and by starting with simple technology. Eventually, they improve enough to seriously disrupt the market. He distinguishes disruptive innovation from sustaining innovation, when a company continues to improve an established product. While film editing made continuous improvements over the course of its history, NLE made transformative changes over a short period of time. The results were so weak in the beginning, that film advocates quickly dismissed it, only to be surprised when it later became competitive so quickly.[3]
Now, several NLEs compete with each other in the cinema, consumer, and television industries. Adobe Premiere Pro, Avid Media Composer, DaVinci Resolve, and Final Cut Pro X are the current leaders in the NLE technologies. These editors below discuss the current state of the NLE editing.
In an upcoming post, I will discuss digital’s influence on special effects.
Notes
[1] From Clayton M. Christensen, Michael E. RaynorRory McDonald, et al. “What Is Disruptive Innovation?” Harvard Business Review, 19 Dec. 2016, hbr.org/2015/12/what-is-disruptive-innovation.
[2] From First Cut 2: More Conversations with Film Editors
By Gabriella Oldham. p. 4.
[3] Raynor, Michael E.; Christensen, Clayton M. (2003-10-09). The Innovator’s Solution: Creating and Sustaining Successful Growth. (p. 18). Perseus Books Group. Kindle Edition.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at St. Edwards University in Austin, Texas, where he maintains his US address. From 2002-2012 was on the faculty of New York University and held his first academic position at Victoria University in New Zealand. He has also spent ten years at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Avid > Clay Christensen > DroidWorks > EditDroid > NLE > Patricia Amaral > Side by Side movie > University of Hawaii Media Lab
Oliver Stone’s Platoon: Jesus Christ Superstar vs. the Marlboro Man
Posted on | August 12, 2020 | No Comments
In Oliver Stone’s award-winning film, Platoon (1986), Charlie Sheen plays Chris Taylor, a “coming of age” infantry soldier trying to reconcile his identity between the influences of two sergeants in his US Army platoon. The setting is the Vietnam War circa late 1967. The sergeants, played William Dafoe and Tom Berenger, were directed to represent two mythical poles of ideology and belief that have come to heavily influence American political culture. I refer to these two men and the contrasting themes they represent as “Jesus Christ Superstar” vs. “the Marlboro Man.”
Platoon (1986) won Best Picture at the 1987 Academy Awards received additional awards for Best Film Editing, Best Sound, and a nomination for Best Cinematography. Oliver Stone won an “Oscar” for Directing and was nominated for Writing. Both sergeants were nominated for Best Actor in a Supporting Role, which brings me back to the two conflicting myths.
With Sgt. Elias (William Dafoe) representing “Jesus Christ Superstar” and Sgt Barnes (Tom Berenger) “the Marlboro Man,” the movie condenses a series of meanings into the two contending perspectives. These viewpoints divided America and haunt to this day its view of the war. Barnes characterizes the tension succinctly at one point, “there’s the way it ought to be, and there’s the way it is.” Barnes, who was shot seven times, including in the face, has the physical scars to represent “the way it is.”
Jesus Christ Superstar was a rock opera album that was released in 1970 on Broadway and as a movie in 1973. The film was shot in Israel and other Middle Eastern locations and was the eighth highest-grossing film of that year. It reconciled different gospels of the Bible and focused on the relationship between Jesus, Judas, and Mary Magdalene, emphasizing betrayal and picking one’s battles. It was in some ways an anthem of the time as its roll and roll music and energy resonated with the “hippie” counterculture that emerged during the height of the Vietnam War. Jesus of Nazareth, with his long hair, certainly looked the part. Stone “paints” Elias and several other soldiers with iconography from the era. Peace symbols, headbands, drugs, and Sixties music like Jefferson Airplane’s “White Rabbit” are used to represent this counter-culture.
It’s hard to portray Elias as a Jesus-like pacifist when he volunteered for three tours in the “Nam” and was a skilled and experienced soldier. But from the first scene, we see him carrying a machine gun on his shoulders like a cross and climbing up a mountainside like Jesus ascending Calvary. As Sergeant O’Neil from a third squad says about Elias after an argument, “Guy’s is in three years and he thinks he is Jesus fuckin’ Christ or something.”
Elias is portrayed as the more sensitive leader. We next encounter him helping Chris and offering to carry much of the load from his amateurishly stuffed backpack. Most importantly, he is the voice of restraint when the platoon is searching a Vietnamese village for guns and ammunition. When Sgt. Barnes shoots a Vietnamese village woman during an interrogation, Elias confronts him and initiates a fistfight.
This scene creates a tension between the two as Barnes faces a potential court marital for the murder. The conflict eventually ends up with Barnes shooting Elias during a battle with the Viet Cong. The shots don’t kill him though and as the platoon is being evacuated by helicopters he is sighted from the air being chased by Vietnamese troops. He is shot several times in the back but struggles to continue. Finally, as he falls to his knees, writhing in pain, a medium shot shows him with his arms outstretched and gaze towards the heavens, as if he was being crucified.
The Marlboro Man was another iconic figure of the Vietnam era. It became the masculine symbol of the famous cigarette brand. Invented to subvert the early impression that Marlboro cigarettes were for women, it successfully became the icon of rugged individualism and aggressive patriarchy. The first scene of Barnes shows him in the jungle with a pack of Marlboro cigarettes strapped to his helmet.
Barnes was clearly the leader of the platoon, as even the lieutenant deferred to his judgment. His first words in the movie were “Get a move on, boy” to Chris, in his Southern accent. He is regularly portrayed as the tough but competent, no-nonsense leader. At one point, while criticizing the pot smokers for what he calls their attempt to “escape reality,” he says, “I am reality.”
Oliver Stone served in Vietnam and was awarded the Bronze Star medal. The story was based roughly on his experience there. In Stone’s interview with Joe Rogan, he speaks to his respect for both sergeants. While Stone clearly favors Elias, his portrayal of Barnes is surprisingly sympathetic, and we see how both men influence Chris.
Chris arrives in Vietnam as a “cherry,” a virgin to the war experience. But after he recovers from being shot during their first ambush, he befriends a black man named King and a “surfer dude” from California named Crawford. They are all assigned to cleaning the latrines and the scene allows Chris to tell his story of why he quit college and enlisted in the Army. “I wasn’t learning anything. I figured why should just the poor kids go off to war and the rich kids always get away with it?” The others laugh off his naivety but invite him to the Feel Good Cave, a bunker where they “party” by playing music and smoking pot.
King introduces Taylor as the resurrected “Chris” to the “heads,” including those soldiers played by Johnny Depp and Forrest Whitaker. Elias is there smoking pot as well and welcomes Chris with a “hit” of marijuana blown through the barrel of a rifle. You can hear Grace Slick singing “feed your head” as Chris says he feels good and can’t feel the pain from his injury. Elias responds, “feeling good is good enough.”
Tom Berenger is masterful in his performance as Sgt. Barnes. While Elias is “partying” with the “stoners,” Barnes is listening to country music and playing cards while drinking whiskey and beer with his group. Later, after Elias is dead, Barnes goes to the Feel Good bunker to confront Elias’ friends in the platoon. With a Jack Daniels Tennessee whiskey bottle in hand, he goes on to criticize the recently departed Elias.
-
Elias was full of shit. Elias was a Crusader. Now, I got no fight with any man who does what he’s told, but when he don’t, the machine breaks down. And when the machine breaks down, we break down. And I ain’t gonna allow that in any of you. Not one.
The scene ends with Chris attacking Barnes, who quickly subdues the young soldier. He is convinced not to kill Chris as he would face ten years in military prison for killing an enlisted man.
In a later battle, the platoon is overrun with Viet Cong, and an airstrike is called in to bomb the US camp. Barnes takes advantage of the chaos to try to kill Chris, but the sergeant is knocked out by the bombing concussion. Chris and Barnes barely survive. When Barnes asks Chris to get a medic, Chris shoots him in retaliation for Elias’ death.
As Chris is airlifted from the battleground, his voice-over narrates an inner conflict:
-
I think now, looking back, we did not fight the enemy; we fought ourselves. And the enemy was in us. The war is over for me now, but it will always be there, the rest of my days as I’m sure Elias will be, fighting with Barnes for what Rhah called possession of my soul. There are times since, I’ve felt like the child born of those two fathers. But, be that as it may, those of us who did make it have an obligation to build again, to teach to others what we know, and to try with what’s left of our lives to find a goodness and a meaning to this life.
Citation APA (7th Edition)
Pennings, A.J. (2020, Aug 12) Oliver Stone’s Platoon: Jesus Christ Superstar vs. the Marlboro Man. apennings.com https://apennings.com/books-and-films/oliver-stones-platoon-jesus-christ-superstar-vs-the-marlboro-man/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea teaching Visual Rhetoric and ICT for Sustainable Development. From 2002-2012 he was on the faculty of New York University where he taught digital economics and media management. In graduate school he participated in the Hawaii Film Festival (HIFF) yearly with Roger Ebert. He lives in Austin, Texas, when not in Korea.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Academy Award Best Picture > Charlie Sheen > Jesus Christ Superstar > Johnny Depp > Marlboro Man > Oliver Stone > Vietnam War
Memes, Propaganda, and Virality
Posted on | July 31, 2020 | No Comments
Today, we are faced with a new and potentially ominous form of manipulation, an insidious form of propaganda dissemination. I’m talking about the viral spread of memes. The “meme” has emerged as a powerful communication device in the modern world of social media, such as Facebook, Instagram, and Twitter (X). A meme here refers to a digital image, usually a jpeg or PNG, with a short text caption that is easily posted and shared. Usually, they imply a joke or political message that can be diffused quickly and widely to a broad audience.
In this post, I examine the proliferation of memes and their potentially damaging effect on political culture. I discuss the rhetoric of memes and particularly the viral spread of memes as a new form of propaganda that can easily be spread from friend to friend. Propaganda utilizes elements of powerful media techniques to have specific effects on the political and social consciousness of individuals.
One of my former colleagues at New York University, had an apt description of propaganda. Media ecologist Neil Postman called propaganda an “intentionally designed communication that invites us to respond emotionally, immediately and in an either-or manner.” Propaganda is the use of powerful rhetorical forms that work on the individual to energize, promote, stimulate, and determine ideologies. Propaganda can mobilize support for political action as well as pressure for the legislation and implementation of specific policies.
-
Meme wars are a consistent feature of our politics, and they’re not just being used by internet trolls or some bored kids in the basement, but by governments, political candidates, and activists across the globe. Russia used memes and other social-media tricks to influence the US election in 2016, using a troll farm known as the Internet Research Agency to seed pro-Trump and anti-Clinton content across various online platforms. Both sides in territorial conflicts like those between Hong Kong and China, Gaza and Israel, and India and Pakistan are using memes and viral propaganda to sway both local and international sentiment. – Joan Donovan
What makes memes more insidious is that the propaganda is administered by some of a person’s most trusted friends. The goal of a meme is to to spread rapidly through a population, to go viral. This is a diffusion process where a meme is shared from person-to-person or person-to-group, despite the existence of weak links. The goal is to reach a point of exponential growth for the meme’s exposure and influence.
The success of the virality depends on a high “pass-along rate” where individuals are motivated to share their meme to others. A common communication network facilitates virality as a meme may be “retweeted” in the Twitter environment or “shared” on Facebook. It helps if the process is easy – clicking a button rather than cutting and pasting. This is setting the bar low, but decisions are made very quickly and dependent on relatively weak motivations.
An important measure is the virality rate, the number of people who went on to share your meme compared to the number of unique views or impressions it had during a certain period. You can get this metric by dividing the number of total shares of the meme by the number of impressions. Multiply that figure by 100 to get your virality rate percentage. The higher the percentage, the better the meme.
Memes can be created with a program like Adobe’s Photoshop or Adobe Express. You can also use a specialized meme generator application on the Internet or your mobile phone, such as Word Swag. Canva, Imgur, Imgflip, and Livememe are other applications. But please, be honest and do your homework.
Memes are designed to crystallize or fix sets of meanings that Postman argued causes us react quickly and emotionally. They draw on bits of culture, such as slogans, cartoon images, and brand items that are small and easily remembered. They are packaged semiotically with images and text juxtaposed in ways that invite us to construct more complex associations. They are usually structured enough to draw us into some preferred meanings, yet evocative yet enough to draw on the reader’s string of cultural, economic or political associations. Memes are usually vague enough to leave much for our imaginations to interject.
Memes are much like posters in that they remove authorship. The effect can be ominous, creating an anonymous yet authoritative “voice of God.” No one “has to answer for transgressive or hateful ideas.” Memes can weaponize half-truths, lies, and inappropriate material easily and diffuse them quickly through society.
Citation APA (7th Edition)
Pennings, A.J. (2020, Jul 31). Memes, Propaganda, and Virality. apennings.com https://apennings.com/meaning-makers/memes-virality-and-propaganda/
Ⓒ ALL RIGHTS RESERVED
Anthony J. Pennings, Ph.D. is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. He has also taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas. He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii where he worked on a National Computerization Policies Project.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: media ecology > Memes > Neil Postman > propaganda > Virality
Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera
Posted on | June 9, 2020 | No Comments
Keanu Reeves produced a thoughtful and timely documentary on the move from photochemical to digital film. In Side by Side (2012), he interviews some of the best directors and directors of cinematic photography (DPs) about the transformation of the film making process from celluloid to silicon. The transition, which has taken decades, is worth examining through the lens Clay Christensen provides through his theory of innovative disruption. His theory examines how technology can start out “under the radar” with an inferior and cheaper version that is continuously improved until it disrupts a major industry.
The documentary looks at the development of digital cameras, editing, and special effects processes, as well as distribution and projection systems. For each category, it examines the differences between film and digital video. In interviews with prominent actors, directors and editors, the pros and cons of each are discussed along with the obvious arc of the movement towards the increased use of digital processes in cinema.
In this post, I introduce some of the issues in the move to digital cameras within the context of disruptive innovation theory. It is instrumental in understanding how technology like digital video, which was obviously inferior for so long, could emerge to seriously challenge the beauty and institutional momentum of celluloid cinema.
Film merged into a working technology during the 1880s and 1890s with significant improvements in cameras and projection technologies primarily made by Thomas Edison and the French Lumiere brothers. Innovations occurred over the next 100 years, but the invention of the digital charge-coupled device (CCD) in 1969 at Bell Labs marked the beginning of a disruptive trend in digital cameras that would slowly continue to improve until they became a major competitor to film cameras.
The CCD was initially developed for spy satellites during the Cold War but was later integrated into consumer and commercial video products. It was used by Steven Sasson, at Kodak to invent the first digital camera in 1974. The technology was very crude, however, and Kodak did not see it as a worthy replacement for its film-based cameras. Neither did the film industry.
It was the Japanese electronics company SONY that developed the Camcorder in the 1980s based on digital CCD technology and continued development into the generation of 4K cameras. Similar resolution was achieved by the Red One Camera that rocked the film industry in 2007. The same company announced their Red Dragon 6K Sensor at at NAB 2013.
CCDs were largely replaced by complementary metal-oxide semiconductor (CMOS) technology in digital cameras. A spinoff from NASA and JPL, CMOS used less power and could make the cameras smaller. It made an enormous impact on social media, bringing awareness to uprisings in the Arab Spring and other crises around the world.
“Digital cinematography” emerged by 2002 with George Lucas’s Star Wars: Episode II, the Attack of the Clones, which was shot entirely in a high definition digital format. Although the lack of digital projections systems meant that the footage was transferred to film to play in theaters; the film still caused major controversy as the industry debated digital’s pros and cons. While initially these cameras were clearly inferior to their film counterparts, SONY and other companies stayed on the digital trail of eventually and unmistakably improving them to the point where some of the early adopters like Lucas took a chance.
By committing first to consumer markets, digital cameras found the resources to continually improve. Later they showed additional characteristics that film couldn’t match, such as the ability to store huge amounts of film data in very small devices. This meant no more transporting cans of film – a major consideration when shooting in remote and hazardous locations. Increased storage capability also meant the ability to shoot for longer periods of time – often to the actor’s chagrin but with the benefit of maintaining momentum.
Another benefit was being able to watch what was being shot instantaneously on a monitor and being able to review the shots while still on the set. This move was very popular with directors but shifted power away from the directors of photography.
The last fifteen years of camera development have witnessed the disruption of the entire global complex known as “Hollywood.” New cameras such as the Red Dragon and other digital technologies for editing, special effects, and distribution played havoc with the entire chain of creative processes established in the film world and its production and distribution circuits. Digital convergence has also broken down the barriers between film and television and expanded cinematic presentations to a wide variety of screens, from mobile phones to stadium jumbotrons.
Are we still in the infancy of an entirely new array of digital televisual experiences? The year 2007 marked a critical threshold for the use of digital technologies in cameras, but the innovations continued, including the integration of the digital camera on a smartphone and the widescale adoption of digital high definition. Rather than LIDAR, it is the digital camera that has been adopted by Telsa and other EVs to enhance automation and safety.
Film continues to be championed by directors like Christopher Nolan who used it in Interstellar (2014) and Dunkirk (2017). Quentin Tarantino is also known for being dedicated to film in his movies, including Once Upon a Time in Hollywood (2019).
In the next section we look at the disruptive influence of digital editing. Following that is a look at non-linear editing (NLE) that was also made possible by the digital revolution.
Citation APA (7th Edition)
Pennings, A.J. (2020, Jun 9). Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera. apennings.com https://apennings.com/multimedia/digital-disruption-in-the-film-industry-gains-and-losses-part-1-the-camera/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. He wrote this during a brief stay at the Digital Media Management program at St. Edwards University in Austin, Texas. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
Tags: Bell Labs > CCD > charge-coupled device > Christensen > Clay Christensen > Digital cinematography > Digitalistic > disruptive innovation theory > French Lumiere brothers > Red Dragon 6K Sensor > Sony Camcorder
“Letteracy” and Logos
Posted on | May 5, 2020 | No Comments
More than ever, people are interested in how visual design influences the production of meaning and its intellectual and emotional effects. In the new media age, a new type of literacy has emerged that Seymour Papert (1993) and others began to call “letteracy.” Papert was critical of the idea of introducing letters too early in a child’s development but recognized that connecting with culture and history required understanding alphabets and their significance.
“Letteracy” suggests a larger conversation about global visual culture and why people are increasingly more interested in the impact of letters, typographies, and logos in our media world. A twist on “literacy,” it points to the discrepancy between a world in which reading is pervasive and the relative ignorance of how letters are designed and have an influence on us.
This blog post discusses letteracy by discussing the significance of the alphabet and then focusing on the importance of fonts and typography in visual design such as in a magazine or webpage layout, as well as in the use of logos.
Letters Capture Sound
One of the first questions to ask is “What are letters?” Letters typically refer to characters of the alphabet, which are used in written language to represent sounds. Letters are the building blocks of words and are fundamental to written communication in many languages. Each letter typically represents one or more sounds in a spoken language, and when combined in various sequences, they form words that convey meaning.
Letters are phonographic – they code the sounds of language in scripted figures. A few writing systems like Chinese characters are ideographic, they code ideas into their figures. Phonographic writing has the advantage of coding everyday language in their letters while being flexible enough to incorporate new words. Ideographic writing requires extensive memorization and social mentoring to enforce meanings and consistency in sound reproduction.
Asian societies like Korea, and to a lesser extent, Japan, have replaced Chinese characters with the phonographic characters. Korea instituted “Hangul” that is phonographic but with some iconic aspects. The characters represent oral movements of the tongue and lips used to achieve sounds. The change allowed Korea’s population to achieve a high rate of reading literacy. Japan has two sets of phonographic characters, hiragana, and katakana. These both are sound based, but each character represents a whole syllable – the vowel and the consonant. To make the situation a bit more complicated, they still use “Kanji” ideographic characters borrowed from China.
Fonts and Typography
A key distinction in letteracy is between the terms “font” and “typography” that are often used interchangeably, but they refer to different aspects of written or printed text. A font refers to a specific style or design of a set of characters that share consistent visual characteristics. This set would include letters, numbers, punctuation marks, and symbols. Font characteristics include attributes such as typeface, weight, style (e.g., regular, italic, bold), size, and spacing. Examples of fonts include Arial, Times New Roman, Helvetica, and Comic Sans.
Typography, on the other hand, encompasses the art and technique of arranging type to make written language readable, legible, and visually appealing. It is the design and arrangement of letters and text so that the writing is easy to understand, appealing, and conveys an appropriate set of feelings and meanings. Typography involves the selection and use of fonts, as well as considerations such as layout, spacing, alignment, hierarchy, color, and overall design. Good typography involves careful attention to detail and consideration of the intended audience, context, and purpose of the text.
Below is one of my favorite TED talks about typography.
Spacing
Part of this literacy is an understanding the various meanings associated with typography. Type fonts can be designed and used with various purposes in mind. The “Power of Typography” video above explains in more detail. As the speaker points out, spacing is an important issue in typography. Kerning, Tracking, and Leading are three terms that describe the importance of space and help us do the denotative analysis.
Kerning deals with distance between two letters. Words are indecipherable if the letters are tooclose or too far apart. They can also be awkward to read when some letters have wi d e r spacing and others narrower.
Tracking involves adjusting the spacing throughout the e n t i r e word. It can be used to change the spacing equally between every letter at once. Tracking can make a single word seem airy and impressive but can quickly lead to difficulty in reading if used excessively.
Leading is a design aspect that determines how text is spaced vertically in lines. It deals with the distance |||||||||||||||||||||||||||||| from the bottom of the words above to the top of the words below in order to make them legible.
From Printing Press to Desktop Publishing
Johannes Gutenberg is credited with inventing both the printing press and the production of durable typefaces around 1460 AD. The technology had also been developed in China and Korea, but conditions in Europe were better for its expansion. Printing presses in China and Korea were state-based projects that eventually withered. Conversely, religious, market, and political conditions in Europe improved their chances of success.
The first best-seller? The Christian Bible. In 1517, the Protestant revolution began that emphasized reading of the Bible over the services of the Catholic church and its priests. It also helped Europe develop separate nation-states as people became more literate in their local languages. Printed materials in different dialects began to coagulate community identities. People began to identify with others who spoke the same dialect and recognize them as sharing the same national values. Benedict Anderson called these “imagined communities” in the book by the same name.
Thanks to Steve Jobs and the Apple Macintosh graphical user interface, different typefaces were added to computers. Along with WYSIWYG display, the new GUI enabled desktop publishing. This democratized the printing “press.” Consequently, understanding the importance of different styles of letters became an important literacy of the digital age.
Logos
A logo is the graphic signature of a person or organization. It is meant to encapsulate and communicate the preferred symbolic meanings of an organization that cannot be imparted through words alone. A logo is a simple, abstracted representation of an individual or corporate identity. It is a constructed icon meant to immediately denote and connote meaning.
A logo should maintain its clarity in many different sizes. It should be designed as an easily remembered icon that represents its bearer. It is meant to be seen and recognized instantly. It is also meant to be reduced in size without disappearing from view or losing its legibility.
It may include visual elements such as colors, forms, fonts and shapes, and even dress codes. It should be effective in both black and white, and color. It should appear well in different media (paper, RGB, etc.) It should also translate well into different media and packaging. Few logos achieve these standards, but those that succeed play an important role in determining success.
Conclusion
Letteracy provides a framework for understanding the significance of letters, typography, and logos in visual design. By appreciating the art and science of typography and recognizing the power of logos, we can better comprehend and communicate through the visual language of text and symbols in our media-saturated world.
Citation APA (7th Edition)
Pennings, A.J. (2020, May 5) “Letteracy” and Logos. apennings.com https://apennings.com/visual-literacy-and-rhetoric/letteracy-and-logos/
Ⓒ ALL RIGHTS RESERVED
Anthony J. Pennings, Ph.D. is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. His American home is in Austin, Texas where he has taught in the Digital Media BS and MBA programs at St. Edwards University. He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: desktop publishing > Hangul > letteracy > phonographic > print capitalism > printing press > visual literacy