Digital Disruption in the Film Industry – Gains and Losses – Part 2: Non-Linear Editing
Posted on | August 26, 2020 | No Comments
In 1999, The Matrix (1999) won 4 Oscars with the Avid, a non-linear video editing computer system, that launched a revolution in the production of audio-visual materials. A new years before, James Cameron edited much of Titanic (1997) himself with an Avid application on a computer at his house.
This post returns to the theme of digital disruption in the film industry. In Part 1, digital cameras were discussed. Below, I examine the transformation of post-production practices with the advent of Non-linear Editing (NLE) and digital-enabled special effects (F/X), paying particular attention to the introduction of the Avid NLE. These technologies came about when computer micro-processing power was sufficiently miniaturized, and software applications became efficient enough for immediate interaction.
To get a quick overview of the digital transformation in the film industry, this quick summary of Keanu Reeves’ documentary Side by Side (2012) is useful:
NLE clearly came about because customers were often under-served by traditional film/video editing technology. While chemically-coated film had been used for over a hundred years, it required a massive industry to operate and thrive. Cameras and their film reels were heavy and difficult to transport. Silver oxide film is dangerous and complicated to develop.
Cameras would capture the action on film but the film could not be viewed until after the film was processed. The director and his team would gather after the shooting and view the “dailies.” They would make decisions on which film was good and any adjustments on camera lighting or style.
Editing was a manual chore requiring sifting through scores of film reels to find the right clips. It was a laborious activity that often left editors with bloody fingers. The final project required literally cutting the film and taping them together.
Finally, only a few techniques existed to add special effects to the film in post production.
Early NLE technology was cumbersome and weak, but it progressed rapidly. This is in line with Clayton Christensen’s theory of innovative disruption. While early developments in digital video were crude and presented little aesthetic and production efficiencies to challenge film, innovations increased rapidly as the PC integrated the fruits of Moore’s law, the rapid increase of transistors on a microprocessor chip. Eventually, the sophistication of digital technologies presented a major threat to the film industry.[1]
While the digital non-linear editing process was conceptualized in the early 1970s, when CBS and Memorex collaborated on the CMX 600, the technology was not practical enough for commercial use. It used washing machine-size storage disks and cost a quarter of a million dollars to record and edit low resolution black and white images.[2]
The next stage in digital editing stemmed from George Lucas’special effects empire with the invention of EditDroid. This computer used footage stored on LaserDisks. EditDroid was sold by a George Lucas spin-off company, DroidWorks. The technology did not really work very well, and the company shut down in 1987.
The major disruption came with the development of the Avid NLE system. Bill Warner, the founder of AVID Technologies, Inc. had been crippled in an accident that rendered him unable to walk. In 1984, he sought to buy digital editing equipment but was surprised that the technology didn’t really exist on the market.
Warner spent the next few years in production to create a non-linear editing system. In 1987 invented the digital editor and formed the Avid company. At first, the digital editing took place in the Apple Macintosh and then it would render the edits on tape offline by controlling a stack of regular video editing machines.
He started the company in September of 1987, but then in November, the stock market fell apart. Warner said, “I started the company in September of 1987. I got going and then in November of 1987 boom, the stock market fell apart, Black Monday, 500 point fall in the Dow and people said to me, they said, they said oh, now you’ll never raise money. And I said I just need one — I am just one person who needs money. I need $500,000. It’s still out there. I’ll get it.” He took his Avid Technology to the National Association of Broadcasters (NAB) annual trade show and found the response overwhelming.
A colleague of mine was at the 1988 NAB conference and was intrigued with the Avid. Patricia Amaral was the head of the Media Lab at the University of Hawaii and worked with Professors Stan Harms and Dan Wedemeyer to get the Avid NLE. In 1989, the University of Hawaii was the first educational institution to purchase the first Avid.
I was intrigued when I first saw the AVID non-linear digital editing system as a PhD student at the University of Hawaii in 1989. It was a clunky system but connected to a new Apple Mac that brought up thumbnails of all the clips that had been digitized. Using a mouse and keyboard, the clips could be edited/assembled and transformations added, but it took a while for video assemblage because the editing technology still required the actual rendering to take place using traditional 3/4 inch tape drives. At the end of the day they would start their laborious task and in the course of a few hours, they would finish their edits.
How does it fit the requirements of Christensen disruptive innovation model? For Christensen, almost all disruptive innovations happen when a new entrant can enter a market, and by starting with simple technology. Eventually, they improve enough to seriously disrupt the market. He distinguishes disruptive innovation from sustaining innovation, when a company continues to improve an established product. While film editing made continuous improvements over the course of its history, NLE made transformative changes over a short period of time. The results were so weak in the beginning, that film advocates quickly dismissed it, only to be surprised when it later became competitive so quickly.[3]
Now, several NLEs compete with each other in the cinema, consumer, and television industries. Adobe Premiere Pro, Avid Media Composer, DaVinci Resolve, and Final Cut Pro X are the current leaders in the NLE technologies. These editors below discuss the current state of the NLE editing.
In an upcoming post, I will discuss digital’s influence on special effects.
Notes
[1] From Clayton M. Christensen, Michael E. RaynorRory McDonald, et al. “What Is Disruptive Innovation?” Harvard Business Review, 19 Dec. 2016, hbr.org/2015/12/what-is-disruptive-innovation.
[2] From First Cut 2: More Conversations with Film Editors
By Gabriella Oldham. p. 4.
[3] Raynor, Michael E.; Christensen, Clayton M. (2003-10-09). The Innovator’s Solution: Creating and Sustaining Successful Growth. (p. 18). Perseus Books Group. Kindle Edition.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at St. Edwards University in Austin, Texas, where he maintains his US address. From 2002-2012 was on the faculty of New York University and held his first academic position at Victoria University in New Zealand. He has also spent ten years at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Avid > Clay Christensen > DroidWorks > EditDroid > NLE > Patricia Amaral > Side by Side movie > University of Hawaii Media Lab
Oliver Stone’s Platoon: Jesus Christ Superstar vs. the Marlboro Man
Posted on | August 12, 2020 | No Comments
In Oliver Stone’s award-winning film, Platoon (1986), Charlie Sheen plays Chris Taylor, a “coming of age” infantry soldier trying to reconcile his identity between the influences of two sergeants in his US Army platoon. The setting is the Vietnam War circa late 1967. The sergeants, played William Dafoe and Tom Berenger, were directed to represent two mythical poles of ideology and belief that have come to heavily influence American political culture. I refer to these two men and the contrasting themes they represent as “Jesus Christ Superstar” vs. “the Marlboro Man.”
Platoon (1986) won Best Picture at the 1987 Academy Awards received additional awards for Best Film Editing, Best Sound, and a nomination for Best Cinematography. Oliver Stone won an “Oscar” for Directing and was nominated for Writing. Both sergeants were nominated for Best Actor in a Supporting Role, which brings me back to the two conflicting myths.
With Sgt. Elias (William Dafoe) representing “Jesus Christ Superstar” and Sgt Barnes (Tom Berenger) “the Marlboro Man,” the movie condenses a series of meanings into the two contending perspectives. These viewpoints divided America and haunt to this day its view of the war. Barnes characterizes the tension succinctly at one point, “there’s the way it ought to be, and there’s the way it is.” Barnes, who was shot seven times, including in the face, has the physical scars to represent “the way it is.”
Jesus Christ Superstar was a rock opera album that was released in 1970 on Broadway and as a movie in 1973. The film was shot in Israel and other Middle Eastern locations and was the eighth highest-grossing film of that year. It reconciled different gospels of the Bible and focused on the relationship between Jesus, Judas, and Mary Magdalene, emphasizing betrayal and picking one’s battles. It was in some ways an anthem of the time as its roll and roll music and energy resonated with the “hippie” counterculture that emerged during the height of the Vietnam War. Jesus of Nazareth, with his long hair, certainly looked the part. Stone “paints” Elias and several other soldiers with iconography from the era. Peace symbols, headbands, drugs, and Sixties music like Jefferson Airplane’s “White Rabbit” are used to represent this counter-culture.
It’s hard to portray Elias as a Jesus-like pacifist when he volunteered for three tours in the “Nam” and was a skilled and experienced soldier. But from the first scene, we see him carrying a machine gun on his shoulders like a cross and climbing up a mountainside like Jesus ascending Calvary. As Sergeant O’Neil from a third squad says about Elias after an argument, “Guy’s is in three years and he thinks he is Jesus fuckin’ Christ or something.”
Elias is portrayed as the more sensitive leader. We next encounter him helping Chris and offering to carry much of the load from his amateurishly stuffed backpack. Most importantly, he is the voice of restraint when the platoon is searching a Vietnamese village for guns and ammunition. When Sgt. Barnes shoots a Vietnamese village woman during an interrogation, Elias confronts him and initiates a fistfight.
This scene creates a tension between the two as Barnes faces a potential court marital for the murder. The conflict eventually ends up with Barnes shooting Elias during a battle with the Viet Cong. The shots don’t kill him though and as the platoon is being evacuated by helicopters he is sighted from the air being chased by Vietnamese troops. He is shot several times in the back but struggles to continue. Finally, as he falls to his knees, writhing in pain, a medium shot shows him with his arms outstretched and gaze towards the heavens, as if he was being crucified.
The Marlboro Man was another iconic figure of the Vietnam era. It became the masculine symbol of the famous cigarette brand. Invented to subvert the early impression that Marlboro cigarettes were for women, it successfully became the icon of rugged individualism and aggressive patriarchy. The first scene of Barnes shows him in the jungle with a pack of Marlboro cigarettes strapped to his helmet.
Barnes was clearly the leader of the platoon, as even the lieutenant deferred to his judgment. His first words in the movie were “Get a move on, boy” to Chris, in his Southern accent. He is regularly portrayed as the tough but competent, no-nonsense leader. At one point, while criticizing the pot smokers for what he calls their attempt to “escape reality,” he says, “I am reality.”
Oliver Stone served in Vietnam and was awarded the Bronze Star medal. The story was based roughly on his experience there. In Stone’s interview with Joe Rogan, he speaks to his respect for both sergeants. While Stone clearly favors Elias, his portrayal of Barnes is surprisingly sympathetic, and we see how both men influence Chris.
Chris arrives in Vietnam as a “cherry,” a virgin to the war experience. But after he recovers from being shot during their first ambush, he befriends a black man named King and a “surfer dude” from California named Crawford. They are all assigned to cleaning the latrines and the scene allows Chris to tell his story of why he quit college and enlisted in the Army. “I wasn’t learning anything. I figured why should just the poor kids go off to war and the rich kids always get away with it?” The others laugh off his naivety but invite him to the Feel Good Cave, a bunker where they “party” by playing music and smoking pot.
King introduces Taylor as the resurrected “Chris” to the “heads,” including those soldiers played by Johnny Depp and Forrest Whitaker. Elias is there smoking pot as well and welcomes Chris with a “hit” of marijuana blown through the barrel of a rifle. You can hear Grace Slick singing “feed your head” as Chris says he feels good and can’t feel the pain from his injury. Elias responds, “feeling good is good enough.”
Tom Berenger is masterful in his performance as Sgt. Barnes. While Elias is “partying” with the “stoners,” Barnes is listening to country music and playing cards while drinking whiskey and beer with his group. Later, after Elias is dead, Barnes goes to the Feel Good bunker to confront Elias’ friends in the platoon. With a Jack Daniels Tennessee whiskey bottle in hand, he goes on to criticize the recently departed Elias.
-
Elias was full of shit. Elias was a Crusader. Now, I got no fight with any man who does what he’s told, but when he don’t, the machine breaks down. And when the machine breaks down, we break down. And I ain’t gonna allow that in any of you. Not one.
The scene ends with Chris attacking Barnes, who quickly subdues the young soldier. He is convinced not to kill Chris as he would face ten years in military prison for killing an enlisted man.
In a later battle, the platoon is overrun with Viet Cong, and an airstrike is called in to bomb the US camp. Barnes takes advantage of the chaos to try to kill Chris, but the sergeant is knocked out by the bombing concussion. Chris and Barnes barely survive. When Barnes asks Chris to get a medic, Chris shoots him in retaliation for Elias’ death.
As Chris is airlifted from the battleground, his voice-over narrates an inner conflict:
-
I think now, looking back, we did not fight the enemy; we fought ourselves. And the enemy was in us. The war is over for me now, but it will always be there, the rest of my days as I’m sure Elias will be, fighting with Barnes for what Rhah called possession of my soul. There are times since, I’ve felt like the child born of those two fathers. But, be that as it may, those of us who did make it have an obligation to build again, to teach to others what we know, and to try with what’s left of our lives to find a goodness and a meaning to this life.
Citation APA (7th Edition)
Pennings, A.J. (2020, Aug 12) Oliver Stone’s Platoon: Jesus Christ Superstar vs. the Marlboro Man. apennings.com https://apennings.com/books-and-films/oliver-stones-platoon-jesus-christ-superstar-vs-the-marlboro-man/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea teaching Visual Rhetoric and ICT for Sustainable Development. From 2002-2012 he was on the faculty of New York University where he taught digital economics and media management. In graduate school he participated in the Hawaii Film Festival (HIFF) yearly with Roger Ebert. He lives in Austin, Texas, when not in Korea.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Academy Award Best Picture > Charlie Sheen > Jesus Christ Superstar > Johnny Depp > Marlboro Man > Oliver Stone > Vietnam War
Memes, Propaganda, and Virality
Posted on | July 31, 2020 | No Comments
Today, we are faced with a new and potentially ominous form of manipulation, an insidious form of propaganda dissemination. I’m talking about the viral spread of memes. The “meme” has emerged as a powerful communication device in the modern world of social media, such as Facebook, Instagram, and Twitter (X). A meme here refers to a digital image, usually a jpeg or PNG, with a short text caption that is easily posted and shared. Usually, they imply a joke or political message that can be diffused quickly and widely to a broad audience.
In this post, I examine the proliferation of memes and their potentially damaging effect on political culture. I discuss the rhetoric of memes and particularly the viral spread of memes as a new form of propaganda that can easily be spread from friend to friend. Propaganda utilizes elements of powerful media techniques to have specific effects on the political and social consciousness of individuals.
One of my former colleagues at New York University, had an apt description of propaganda. Media ecologist Neil Postman called propaganda an “intentionally designed communication that invites us to respond emotionally, immediately and in an either-or manner.” Propaganda is the use of powerful rhetorical forms that work on the individual to energize, promote, stimulate, and determine ideologies. Propaganda can mobilize support for political action as well as pressure for the legislation and implementation of specific policies.
-
Meme wars are a consistent feature of our politics, and they’re not just being used by internet trolls or some bored kids in the basement, but by governments, political candidates, and activists across the globe. Russia used memes and other social-media tricks to influence the US election in 2016, using a troll farm known as the Internet Research Agency to seed pro-Trump and anti-Clinton content across various online platforms. Both sides in territorial conflicts like those between Hong Kong and China, Gaza and Israel, and India and Pakistan are using memes and viral propaganda to sway both local and international sentiment. – Joan Donovan
What makes memes more insidious is that the propaganda is administered by some of a person’s most trusted friends. The goal of a meme is to to spread rapidly through a population, to go viral. This is a diffusion process where a meme is shared from person-to-person or person-to-group, despite the existence of weak links. The goal is to reach a point of exponential growth for the meme’s exposure and influence.
The success of the virality depends on a high “pass-along rate” where individuals are motivated to share their meme to others. A common communication network facilitates virality as a meme may be “retweeted” in the Twitter environment or “shared” on Facebook. It helps if the process is easy – clicking a button rather than cutting and pasting. This is setting the bar low, but decisions are made very quickly and dependent on relatively weak motivations.
An important measure is the virality rate, the number of people who went on to share your meme compared to the number of unique views or impressions it had during a certain period. You can get this metric by dividing the number of total shares of the meme by the number of impressions. Multiply that figure by 100 to get your virality rate percentage. The higher the percentage, the better the meme.
Memes can be created with a program like Adobe’s Photoshop or Adobe Express. You can also use a specialized meme generator application on the Internet or your mobile phone, such as Word Swag. Canva, Imgur, Imgflip, and Livememe are other applications. But please, be honest and do your homework.
Memes are designed to crystallize or fix sets of meanings that Postman argued causes us react quickly and emotionally. They draw on bits of culture, such as slogans, cartoon images, and brand items that are small and easily remembered. They are packaged semiotically with images and text juxtaposed in ways that invite us to construct more complex associations. They are usually structured enough to draw us into some preferred meanings, yet evocative yet enough to draw on the reader’s string of cultural, economic or political associations. Memes are usually vague enough to leave much for our imaginations to interject.
Memes are much like posters in that they remove authorship. The effect can be ominous, creating an anonymous yet authoritative “voice of God.” No one “has to answer for transgressive or hateful ideas.” Memes can weaponize half-truths, lies, and inappropriate material easily and diffuse them quickly through society.
Citation APA (7th Edition)
Pennings, A.J. (2020, Jul 31). Memes, Propaganda, and Virality. apennings.com https://apennings.com/meaning-makers/memes-virality-and-propaganda/
Ⓒ ALL RIGHTS RESERVED
Anthony J. Pennings, Ph.D. is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. He has also taught at Marist College in New York, and Victoria University in New Zealand. He keeps his American home in Austin, Texas. He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii where he worked on a National Computerization Policies Project.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: media ecology > Memes > Neil Postman > propaganda > Virality
Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera
Posted on | June 9, 2020 | No Comments
Keanu Reeves produced a thoughtful and timely documentary on the move from photochemical to digital film. In Side by Side (2012), he interviews some of the best directors and directors of cinematic photography (DPs) about the transformation of the film making process from celluloid to silicon. The transition, which has taken decades, is worth examining through the lens Clay Christensen provides through his theory of innovative disruption. His theory examines how technology can start out “under the radar” with an inferior and cheaper version that is continuously improved until it disrupts a major industry.
The documentary looks at the development of digital cameras, editing, and special effects processes, as well as distribution and projection systems. For each category, it examines the differences between film and digital video. In interviews with prominent actors, directors and editors, the pros and cons of each are discussed along with the obvious arc of the movement towards the increased use of digital processes in cinema.
In this post, I introduce some of the issues in the move to digital cameras within the context of disruptive innovation theory. It is instrumental in understanding how technology like digital video, which was obviously inferior for so long, could emerge to seriously challenge the beauty and institutional momentum of celluloid cinema.
Film merged into a working technology during the 1880s and 1890s with significant improvements in cameras and projection technologies primarily made by Thomas Edison and the French Lumiere brothers. Innovations occurred over the next 100 years, but the invention of the digital charge-coupled device (CCD) in 1969 at Bell Labs marked the beginning of a disruptive trend in digital cameras that would slowly continue to improve until they became a major competitor to film cameras.
The CCD was initially developed for spy satellites during the Cold War but was later integrated into consumer and commercial video products. It was used by Steven Sasson, at Kodak to invent the first digital camera in 1974. The technology was very crude, however, and Kodak did not see it as a worthy replacement for its film-based cameras. Neither did the film industry.
It was the Japanese electronics company SONY that developed the Camcorder in the 1980s based on digital CCD technology and continued development into the generation of 4K cameras. Similar resolution was achieved by the Red One Camera that rocked the film industry in 2007. The same company announced their Red Dragon 6K Sensor at at NAB 2013.
CCDs were largely replaced by complementary metal-oxide semiconductor (CMOS) technology in digital cameras. A spinoff from NASA and JPL, CMOS used less power and could make the cameras smaller. It made an enormous impact on social media, bringing awareness to uprisings in the Arab Spring and other crises around the world.
“Digital cinematography” emerged by 2002 with George Lucas’s Star Wars: Episode II, the Attack of the Clones, which was shot entirely in a high definition digital format. Although the lack of digital projections systems meant that the footage was transferred to film to play in theaters; the film still caused major controversy as the industry debated digital’s pros and cons. While initially these cameras were clearly inferior to their film counterparts, SONY and other companies stayed on the digital trail of eventually and unmistakably improving them to the point where some of the early adopters like Lucas took a chance.
By committing first to consumer markets, digital cameras found the resources to continually improve. Later they showed additional characteristics that film couldn’t match, such as the ability to store huge amounts of film data in very small devices. This meant no more transporting cans of film – a major consideration when shooting in remote and hazardous locations. Increased storage capability also meant the ability to shoot for longer periods of time – often to the actor’s chagrin but with the benefit of maintaining momentum.
Another benefit was being able to watch what was being shot instantaneously on a monitor and being able to review the shots while still on the set. This move was very popular with directors but shifted power away from the directors of photography.
The last fifteen years of camera development have witnessed the disruption of the entire global complex known as “Hollywood.” New cameras such as the Red Dragon and other digital technologies for editing, special effects, and distribution played havoc with the entire chain of creative processes established in the film world and its production and distribution circuits. Digital convergence has also broken down the barriers between film and television and expanded cinematic presentations to a wide variety of screens, from mobile phones to stadium jumbotrons.
Are we still in the infancy of an entirely new array of digital televisual experiences? The year 2007 marked a critical threshold for the use of digital technologies in cameras, but the innovations continued, including the integration of the digital camera on a smartphone and the widescale adoption of digital high definition. Rather than LIDAR, it is the digital camera that has been adopted by Telsa and other EVs to enhance automation and safety.
Film continues to be championed by directors like Christopher Nolan who used it in Interstellar (2014) and Dunkirk (2017). Quentin Tarantino is also known for being dedicated to film in his movies, including Once Upon a Time in Hollywood (2019).
In the next section we look at the disruptive influence of digital editing. Following that is a look at non-linear editing (NLE) that was also made possible by the digital revolution.
Citation APA (7th Edition)
Pennings, A.J. (2020, Jun 9). Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera. apennings.com https://apennings.com/multimedia/digital-disruption-in-the-film-industry-gains-and-losses-part-1-the-camera/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. He wrote this during a brief stay at the Digital Media Management program at St. Edwards University in Austin, Texas. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
Tags: Bell Labs > CCD > charge-coupled device > Christensen > Clay Christensen > Digital cinematography > Digitalistic > disruptive innovation theory > French Lumiere brothers > Red Dragon 6K Sensor > Sony Camcorder
“Letteracy” and Logos
Posted on | May 5, 2020 | No Comments
More than ever, people are interested in how visual design influences the production of meaning and its intellectual and emotional effects. In the new media age, a new type of literacy has emerged that Seymour Papert (1993) and others began to call “letteracy.” Papert was critical of the idea of introducing letters too early in a child’s development but recognized that connecting with culture and history required understanding alphabets and their significance.
“Letteracy” suggests a larger conversation about global visual culture and why people are increasingly more interested in the impact of letters, typographies, and logos in our media world. A twist on “literacy,” it points to the discrepancy between a world in which reading is pervasive and the relative ignorance of how letters are designed and have an influence on us.
This blog post discusses letteracy by discussing the significance of the alphabet and then focusing on the importance of fonts and typography in visual design such as in a magazine or webpage layout, as well as in the use of logos.
Letters Capture Sound
One of the first questions to ask is “What are letters?” Letters typically refer to characters of the alphabet, which are used in written language to represent sounds. Letters are the building blocks of words and are fundamental to written communication in many languages. Each letter typically represents one or more sounds in a spoken language, and when combined in various sequences, they form words that convey meaning.
Letters are phonographic – they code the sounds of language in scripted figures. A few writing systems like Chinese characters are ideographic, they code ideas into their figures. Phonographic writing has the advantage of coding everyday language in their letters while being flexible enough to incorporate new words. Ideographic writing requires extensive memorization and social mentoring to enforce meanings and consistency in sound reproduction.
Asian societies like Korea, and to a lesser extent, Japan, have replaced Chinese characters with the phonographic characters. Korea instituted “Hangul” that is phonographic but with some iconic aspects. The characters represent oral movements of the tongue and lips used to achieve sounds. The change allowed Korea’s population to achieve a high rate of reading literacy. Japan has two sets of phonographic characters, hiragana, and katakana. These both are sound based, but each character represents a whole syllable – the vowel and the consonant. To make the situation a bit more complicated, they still use “Kanji” ideographic characters borrowed from China.
Fonts and Typography
A key distinction in letteracy is between the terms “font” and “typography” that are often used interchangeably, but they refer to different aspects of written or printed text. A font refers to a specific style or design of a set of characters that share consistent visual characteristics. This set would include letters, numbers, punctuation marks, and symbols. Font characteristics include attributes such as typeface, weight, style (e.g., regular, italic, bold), size, and spacing. Examples of fonts include Arial, Times New Roman, Helvetica, and Comic Sans.
Typography, on the other hand, encompasses the art and technique of arranging type to make written language readable, legible, and visually appealing. It is the design and arrangement of letters and text so that the writing is easy to understand, appealing, and conveys an appropriate set of feelings and meanings. Typography involves the selection and use of fonts, as well as considerations such as layout, spacing, alignment, hierarchy, color, and overall design. Good typography involves careful attention to detail and consideration of the intended audience, context, and purpose of the text.
Below is one of my favorite TED talks about typography.
Spacing
Part of this literacy is an understanding the various meanings associated with typography. Type fonts can be designed and used with various purposes in mind. The “Power of Typography” video above explains in more detail. As the speaker points out, spacing is an important issue in typography. Kerning, Tracking, and Leading are three terms that describe the importance of space and help us do the denotative analysis.
Kerning deals with distance between two letters. Words are indecipherable if the letters are tooclose or too far apart. They can also be awkward to read when some letters have wi d e r spacing and others narrower.
Tracking involves adjusting the spacing throughout the e n t i r e word. It can be used to change the spacing equally between every letter at once. Tracking can make a single word seem airy and impressive but can quickly lead to difficulty in reading if used excessively.
Leading is a design aspect that determines how text is spaced vertically in lines. It deals with the distance |||||||||||||||||||||||||||||| from the bottom of the words above to the top of the words below in order to make them legible.
From Printing Press to Desktop Publishing
Johannes Gutenberg is credited with inventing both the printing press and the production of durable typefaces around 1460 AD. The technology had also been developed in China and Korea, but conditions in Europe were better for its expansion. Printing presses in China and Korea were state-based projects that eventually withered. Conversely, religious, market, and political conditions in Europe improved their chances of success.
The first best-seller? The Christian Bible. In 1517, the Protestant revolution began that emphasized reading of the Bible over the services of the Catholic church and its priests. It also helped Europe develop separate nation-states as people became more literate in their local languages. Printed materials in different dialects began to coagulate community identities. People began to identify with others who spoke the same dialect and recognize them as sharing the same national values. Benedict Anderson called these “imagined communities” in the book by the same name.
Thanks to Steve Jobs and the Apple Macintosh graphical user interface, different typefaces were added to computers. Along with WYSIWYG display, the new GUI enabled desktop publishing. This democratized the printing “press.” Consequently, understanding the importance of different styles of letters became an important literacy of the digital age.
Logos
A logo is the graphic signature of a person or organization. It is meant to encapsulate and communicate the preferred symbolic meanings of an organization that cannot be imparted through words alone. A logo is a simple, abstracted representation of an individual or corporate identity. It is a constructed icon meant to immediately denote and connote meaning.
A logo should maintain its clarity in many different sizes. It should be designed as an easily remembered icon that represents its bearer. It is meant to be seen and recognized instantly. It is also meant to be reduced in size without disappearing from view or losing its legibility.
It may include visual elements such as colors, forms, fonts and shapes, and even dress codes. It should be effective in both black and white, and color. It should appear well in different media (paper, RGB, etc.) It should also translate well into different media and packaging. Few logos achieve these standards, but those that succeed play an important role in determining success.
Conclusion
Letteracy provides a framework for understanding the significance of letters, typography, and logos in visual design. By appreciating the art and science of typography and recognizing the power of logos, we can better comprehend and communicate through the visual language of text and symbols in our media-saturated world.
Citation APA (7th Edition)
Pennings, A.J. (2020, May 5) “Letteracy” and Logos. apennings.com https://apennings.com/visual-literacy-and-rhetoric/letteracy-and-logos/
Ⓒ ALL RIGHTS RESERVED
Anthony J. Pennings, Ph.D. is a Professor at the Department of Technology and Society, State University of New York, Korea. From 2002-2012 was on the faculty of New York University. Previously, he taught at Marist College in New York, and Victoria University in New Zealand. His American home is in Austin, Texas where he has taught in the Digital Media BS and MBA programs at St. Edwards University. He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: desktop publishing > Hangul > letteracy > phonographic > print capitalism > printing press > visual literacy
US INTERNET POLICY, PART 2: THE SHIFT TO BROADBAND
Posted on | March 24, 2020 | No Comments
This post is second in a series that I am producing during the COVID-19 pandemic about the importance of telecommunications policy in ensuring the widespread availability of affordable high-speed Internet access. Teaching online and working from home have gone from fringe activities to be central components of life. As we move to a Smart New Deal to transform American life, how we structure our digital environments will be central. This post discusses the transition from dial-up modems in the early days of the Internet to high-speed broadband connections. With that technical transition and the FCC’s 2005 decision, the competitive environment that the Clinton-Gore administration built, collapsed into the cable-telco duopoly we see today and made “net neutrality” an issue.
The Internet Service Providers (ISPs) mentioned in US Internet Policy, Part 1 facilitated the process of getting individuals, businesses and government “online” by linking to the Internet backbone and going “retail” with dial-up modems and then high-speed broadband connections. The term “online” emerged as way to distinguish data communications from telephony, which was highly regulated by the Communications Act of 1934. ISPs offered businesses and consumers high-speed data services for accessing the World Wide Web, hosting websites, and providing large file transfers (FTP). The key was accessing the rapidly expanding Internet packet-switching backbone network that had been developed by the National Science Foundation (NSF).
The National Science Foundation’s backbone network (NSFNET) began data transmissions at 56 kilobits per second (Kbit/s) but was upgraded to T1 lines in 1988, sending at 1.544 megabits per second (Mbit/s). It eventually consisted of some 170 smaller networks connecting research centers and universities. In 1991, the NSFNET backbone was upgraded to T3 lines sending data at 45 Mbit/s. From 1993, NSFNET was privatized, and a new backbone architecture was solicited, that incorporated the private sector.
The next-generation very-high-performance Backbone Network Service (vBNS) was developed as the successor to the NSFNet. vBNS began operation in April 1995 and was developed with MCI Communications, now a part of Verizon. The new backbone consisted primarily of glass Optical Carrier (OC) lines, each of which had several fiber-optic cables banded together to increase the total amount of capacity of the line. The interconnected Optical Carrier (OCx) lines operated at 155 Mbit/s and higher. These high-speed trunk lines soon multiplied their capabilities from OC-3 operating at 155 Mbit/s, to OC-12 (622 Mbit/s), OC-24 (1244 Mbit/s), and OC-48 (2488 Mbit/s). By 2005, OC-48 was surpassed by OC-192 (9953.28 Mbit/s) and 10 Gigabit Ethernet.
As part of NSFNET decommission in 1995, these backbone links connected to the four national network access points (NAPs) in California, Chicago, New Jersey, and Washington D.C. The backbone expanded to multiple carriers that coordinated with ISPs to provide high-speed connections for homes and businesses.
At first consumers used analog dial-up modems over the telephone lines at speeds that increased to 14.4 kilobits per second (Kbit/s or just k) by 1991 to 28.8 kbit/s in 1994. Soon the 33.6 Kbit/s was invented that many thought to be the upper limit for phone line transmissions. But the 56K modem was soon available and a new set of standards continue to push speeds of data over the telephone system. The 56K modem was invented by Dr. Brent Townshend for an early music streaming service. This new system avoided the analog to digital conversion that seriously hampered data speeds and allow content to be switched digitally to the consumer’s terminal device, usually a PC.
Also, during the 1990s, the telcos were conducting tests using a new technology called ADSL (Asynchronous Digital Subscriber Line). It was initially designed to provide video over copper lines to the home. Baby Bells, in particular, wanted to offer television services to compete with cable television. It was called asynchronous because it could send data downstream to the subscriber faster (256 kbit/s-9 Mbit/s) than upstream (64 Kbit/s-1.54 Kbit/s) to the provider.
ADSL was able to utilize electromagnetic frequencies that telephone wires carry, but don’t use. ADSL services separated the telephone signals into three bands of frequencies-one for telephone calls and the other two bands for uploading and downloading Internet activities. Different versions and speeds emerged based on the local telco’s ability and willingness to get an optical fiber link close to the neighborhood or “to the curb” next to a household or business location.
They were soon called Digital Subscriber Lines (DSL), and they began to replace dial-up modems. High demand and competition from cable companies with high-speed coaxial lines pressured ISPs and telcos to adapt DSL technologies. DSL and new cable technologies that carried Internet traffic, as well as television, came to be collectively called “broadband” communications.
Internet traffic grew at a fantastic rate during the late 1990s as individuals and corporations rushed to “get on the web.” The rhetoric of the “new economy” circulated and fueled investments in web-based companies and telecommunications providers.
A temporary investment bubble emerged as many companies lacked the technology or business expertise to obtain profits. Dot.coms such as Drkoop.com, eToys.com, Flooz.com, GeoCities, Go.com, Kozmo.com, Pets.com, theGlobe.com, and Webvan.com failed for a variety of reasons but mainly flawed business plans and the premature expenditure of investment capital.
Similarly, many carriers such as Global Crossing, WorldCom, and ISPs overestimated web traffic and built excess capacity. In the wake of the dot.com crash in 2000 and the telecom crash in 2002, many ISPs filed for bankruptcy, including Wall Street darlings like Covad, Excite@home, NorthPoint, PSINet, Rhythms NetConnections, and Winstar Communications.
The broadband industry changed significantly after the 2000 election. The technological infrastructure was significantly devastated by the dot.com crash of 2000 and the telecom crash of 2002.
Furthermore, Internet policy changed when the Bush administration was reelected in 2004. The FCC revoked Computer II in 2005 when it redefined carrier-based broadband as an information service.
This meant that broadband was effectively not regulated and telcos could go on to compete with the ISPs. Instead of offering backbone services and being required to interconnect with the ISPs, they became ISPs and were no longer required to provide ISPs their connection to the Internet. The competitive environment that nurtured Internet growth was effectively decimated.
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
US Internet Policy, Part 1: The Rise of ISPs
Posted on | March 15, 2020 | No Comments
Much of the early success of the Internet in the USA can be attributed to the emergence of a unique organizational form, the Internet Service Provider or “ISP,” which became the dominant provider of Internet and broadband services in the 1990s. These organizations resulted from a unique set of regulatory directives that pushed the Internet’s development and created a competitive environment that encouraged the proliferation of ISPs and the spread of the World Wide Web.
In this series on Internet policy, I look at the rise of the World Wide Web, the shift to broadband, deregulation, and consolidation of broadband services. Later in the series, I address the issue of net neutrality and raise the question, “Is Internet service a utility? Is it an essential service that should be made universally available to all Americans at regulated prices?
The first ISPs began as US government-funded entities that served research and education communities of the early Internet. Secured by Al Gore in 1991, legislation signed by President George H. Bush created the model of the National Research and Education Network (NREN), a government-sponsored internet service provider dedicated to supporting the needs of the research and education communities within the US. Internet2, Merit, NYSERNET, OARnet, and KanRen were a few of the systems that provided schools and other non-profit organizations access to the World Wide Web. While dialup services like Compuserve existed in the early 1980s, only later were the ISPs released for commercial traffic and services.
While telecommunications carriers had been moving some Internet traffic since the late 1980s, their role expanded dramatically after the Internet began to allow commercial activities. In June of 1992 Congressman Rick Boucher (D-Va) introduced an amendment to the National Science Act of 1950 that allowed commercial activities on the US National Science Foundation Network (NSFNET). “A few months later, while waiting for Arkansas Governor William Jefferson Clinton to take over the Presidency, outgoing President George Bush, Sr. signed the Act into law.” The amendment allowed advertising and sales activities on the NSFNET and marked the advent of online commercial activities.
As part of the National Information Infrastructure (NII) plan, the US government decommissioned the US National Science Foundation Network (NSFNET) in 1995. It had been the publicly financed backbone for most IP traffic in the US. The NII handed over interconnection to four Network Access Points (NAPs) in different parts of the country to create a bridge to the modern Internet of many private-sector competitors.
These NAPS contracted with the big commercial carriers of the time such as Ameritech, Pacific Bell, and Sprint for new facilities to form a network-of-networks, anchored around Internet Exchange Points (IXPs). The former regional Bell companies were to be primarily wholesalers, interconnecting with ISPs. This relatively easy process of connecting routers was to put the “inter” in the Internet but also became sites of performance degradation and unequal power relations.
As the Internet took off in the late 1990s, thousands of new ISPs set up business to commercialize the Internet. The major markets for ISPs were: 1) access services, 2) wholesale IP services, and 3) value-added services offered to individuals and corporations. Access services were provided for both individual and corporate accounts and involved connecting them to the Internet via dial-up, ISDN, T-1, frame-relay or other network connections. Wholesale IP services were primarily offered by facilities-based providers like MCI, Sprint, and WorldCom UUNET (a spinoff of a DOD-funded seismic research facility) and involved providing leased capacity over its backbone networks. Value-added services included web-hosting, e-commerce, and networked resident security services. By the end of 1997, over 4,900 ISPs existed in North America, although most of them had fewer than 3,000 subscribers.[2] See the below video and this response for how much things have changed.
FCC policy had allowed unlimited local phone calling for enhanced computer services and early Internet users connected to their local ISP using their modems over POTS (Plain Old Telephone System). ISPs quickly developed software that was put on CD-ROMs that could be easily installed on a personal computer. The software usually put an icon on the desktop screen of the computer that when clicked on would dial the ISP automatically, provide the password, and connect the user to Internet. A company called Netscape created a popular “browser” that allowed text and images to be displayed on the screen. The browser used what was called the World Wide Web, a system of accessing files quickly from computer servers all over the globe.
The ISPs emerged as an important component to the Internet’s accessibility and were greatly aided by US government policy. The distinctions made in the FCC’s Second Computer Inquiry in 1981 allowed ISPs to bypass many of the regulatory roadblocks experienced by traditional communication carriers. They opened up possibilities and created protections for computer communications. Telcos were to provide regulated basic services and “enhanced services” were to stay unregulated. Dan Schiller explained:
-
Under federal regulation, U.S. ISPs had been classed as providers of enhanced service. This designation conferred on ISPs a characteristically privileged status within the liberalized zone of network development. It exempted them from the interconnection, or access, charges levied on other systems that tie in with local telephone networks; it also meant that ISPs did not have to pay into the government’s universal service fund, which provided subsidies to support telephone access in low-income and rural areas. As a result of this sustained federal policy, ISPs enjoyed a substantial cross-subsidy, which was borne by ordinary voice users of the local telecommunications network.[3]
ISPs looked to equip themselves for potential new markets and also connect with other companies. For example, IBM and telecom provider Qwest hooked up to offer web hosting services. PSINet bought Metamor to not only transfer data but to host, design, and move companies from the old software environment to the new digital environment. ISPs increasingly saw themselves as not only providers of a transparent data pipe but also as a provider of value-added services such as web hosting, colocation, and support for domain name registration.
The next part of this series will discuss the shift to higher speed broadband capabilities. Later, the consolidation of the industries starting in 2005 when the FCC changed the regulatory regime for wireline broadband services.
Notes
[1] Hundt, R. (2000) You Say You Want a Revolution? A Story of Information Age Politics. Yale University Press. p. 25.
[2] McCarthy, B. (1999) “Introduction to the Directory of Internet Service Providers,” Boardwatch Magazine’s Directory of Internet Service Providers. Winter 1998-Spring 1999. p. 4.
[3] Schiller, D. (1999) Digital Capitalism. The MIT Press. p. 31.
Share
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor and Undergraduate Director at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Congressman Rick Boucher > enhanced services > FCC > ISDN > ISPs > Net Neutrality > Second Computer Inquiry > Section 706 > Telecommunications Act of 1996 > Worldcom
Five Stages of ICT for Global Development
Posted on | February 19, 2020 | No Comments
Summarized remarks from “Five Stages of Global ICT4D: Governance, Network Transformation, and the Increasing Utilization of Surplus Behavioral Data for Prediction Products and Services,” presented at the 27th AMIC Annual Conference on 17-19 June 2019 at Chulalongkorn University, Bangkok, Thailand.
This presentation will explore and outline the following stages of economic and social development utilizing information and communications technologies (ICT). The ICT acronym has emerged as a popular moniker, especially in international usage, for the digital technology revolution and is often combined with “development” to form ICT4D. Development is a contested term with currency in several areas. Still, in global political economy, it refers to the process of building environments and infrastructure needed to improve the quality of human life and bridge equity divides. Often this means enhancing a nation’s agriculture, education, health, and other public goods that are not strictly economy-related but improve well-being and intellectual capital.
Of particular interest is the transformation of public-switched networks for Internet Protocol (IP)-based services and then WiFi and mobile use. Data-intensive solutions are beginning to address many development issues. However, a growing concern is that data is being collected extensively and used intrusively to manipulate behaviors.
The stages are categorized as:
1) Containment/Development/Modernization; 2) New World Economic and Information Orders; 3) Structural Adjustment and Re-subordination, 4) Global ICT Integration, and; 5) Smart/Sustainable Mobile and Data-Driven Development.
Using a techno-structural approach, the explication of these stages will provide historical context for understanding trends in ICT innovation and implementation. This approach recognizes the reciprocal effects between technological developments and institutional power and a “dual-effects hypothesis” to illustrate the parallel potentials of ICT4D as both a democratizing and totalizing force. This research will also provide insights into the possibilities of ICT diffusion in developing environments.
1) Containment/Development/Modernization
The US emerged as the primary economic and military power in the aftermath of World War II. Arms and materiel sales before the war had transferred much of the world’s gold to the US, which was wisely transferred inland to Fort Knox, Kentucky. Franklin Delano Roosevelt (FDR) had sought to limit finance in the aftermath of the “Crash of 1929” and continued it globally with the initiative at Bretton Woods Hotel in New Hampshire. “Bretton Woods” created the International Monetary Fund (IMF), the World Bank, the International Trade Organization (rejected by Congress), and also instituted a dollar-gold standard that tied the US dollar to gold at $35 an ounce (oz) and other international currencies to the US dollar at set rates. This system was designed to “contain” financial speculation and encourage trade and development.
The other aspect of containment, more widely known, is the containment of Communism. The painful success of the USSR in World War II in countering the Nazis on the eastern front and their appropriation of German atomic and rocketry technology presented an ideological and military threat to the US and its allies. The USSR’s launch of Sputnik satellites in 1957 resulted in the US’s formation of NASA and ARPA. The Communist revolution in China in 1949 and their explosion of an atomic bomb on Oct. 16, 1964, spurred additional concern. The resultant Cold War and Space Race spurred technological development and competition for “developing countries”worldwide.
“Development” and “modernization” characterized the post-World War II US prescription for economic development around the world, and especially in newly decolonized nation-states. Modernization referred to a transitional process of moving from “traditional” or “primitive” communities to modern societies based on scientific rationality, abstract thought, and the belief in progress. It included urbanization, economic growth, and “psychic mobility” that could be influenced by types of media. Scholars talked of an eventual “takeoff” if the proper regiment was followed, particularly the adoption of new agricultural techniques termed the “Green Revolution.”[1] Information and Communications Technologies (ICTs) were rarely stressed, but “five communication revolutions” (print, film, radio, television, and later, satellites) were beginning to be recognized as making a contribution to national development.
Communication technologies were beginning to spread information about modern practices in agriculture, health, education, and national governance. Some early computerization projects continued population analysis, such as the census that had started with tabulation machines, while mainframes and minicomputers were increasingly utilized for statistical gathering by government agencies for development processes.
Telegraphy and telephones were strangely absent from much of the discussion but were important for government activities as well as large-scale plantations, mining operations, transportation coordination, and maritime shipping. Because of their large capital requirements and geographic expanse, countries uniformly developed state-controlled Post, Telephone, and Telegraph (PTTs) entities. Organized with the help and guidance of the International Telecommunications Union (ITU), the oldest United Nations entity, PTTs struggled to provide basic voice and telegraphic services. However, they provided needed jobs, technical resources, and currency for the national treasury.
Wilbur Schramm’s (1964) book Mass Media and National Development made crucial links between media and national development. Published by Stanford University Press and UNESCO, it examined the role of newspapers, radio, and television. Its emphasis on the role of information in development also laid the foundation for the analysis of computerization and ICT in the development process. I had an office next to Schramm for many years at the East-West Center’s Communication Insitute that he founded while I worked on the National Computerization Policy project that resulted in the ICT4D benchmark study Computerization and Development in Southeast Asia (1987). Herbert Dordick, Meheroo Jussawalla, Deane Neubauer, and Syed Rahim were key scholars in the early years of ICT4D at the East-West Center.[1]
2) New World Economic and Information Orders
Rising frustrations and concerns about neo-colonialism due to the power of transnational corporations (TNCs), especially news companies, resulted in a collective call by developing countries for various conceptions of a “New World Economic and Communication Order.” It was echoed by UNESCO in the wake of OPEC oil shocks and the resulting Third World debt crisis. The issue was primarily news flow and the imbalanced flow of information from North to South. Developing countries were concerned about the unequal flows of news and data from developing to developed countries. In part, it was the preponderance of news dealing with disasters, coups, and other calamities that many countries felt restricted flows of foreign investment. The calls caused a backlash in the US and other developed countries concerned about the independence of journalism and the free flow of trade.[2]
It was followed by concerns about obstacles hindering communications infrastructure development and how telecommunications access across the world could be stimulated. In 1983, UNESCO’s World Communication Year, the Independent Commission met several times to discuss the importance of communication infrastructure for social and economic development and to make recommendations for spurring its growth.
The Commission consisted of seventeen members – communication elites from both private and public sectors and representing a number of countries. Spurred on by the growing optimism about the development potential of telecommunications, they investigated ways Third World countries could be supported in this area. They published their recommendations in The Missing Link (1984) or what soon was to be called the “Maitland Report” after its Chair, Sir Donald Maitland from the United Kingdom. This report brought recognition to the role of telecommunications in development and opened up resources by international organizations such as the World Bank.
The transition from telegraph and telex machines to computers also resulted in concerns about data transcending national boundaries. The Intergovernmental Bureau for Informatics (IBI) that had been set up as the International Computation Centre (ICC) in 1951 to help countries get access to major computers, began to study national computerization policy issues in the mid-1970s.
They increasingly focused on transborder data flows (TDF) that moved sensitive corporate, government, and personal information across national boundaries. The first International Conference on Transborder Data Flows was organized in September 1980, followed by a second held in 1984; both were held in Rome (Italy). The increasing use of computers raised questions about accounting and economic data avoiding political and tax scrutiny. The concern was that these data movements could act like a “trojan horse” and compromise a country’s credit ratings and national sovereignty, as well as individual privacy.
3) Structural Adjustment and Re-subordination
Instead, a new era of “structural adjustment” enforced by the International Monetary Fund emerged that targeted national post, telephone, and telegraph (PTT) agencies and other aspects of government administration and ownership. Long considered agents of national development and employment, PTTs came under increasing criticism for their antiquated technologies and lack of customer service.
In the early 1970s, Nixon ended the Bretton Woods regulation of the dollar-gold standard, resulting in very volatile currency markets. Oil prices increased, and dollars flowed into OPEC countries, only to be lent out to cash-poor developing countries. The flow of petrodollar lending and rising “third world debt” pressured PTTs to add new value-added data networks and undergo satellite deregulation. Global circuits of digital money and news emerged, such as Reuters Money Monitor Rates and SWIFT (Society for Worldwide Interbank Telecommunications). These networks, the first to use packet-switching, linked currency exchange markets worldwide in arguably the first virtual market.
A new techno-economic imperative emerged that changed the relationship between government agencies and global capital. PC-spreadsheet technologies were utilized to inventory, value, and privatize PTTs so they could be corporatized and listed on electronically linked share-market exchanges. Communications markets were liberalized to allow domestic and international competition for new telecommunications services, and sales of digital switches and fiber optic networks. Developing countries became “emerging markets,” consistently disciplined by the “Washington Consensus” stressing a set of policy prescriptions to continue to open them up to transborder data flows and international trade.[3]
4) Global ICT Integration
Packet-switching technologies standardized into the ITU’s X.25 and X.75 protocols for PTT data networks, transformed into ubiquitous TCP/IP networks by the late 1990s. Cisco Systems became the principal enabler with a series of multi-protocol routers designed for enterprises, governments, and eventually telcos. Lucent, Northern Telecom, and other telecommunications equipment suppliers quickly lost market share as the Internet protocols, mandated by the US military’s ARPANET, and later by the National Science Foundation’s NSFNET, were integrated into ISDN, ATM, and SONET technologies in telcos around the world.
The Global Information Infrastructure (GII) introduced at the annual ITU meeting in Buenos Aires in March of 1994 by Vice President Gore targeted national PTT monopolies and government regulatory agencies. He proposed a new model of global telecommunications based on competition, instead of monopoly. He stressed the rule of law and the interconnection of networks to existing networks at fair prices. Gore followed up the next month in Marrakesh, Morocco, at the closing meeting of the Uruguay Round of the GATT (General Agreement on Tariffs and Trade) negotiations which called for the inclusion of GATS (General Agreement on Trade in Services) that include everything from ciruses, to education, radio and television, and telecommunications services. And, at this meeting, they called for the creation of the World Trade Organization (WTO).
Formed in 1995, the WTO had two meetings in 1996 and 1997 that created a new era of global communications and development. Members party to the new multilateral arrangement met quickly in Singapore in 1996 to reduce tariffs on the international sales of a wide variety of information technologies. The Information Technology Agreement (ITA) was signed by 29 participants in December 1996. The agreement was expanded at the Nairobi Ministerial Conference in December 2015, to cover an additional 201 products valued at over $1.3 trillion per year. A agreements allowed Korea to successfully market early CDMA mobile handsets and develop a trajectory of success in the smartphone market.
In 1997 the WTO met in Geneva and established rules for the continued privatization of national telecommunications operations. Sixty-nine nations party to the WTO, including the U.S., signed the Agreement on Basic Telecommunications Services in 1997 that codified new rules for telecommunications deregulation where countries agreed to privatize and open their own telecommunications infrastructures to foreign penetration and competition by other telcos.
The agreements came at a crucial technological time. The World Wide Web (WWW) was a working technology, but it would not have lived up to its namesake if the WTO had not negotiated and reduced tariffs for crucial networking and computer equipment. The resultant liberalization of data and mobile services around the world made possible a new stage in global development.
Hypertext, Ad Markets, and Search Engines
The online economy emerged with the Internet and its hypertext click environment. Starting with advertising and the keyword search and auctioning system, a new means of economic production and political participation based on the wide-scale collection and rendition of surplus behavioral data emerged for prediction products and services.
As Shoshana Zuboff points out in Surveillance Capitalism (2019), the economy expands by finding new things to commodify, and the Internet provided a multitude of new products and services that could be sold. When the Internet was privatized in the early 1990s and the World Wide Web (WWW) established the protocols for hypertext and webpages, new virtual worlds of online media spaces were enabled. These were called “inventory.” Or you can can them ad space.
Behavioral data is the information produced as a result of actions that can is measured on a range of devices connected to the Internet, such as a PC, tablet, or smartphone. Behavioral data tracks the sites visited, the apps downloaded, or the games played. Cloud platforms claims human experience as free raw material for translation into behavioral data. Some of this data is applied to product or service improvements, the rest are declared as proprietary behavioral surplus, and fed into advanced manufacturing processes known as ‘machine intelligence.’ Automated machine processes can capture knowledge about behaviors but also shape behaviors.
Surplus behavioral and instrumental data is turned into prediction products such as recommendation engines for e-commerce and entertainment. These anticipate what people will do now, soon, and later. Prediction products are traded in a new kind of marketplace for behavioral predictions called behavioral futures markets. These are currently used primarily used in advertising systems based on CTR, Pay-Per-Click (PPC), and real-time bidding auction systems.
5) Smart/Sustainable Mobile and Data-Centric Development
The aggressive trade negotiations and agreements in the 1990s significantly reduced the costs of ICT devices and communication exchanges worldwide, making possible a wide variety of new commercial and development activities based on ICT capabilities. We are at the halfway point for the sustainable development goals (SDGs) outlined by the United Nations in 2015. The SDGs are providing an additional impetus for ICT4D as it encourages infrastructure building and support for key development activities that ICT can assist, such as monitoring Earth and sea resources and providing affordable health information and communication activities.
A key variable is the value of the dollar that is the world’s primary transacting currency. A global shortage of dollars due to high interest rates or political risk means higher prices for imported goods, regardless of lower tariffs. The post-Covid crisis in Ukraine has stressed supply chains of key materials and raw Earth mineral from Russia and Ukraine further adding to potential costs and geopolitical risk. ICT4D is highly reliant on global supply chains making digital devices readily available at reasonable prices.
The near-zero marginal costs for digital products make information content and services more accessible for developing countries. Books, MOOCs, and other online services provide value to a vast population with minimal costs to reach each additional person. Platform-based services providing agricultural, health, and other development services provide low-cost accessibility and outreach. They allow new applications to scale significantly with low costs. Incidentally and significantly, renewable energy sources like solar and wind also provide near-zero marginal costs for producing electricity. Like digital products, they require high initial investments but output product at low costs once operational.
Mobility, broadband, and cloud services are three significant technologies presenting positive prospects for ICT4D. Mobile broadband technologies that bypass traditional wireline “last mile” infrastructure have been a major boost to the prospects for ICT4D. They provide significant connectivity across a wide range of the population and with key commercial and government entities. 4G LTE technologies currently provide the optimal service, as 5G towers consume nearly over 60% more power than LTE and also require more stations as their range is lower.
Enhanced connectivity strengthens network effects. Blockchain technologies and cryptocurrencies, the Internet of Things (IoT), and the proliferation of web platforms are some of the current conceptions of how reduced costs for communications and information analysis are enhancing network effects and creating value from the collection and processing of unstructured data.
This project will expand on these stages and provide a context for a further investigation of ICT for development drawing on historical and current research. Of particular concern is the implementation of policies and practices related to contemporary development practices, but commercial and monetization techniques are important as well.
Notes
[1a] Dordick, Herbert S. and Deane Neubauer. 1985. “Information as Currency: Organizational Restructuring Under the Impact of the Information Revolution.” Bulletin of the Institute for Communications Research, Keio University, No 25, 12–13. This journal article was particularly insightful into the dynamics of the PTTs that would lead to pressures on them to adapt IP technologies leading to the World Wide Web.
[1] Rostow, W.W. (1960) Stages of Economic Growth: A Non-Communist Manifesto. Cambridge: Cambridge University Press. See also Rostow W.W., (1965) Stages of Political Development. Cambridge: Cambridge University Press.
[2] An excellent discussion of the various development and new world communication and economic order discourses can be found in Majid Tehranian’s (1999) Global Communication and World Politics: Domination, Development, and Discourse. Boulder, CO: Lynne Rienner Publishers. p. 40-41. Also, see Jussawalla, M. (1981) ) Bridging Global Barriers: Two New International Orders. Papers of the East-West Communications Institute. Honolulu, Hawaii.
[3] Wriston, W.W. (1992) The Twilight of Sovereignty : How the Information Revolution Is Transforming Our World.
Citation APA (7th Edition)
Pennings, A.J. (2020, Feb 19). Five Stages of ICT for Global Development. apennings.com https://apennings.com/how-it-came-to-rule-the-world/planting-to-platforms-five-stages-of-ict-for-global-development/
© ALL RIGHTS RESERVED
Anthony J. Pennings, Ph.D. is a Professor in the Department of Technology and Society, State University of New York, Korea. From 2002-2012 he was on the faculty of New York University. Previously, he taught at Hannam University in South Korea, Marist College in New York, andVictoria University in Wellington, New Zealand. His American home is in Austin, Texas and taught there in the Digital Media MBA program atSt. Edwards University He joyfully spent 9 years at the East-West Center in Honolulu, Hawaii.
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Computerization and Development in Southeast Asia > five communication revolutions > ICT > ICT4D > Maitland Commission > Wilbur Schramm