Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera
Posted on | June 9, 2020 | No Comments
Keanu Reeves produced a thoughtful and timely documentary on the move from photochemical to digital film. In Side by Side (2012), he interviews some of the best directors and directors of cinematic photography (DPs) about the transformation of the film making process from celluloid to silicon. The transition, which has taken decades, is worth examining through the lens Clay Christensen provides through his theory of innovative disruption. His theory examines how technology can start out “under the radar” with an inferior and cheaper version that is continuously improved until it disrupts a major industry.
The documentary looks at the development of digital cameras, editing, and special effects processes, as well as distribution and projection systems. For each category, it examines the differences between film and digital video. In interviews with prominent actors, directors and editors, the pros and cons of each are discussed along with the obvious arc of the movement towards the increased use of digital processes in cinema.
In this post, I introduce some of the issues in the move to digital cameras within the context of disruptive innovation theory. It is instrumental in understanding how technology like digital video, which was obviously inferior for so long, could emerge to seriously challenge the beauty and institutional momentum of celluloid cinema.
Film merged into a working technology during the 1880s and 1890s with significant improvements in cameras and projection technologies primarily made by Thomas Edison and the French Lumiere brothers. Innovations occurred over the next 100 years, but the invention of the digital charge-coupled device (CCD) in 1969 at Bell Labs marked the beginning of a disruptive trend in digital cameras that would slowly continue to improve until they became a major competitor to film cameras.
The CCD was initially developed for spy satellites during the Cold War but was later integrated into consumer and commercial video products. It was used by Steven Sasson, at Kodak to invent the first digital camera in 1974. The technology was very crude, however, and Kodak did not see it as a worthy replacement for its film-based cameras. Neither did the film industry.
It was the Japanese electronics company SONY that developed the Camcorder in the 1980s based on digital CCD technology and continued development into the generation of 4K cameras. Similar resolution was achieved by the Red One Camera that rocked the film industry in 2007. The same company announced their Red Dragon 6K Sensor at at NAB 2013.
CCDs were largely replaced by complementary metal-oxide semiconductor (CMOS) technology in digital cameras. A spinoff from NASA and JPL, CMOS used less power and could make the cameras smaller. It made an enormous impact on social media, bringing awareness to uprisings in the Arab Spring and other crises around the world.
“Digital cinematography” emerged by 2002 with George Lucas’s Star Wars: Episode II, the Attack of the Clones, which was shot entirely in a high definition digital format. Although the lack of digital projections systems meant that the footage was transferred to film to play in theaters; the film still caused major controversy as the industry debated digital’s pros and cons. While initially these cameras were clearly inferior to their film counterparts, SONY and other companies stayed on the digital trail of eventually and unmistakably improving them to the point where some of the early adopters like Lucas took a chance.
By committing first to consumer markets, digital cameras found the resources to continually improve. Later they showed additional characteristics that film couldn’t match, such as the ability to store huge amounts of film data in very small devices. This meant no more transporting cans of film – a major consideration when shooting in remote and hazardous locations. Increased storage capability also meant the ability to shoot for longer periods of time – often to the actor’s chagrin but with the benefit of maintaining momentum.
Another benefit was being able to watch what was being shot instantaneously on a monitor and being able to review the shots while still on the set. This move was very popular with directors but shifted power away from the directors of photography.
The last fifteen years of camera development have witnessed the disruption of the entire global complex known as “Hollywood.” New cameras such as the Red Dragon and other digital technologies for editing, special effects, and distribution played havoc with the entire chain of creative processes established in the film world and its production and distribution circuits. Digital convergence has also broken down the barriers between film and television and expanded cinematic presentations to a wide variety of screens, from mobile phones to stadium jumbotrons.
Are we still in the infancy of an entirely new array of digital televisual experiences? The year 2007 marked a critical threshold for the use of digital technologies in cameras, but the innovations continued, including the integration of the digital camera on a smartphone and the widescale adoption of digital high definition. Rather than LIDAR, it is the digital camera that has been adopted by Telsa and other EVs to enhance automation and safety.
Film continues to be championed by directors like Christopher Nolan who used it in Interstellar (2014) and Dunkirk (2017). Quentin Tarantino is also known for being dedicated to film in his movies, including Once Upon a Time in Hollywood (2019).
In the next section we look at the disruptive influence of digital editing. Following that is a look at non-linear editing (NLE) that was also made possible by the digital revolution.
Citation APA (7th Edition)
Pennings, A.J. (2020, Jun 9). Digital Disruption in the Film Industry – Gains and Losses – Part 1: The Camera. apennings.com https://apennings.com/multimedia/digital-disruption-in-the-film-industry-gains-and-losses-part-1-the-camera/
© ALL RIGHTS RESERVED
Anthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. He wrote this during a brief stay at the Digital Media Management program at St. Edwards University in Austin, Texas. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.
Tags: Bell Labs > CCD > charge-coupled device > Christensen > Clay Christensen > Digital cinematography > Digitalistic > disruptive innovation theory > French Lumiere brothers > Red Dragon 6K Sensor > Sony Camcorder