Provisionally, we can define cinema as on-screen (and even large-screen) presentation of moving images that have been pre-recorded photo-chemically on some support (most often, strips of celluloid).
Among the practices of modern, mass-disseminated visual culture, the cinema was perhaps the pre-eminent form for at least the first half of the twentieth century (after which it confronted challenges for audiences’ attention from television) and it certainly continues to exert great impact on both leisure activities and the visual education of publics worldwide into the twenty-first century. As film theorists of the 1970s were fond of declaring, the cinema combines a technical apparatus – that is, technology for capturing images; technology for presenting those images to viewers – and a mental apparatus – that is, the psychological dispositions that encourage spectators to invest cinematic images with great affective power. That so many ordinary citizens report that their first response to the media images of 9/11 was to be reminded of Hollywood big-explosion disaster films (and even perhaps wonder if they weren’t seeing “just a movie”) is a sobering reminder of the cinema’s ongoing emotional power, its ability to insinuate its imaginary visions into our very mental make-up.
Historical Challenges To Absolute Definitions Of Cinema
The cinema emerged out of the rich visual culture of nineteenth-century western society as an art that gave motion to the still images of photography. While the earliest films were single-shot renditions of everyday reality and delighted spectators simply by projecting their ordinary world to them, very quickly creative filmmakers used editing and staged scenes to construct narrative fictions that grew in complexity. The art of film came in the public imagination to be identified with entertainment stories geared for mass consumption around the globe.
Paradoxically, though, while its very popularity makes it seem easy to say what the cinema is, most definitions of the form capture only some part of the cinematic experience as it has evolved over time and been disseminated across the globe. But to return to the provisional definition of cinema offered above, the actual history of cinema suggests that virtually every term in such a “definition” is at best a generalization from specific practices particular to just some phases of cinema’s complex history. For instance, although it is easy to associate film with movement and thereby distinguish it from, say, more seemingly static visual culture such as photography (and here the etymological derivation of “cinema” from “kinetic” can encourage such association), some filmmakers play with stasis in ways that seek directly to challenge cinema’s ostensible kinesis. The motion of the film strip through the projector does not necessarily imply motion in the images that are then presented on the screen. Here, an extreme example might be Andy Warhol’s 1964 film Empire, which lasts close to 8 hours and offers a fixed shot of New York’s Empire State Building. While there is some “action” in the film (for example, a blinking light comes on the building as night falls), it is minimal, and Warhol slowed down the experience even more by shooting the film at fast speed (which generates more film stock and therefore extends projection time).
Warhol has been termed an experimental or avant-garde filmmaker, and it has been one of the goals of experimenters in the arts to challenge any attempt to fix artistic forms within rigid definitions. Where narrative diversion has been a, if not the, primary historical form of cinema, avant-garde artists seek other functions for the art form. Additional challenges to an encompassing definition come from the evolution of technologies for the capture and display of images and from changes in the roles those images take on for their various publics.
For example, it might be tempting to distinguish cinema from, say, television by the supposed intimacy of the latter within the privatized realm of domesticity (television as a medium of the home). But it is clear that from the second half of the twentieth century on, there were various forms of convergence between screen practices, both large and small, public and private, that blurred the boundaries of cinema and other forms of moving-image culture. Is the ritual holiday presentation of The wizard of Oz (1939) on television part of cinema history (the holiday viewings are in fact where the popular veneration of the film really took off ) or part of television history, or both? Is the viewing of a feature film on a computer screen or even, as is becoming more possible, on a mobile phone still part of a cinematic experience we so traditionally assume involves public movie-going? Conversely, is the viewing of a loud, explosion-filled Hollywood blockbuster on a television screen at home not to be called movie viewing, especially if the television is a large-screen “home theater” (which may be hooked up to a domestic version of multitrack Dolby and THX sound; Digitization and Media Convergence)?
In today’s movie business, the public projection of a film in theaters is often only a first stage in a process in which the individual film moves across multiple media platforms – from theater to DVD, to subscription cable television and cable video-on-demand, to computer screen, to large-screen home projection to hand-held device, and on and on. If we are tempted to think stereotypically of cinema as an independent cultural form centered on the experience of seeing films in a theater, then it seems increasingly likely that such singular imagining of cinema is partial and inadequate. It is the prediction, moreover, of most media experts that digital filmmaking and digital projection will eventually supplant most production and exhibition through the physical format of film stock, and this rendering virtual of film images will no doubt further the convergence of cinema with other forms of digital culture such as television shows and video games.
From The Historical Variability Of The Cinematic Experience To Hollywood Hegemony
It is likely, then, that an understanding of cinema requires less an ontology (that is, a definition of the very being, the very nature, of the cultural form) than a history or a sociology of the various forms of cinema that have been and will be. Indeed, if we associate “cinema” with the experience of public, theatrical projection of moving images, that is because historically such experience became the socially dominant form for the art. Revealingly, many of the very first films were viewed not in public, theatrical contexts but on a person-by-person basis: in particular, the Edison Kinetoscope of the late nineteenth century involved the film strip running inside a large box that single spectators peered into one at a time. It was not inevitable that screen images would cease to be viewed in such privatized, individualized fashion, and it was in large part the impact of market conditions – the financial need to bring in more revenue by bringing more people into contact with the cultural form – that led to larger screen projection viewable by collective audiences rather than just individuals alone.
Clearly, the historically hegemonic form of cinema through the twentieth century was the narrative, fiction film, with the US film industry in particular assuming domination of the world market. In the earliest decades, Europe had been in the forefront of production and distribution, but World War I left many European film companies in a shambles and US firms quickly gained a comparative advantage in the world market for film entertainment. For many audiences, cinema means “the movies,” by which the intention is to describe an entertaining, diverting, and kinetically moving form of leisure culture based on identification with on-screen characters who are involved in stories often of romance and action. That identification is fostered by film techniques that render the image seductive (for example, in Hollywood’s so-called Golden Age, soft focus might be used in close-ups of stereotypically beautiful women stars to give them an additionally attractive ethereal quality) and by the star system itself, which implied that certain people stood out from the crowd and deserved special adulation for their grace and glamour.
The seductiveness of Hollywood fictions centered on vibrant protagonists could be so strong as to cast its spell over cinematic forms that set out precisely to move away from fiction and offer an imputed direct engagement with everyday reality. Thus, many works from documentary cinema adopt compelling narrative form and center on figures who fight the odds to insist on their place in the world: from Robert Flaherty’s portrayal of the resilient Eskimo who keeps at his battle with savage nature in Nanook of the North (1922) to Michael Moore’s documentary box office hits like Fahrenheit 9/11 (2004), the history of the nonfiction film is to a large degree the tale of its complicated encounter with fiction, narrative engagement, and entertainment.
Throughout its history, indeed, the documentary film has had to bear the charge that its seemingly direct presentation of reality is in fact mediated in ways that blur the boundaries of truth and fiction. The infamous manner by which Leni Riefenstahl’s 1935 documentary Triumph of the will set out to propagandize for Adolf Hitler as a seemingly god-like figure of incomparable charisma is only the most explicit example of a constant temptation in documentary to turn instruction into preplanned and prepackaged emotional persuasion. The documentary may set out to educate but it also frequently intends to seduce, and here the rhetorical devices of Hollywood storytelling are frequently an evident resource: not for nothing was the celebrated wartime propaganda series, Why we fight (1943), which set out in rousing fashion to mold ordinary citizens’ support of the war efforts of the Allies, put together by Frank Capra, a Hollywood director well known for films of intense emotionalism and sentimental appeal.
Hollywood’s hegemony over publicly shared image culture throughout the globe has raised concerns by censorship organizations, moral custodians, and even governments over the power of this visual medium. For instance, in the domestic market of the US itself, worry over the potentially deleterious effect of audience identification with glamorously portrayed but morally dubious characters led to the imposition of a code of permissible content in films that achieved refinement at the beginning of the period of the sound film (the late 1920s and early 1930s) and then lasted for decades.
At the same historical moment, a group of conservative religious figures who worried over the movies’ impact on morals decided to give their condemnation scientific standing by turning to social scientists for experiments that would “prove” the cinema’s negative effects on behavior, conduct, and even health and personal well-being: with funding from a private foundation concerned with reform morals, the Payne Fund, the scientists produced over 10 volumes through the first half of the 1930s. The results of their investigations were somewhat mixed but it is clear that there was pressure on these social scientists to come up with conclusions that at least imagined popular cinema to likely have deleterious effects on public morals. However compromised in their research objectivity, the Payne Fund studies on the influence of movies are still cited in the literature of communication studies as one of the first concerted efforts – albeit highly biased in intent and highly questionable in control of method, collection of data, and interpretation of experimental results – by social science to conduct communications research on media effects for a specific form of modern popular culture.
Alternatives To Hollywood
Globally, there has historically been concern about the hegemony of Hollywood over other national cinemas, and resistance to the perceived power of the American feature film has manifested itself most immediately in such forms as quotas (percentages of indigenous production versus American film allowed into diverse national or regional markets), incentives (such as tax breaks) to local filmmakers, subventions of filmmaking when it concerns indigenous subject matter, and so on. The history of cinema is simultaneously the history of a dominant industry centered in Hollywood and the diversity of alternatives that have arisen elsewhere over time. This “elsewhere” can include American sites in which there is an attempt to articulate non-Hollywood modes of production: for example, the avant-garde film scenes of New York and San Francisco in the 1960s or the rough urban filmmaking of blaxsploitation in the 1970s (which ironically was itself eventually largely co-opted and commercialized by Hollywood itself ).
In some cases, the intent of filmmakers outside of the Hollywood system is to rival Hollywood at its own game by intensifying the entertainment propensities of various genres of playful diversion. For instance, Italian “spaghetti westerns” of the 1960s or Hong Kong “kung fu” action films from more recent years take inspiration from Hollywood genres of violence (for example, the western or the gangster story) as well as from Hollywood investment in the glamorous star (for example, in the Hong Kong case, the male star, whose characters exhibit extreme prowess in scenes of combat). Thereby, these non-Hollywood genres set out to articulate their own particularly seductive form of action through choreographed set pieces in which violent men move virtually balletically as they go about their work of mayhem.
In the 1980s, to take a very different example, a French film industry that was decidedly commercial in intent revitalized itself by focusing on the screwball genre of popular, fast-talking comedies of amorous and personal misunderstanding (as in the big indigenous hit, Trois hommes et un couffin [1985], about three carefree bachelors who suddenly “inherit” a baby from a single mom and must learn familial responsibility). Perhaps predictably, the success of such genre cinema has caught Hollywood’s attention and led to such phenomena as the hiring of global talent into the American system (for example, the revered Hong Kong action director John Woo was brought to Hollywood, as were seductive actors like Chow-Yung Fat) and to Hollywood remakes of successful genre films from the foreign scene (thus, Trois hommes became Three men and a baby [1987]).
Other global filmmakers and film industries find their own specificity less by extension or emulation of Hollywood genre entertainment than by the production of films that they assume offer audiences experiences that fall outside the purview of Hollywood escapism. For instance, in the 1960s and 1970s, a number of filmmakers in so-called “third world” countries set out, often with government support through production subvention, to challenge what they saw as their media industries’ risk of dependency on US hegemony in the cultural realm. Against the glossy visual perfection of the Hollywood film, some filmmakers (for example, in Latin and South America) thus militated for an activist cinema that would be rough or deliberately imperfect in look and politically confrontational in subject matter. This cinema was sometimes termed “third cinema” or “imperfect cinema.” Yet other filmmakers have been driven by the assumption that there needs to be an encouragement of cinematic forms other than that Hollywood one geared ostensibly to passively absorbed entertainment. The alternative for these filmmakers is to create works that they assume lead to the audience’s intellectual engagement with what it is seeing and to reflection on both artistic and existential issues as well as political ones. Thus, in the period after the Russian Revolution of 1917, a number of Soviet filmmakers such as Dziga Vertov and Sergei Eisenstein set out to construct a dialectical cinema that would refuse easy entertainment and dramatically compel spectators toward new social attitudes. Likewise, in the late 1950s and early 1960s especially, what was termed “European art cinema” was rich with films that offered thematic complexity, moral dilemma, intellectual puzzle, dis-identification with character, and philosophical reflection, along with an antinarrative refusal of closure and the tying up of all enigmas with a happy ending. One strategy was to end with a freeze frame in which characters were caught in the midst of making a decision about life options – see, for instance, the famous last shot of François Truffaut’s The 400 blows (1959) with its truant boy stuck at the ocean’s edge not sure where to go – and in which it was assumed the audience itself had to puzzle out intellectually what could or even should happen next.
Significantly, this emphasis in the 1950s and 1960s on an “art cinema” that ostensibly encourages the audience both to think and to appreciate concerted experimentation with cinematic language was complemented in the 1960s by criticism and scholarship that themselves emphasized a cinema of aesthetic distinction from Hollywood light diversion. Central to this new critical discourse was an emphasis on the film director as “auteur” – that is, a veritable author who, even with a screenplay written by someone else, imposed his or her own thematic vision on the finished work of film art. By the 1970s, the Hollywood film industry itself – which had seen its own efficacy of production and its own grasp of the world filmmaking scene challenged with the coming of television, the break-up of the old studios, and the influx of new subject matter in the area of sex and violence in ways that showed up Hollywood’s censorious fastidiousness as old-fashioned and out of step with the times – came to invigorate itself precisely by a turn to auteur-driven filmmaking.
For a while, Hollywood filmmaking ceased to seem conformist in order to give witness to films that appeared to have “something to say.” For instance, Robert Altman’s McCabe and Mrs Miller (1971) is essentially a “European” type art film but made from within the Hollywood system: this work employs ambiguity of character (is McCabe [Warren Beatty] a hero or not?), distinctiveness of visual style, critique of dominant power structures, thought-provoking thematic resonance, relative explicitness of violence and sexual reference, lack of full narrative closure, deconstruction of genre (in this case, the western) and its pleasures, and so on. However, as much as the “auteur” was intended to serve a directly critical function – reminding one that escapist cinema was precisely a site in which personal voice was rare – it was clearly also a sort of branding by which artful films could be promoted and sold to their target audiences. In this way, a cinema of supposed distinction rejoins the commercial impulses of dominant movie entertainment.
References:
- Bordwell, D., Thompson, K., & Staiger, J. (1985). The classical Hollywood cinema: Film style and mode of production to 1960. New York: Columbia University Press.
- Casetti, F. (1999). Theories of cinema, 1945 –1990 (trans. F. Chiostri & E. G. Bartolini-Salimbeni). Austin, TX: University of Texas Press. (Original work published 1993).
- Cook, D. (2004). A history of narrative film, 4th edn. New York: W. W. Norton.
- Kahana, J. (2008). Intelligence work: The state of American documentary. New York: Oxford University Press.
- Le Grice, M. (1977). Abstract film and beyond. Cambridge, MA: MIT Press.
- Musser, C. (1990). The emergence of cinema: The American screen to 1907. New York: Macmillan.