Digital media, also known as “new media,” comprise content created, disseminated, and/ or stored using digital computers or mobile devices (video games, blogs, etc.), as well as their physical embodiment (DVDs, flash memory sticks, etc.). Digital media are often defined in contrast to “analog media,” new media in contrast to “mass media.” The history of digital media documents the move of computers from glorified calculators to devices for human communications, entertainment, and creative production, linking digital media to earlier interactive machines and media.
Central to this move and to computers more generally is memory: one of the definitions of digital media is digital storage device. The first clear articulation of computer memory to store both data and instructions was John von Neumann’s 1945 First draft of a report on the EDVAC, in which he compared vacuum tubes to human neurons. More concretely, J. Presper Eckert developed a mercury signal delay tube to produce “regenerative memory.” Static magnetic memory – based on recording technology developed in Germany during World War II – was first used in 1952 for an MIT test machine. The term digital media, however, does not simply mean digital storage: not only do early forms of digital storage (CRTs, ferrite magnetic coated tapes, and so on) not register now as digital media, the emergence of digital media effected the notion of medium as storage device. The term digital media encompasses broader definitions of medium as a form of circulation, dissemination, and communication. These understandings emphasize algorithms and human use over storage and thus the difference between technology and media. Important to these definitions are the development of higher-level programming languages, real-time and graphic user interfaces, and networked communications – all of which rely upon discrete hardware.
Digital, Real-Time Computation, And Human Augmentation
Higher-level programming languages, which enable language-based exchange between man and machine, depend on the programmability of discrete computation. Because digital machines treat ranges of signals as one value, they allow for the precision necessary for solving equations numerically, and for ensuring that instructions, stored in computer memory, are accurately read and executed. The first implemented higher-level language was Short Code for the UNIVAC (1950); the first widely used language FORTRAN, developed by IBM (1954 – 1957). Stemming from the desire to use the computer to program itself, higher-level programming languages hide the specificities of hardware and memory allocation from the programmer, and focus the programmer’s attention on questions of problem flow and data relations. Computers are thus part of the drive within media development to offer a highly technologized, equipment-free reality (Benjamin 1968).
Software as a mediating entity also requires real-time interfaces, where real time means computation that occurs “in conjunction with instruments receiving and responding to stimuli from the external environment” (OED). The first real-time digital computer was the WHIRLWIND, a post–World War II project initiated by the US Navy and MIT to produce a universal cockpit simulator. It eventually evolved into an Airforce-funded, continent-wide missile detection system known as SAGE, already obsolete by the time it was first fully implemented in 1963 (Edwards 1996). Real-time interfaces are the basis for time-sharing interfaces such as UNIX, developed by Bell Labs in 1970, that make the user feel as though he or she controls a shared machine and are thus considered the basis for personal computing (Campbell-Kelly & Aspray 1996). Real-time interfaces also make possible graphic user interfaces (GUIs), in which computers respond in real time to user input framed as an intervention on to a graphical surface.
Graphic user interfaces – along with a concerted commercial effort to turn computers from industrial to personal commodities – have helped to make computers personal multimedia machines. GUIs are part of a series of attempts to make the computer a vehicle of empowerment and mastery, rather than an object of grudging acceptance and resentment. Douglas Engelbart, who demoed the first mouse in 1968, argued as early as 1962 for the development of computer interfaces to augment human intellect. Such systems, he argued, would enable the type of associative linking (and thus human progress) advocated by Vannevar Bush in his highly influential “As we may think” article (1945). Theodor Nelson, who coined the term “hypertext” and was also influenced by Bush’s ideas, similarly argued “Responsive computer display systems [that] can, should and will restructure and light up the mental life of mankind” (Nelson 1987). Whether or not such displays could or would effect, or have effected, such results, GUIs have been a key component to the spread of desktop computers and the automation of office work.
Internet Communications As Media
The Internet has provided a means of circulation that makes digital media mass media. Like the technologies described above, the Internet was a military creation designed to augment human intelligence and turn humans into more creative producers. The Internet stems from ARPANET, a US military-funded network designed to link together several research sites. The main impetus behind its development was the belief that timesharing computer networks could bring about a “man machine synthesis” (Licklider 1960). The most popular use of ARPANET was an unforeseen one: electronic mail, an application that would also drive widespread civilian use of the Internet. In 1983, with the formal adoption of TCP/IP, the Internet proper – as a network of networks, based on an open protocol to link differently configured local area networks – would emerge.
Commercial transactions were prohibited on the Internet until 1991, and before 1991 popular forms of communications were email, newsgroups (1979), Internet Relay Chat (IRC, 1988), and multi-user text-based games (1978), such as MUDs and MOOs. Access to the Internet backbone was limited to academia, the government, and the military, although there were some public bulletin board services run by organizations such as the WELL in San Francisco. Communications were mainly text-based, although image files could be downloaded.
The Internet as popular mass medium coincided with its commercialization, the privatization of its backbone (early 1990s) and the development of the world wide web (WWW) by Tim Berners-Lee in 1990. The WWW enabled the mass sharing of linked files and provided an easy to use, readily accessible way to post personal and commercial content. At first, the WWW was populated by personal and institutional homepages, designed with a distinctly amateur DIY aesthetic (now perpetuated as an indicator of “authentic” digital content). Soon flashy “push” media (content delivered to users whether or not they had requested it) were developed for the WWW. With the frenetic laying down of fiber optic cable in the late 1990s, high-bandwith media such as video and music became available and “real time” referred less to synchronous computing and more to synchronous transmission. Massively Multiplayer Online Role Playing Games (MMORPGs) also emerged. Technology, however, does not guarantee use: fiber optic cable was initially laid down in the 1980s in anticipation of demand for the video phone.
During the mid to late 1990s, the Internet was conflated with William Gibson’s “cyberspace” and was viewed as an alternative space that belied government regulation, big business, and intellectual property (Chun 2006). The information superhighway was to fix all our political problems, from racial discrimination to the ills created by the mass media. The Internet was supposedly uncensorable, since it allegedly treated censorship like a network outage and thus routed around it. It was also a fundamentally anonymous and participatory medium. According to Janet Abbate (1999), “the culture of the Internet challenges the whole distinction between producers and users.” Information, the hacker ethic argued, wanted to be free (Levy 1984). This did not mean, however, that all of the utopian discussions around the Internet were non-commercial. New companies, such as amazon.com, seemed to threaten the continued existence of brick and mortar companies. Digital media were also sold as the medium of convergence and soon, it was prophesied, all mass media would converge into one narrow-casting digital form.
Video Games And Other “Media”
Video games are a crucial form of digital media, and their history intersects with and diverges from the history of communications media. Video games underlie literary conceptions of cyberspace (Chun 2006).
The first computer game is widely accepted to be Spacewar, developed by Steve Russell for the PDP-1 minicomputer at MIT in 1962; the first major hit the arcade game Pong in 1972. The development of microchips in the late 1970s enabled more graphically complex arcade games and the development of relatively affordable digital home game consoles. Japan has a strong game machine tradition and in 1978, Taito’s Space Invaders was the first computer game to be more popular than local pachinkos (Malliet and de Meyer 2005). It and Pac-Man, introduced in 1980, were cross-continental hits. During this period, pre-assembled personal computers such as the Commodore-64 (1982) and the Macintosh (1984), on which computer games could be played and copied, were also introduced, and these multi-use computers threatened the game console business.
The game console, however, came back in the 1990s with new machines by Nintendo, Sega, and Sony, and later Microsoft. Made possible by the globalization of computer production, which led to cheaper microprocessors and memory, the return of the console signaled a move away from the personal computer as the universal machine, toward ubiquitious computing: from Ipods, introduced in 2001, to smart phones. SMS text messaging, first developed for GSM phones in Europe, is extremely popular globally and has started to penetrate the US market (Lee and Keeter 2006). Digital media have thus not simply signaled a convergence in hardware, but also a proliferation of it.
Digital media have also penetrated other forms of media. For instance, digital special effects have made animation and stop-image photography a more prominent factor in the practice and history of cinema (Manovich 2001). Devices and services such as Tivo have also affected normal television watching practices, making it possible to skip through commercials and to post recorded television shows on the Internet. This change in audience behavior is most distinct in fan sites and blogs, linked to early print fanzines, on which users post their own narratives based on the shows and stars of interest, and share information about the show. These have impacted the development of various series, with directors reputed to visit fansites regularly. Television shows have also produced official websites, which offer teasers for shows to come, episode summaries, photo galleries, and so on. The question is: what now is not digital media?
Current Usage
New media have become both more and less than expected. They are used by over one billion users worldwide, and commercial transactions are now commonplace. Their very existence, however, has not eradicated older institutions and governmental regulations. The Internet, after all, is a networking protocol, and hence can be changed: policy-based routing protocols used by cable providers, for example, have challenged the equal treatment and circulation of data. As well, intellectual property laws, introduced by the US government, make “fair use” of digital media difficult. The Motion Picture Association of America (MPAA) has launched an aggressive anti-piracy campaign directed against content-sharing sites, as well as individual end users. The use of the Internet to prosecute people, from terrorists to journalists, and its use in commercial forms of surveillance have belied its status as an “anonymous medium.”
All this has not meant the demise of digital media DIY culture and free circulation. New networking sites, such as gnutella, are truly distributed and thus cannot be easily shut down. Sites such as Youtube.com enable users to post their own videos, and blogs, and indymedia sites have come to rival news sites as a source of credible information. More damningly for local newspapers, sites such as Craigslists, in which people can freely post classified ads, have taken away an important income source. Many popular sites, such as facebook.com and myspace.com, herald a return to homepages, albeit in an extremely standardized but easier to use form. Myspace and blogs are part of the much heralded Web 2.0 centered around these social networking sites, which enable the quick and easy posting of micro-content.
References:
- Abbate, J. (1999). Inventing the Internet. Cambridge, MA: MIT Press.
- Benjamin, W. (1968). The work of art in the age of mechanical reproduction. In Illuminations: Essays and reflections (trans. H. Zohn). New York: Schocken Books, pp. 217– 251.
- Bush, V. (1945). As we may think. The Atlantic Monthly (July). At theatlantic.com/doc/194507/bush, accessed June 20, 2007.
- Campbell-Kelly, M., & Aspray, W. (1996). Computer: A history of the information machine. New York: Basic Books.
- Chun, W. (2006). Control and freedom: Power and paranoia in the age of fiber optics. Cambridge, MA: MIT Press.
- Edwards, P. (1996). The closed world: Computers and the politics of discourse in Cold War America. Cambridge, MA: MIT Press.
- Engelbart, D. (1962). Augmenting human intellect: A conceptual framework. AFOSR-3233 summary report contract AF49(638)-1024, Air Force Office of Scientific Research. At www.bootstrap.org/augdocs/friedewald030402/augmentinghumanintellect/ahi62index.html, accessed June 20, 2007.
- Lee, R., & Keeter, S. (2006). How Americans use their cell phones. Pew Internet & American Life Project. At www.pewinternet.org/pdfs/PIP_Cell_phone_study.pdf, accessed June 20, 2007.
- Levy, S. (1984). Hackers: Heroes of the computer revolution. Garden City, NY: Anchor Press/Doubleday.
- Licklider, J. C. R. (1960). Man-computer symbiosis. IRE Transactions on Human Factors in Electronics, HFE-1, 4-11. At memex.org/licklider.pdf, accessed June 20, 2007.
- Malliet, S., & de Meyer, G. (2005). The history of the video game. In Handbook of computer game studies. Cambridge, MA: MIT Press, pp. 23 – 45.
- Manovich, L. (2001). The language of new media. Cambridge, MA: MIT Press.
- Nelson, T. (1987). Computer lib/dream machines, 2nd edn. Redmond, WA: Tempus Books/ Microsoft Press. (Original work published 1974).
- Von Neumann, J. (1945). First draft of a report on the EDVAC. Contract No. W-670-ORD-492, Moore School of Electrical Engineering, University of Pennsylvania, Philadelphia. At qss.stanford.edu/~godfrey/vonNeumann/vnedvac.pdf, accessed June 20, 2007.
- Wardrip-Fruin, N., & Montfort, N. (2003). The new media reader. Cambridge, MA: MIT Press.