Resource Guide

EAI's Online Resource Guide for Exhibiting, Collecting & Preserving Media Art is a comprehensive resource that addresses key issues on current practices and critical dialogue relating to exhibiting, collecting and preserving single-channel video, computer-based art and media installation.

Media art presents a unique and evolving set of exhibition, collection and preservation challenges. For four decades, artists have created electronic works, from single-channel video to art made from digital source code, that demand new practices and vocabularies for exhibitors and collectors. The variable ecology of media art, with its reproducible forms, changing technologies and mutable contexts, is one of its most dynamic if challenging attributes.

The guide features a range of essential information, including best practices; basic questions; agreements and reports; equipment and technical guidelines; interviews with artists, curators, educators, collectors, conservators, archivists, technicians and other specialists; case studies of significant projects and organizations; hard-to-find and out-of-print articles, conference papers and essays; a glossary, a guide to media formats, and related resources.

The EAI Resource Guide was funded by New Art Trust. The Preservation section, which was created in collaboration with IMAP, was supported by public funds from the New York State Council on the Arts, a state agency.

In 2007 the EAI Resource Guide was given the Award for Outstanding Contributions to Archives by the Archivists Roundtable of New York.

Enter the resource guide


A Kinetic History

A Kinetic History: The EAI Archives Online is a digital resource that celebrates a remarkable artistic and cultural legacy. EAI's 30th anniversary in 2001 was the catalyst for this ongoing initiative to digitize and provide online access to EAI's extensive archives of rare materials about the emergent video art movement of the late 1960s and early 1970s.

The project charts the history of EAI from its roots in the alternative art discourses of the late 1960s, and also illuminates the evolution of an artistic movement and its cultural and art historical context. Primary materials (documents, catalogues, video footage, ephemera) and contextualizing essays trace a rich and eclectic trajectory of art, artists and ideas, from Kinetic Art to the Avant-Garde Festival to contemporary video art.

The first two "chapters" of this project are currently available online: Howard Wise & the Howard Wise Gallery, and Beginnings: Sponsored Projects 1970-74.

A work-in-progress, this "living archive" will continue to expand, linking the history of the media arts to its future.

Click here to enter A Kinetic History.


Digitization Project

EAI has launched a major initiative to digitize the media artworks in the EAI collection. The goal of this project is to ensure that the EAI collection of over 3,500 experimental video and media works by artists—which is recognized as one of the leading historical collections of its kind—will continue to be accessible and viable for current and future generations.

The first phase of this project was funded with generous support by the New York State Council on the Arts, a state agency, through their Digitization Program.

The initial phase allowed EAI to begin the process of creating and storing uncompressed digital files of the media works in the collection. This ongoing process facilitates the preservation and distribution of the collection on current and future digital formats, and will ensure its viability as new digital technologies are developed.

In addition, the project allows for broader and more efficient access to the EAI collection through a series of initiatives that include enhanced online streaming of excerpts of works through our Online Catalogue, video documentation of our public programs, advanced access for research and preview, and a digital viewing station at EAI for educators, curators and the public.

In 2009 EAI launched the first digital access component of this project: an "on demand" digital interface for EAI's on-site Viewing Room. This system allows visitors—including students, curators and educators—to directly access over 1,300 artists' video works from the EAI collection for private on-site viewing, research and study.

In 2010 EAI launched the newest access phase of our Digitization Initiative: the EAI Online Viewing Room. This service allows for secure, private online viewing of works in the EAI collection for preview and research purposes. Please contact the EAI office to inquire about this new service.


High-Definition Video Guide

In the past decade, "widescreen" media has permeated electronic design; from television monitors and computer screens to streaming video interfaces and phones, the boxy shape once associated with television sets has all but disappeared. The force behind the widening of our screens is high-definition (HD) video, which is increasingly ubiquitous, but still a source of confusion for many.

Manufacturers are phasing out tube televisions in favor of flat-panel LCDs and plasmas. HD video cameras are becoming more and more affordable. As a result, HD video is becoming the "default" mode for the production of artists" video—and for museum acquisition of these works.

If we agree that mindfulness about the proper display of electronic art is necessary to maintain the integrity of the work, then a basic awareness of how this new medium works is crucial. In what ways is HD different from other forms of video? How do these factors visibly affect the picture? How can older analog works be properly displayed with today's technology? All in all, how will HD video impact collection, exhibition and preservation?

This addendum to EAI's Online Resource Guide explains HD technology and its implications for curators, conservators, registrars, art historians and educators. The goal is not to mandate best practices, but to offer the foundation of a consistent vocabulary. Even more, the aim is to initiate dialogue across the field about the challenges and possibilities in this new chapter in the history of the moving image.


HD video can be thought of as the third generation of video technology. Analog video was the first, offering instantaneous playback of recorded images. Next came digital video, which emulated the analog video process in the form of digital data, opening up the possibilities of non-linear editing. HD video represents another paradigm shift. Not only is HD "larger" than older forms of video (in resolution, screen size, digital and file size), but as we will see, it is uniquely "flexible," with more technical variables.

The term "HD" is often used as a meaningless marketing buzzword—look around and you"ll find "HD" sunglasses, "HD" lightbulbs, moisturizer, acrylic paint. However, the term does have a concrete, objective definition. HD is digital video with a resolution of at least 1280 x 720 pixels.

HD is a broad category with a minimum quantitative requirement. Just as a person must have at least one million dollars to be called a millionaire, video must meet this minimum number of pixels to qualify for HD status. Therefore, the term can apply to the full spectrum of digital video contexts, from IMAX 3D to YouTube. This guide will focus on the forms of HD video likely to be encountered in museums, galleries and classrooms.

Analog, digital SD, digital HD

What do we mean, specifically, when we say HD video is the "third generation" of video? The concept best applies to non-broadcast video, which is what is most often found in museum collections and small archives. The first generation of non-broadcast or "personal" video began in the late 1960s, when the first portable video systems arrived on the American market. These "portapaks" (not a brand name) were self-contained, battery-powered, reel-to-reel videorecorders using analog tape; throughout the 1980s, they were replaced by smaller cameras that recorded to videocassettes, most popularly Betamax, VHS and Hi8. The era of analog video lasted through the mid-1990s, when digital standard-definition video became available to consumers in formats like DV, which allowed users to record on tape and edit on a computer. Today, the new HD formats are replacing DV, Digital Betacam and other standard-definition digital formats.

The leap from standard-definition (SD) video to HD is as significant as the analog-to-digital transition. In fact, we can group the first two generations together as "SD," since both are based on the same legal television standards (NTSC in the North America and Japan, and PAL throughout much of the rest of the world). SD digital video translated the standards for analog recording--NTSC--into digital form. By contrast, HD video has a purely digital foundation and does not rely on the NTSC or PAL. 

Note: The rest of this guide will pertain specifically to video in North America, describing contemporary HD video as an outgrowth of the former NTSC television standard. It will not cover the PAL or SECAM standards. Also, from this point on, "SD" will refer to analog video as well as standard-definition digital video.


"High definition" is a relative term. Like all forms of video, the concept of HD video originated in the technology and legal standards of the television broadcast industry. A television standard is an industry-wide platform of specifications for generating and interpreting electronic signals so that all systems (broadcast sending/receiving, videorecording/playback) are compatible in a given region. Because these legal rules drive the development of new recording/playback formats and other video equipment, it is important to have a basic understanding of how they function.

The history of HD video stretches to the earliest days of television in the United States. In 1941, the National Television Systems Committee (NTSC) formulated a standard for black-and-white television in North America. Then in 1953 the committee revised its standard for color television. "NTSC Color" was designed to be backwards-compatible with the black-and-white television infrastructure. All TV was broadcast in color; people had to buy color sets in order to see color TV, but they could still watch the transmissions in black and white on their existing sets. This decision made economic sense, but it also meant that most elements of the television formula (including the resolution, scanning method and aspect ratio) would remain the same as in the 1940s.

Both North America and Japan implemented the NTSC Color standard. Around the early 1970s the Japanese state broadcasting company NHK began developing a more efficient transmission system, which had a higher resolution and a wider aspect ratio, and brought this high-definition prototype to the United States for demonstrations in the following decade. In 1982, a group called the Advanced Television Systems Committee (ATSC) formed to create new television standards to replace Color NTSC in North America.

Contending proposals for the future of television came from all corners of the television and electronics industries. In 1996, the ATSC reached a consensus and gained the approval of the Federal Communications Commission (FCC). A major goal of the new ATSC standard was to phase out analog broadcasting over the next decade. Analog broadcasting was discontinued and replaced with all-digital transmission in June 2009.

Today the ATSC standard is officially in place. Unlike NTSC, which mandated only one recipe for video signals, the ATSC makes room for many variations and includes several subcategories. One subcategory is Standard Definition Television (SDTV), the digital equivalent of the original analog NTSC standards. High-definition Television (HDTV) has a new set of digital parameters. Within HDTV, there are many different permutations of variables. For most practical purposes, it is only necessary to get acquainted with the following three types of HD video,

- 720p 
- 1080i 
- 1080p 

A discussion of the differences between SD and HD video will clarify these types.


How does HD video differ from SD? In general, the technical parameters offer more options; this level of "customizability" is the key to the versatility of HD video across different applications, from streaming video to broadcasting, and from tape-based to digital storage.  Specifically, HD video differs from SD video in four basic ways:

- aspect ratio
- resolution
- scanning method
- frame rate

What are these parameters, and how do they affect the image?

1) Aspect ratio

Aspect ratio is the shape of the image, regardless of its size or resolution, represented as a ratio of width to height. The aspect ratio of SD video is always 4:3, a roughly square shape. HD video approximates widescreen cinema, with a constant aspect ratio of 16:9.

Because of the basic difference in image shape, widescreen content must resized or cropped to fit on SD screens. Most people are familiar with "letterboxing" from DVDs?the image is resized so that it fits within the 4:3 frame, with black bars on the top and bottom of the screen. Inversely, SD content must be pillarboxed to fit in a widescreen frame: black bars appear on either side of the image to maintain the aspect ratio of 4:3.

***Practical consideration for exhibitors:

In EAI's experience, the #1 problem with HD displays in museums and galleries is the presentation of video in the wrong aspect ratio. When SD video is displayed on HD equipment, most monitors will automatically scale the source to the 16:9 aspect ratio, stretching and distorting the image. To maintain the correct aspect ratio, the monitor may need to be manually set to 4:3 using the remote and menus. Remember, SD video sources should not fill a wide screen. Instead, the image should be pillarboxed.

2) Resolution
Resolution is the level of image detail, determined by the number of pixels that make up the picture.

Analog video is composed of lines rather than pixels, based on the analog principle of interlaced scanning (discussed below). For this reason, analog resolution is expressed in terms of lines, i.e., the resolution of NTSC video is 525 lines. Translated into the digital realm, 525 lines is equal to 720 x 480 pixels.

Digital resolution is usually expressed as width-to-height (horizontal pixel count x vertical pixel count) but more commonly is expressed in terms of vertical dimension only. This is vertical resolution.

The vertical resolution of analog NTSC/digital SDTV is 480.

            720 x 480 = 345,600 pixels per frame

The vertical resolution of HDTV/HD can be either 720 or 1080.

            1280 x 720 = 921,600 pixels per frame

            1920 x 1080 = 2,073,600 pixels per frame

Therefore, full HD video has about six times the resolution of analog video.

***Practical consideration for exhibitors:

HD monitors and projectors are fixed-pixel displays, meaning they can display only a certain number of pixels. Native resolution is the exact number of pixels on a display. A monitor or projector will always scale the source (video signal) to its own native resolution, which means throwing away surplus picture information as necessary (i.e., if your source resolution is 1080 but your monitor's native resolution is 720, you won"t see the full-quality image). The display resolution should be equal to or higher than the source resolution. For this reason, it's wise to invest in a full HD (1920 x1080) monitor or projector for large displays.

3) Scanning Method

Like motion picture film, video is a series of still pictures displayed fast enough to create the illusion of motion. Scanning is the process of drawing electronic pictures on the screen. Video images are scanned in one of two ways: interlaced or progressive.

Interlace scanning is the conventional analog method of video and television display. Analog television was designed for cathode ray tube (CRT) monitors. Inside a CRT, an electron gun at the back of the tube draws lines of light on the screen in a left-to-right, top-to-bottom motion. To draw one frame of video, the gun must complete 525 horizontal passes in 1/30th of a second. To expedite this process, the interlace method takes advantage of the persistence of vision. Instead of drawing the entire frame of video at once, the interlace system very quickly draws every other line, then goes back to the top to fill in the remaining lines.

Put another way, every frame of video is split in half. These halves are called fields. Since a frame lasts 1/30th of a second, a field lasts 1/60th of second. In any given frame, the first field contains the even-numbered lines (2,4,6...524) and the second field contains odd-numbered lines (1,3,5...525). Because the fields are displayed so quickly, the viewer's brain fills in the blanks, forming the impression of a complete picture.

SD video is always interlaced. For digitally interlaced SD video, the process is basically the same: the pixels that make up the frame are divided into two fields, in which only half the pixels are active at a time; the fields are interlaced together to form one complete frame.

One field of interlaced NTSC video

HD video can be either interlaced (denoted i) or progressive (p).

Progressive scanning displays the entire frame at once. We can think of motion picture film as "progressive," because each frame of film is a complete picture. Because the image is not split in half, the progressive display method produces a smoother, more stable picture than interlacing. But this higher level of "data density" can slow things down—progressive video produces larger file sizes and requires more bandwidth. This is why interlacing is still an option for HD: it is a more "economical" method.

****Practical consideration for exhibitors:

Interlacing can cause image problems in the HD realm. CRT televisions are inherently interlaced; watching a tube television, the human eye flawlessly "laces" the two fields together and sees a constant image. Flat panel displays, on the other hand, are inherently progressive, composed of a fixed grid of pixels that translate digital data into light and color. The entire matrix of pixels is activated at once to create a frame of video. Progressive devices display interlaced sources through a process called de-interlacing. All HD monitors and projectors have this software built in. De-interlacing is an algorithm that combines the two interlaced fields, by blending or averaging them together, into a single progressive frame. For relatively static images,  deinterlacing works well, but fast-motion images tend to cause visible errors, or artifacts.

4) Frame Rate

Frame rate is the speed at which individual frames are recorded and played back, measured in frames per second (fps). The standard frame rate for motion picture sound film was universally standardized in the early 1930s at 24fps. Video frame rates are usually higher. The 1941 NTSC standard set the frame rate for video at 30 frames per second. Color NTSC reduced the rate slightly, to 29.97 frames, saving a tiny amount of bandwidth every second. This space was reserved for the "color burst," or the extra data that added color information to the black-and-white image. (Although the NTSC frame rate is a fraction short of 30 fps, it is commonly referred to as "30 fps" for the sake of convenience.)

While NTSC SD video is recorded and played back at only one possible frame rate, HD video can be recorded in several rates, most commonly 29.97 (aka "30") fps, 59.94 (aka "60") fps, and 24 fps. The higher the frame rate, the greater the data density, which again translates into larger file sizes or more bandwidth. For this reason, frame rate is also called "temporal resolution."

***Practical consideration for exhibitors: A detailed knowledge about frame rates is necessary for video editors and technical staff, but not as crucial for curators or exhibition planners, because the rate is locked in at the point of recording, and video playback equipment is unlikely to cause problems in this area.

HD MODES: 720p, 1080i and 1080p

Now that we understand the basic differences between SD and HD video, and particularly the concepts of resolution and aspect ratio, it will be easier to discuss the three HD "modes," 720p, 1080i and 1080p. Sometimes referred to as "flavors," these will be called "modes" in this guide in order to distinguish them from "formats," which will be covered in the following section. 

The three video modes are designated by their resolution and scan method. Within these modes only one aspect ratio, 16:9, is possible, though several frame rates are possible.

720p is progressive video composed of 1280 horizontal pixels and 720 vertical pixels.
resolution: 1280 x 720
used in: broadcast television, internet downloads/streaming, camcorders, video games, artists" video

1080i is an interlaced video composed of 1080 vertical pixels and 1920 horizontal pixels.
resolution: 1920 x 1080
used in: same as 720p

It is important to understand that 1080i is not superior to 720p. In fact the two modes are equivalent in quality when applied to their best uses. The progressive 720p is ideally suited to high-motion subjects, while 1080i is interlaced, with a higher pixel resolution, so it's ideal for more static subjects. 

Where do these values come from? 720p and 1080i were competing systems proposed to the ATSC as possible HDTV standards. Roughly half of the major American TV networks supported one system, the other half supported the other; in the end both were adopted. Each network is now committed to only one (ESPN, ABC and Fox use 720p, while PBS, CBS, NBC and HBO use 1080i).

1080p combines high resolution with progressive scanning. This "full HD" mode is supported by Blu-ray and most other HD video formats, and nearly all HD monitors and projectors.
resolution: 1920 x 1080
used in: Blu-ray, internet downloads/streaming, camcorders, video games, artists" video

1080p is not currently used in broadcasting because it exceeds the bandwidth capacity of the parameters for television transmission established by the ATSC. However, YouTube offers a 1080p option for streaming video, and content viewed in this way is likely to be visibly inferior in quality to broadcast HD television. How is this possible?

To explain this disparity in quality, we must add to our existing HD parameters (aspect ratio, resolution, scanning type and frame rate) a fifth variable: data rate.

Data rate (or bit rate) is the number of bits (binary digits) that are conveyed or processed per second, expressed in bits per second, or bit/s. The ATSC standard specifies a fairly high data rate, while YouTube and other streaming video portals use a much lower data rate for transmission. (For more information on data rate and digital compression, see the Resources section below.)

In conclusion, 720p, 1080i and 1080p are three HD modes used for different purposes—they do not necessarily represent three tiers of image quality. A highly-compressed 1080p file may look worse than a less-compressed 720p file. As noted above, two 1080p files, viewed in different compression schemes, may differ dramatically in quality.


So far, this guide has covered video standards (technical specifications for broadcasting, such as NTSC and ATSC) and "modes" (the record and playback settings for HD, 1080i, 720p, 1080p). Formats are carriers of video information—the devices and/or software that contain video data. As with standard definition video, most HD formats fall into three categories: videotape, optical media and digital files. The most common formats encountered in museum collections are listed below.



HDCAM - HD version of Sony's Digital Betacam. Former mastering/production format in broadcast industry, now largely replaced by HDCAM SR. Suitable for preservation mastering, theatrical exhibition.

HDCAM SR - An upgrade to the HDCAM format. SR ("superior resolution") offers lower rate of compression and hence higher image quality. Current mastering/production format in broadcast industry. Suitable for preservation mastering, theatrical exhibition and use in high-end production facilites.

D5-HD (Panasonic) HD version of Panasonic D-5. Not widely used for archiving or preservation mastering in the United States due to popularity of competitor HDCAM/SR.

HDV - HD version of the standard-definition DV format. Cassettes are the same size as mini-DV tapes and come in 720p and 1080i varieties. A corresponding digital-file format, also called HDV, allows for seamless transition between file and tape. Best for theatrical exhibition, museum/gallery exhibition and editing, but may also be suitable for preserving content originally created on this format.



Blu-ray - Blu-ray is the disc format for HD video display, capable of much higher resolution than DVD. Blu-ray discs record in the 1080p mode and can store five times as much information as DVDs (25 GB for single-layered discs, 50 GB for dual-layered). The information is written to the discs with a blue laser and has a shorter wavelength than DVD's red laser?this means it can record data into finer grooves on the surface of the disc. Suitable for theatrical exhibition and gallery/museum exhibition, and far less expensive than any of the HD videotape formats. Blu-ray is not an archival format and should not be used for preservation mastering.

Note: Blu-ray formerly competed with Toshiba's HD-DVD, but attained market dominance in 2008. HD-DVD is now obsolete.

File Formats:

The easiest and most popular method for delivering HD video is in the form of digital files saved on hard drives or other storage media, or transmitted electronically via the web as downloads or video streams. As a format category, digital files are conceptually more complex than videotape or optical media; when discussing files, "format" is useful only as a blanket term, encompassing a number of components with more precise definitions. These components include codecs, wrappers, media players and other forms of proprietary or open-source software. For an introduction to this topic, see "A Primer on Codecs for Moving Image and Sound Archives" and other links in the Resources section. The principles of digital archiving are the same for SD and HD video. However, it is important to be mindful that HD video files are much larger than SD files, requiring more storage space. 

Files can be exhibited in several ways, including: 

- Computer connected to monitor
- Hard drive connected to computer connected to monitor
- Hard drive player device connected to monitor
- Download/streaming application on player or computer connected to monitor

***For museum and gallery exhibition of HD video files, the following questions are crucial:

- Is an HD cable (HDMI or DVI) connecting the source to the monitor?
- Is the relevant software (operating system, media player) upgraded to the latest version? 
- If using a computer (like a MacMini), does the installation space offer sufficent ventilation to keep the computer from overheating?
- Is the file playing at the proper aspect ratio, resolution and scanning method?


A note about Cathode ray tubes
Many video experts prefer professional cathode ray tube (CRT) monitors to flat-panel monitors. CRTs offer the best brightness, contrast, and color reproduction of any display system, producing true, "inky blacks" where flat panels can only produce shades of gray. CRTs are definitely the best method for displaying "born-analog" video.

CRTs work well in bright and dark rooms. Sony no longer manufactures Trinitron monitors, and these excellent displays will probably become more valuable over time. If you must get rid of your Trinitron or other professional-grade monitor, consider donating it to another institution rather than throwing it away.

A note about cables
To display HD sources, it is necessary to use HD digital connections. HDMI (High Definition Media Interface) cables are the best option, because they carry high-resolution video and high-quality audio. DVI cables are the second-best option, because they carry only high-resolution video, and require a separate audio connection.

A note about HD monitors

HD (flatscreen) monitors usually need to be calibrated after purchase. New monitors leave the factory with the brightness setting on maxium so that they will stand out on the retail showroom floor. Turning down the brightness to a medium level will reduce power consumption, extend the life of the monitor and improve the picture.

Plasma monitors come in large sizes only—from 42 inches to 65 inches, and are best for use in dimly lit rooms.

- produces good range of colors, deep blacks
- wide viewing angles (i.e., the image looks correct to a viewer off to one side, or above/below the front-and-center space in front of the screen)

- prone to screen burn-in for the first 100 hours of use. (Burn-in is permanent discoloration of the screen caused by a prolonged display of non-moving images, text or graphics.)
- heavy
- not energy-efficient
- some models have a glass screen that reflects ambient light; these should not be used in dim or dark rooms only.
- malfunctions at high altitutdes

LCD (Liquid Crystal Display) monitors come in a wide range of sizes, from 13" to 65."

- lighter and more energy-efficient than plasmas
- screens are less light-reflective than plasmas, so LCDs are suitable for use in lit rooms (i.e., museum/gallery display)

- A design flaw in LCD displays impairs contrast and causes blacks in the image to appear gray: the entire screen is continuously back-lit by a florescent panel.
- also due to florescent backlighting, colors may appear excessively bright
- narrower viewing angle (image is distorted or not visible from top, bottom and sides)

LCD-LED displays improve on LCD technology. Instead of florescent backlighting, these sets use LED (light-emitting diode) backlighting. The benefit of LED backlighting is "local dimming," the ability to completely shut off certain pixels in darker areas of the screen.

- Like LCDs, they weigh less and are more energy-efficient than plasmas
- Like plasmas, they deliver excellent contrast?bright whites and deep blacks

- Cost


"A Primer on Codecs for Moving Image and Sound Archives: 10 Recommendations for Codec Selection and Management" by Chris Lacinak, Audiovisual Preservation Solutions

"Digital Tape Preservation Strategy: Preserving Data or Video?" by David Rice and Chris Lacinak, 2009

Digital Video and HDTV by Charles Poynton, San Francisco: Morgan Kaufmann Publishers, 2003.

How Video Works by Marcus Weise and Diana Weynand, 2nd Edition, New York: Elsevier, 2007.


(c) 2010 Leah Churner/Electronic Arts Intermix