Conversation with Sarah Ziebell
Robert Wilson Audio/Visual Collection
Sarah Ziebell Mann

Between 2004 and 2006, the Theatre on Film and Tape Archive at The New York Public Library for the Performing Arts preserved the audiovisual materials in the archives of performance artist, director, and designer Robert M. Wilson. The Project Coordinator was Sarah Ziebell Mann. This interview was conducted via e-mail in January 2006. Photo: Einstein on the Beach by Robert Wilson & Philip Glass, 1976, by Fulvio Roiter. Courtesy of the Byrd Hoffman Water Mill Foundation

Jeff Martin: Can you give me a sense of the scale of the project-how many individual tapes, how many titles, and how many different formats?

Sarah Ziebell:The Robert Wilson Audio/Visual Collection includes performance, rehearsal, and interview recordings and video art pieces representing Wilson's forty-year career. These workshop, rehearsal, performance, and reference recordings were Wilson's personal archives to which he referred back time and again throughout the years when creating new pieces. (See this article for more information.)

The Robert Wilson Audio/Visual Collection is headquartered in the Theatre on Film and Tape Archive (TOFT) of The New York Public Library for the Performing Arts. TOFT acquired the collection from the Byrd Hoffman Water Mill Foundation in 2001. Included with the acquisition was a donation of over $1 million to enable its preservation and cataloging over a period of two years.

There are approximately 1,500 items in the collection.

  • 100 films on 8mm, Super-8, and 16mm
  • 330 audio recordings on 1/4" and 1/8" tape
  • 1,100 video recordings on 2", 1" 3/4", 1/2" open reel, Beta SP, DigiBeta, Beta Max, Mini DV, D2, DVCAM, 8mm, Hi-8, VHS, S-VHS, and VHS-C [NTSC, PAL, and SECAM]

Some 108 separate works are documented in the Robert Wilson Audio/Visual Collection. Some works have one item related to them; others have nearly 200. The collection is now fully preserved and cataloged and is available for viewing at The New York Public Library for the Performing Arts. For information on Robert Wilson titles, see http://catnyp.nypl.org/ (do a word search on "Robert Wilson Audio/Visual Collection" to limit the search to just those items).

There are a couple of important points I would like to make concerning working with a collection of this type. Due to its inherent material, structural, and intellectual complexities, multimedia art is often most in danger of becoming completely inaccessible. Robert Wilson's work has been primarily executed on the stage, and performance is, by nature, ephemeral. Moving-image and sound media have been employed as devices for harnessing creations-devices that may be rudimentary, fragmentary, and, often, seriously deteriorated. In terms of the preservation of Wilson's work, we have been confronted with not only the ephemerality of performance, but also the fleeting materiality of audio/visual media itself. Further, there is a peculiar problem inherent in performance documentation. The main "work" is embedded in the performance and not in the performance recording. The performance documentarian's hand is intentionally more anonymous than is the case with cinema or video. This presents significant challenges when working with this type of material as an archivist or curator.

We encountered significant preservation problems with this collection. Many of the recordings were on formats that are unplayable at our archives. Even worse, a good portion was so deteriorated as to be completely unviewable. In the end, after consulting with multiple labs and following every known video restoration strategy (established and experimental!), we determined that 23 videos were beyond rescue. For all of these reasons, the Robert Wilson Audio/Visual Collection proved to be an extremely challenging one to process.

JM: How was the project staffed?

SZ: I was the Project Coordinator; we also had a Cataloger, Laura Jenemann; a Library Technical Assistant, Christopher Depp; and several volunteers and interns who worked on the project.

JM: How much information did you have about the tapes when the project began?

SZ: The donor's foundation, the Byrd Hoffman Water Mill Foundation (BHF), has its own archives. BHF supplied us with an export from their FileMaker collection management database, which we migrated into a Lotus Approach database.

Generally, information supplied by BHF included production title, working title for the item, date, venue, format, recording system, and running time. For some items, the information was minimal, and for others, more complete.

We were able to get some additional information, although not much, from the labels that had been created by BHF. They sometimes contained title information, dates, venues, and-for the audio recordings-some playback specifications.

JM: Did you work with an extant numbering system or create a new one?

SZ: BHF had two numbering systems that it had used over the years, which we retained throughout the processing of the collection. We found that sometimes these numbers had meaning (they helped us to determine the sequencing of recordings), but other times they were meaningless. When we reached the portion of the project when we were cataloging the items, we assigned NYPL class marks (unique location numbers for each object), and those became the dominant numbering system. The old numbering system was still retained and embedded in the catalog records so that the items can still be searched that way.

JM: What type of database were you working with-was it NYPL's?

SZ: Our department uses Lotus Approach, a simple and somewhat outmoded database program. Prior to my arrival at the Library, the Wilson data was migrated to Approach, and that database ended up being the one we used throughout the processing project. The database served several roles for us: most important, it enabled us to group all items for a given production; it allowed us to track the processing status of each item (sent out to a given lab for preservation, scheduled to be remastered in-house, preservation completed, cataloging completed, etc.); it allowed us to track the exact location of each item; and it enabled the easy keeping of processing statistics. The NYPL will continue to maintain the Approach database beyond the completion of the project as an internal resource. The NYPL's catalog will serve as the public database for the collection. Here is a simple record from the NYPL catalog:

JM: How did you keep track of the tapes as they moved through the process?

SZ: The Lotus Approach database helped us track the items. Additionally, for each production title, we completed a separate preservation report. This report allowed us to view all of the holdings for a given item in context and to determine duplication, sequencing, etc. This report documented our preservation decisions: to which lab an item was sent, whether an item was being duplicated in-house, and whether an item was a duplicate of something else and did not need to be preserved, for example.

JM: Why was it decided to do much of the remastering in-house?

SZ: Our archive decided to remaster and produce service copies in-house for a portion of the videos that did not have major format-obsolescence or preservation issues.

There were a couple of reasons for this decision:

  1.  Special funds for remastering were available through the Robert Wilson Audio/Visual Collection.
  2. Project staff was available to learn the system and to train others.
  3. The Theatre on Film and Tape Archive is primarily a producer of its own materials. Having the ability to remaster holdings and make service copies in-house could save money in the long run.
  4. The archive also had some desire to edit its material.

As an alternative, we considered outsourcing all of the preservation to labs, but ultimately decided against this because we felt that we needed to have more control over the process, workflow, etc.

JM: What equipment did you use?

SZ: The in-house system, custom built by 1Beyond of Boston, allowed for real-time encoding of materials in a variety of formats for the purposes of creating uncompressed digital files for editing and compressed MPEG-2 files for DVD service copies. Simultaneous with the encoding was the transfer of the recording to Beta SP or DigiBeta in order to create a preservation master.

The system consisted of a series of decks for video playback; a media converter and two computers for digitization, editing, and DVD authoring; and a DVD printer. The video decks included the following formats: 3/4" tri-standard, VHS tri-standard, Beta SP NTSC and PAL, DigiBeta, Mini-DV, and DVCAM. The hardware consisted of a Convergent SD-Connect media converter, two standard definition 2GB workstations with Dual 3.06 GHz Xeon processors, and a Primera Bravo II DVD printer. We used Canopus DV Storm 2+ Pro software for video capture; Adobe Premiere Pro, Adobe Audition, MPEGCraft, and EDIUS software for editing, and Adobe Encore software for DVD authoring. Using two networked computers allowed us to increase the volume of material processed by simultaneously using one computer for digitization and the other for editing and DVD authoring. This worked very well for us.

JM: What formats did you decide to use for creating masters and viewing copies?

SZ: All films were preserved on film. They were also transferred to Beta SP, from which DVD service copies were produced. All audio recordings were preserved as uncompressed Broadcast WAV files. Service copies were created on CDs. The audio work was completed at outside labs. All analog videos were preserved on Beta SP; and all digital videos on DigiBeta. Service copies were made on DVDs.

We considered doing all video preservation on DigiBeta, but our department only has one DigiBeta deck (which is included in the video remastering system). We felt that we would need additional decks if we wanted to use this as a primary preservation medium, and the funds to purchase them were not available at that time.

JM: What was the approximate percentage of the tapes done in-house and those sent out? How did you determine which ones would be done in-house?

SZ: When we created the preservation reports, we analyzed the quality of each recording, classifying its condition according to the following scale:

  • Excellent (Image and sound quality is unimpeachable. Candidate for in-house dupe.)
  • Good (Image and sound quality is acceptable. Candidate for in-house dupe.)
  • Fair (Image and sound quality are visible/audible, but is not acceptable. Candidate for outside lab work.)
  • Poor (Image and/or sound is unplayable because of condition. Absolutely needs outside lab work.)
  • Format unplayable (Condition cannot be adequately assessed because item cannot be viewed/listened to. Absolutely needs lab work.)

We ended up remastering and producing DVDs for over sixty hours' worth of materials in-house; the rest required outside lab work. Because of strict deadlines tied to the project's funding and the high level of research demand for the materials, the project's preservation deadlines were abbreviated. Had there been more time, we could have processed more materials in-house.

JM: Describe the typical process-if there was such a thing-for reformatting a tape.

SZ: For each remastering project, we completed a simple worksheet, previewing the source tapes, confirming their running times, and assessing their overall appearance. We chose the appropriate encoding rate for each project by consulting a DVD Bit Budgeting chart provided to us by our system's vendor, 1Beyond. For a tape with a relatively good appearance, we encoded at 7 million bits/second. For a tape with a poor appearance, we encoded at 8.5 million bits per second. Here is a copy of the DVD Bit Budgeting chart we used:

If, due to their length, the source tapes had to be split up among several Beta SP/DigiBeta tapes (for preservation masters) and DVDs (for service copies), we previewed each to determine an appropriate breaking point near the maximum capacity of the destination media. In order to maintain consistent image and sound quality, we used the same encoding rate across an entire remastering project. With the recordings that extended beyond more than one tape/DVD, we allowed for a one-minute overlap with the previous recording and built this overlap into our Beta SP and DVD stock and running-time estimates.

After completing the worksheet, we configured the media converter and computers for the appropriate capture settings and set up the source and destination video decks, checking to make sure that the video signal was coming through properly and making any necessary time-base correction adjustments. We then played the source recording, and it was simultaneously encoded as an MPEG-2 file in preparation for creating a DVD and copied onto Beta SP/DigiBeta to make the preservation master. Later, we did any necessary editing of the digital file (usually, none was required) and then authored our DVD. We created a copy of the raw MPEG-2 file on DVD which could be used later, if needed, to produce another DVD without having to re-digitize the recording. Finally, we completed quality-control checking of both the Beta SP/DigiBeta preservation masters and DVDs to make sure there were no recording errors.

JM: What were some of the most common problems you encountered? How did you solve them?

SZ: Dropouts and other kinds of tape noise, which we analyzed using the system's vectorscope and waveform monitor and corrected using the time base corrector. When we encountered a loss of image clarity or detail, we chose a higher encoding rate. It was very important to do quality control of our work, especially the DVDs, as sometimes the authoring process was faulty.

JM: What was the average time to reformat a tape-or, what was the ratio of tape hours to work hours?

SZ: Approximately 2_ hours of work for every hour of source material.

JM: How many different outside vendors did you end up working with? How did you choose them?

SZ: Given the format diversity and level of deterioration we had in this collection, we found that we had to employ an ˆ-la-carte model of lab selection, choosing several vendors-each for a particular strength. Some items had to be sent out to multiple labs before they could be adequately preserved. Not including sub-contractors, we used eight audio/visual labs throughout the project.

JM: Did you encounter any problems?

SZ: Overall, our experience with vendors was excellent. The problems we had were minor in comparison to the wonders performed by the labs on our very distressed materials. Here are two issues that came up:

  • Sometimes it seemed that labs did not perform adequate quality control of their own work before returning it to us. This meant that we had to invest a great deal of time doing quality control on our end and occasionally had to send items back to labs to be redone (mostly DVDs).
  • Flawed DVD authoring processes. Even though DVDs are preferable to many tape stocks as a service-copy format because of their size and navigational features, there are still huge disparities in quality across labs. These disparities relate a great deal to the authoring workflow and software. Common problems included digital artifacts, rolling bars, incorrect chaptering, DVDs simply not playing in our machines, etc.

JM: Are there things you would have done differently in the project? What lessons did you learn that might be valuable for others?

SZ: I sometimes wonder if the decision to remaster material in-house was the best fit for this collection. The project had tremendous time constraints, and implementing a new system and training staff to use it took quite some time. However, it is definitely true that we learned a great deal through doing remastering in-house, which equipped us to evaluate the work of the outside labs that we hired. As for lessons learned:

  • Before a collection can be preserved or cataloged, it must be inventoried as completely as possible. Inventory should proceed on a work-by-work basis in accordance with the order of importance of the work weighed against the urgency of its preservation needs.
  • Quality control: Upon completion of duplication, we checked the image and sound quality of each preservation master and service copy by spot-checking the beginning, middle, and end of each item. When in doubt about a quality matter, we compared a duplicate to the archival original.
  • Don't give up when you're told something is unsavable. Always get a second opinion.
  • Everyone on a project needs to have an understanding of everyone else's job. A cataloger needs to be able to inventory, do quality control, and understand the remastering process. A technical assistant needs to understand what goes into a catalog record, etc. Cross-training is critical.

JM: What would you say to someone considering an in-house reformatting project like this one?

SZ: You need to have a bulk of materials that are on similar formats in good condition to make it worthwhile. Expect installation and training to be time-consuming. Form a good relationship with your vendor. Think long and hard about whether this solution really fits your collection. Make sure you do quality control of your own material! Keep good records about what you do to an item and record statistics so that you can gauge your progress. Realize that equipment will continue to be migrated; build those costs into your budget.

© 2006-2009 | Independent Media Arts Preservation, Inc.