Only ten years ago, the big challenges experienced by preservation people at ALA conferences were 1) finding out beforehand which days the preservation committee meetings were scheduled for, 2) getting from one hotel to the next in the convention town and finding the right room, and 3) hearing what was said, even in some of the smallest committees and discussion groups, because 10 or 20 unrelated groups would too often be scheduled to meet in the same large room, driving up the decibel level to that of a large cocktail party. Now we can download a complete schedule of meetings from Email, in time to make airline reservations; our meetings are not so widely scattered; and we do not have to share meeting rooms.
The big challenge today is partly a consequence of success: scheduling conflicts. More people are coming, and they keep setting up new committees and discussion groups. The preservation meetings went from 9:30 am June 26 to 4 pm June 29, without a break except for eating and sleeping. Each of the last three days had three or four conflicting preservation meetings in the afternoon, and the first two days had evening meetings from 8 to 10 pm. We had an embarrassment of riches, hard to complain about, but as a result this report will not be as complete as it might have been. Still, the major developments and trends were usually discussed in more than one meeting, and this redundancy helps fill in the gaps. As usual, this report is written from personal notes taken at the time. Any errors can be corrected in a later issue, if readers will point them out.
Although the economy is still slow, with library funding being cut at every level, the main topics of discussion were not budgets and staff cuts, but large-scale and long-term matters: setting national preservation policy, microfilm issues, regional cooperative preservation, and various large projects involving computers and electronic media.
The "Chicago Conference" was reported or referred to in three meetings, because of its potential for influencing the direction of preservation in the U.S. It was cosponsored by the Association of Research Libraries (ARL) and the University of Chicago, to "provide a forum for research libraries to significantly advance planning for a comprehensive preservation program" that would be cooperative and coordinated. Management teams (library directors, heads of collection development, and preservation officers) from 16 large ARL libraries, those with "mature preservation programs," were invited to participate along with representatives from organizations with a strong commitment to preservation.
Approximately 60 people in all met on May 27-29 to identify the most important issues in preservation of research library materials. The participants met in small and large groups to articulate the issues and work toward consensus. Some of the people who were there reported feeling frustrated or impatient at what appeared to be slow progress or lack of definite conclusions. However the issues were large and complex. The felt frustration may be a sign that the discussants were on the right track, because if the chosen task were something that could easily be done through established channels, it would probably be trivial.
Issues discussed included preservation vs. access, local vs. national efforts, research and development, government and national awareness, cost, and mechanisms for coordination. A task force accountable to conference participants was created to move forward the preservation agenda developed at the Conference. It will report on its progress in October 1992. The University of Chicago Library will publish the Conference proceedings and distribute them widely.
A separate task force on the preservation of science was upgraded to a subcommittee in the Preservation Management Committee (the action arm of the Preservation Administrators Discussion Group), since its work will continue for the foreseeable future. Jennifer Banks reported that the task force is looking for advocates of preservation within the field of science to work with-credible and informed spokespeople. (She would appreciate any suggestions; call her at 617/253-5664.)
Despite current problems with deteriorating acetate bases and mirroring, properly produced and stored microfilm is still the preservation master of choice. At the Electronic Preservation conference June 3-4, sponsored by the Wisconsin Preservation Program (WISPPR), two of the four speakers saw microfilm as the archival master even in systems that store information on CD-ROMs or magnetic tape. (The CD-ROM would be the "positive copy.") This is still in the future, though, because it is not yet economically feasible to copy the microfilm at 1200 dots per inch, a level of resolution considered necessary to prevent loss of data.
Lee Jones reported that the MicrogrAphic Preservation Service (MAPS) gives the sulfiding treatment on request to any film they process, to give it added stability, and marks the label to show this has been done. They are considering whether to put a note to this effect into the database too. They agree with the Image Permanence Institute that the treatment should be required on at least all master negatives.
MAPS has patented an exposing system that can be built into computer controlled cameras, such as the H&Ks that they use, and allows an operator to achieve very tight density ranges within a reel of film regardless of the quality of the original image. The National Library of Scotland and possibly the German National Library will want to license the system after it is made available. MAPS is also helping the Deutsche Bucherei set up a microfilm lab, and will train personnel for 8-12 weeks at the MAPS lab this year. (Since ALA, MAPS has ordered more of the computer-controlled H&K cameras, which will bring the total to 13 by the end of the summer. It can now arrange the purchase of H&K cameras for any organization that must deal with a US source. Often government agencies, both national and state, are precluded from foreign purchase without tremendous red tape.)
At the ALA Midwinter meeting in January, Pat Battin of CPA had raised the question whether quality control and productivity of contract microfilming and other contracted services were adequate. At the meeting in June, the availability of stored microfilm, and its quality, were also questioned because one library had tried to buy a number of films and found that some were defective or missing. No decision has been made yet to look into this. [A survey to determine the extent of the problem would be quite a large project because of the number of variables that would have to be included. But it would be a good idea to check whether the ongoing national microfilming program has had the planned result: permanent masters, safely stored and easily retrievable for the purpose of making service copies on demand. -Ed.]
The way books are selected for microfilming is a matter of continuing debate. If use and condition of the individual book are selection criteria, then books have to be selected on an individual basis, which is seen as too expensive by some. If they are not, entire collections can be targeted for filming, which makes selection easier and keeps the collection all in the same format, but results in the filming of some books that are not brittle or that will never be looked at again. [A detailed description of the costs of selecting individual books for filming of a certain collection at Columbia is in "A Cost Model for Preservation: The Columbia University Libraries' Approach," by Carolyn Harris, Carol Mandel and Robert Wolven, in LRTS v. 35 #1, pp. 33-54. The processes described include more than selection for microfilming, though. They also involve the actual microfilming, cleaning, inventory, replacement of missing volumes, mending, binding, boxing, withdrawal, and photocopying, with cataloging and other processing done as needed. Half of the books in the collection on which this study was done did not need anything done to them, and only 12% needed microfilming. The collection had only 155,380 books in it, but the total cost, including the time and costs of everyone involved in the project, was $1,623,874. About 40% of that total was for microfilming, which came to about $60/volume.]
George Farr reported that NEH had received 214 grant applications in the last cycle from libraries and consortia, of which 54 were funded, for a total of $18 million. (If you add in museum libraries, the total goes up to 62.) Thirteen of these, according to a handout that lists and describes the June 1992 grants, were for over a half million dollars, but only three of the 13 were for microfilming projects. (The rest were for education, environmental control, rehousing, cataloging, etc.) An even half million volumes will have been preserved by microfilming, with NEH aid, by the end of this cycle.
The program on preservation of microfilm is reported separately in this issue by Carol Unger.
The "LaGuardia Eight" project for digital imaging was referred to in a couple of committee meetings, but was not described in any detail. It is named for the airport at which technical and administrative representatives of eight research libraries met to plan sharing of protocols for digital preservation. The group, which is collaborating on an informal basis, includes the libraries at Cornell, Harvard, Yale, Tennessee, Pennsylvania State, Princeton, Southern California, and Stanford. They are still exploring organizational questions; for more information contact the CPA.
A task force of the Preservation Management Committee did a little survey of the software used to manage budgets in preservation departments. Six out of ten respondents used software, all commercially produced, and of this number, three used Lotus 1-2-3. The others used Symphony, Quattro or R:base. They used it not only for managing the budget, but for supplies, binding statistics, student wages, grants, work statistics and inventory, among other things.
"Calipr" (for California Preservation) is the name of an automated tool to assess preservation needs of book and document collections for institutional or statewide planning. Barclay Ogden described it in the Preservation Administrators Discussion Group, saying it may supersede the condition survey. It was developed and tested in manual form in 1989 at Berkeley, and after further development, different versions were programmed by both the University of California and RLG. The RLG version will be used by the New York State program in October. Debra McKern took Calipr with her to Egypt when she went there on her consulting trip last year and was very pleased with its flexibility. The University of California version is available now for $32.50 from the California State Library Foundation in Sacramento.
Calipr runs on a PC. It takes into account, for each item in the sample, its condition, exposure (likelihood of damage from frequent use or poor storage conditions), and value (importance to institutional programs, uniqueness or rarity, inclusion in a comprehensive collection, and/or value as an artifact). Condition, exposure and value together determine priority for preservation. Reports can be produced on the whole sample, or only high-use materials, or most valuable materials, or high-use, high-value materials. Forty-three California libraries have used it. Nonpreservation people administered it in all cases. Barclay Ogden said they learned that preservation is not a bottomless pit, and they felt when it was done that they had an idea where to begin. Their survey results have been pooled to help make the case for state funding for preservation.
Timestamp: Sunday, 03-Mar-2013 21:37:34 PST
Retrieved: Tuesday, 20-Mar-2018 04:16:17 GMT