Library of Congress Workshop on Etexts - Part 7
Library

Part 7

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ La.r.s.eN * Issues of scalability and modularity * Geometric growth of the Internet and the role played by layering * Basic functions sustaining this growth * A library's roles and functions in a network environment *

Effects of implementation of the Z39.50 protocol for information retrieval on the library system * The trade-off between volumes of data and its potential usage * A snapshot of current trends *

Ronald La.r.s.eN, a.s.sociate director for information technology, University of Maryland at College Park, first addressed the issues of scalability and modularity. He noted the difficulty of antic.i.p.ating the effects of orders-of-magnitude growth, reflecting on the twenty years of experience with the Arpanet and Internet. Recalling the day's demonstrations of CD-ROM and optical disk material, he went on to ask if the field has yet learned how to scale new systems to enable delivery and dissemination across large-scale networks.

La.r.s.eN focused on the geometric growth of the Internet from its inception circa 1969 to the present, and the adjustments required to respond to that rapid growth. To ill.u.s.trate the issue of scalability, La.r.s.eN considered computer networks as including three generic components: computers, network communication nodes, and communication media. Each component scales (e.g., computers range from PCs to supercomputers; network nodes scale from interface cards in a PC through sophisticated routers and gateways; and communication media range from 2,400-baud dial-up facilities through 4.5-Mbps backbone links, and eventually to multigigabit-per-second communication lines), and architecturally, the components are organized to scale hierarchically from local area networks to international-scale networks. Such growth is made possible by building layers of communication protocols, as BESSER pointed out.

By layering both physically and logically, a sense of scalability is maintained from local area networks in offices, across campuses, through bridges, routers, campus backbones, fiber-optic links, etc., up into regional networks and ultimately into national and international networks.

La.r.s.eN then ill.u.s.trated the geometric growth over a two-year period-- through September 1991--of the number of networks that comprise the Internet. This growth has been sustained largely by the availability of three basic functions: electronic mail, file transfer (ftp), and remote log-on (telnet). La.r.s.eN also reviewed the growth in the kind of traffic that occurs on the network. Network traffic reflects the joint contributions of a larger population of users and increasing use per user. Today one sees serious applications involving moving images across the network--a rarity ten years ago. La.r.s.eN recalled and concurred with BESSER's main point that the interesting problems occur at the application level.

La.r.s.eN then ill.u.s.trated a model of a library's roles and functions in a network environment. He noted, in particular, the placement of on-line catalogues onto the network and patrons obtaining access to the library increasingly through local networks, campus networks, and the Internet.

La.r.s.eN supported LYNCH's earlier suggestion that we need to address fundamental questions of networked information in order to build environments that scale in the information sense as well as in the physical sense.

La.r.s.eN supported the role of the library system as the access point into the nation's electronic collections. Implementation of the Z39.50 protocol for information retrieval would make such access practical and feasible. For example, this would enable patrons in Maryland to search California libraries, or other libraries around the world that are conformant with Z39.50 in a manner that is familiar to University of Maryland patrons. This client-server model also supports moving beyond secondary content into primary content. (The notion of how one links from secondary content to primary content, La.r.s.eN said, represents a fundamental problem that requires rigorous thought.) After noting numerous network experiments in accessing full-text materials, including projects supporting the ordering of materials across the network, La.r.s.eN revisited the issue of transmitting high-density, high-resolution color images across the network and the large amounts of bandwidth they require. He went on to address the bandwidth and synchronization problems inherent in sending full-motion video across the network.

La.r.s.eN ill.u.s.trated the trade-off between volumes of data in bytes or orders of magnitude and the potential usage of that data. He discussed transmission rates (particularly, the time it takes to move various forms of information), and what one could do with a network supporting multigigabit-per-second transmission. At the moment, the network environment includes a composite of data-transmission requirements, volumes and forms, going from steady to bursty (high-volume) and from very slow to very fast. This aggregate must be considered in the design, construction, and operation of multigigabyte networks.

La.r.s.eN's objective is to use the networks and library systems now being constructed to increase access to resources wherever they exist, and thus, to evolve toward an on-line electronic virtual library.

La.r.s.eN concluded by offering a snapshot of current trends: continuing geometric growth in network capacity and number of users; slower development of applications; and glacial development and adoption of standards. The challenge is to design and develop each new application system with network access and scalability in mind.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ BROWNRIGG * Access to the Internet cannot be taken for granted * Packet radio and the development of MELVYL in 1980-81 in the Division of Library Automation at the University of California * Design criteria for packet radio * A demonstration project in San Diego and future plans * Spread spectrum * Frequencies at which the radios will run and plans to reimplement the WAIS server software in the public domain * Need for an infrastructure of radios that do not move around *

Edwin BROWNRIGG, executive director, Memex Research Inst.i.tute, first polled the audience in order to seek out regular users of the Internet as well as those planning to use it some time in the future. With nearly everybody in the room falling into one category or the other, BROWNRIGG made a point re access, namely that numerous individuals, especially those who use the Internet every day, take for granted their access to it, the speeds with which they are connected, and how well it all works.

However, as BROWNRIGG discovered between 1987 and 1989 in Australia, if one wants access to the Internet but cannot afford it or has some physical boundary that prevents her or him from gaining access, it can be extremely frustrating. He suggested that because of economics and physical barriers we were beginning to create a world of haves and have-nots in the process of scholarly communication, even in the United States.

BROWNRIGG detailed the development of MELVYL in academic year 1980-81 in the Division of Library Automation at the University of California, in order to underscore the issue of access to the system, which at the outset was extremely limited. In short, the project needed to build a network, which at that time entailed use of satellite technology, that is, putting earth stations on campus and also acquiring some terrestrial links from the State of California's microwave system. The installation of satellite links, however, did not solve the problem (which actually formed part of a larger problem involving politics and financial resources).

For while the project team could get a signal onto a campus, it had no means of distributing the signal throughout the campus. The solution involved adopting a recent development in wireless communication called packet radio, which combined the basic notion of packet-switching with radio. The project used this technology to get the signal from a point on campus where it came down, an earth station for example, into the libraries, because it found that wiring the libraries, especially the older marble buildings, would cost $2,000-$5,000 per terminal.

BROWNRIGG noted that, ten years ago, the project had neither the public policy nor the technology that would have allowed it to use packet radio in any meaningful way. Since then much had changed. He proceeded to detail research and development of the technology, how it is being deployed in California, and what direction he thought it would take.

The design criteria are to produce a high-speed, one-time, low-cost, high-quality, secure, license-free device (packet radio) that one can plug in and play today, forget about it, and have access to the Internet.

By high speed, BROWNRIGG meant 1 megabyte and 1.5 megabytes. Those units have been built, he continued, and are in the process of being type-certified by an independent underwriting laboratory so that they can be type-licensed by the Federal Communications Commission. As is the case with citizens band, one will be able to purchase a unit and not have to worry about applying for a license.

The basic idea, BROWNRIGG elaborated, is to take high-speed radio data transmission and create a backbone network that at certain strategic points in the network will "gateway" into a medium-speed packet radio (i.e., one that runs at 38.4 kilobytes), so that perhaps by 1994-1995 people, like those in the audience for the price of a VCR could purchase a medium-speed radio for the office or home, have full network connectivity to the Internet, and partake of all its services, with no need for an FCC license and no regular bill from the local common carrier. BROWNRIGG presented several details of a demonstration project currently taking place in San Diego and described plans, pending funding, to install a full-bore network in the San Francisco area. This network will have 600 nodes running at backbone speeds, and 100 of these nodes will be libraries, which in turn will be the gateway ports to the 38.4 kilobyte radios that will give coverage for the neighborhoods surrounding the libraries.

BROWNRIGG next explained Part 15.247, a new rule within t.i.tle 47 of the Code of Federal Regulations enacted by the FCC in 1985. This rule challenged the industry, which has only now risen to the occasion, to build a radio that would run at no more than one watt of output power and use a fairly exotic method of modulating the radio wave called spread spectrum. Spread spectrum in fact permits the building of networks so that numerous data communications can occur simultaneously, without interfering with each other, within the same wide radio channel.

BROWNRIGG explained that the frequencies at which the radios would run are very short wave signals. They are well above standard microwave and radar. With a radio wave that small, one watt becomes a tremendous punch per bit and thus makes transmission at reasonable speed possible. In order to minimize the potential for congestion, the project is undertaking to reimplement software which has been available in the networking business and is taken for granted now, for example, TCP/IP, routing algorithms, bridges, and gateways. In addition, the project plans to take the WAIS server software in the public domain and reimplement it so that one can have a WAIS server on a Mac instead of a Unix machine. The Memex Research Inst.i.tute believes that libraries, in particular, will want to use the WAIS servers with packet radio. This project, which has a team of about twelve people, will run through 1993 and will include the 100 libraries already mentioned as well as other professionals such as those in the medical profession, engineering, and law. Thus, the need is to create an infrastructure of radios that do not move around, which, BROWNRIGG hopes, will solve a problem not only for libraries but for individuals who, by and large today, do not have access to the Internet from their homes and offices.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * Project operating frequencies *

During a brief discussion period, which also concluded the day's proceedings, BROWNRIGG stated that the project was operating in four frequencies. The slow speed is operating at 435 megahertz, and it would later go up to 920 megahertz. With the high-speed frequency, the one-megabyte radios will run at 2.4 gigabits, and 1.5 will run at 5.7.

At 5.7, rain can be a factor, but it would have to be tropical rain, unlike what falls in most parts of the United States.

SESSION IV. IMAGE CAPTURE, TEXT CAPTURE, OVERVIEW OF TEXT AND IMAGE STORAGE FORMATS

William HOOTON, vice president of operations, I-NET, moderated this session.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ KENNEY * Factors influencing development of CXP * Advantages of using digital technology versus photocopy and microfilm * A primary goal of CXP; publishing challenges * Characteristics of copies printed * Quality of samples achieved in image capture * Several factors to be considered in choosing scanning * Emphasis of CXP on timely and cost-effective production of black-and-white printed facsimiles * Results of producing microfilm from digital files * Advantages of creating microfilm * Details concerning production * Costs * Role of digital technology in library preservation *

Anne KENNEY, a.s.sociate director, Department of Preservation and Conservation, Cornell University, opened her talk by observing that the Cornell Xerox Project (CXP) has been guided by the a.s.sumption that the ability to produce printed facsimiles or to replace paper with paper would be important, at least for the present generation of users and equipment. She described three factors that influenced development of the project: 1) Because the project has emphasized the preservation of deteriorating brittle books, the quality of what was produced had to be sufficiently high to return a paper replacement to the shelf. CXP was only interested in using: 2) a system that was cost-effective, which meant that it had to be cost-compet.i.tive with the processes currently available, princ.i.p.ally photocopy and microfilm, and 3) new or currently available product hardware and software.

KENNEY described the advantages that using digital technology offers over both photocopy and microfilm: 1) The potential exists to create a higher quality reproduction of a deteriorating original than conventional light-lens technology. 2) Because a digital image is an encoded representation, it can be reproduced again and again with no resulting loss of quality, as opposed to the situation with light-lens processes, in which there is discernible difference between a second and a subsequent generation of an image. 3) A digital image can be manipulated in a number of ways to improve image capture; for example, Xerox has developed a windowing application that enables one to capture a page containing both text and ill.u.s.trations in a manner that optimizes the reproduction of both. (With light-lens technology, one must choose which to optimize, text or the ill.u.s.tration; in preservation microfilming, the current practice is to shoot an ill.u.s.trated page twice, once to highlight the text and the second time to provide the best capture for the ill.u.s.tration.) 4) A digital image can also be edited, density levels adjusted to remove underlining and stains, and to increase legibility for faint doc.u.ments. 5) On-screen inspection can take place at the time of initial setup and adjustments made prior to scanning, factors that substantially reduce the number of retakes required in quality control.

A primary goal of CXP has been to evaluate the paper output printed on the Xerox DocuTech, a high-speed printer that produces 600-dpi pages from scanned images at a rate of 135 pages a minute. KENNEY recounted several publishing challenges to represent faithful and legible reproductions of the originals that the 600-dpi copy for the most part successfully captured. For example, many of the deteriorating volumes in the project were heavily ill.u.s.trated with fine line drawings or halftones or came in languages such as j.a.panese, in which the buildup of characters comprised of varying strokes is difficult to reproduce at lower resolutions; a surprising number of them came with annotations and mathematical formulas, which it was critical to be able to duplicate exactly.

KENNEY noted that 1) the copies are being printed on paper that meets the ANSI standards for performance, 2) the DocuTech printer meets the machine and toner requirements for proper adhesion of print to page, as described by the National Archives, and thus 3) paper product is considered to be the archival equivalent of preservation photocopy.

KENNEY then discussed several samples of the quality achieved in the project that had been distributed in a handout, for example, a copy of a print-on-demand version of the 1911 Reed lecture on the steam turbine, which contains halftones, line drawings, and ill.u.s.trations embedded in text; the first four loose pages in the volume compared the capture capabilities of scanning to photocopy for a standard test target, the IEEE standard 167A 1987 test chart. In all instances scanning proved superior to photocopy, though only slightly more so in one.

Conceding the simplistic nature of her review of the quality of scanning to photocopy, KENNEY described it as one representation of the kinds of settings that could be used with scanning capabilities on the equipment CXP uses. KENNEY also pointed out that CXP investigated the quality achieved with binary scanning only, and noted the great promise in gray scale and color scanning, whose advantages and disadvantages need to be examined. She argued further that scanning resolutions and file formats can represent a complex trade-off between the time it takes to capture material, file size, fidelity to the original, and on-screen display; and printing and equipment availability. All these factors must be taken into consideration.

CXP placed primary emphasis on the production in a timely and cost-effective manner of printed facsimiles that consisted largely of black-and-white text. With binary scanning, large files may be compressed efficiently and in a lossless manner (i.e., no data is lost in the process of compressing [and decompressing] an image--the exact bit-representation is maintained) using Group 4 CCITT (i.e., the French acronym for International Consultative Committee for Telegraph and Telephone) compression. CXP was getting compression ratios of about forty to one. Gray-scale compression, which primarily uses JPEG, is much less economical and can represent a lossy compression (i.e., not lossless), so that as one compresses and decompresses, the ill.u.s.tration is subtly changed. While binary files produce a high-quality printed version, it appears 1) that other combinations of spatial resolution with gray and/or color hold great promise as well, and 2) that gray scale can represent a tremendous advantage for on-screen viewing. The quality a.s.sociated with binary and gray scale also depends on the equipment used.

For instance, binary scanning produces a much better copy on a binary printer.

Among CXP's findings concerning the production of microfilm from digital files, KENNEY reported that the digital files for the same Reed lecture were used to produce sample film using an electron beam recorder. The resulting film was faithful to the image capture of the digital files, and while CXP felt that the text and image pages represented in the Reed lecture were superior to that of the light-lens film, the resolution readings for the 600 dpi were not as high as standard microfilming.

KENNEY argued that the standards defined for light-lens technology are not totally transferable to a digital environment. Moreover, they are based on definition of quality for a preservation copy. Although making this case will prove to be a long, uphill struggle, CXP plans to continue to investigate the issue over the course of the next year.

KENNEY concluded this portion of her talk with a discussion of the advantages of creating film: it can serve as a primary backup and as a preservation master to the digital file; it could then become the print or production master and service copies could be paper, film, optical disks, magnetic media, or on-screen display.

Finally, KENNEY presented details re production:

* Development and testing of a moderately-high resolution production scanning workstation represented a third goal of CXP; to date, 1,000 volumes have been scanned, or about 300,000 images.

* The resulting digital files are stored and used to produce hard-copy replacements for the originals and additional prints on demand; although the initial costs are high, scanning technology offers an affordable means for reformatting brittle material.

* A technician in production mode can scan 300 pages per hour when performing single-sheet scanning, which is a necessity when working with truly brittle paper; this figure is expected to increase significantly with subsequent iterations of the software from Xerox; a three-month time-and-cost study of scanning found that the average 300-page book would take about an hour and forty minutes to scan (this figure included the time for setup, which involves keying in primary bibliographic data, going into quality control mode to define page size, establishing front-to-back registration, and scanning sample pages to identify a default range of settings for the entire book--functions not dissimilar to those performed by filmers or those preparing a book for photocopy).

* The final step in the scanning process involved rescans, which happily were few and far between, representing well under 1 percent of the total pages scanned.

In addition to technician time, CXP costed out equipment, amortized over four years, the cost of storing and refreshing the digital files every four years, and the cost of printing and binding, book-cloth binding, a paper reproduction. The total amounted to a little under $65 per single 300-page volume, with 30 percent overhead included--a figure compet.i.tive with the prices currently charged by photocopy vendors.

Of course, with scanning, in addition to the paper facsimile, one is left with a digital file from which subsequent copies of the book can be produced for a fraction of the cost of photocopy, with readers afforded choices in the form of these copies.

KENNEY concluded that digital technology offers an electronic means for a library preservation effort to pay for itself. If a brittle-book program included the means of disseminating reprints of books that are in demand by libraries and researchers alike, the initial investment in capture could be recovered and used to preserve additional but less popular books. She disclosed that an economic model for a self-sustaining program could be developed for CXP's report to the Commission on Preservation and Access (CPA).

KENNEY stressed that the focus of CXP has been on obtaining high quality in a production environment. The use of digital technology is viewed as an affordable alternative to other reformatting options.