Descriptor
| Benchmarking | 3 |
| Library Automation | 3 |
| Design Requirements | 2 |
| Field Tests | 2 |
| Guidelines | 2 |
| Library Planning | 2 |
| Library Role | 2 |
| Microcomputers | 2 |
| Online Systems | 2 |
| Performance | 2 |
| Comparative Analysis | 1 |
| More ▼ | |
Source
| Library Hi Tech | 3 |
Publication Type
| Journal Articles | 3 |
| Opinion Papers | 2 |
| Book/Product Reviews | 1 |
| Guides - General | 1 |
| Reports - Evaluative | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Peer reviewedShekhel, Alex; O'Brien, Mike – Library Hi Tech, 1989
Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)
Descriptors: Benchmarking, Comparative Analysis, Database Management Systems, Evaluation Criteria
Peer reviewedSugnet, Chris, Ed. – Library Hi Tech, 1986
Representatives of six vendors of online library systems address key issues related to system performance: designing, configuring and sizing systems; establishing performance criteria; using benchmark and acceptance tests; the risks of miscalculations; vendor, consultant, and library roles; and related topics. (Author/EM)
Descriptors: Benchmarking, Design Requirements, Field Tests, Guidelines
Peer reviewedDrabenstott, John, Ed. – Library Hi Tech, 1986
Five library consultants address issues that affect online system performance: options in system design that relate to diverse library requirements; criteria that most affect performance; benchmark tests and sizing criteria; minimalizing the risks of miscalculation; and the roles and responsibilities of vendors, libraries, and consultants.…
Descriptors: Benchmarking, Design Requirements, Field Tests, Guidelines


