Descriptor
| Information Storage | 7 |
| Information Theory | 7 |
| Performance | 7 |
| Tables (Data) | 7 |
| Algorithms | 6 |
| Coding | 6 |
| Mathematical Models | 6 |
| Evaluation | 5 |
| Comparative Analysis | 4 |
| Data Processing | 4 |
| Models | 3 |
| More ▼ | |
Source
| Information Processing &… | 12 |
Author
| Moffat, Alistair | 2 |
| Burns, Catherine M. | 1 |
| Chen, Zhengxin | 1 |
| Constantinescu, Cornel | 1 |
| Culik, Karel II | 1 |
| Decroos, Francis | 1 |
| Feygin, Gennady | 1 |
| Grumbach, Stephane | 1 |
| Howard, Paul G | 1 |
| Kari, Jarkko | 1 |
| Levitin, Anany | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 12 |
| Reports - Evaluative | 12 |
| Speeches/Meeting Papers | 6 |
| Reports - Descriptive | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Peer reviewedMoffat, Alistair; And Others – Information Processing & Management, 1994
Evaluates the performance of different methods of data compression coding in several situations. Huffman's code, arithmetic coding, fixed codes, fast approximations to arithmetic coding, and splay coding are discussed in terms of their speed, memory requirements, and proximity to optimal performance. Recommendations for the best methods of…
Descriptors: Coding, Data Processing, Evaluation, Experiments
Peer reviewedMoffat, Alistair; And Others – Information Processing & Management, 1994
Describes an approximate document ranking process that uses a compact array of in-memory, low-precision approximations for document length. Combined with another rule for reducing the memory required by partial similarity accumulators, the approximation heuristic allows the ranking of large document collections using less than one byte of memory…
Descriptors: Database Design, Database Management Systems, Full Text Databases, Information Retrieval
Peer reviewedLevitin, Anany; Redman, Thomas – Information Processing & Management, 1995
Discusses a list of characteristics (dimensions) that are crucial for data model quality. Fourteen quality dimensions are singled out and organized in six categories: content, scope, level of detail, composition, consistency, and reaction to change. Two types of correlations among dimensions called "reinforcements" and…
Descriptors: Change, Check Lists, Data, Definitions
Peer reviewedHoward, Paul G; Vitter, Jeffrey Scott – Information Processing & Management, 1994
Describes a detailed algorithm for fast text compression. Related to the PPM (prediction by partial matching) method, it simplifies the modeling phase by eliminating the escape mechanism and speeds up coding by using a combination of quasi-arithmetic coding and Rice coding. Details of the use of quasi-arithmetic code tables are given, and their…
Descriptors: Algorithms, Coding, Electronic Text, Information Storage
Peer reviewedConstantinescu, Cornel; Storer, James A. – Information Processing & Management, 1994
Presents a new image compression algorithm that employs some of the most successful approaches to adaptive lossless compression to perform adaptive online (single pass) vector quantization with variable size codebook entries. Results of tests of the algorithm's effectiveness on standard test images are given. (12 references) (KRN)
Descriptors: Algorithms, Coding, Data Processing, Evaluation
Peer reviewedFeygin, Gennady; And Others – Information Processing & Management, 1994
Presents two new algorithms for performing arithmetic coding without employing multiplication and discusses their implementation requirements. The first algorithm, suitable for an alphabet of arbitrary size, reduces the worst case excess length to under 0.8%. The second algorithm, suitable only for alphabets of less than 12 symbols, allows even…
Descriptors: Algorithms, Coding, Comparative Analysis, Evaluation
Peer reviewedCulik, Karel II; Kari, Jarkko – Information Processing & Management, 1994
Presents an inference algorithm that produces a weighted finite automata (WFA), in particular, the grayness functions of graytone images. Image-data compression results based on the new inference algorithm produces a WFA with a relatively small number of edges. Image-data compression results alone and in combination with wavelets are discussed.…
Descriptors: Algorithms, Coding, Comparative Analysis, Data Processing
Peer reviewedSavoy, Jacques – Information Processing & Management, 1997
Discussion of evaluation methodology in information retrieval focuses on the average precision over a set of fixed recall values in an effort to evaluate the retrieval effectiveness of a search algorithm. Highlights include a review of traditional evaluation methodology with examples; and a statistical inference methodology called bootstrap.…
Descriptors: Algorithms, Evaluation Methods, Information Retrieval, Mathematical Formulas
Peer reviewedDecroos, Francis; And Others – Information Processing & Management, 1997
Describes an investigation that was conducted to show the feasibility of spectral methods in information science, particularly for analyzing academic library circulation data. Signal analysis methods were used to detect periodicity, and spectral methods show promise for analyzing time series and other signals in information science. (Author/LRW)
Descriptors: Academic Libraries, Case Studies, Data Analysis, Evaluation Methods
Peer reviewedGrumbach, Stephane; Tahi, Fariza – Information Processing & Management, 1994
Analyzes the properties of genetic sequences that cause the failure of classical algorithms used for data compression. A lossless algorithm, which compresses the information contained in DNA and RNA sequences by detecting regularities such as palindromes, is presented. This algorithm combines substitutional and statistical methods and appears to…
Descriptors: Algorithms, Coding, Comparative Analysis, Databases
Peer reviewedChen, Zhengxin – Information Processing & Management, 1996
Discusses similarities and differences between knowledge acquisition in expert systems and requirement acquisition in others kinds of information systems (particularly decision support systems). Examines role-limiting methods for automated knowledge acquisition and their integration from the problem-solving perspective. Suggests a feasibility…
Descriptors: Comparative Analysis, Data Processing, Decision Support Systems, Expert Systems
Peer reviewedBurns, Catherine M.; Vicente, Kim J. – Information Processing & Management, 1996
Describes an empirical evaluation that investigated the criteria by which designers of human-machine systems evaluate design information. Professional designers of nuclear power plant control rooms rated hypothetical information search questions in terms of relevance, importance, cost, and effort based on Rouse's model of information search…
Descriptors: Computer System Design, Correlation, Costs, Designers


