Descriptor
| Evaluation Methods | 10 |
| Models | 10 |
| Information Retrieval | 7 |
| Relevance (Information… | 6 |
| Mathematical Formulas | 2 |
| Measurement Techniques | 2 |
| Performance Factors | 2 |
| Semantics | 2 |
| Standards | 2 |
| Subject Index Terms | 2 |
| Tables (Data) | 2 |
| More ▼ | |
Source
| Information Processing &… | 10 |
Author
| Shaw, W. M., Jr. | 2 |
| Amati, Gianni | 1 |
| Bodner, Richard C. | 1 |
| Brooks, Terrence A. | 1 |
| Chignell, Mark H. | 1 |
| Crestani, Fabio | 1 |
| Dominich, Sandor | 1 |
| Gwizdka, Jacek | 1 |
| Jones, Karen Sparck | 1 |
| Lam, W. | 1 |
| Lang, S. D. | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 10 |
| Reports - Descriptive | 6 |
| Reports - Research | 4 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Peer reviewedMenou, Michel J. – Information Processing & Management, 1995
Discusses concepts and theories of information and suggests a model for assessment of information-as-contents. Proposes a revised formulation of Brookes' fundamental equation and possible approaches for describing attributes of the beneficiaries and their knowledge structure. (JKP)
Descriptors: Cognitive Structures, Evaluation Methods, Information Theory, Models
Peer reviewedMostafa, J.; Lam, W. – Information Processing & Management, 2000
Presents a multilevel model of the information filtering process that permits document classification. Evaluates a document classification approach based on a supervised learning algorithm, measures the accuracy of the algorithm in a neural network that was trained to classify medical documents on cell biology, and discusses filtering…
Descriptors: Algorithms, Classification, Cytology, Evaluation Methods
Peer reviewedShaw, W. M., Jr.; And Others – Information Processing & Management, 1997
Describes a study that computed low performance standards for the group of queries in 13 information retrieval (IR) test collections. Derived from the random graph hypothesis, these standards represent the highest levels of retrieval effectiveness that can be obtained from meaningless clustering structures. (Author/LRW)
Descriptors: Evaluation Methods, Hypothesis Testing, Information Retrieval, Measurement Techniques
Peer reviewedDominich, Sandor – Information Processing & Management, 2003
Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…
Descriptors: Computation, Evaluation Methods, Information Retrieval, Interaction
Peer reviewedBrooks, Terrence A. – Information Processing & Management, 1997
Analyzes relevance assessments of topical descriptors for bibliographic records for two dimensions: (1) a vertical conceptual hierarchy of broad to narrow descriptors, and (2) a horizontal linkage of related terms. The data were analyzed for a semantic distance and semantic direction effect as postulated by the Semantic Distance Model. (Author/LRW)
Descriptors: Bibliographic Records, Evaluation Methods, Models, Relevance (Information Retrieval)
Peer reviewedShaw, W. M., Jr.; And Others – Information Processing & Management, 1997
Describes a study that computed the low performance standards for queries in 17 test collections. Predicted by the hypergeometric distribution, the standards represent the highest level of retrieval effectiveness attributable to chance. Operational levels of performance for vector-space and other retrieval models were compared to the standards.…
Descriptors: Comparative Analysis, Evaluation Methods, Information Retrieval, Measurement Techniques
Peer reviewedSyu, Inien; Lang, S. D. – Information Processing & Management, 2000
Explains how a competition-based connectionist model for diagnostic problem-solving is adapted to information retrieval. Topics include probabilistic causal networks; Bayesian networks; the neural network model; empirical studies of test collections that evaluated retrieval performance; precision results; and the use of a thesaurus to provide…
Descriptors: Competition, Evaluation Methods, Information Retrieval, Mathematical Formulas
Peer reviewedJones, Karen Sparck – Information Processing & Management, 2000
Reviews the TREC (Text Retrieval Conference) program, considering the test results, the findings for information retrieval, and the lessons TREC offers for information retrieval evaluation. Topics include the ad hoc retrieval task; indexing models; document and query descriptions; search strategies; and the user's request as the dominant factor in…
Descriptors: Conferences, Evaluation Methods, Indexing, Information Retrieval
Peer reviewedChignell, Mark H.; Gwizdka, Jacek; Bodner, Richard C. – Information Processing & Management, 1999
Introduces an evaluative framework for metasearch engine performance and illustrates its use in two experiments. Results are used to characterize some properties of leading search engines (as of 1998). Significant interactions were observed between search engine and two other factors: time of day and Web domain. (Author/AEF)
Descriptors: Computer System Design, Evaluation Criteria, Evaluation Methods, Information Retrieval
Peer reviewedAmati, Gianni; Crestani, Fabio – Information Processing & Management, 1999
Describes and evaluates a learning model for information filtering and selective dissemination of information which is an adaptation of the generalized probabilistic model of information retrieval. The model is based on the concept of uncertainty sampling that allows for relevance feedback both on relevant and nonrelevant documents. (Author/LRW)
Descriptors: Evaluation Methods, Feedback, Information Retrieval, Learning Processes


