NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)0
Since 2007 (last 20 years)19
Audience
Laws, Policies, & Programs
No Child Left Behind Act 20011
What Works Clearinghouse Rating
Showing 1 to 15 of 19 results Save | Export
Cresswell, John; Schwantner, Ursula; Waters, Charlotte – OECD Publishing, 2015
This report reviews the major international and regional large-scale educational assessments, including international surveys, school-based surveys and household-based surveys. The report compares and contrasts the cognitive and contextual data collection instruments and implementation methods used by the different assessments in order to identify…
Descriptors: International Assessment, Educational Assessment, Data Collection, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Akour, Mutasem; Sabah, Saed; Hammouri, Hind – Journal of Psychoeducational Assessment, 2015
The purpose of this study was to apply two types of Differential Item Functioning (DIF), net and global DIF, as well as the framework of Differential Step Functioning (DSF) to real testing data to investigate measurement invariance related to test language. Data from the Program for International Student Assessment (PISA)-2006 polytomously scored…
Descriptors: Test Bias, Science Tests, Test Items, Scoring
Katzman, John – New England Journal of Higher Education, 2014
It is so easy to criticize the SAT that most observers overlook the weaknesses of its architect, the College Board. This author contents that, until the latter is replaced, however, the former will never be fixed. The College Board has every incentive to create a complex, stressful, expensive college admissions system. Because it is accountable to…
Descriptors: Standardized Tests, Testing Programs, Program Administration, Cost Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Arffman, Inga – Educational Measurement: Issues and Practice, 2013
The article reviews research and findings on problems and issues faced when translating international academic achievement tests. The purpose is to draw attention to the problems, to help to develop the procedures followed when translating the tests, and to provide suggestions for further research. The problems concentrate on the following: the…
Descriptors: Achievement Tests, Translation, Testing Problems, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Wen-Chung; Chen, Hui-Fang; Jin, Kuan-Yu – Educational and Psychological Measurement, 2015
Many scales contain both positively and negatively worded items. Reverse recoding of negatively worded items might not be enough for them to function as positively worded items do. In this study, we commented on the drawbacks of existing approaches to wording effect in mixed-format scales and used bi-factor item response theory (IRT) models to…
Descriptors: Item Response Theory, Test Format, Language Usage, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Starr, Karen – AASA Journal of Scholarship & Practice, 2014
This article is a commentary on Australia's involvement in the Programme for International Student Assessment (PISA) tests. It provides a rationale for Australia's participation in the PISA programme, the influences of PISA involvement on education policies and practices, and considerations and implications for school leaders and education…
Descriptors: Foreign Countries, Educational Policy, Educational Practices, Comparative Education
Di Giacomo, F. Tony; Fishbein, Bethany G.; Buckley, Vanessa W. – College Board, 2013
Many articles and reports have reviewed, researched, and commented on international assessments from the perspective of exploring what is relevant for the United States' education systems. Researchers make claims about whether the top-performing systems have transferable practices or policies that could be applied to the United States. However,…
Descriptors: Comparative Testing, International Assessment, Relevance (Education), Testing Programs
Schleicher, Andreas – Online Submission, 2016
The OECD Programme for International Student Assessment (PISA) provides a framework in which over 80 countries collaborate to build advanced global metrics to assess the knowledge, skills and character attributes of the students. The design of assessments poses major conceptual and technical challenges, as successful learning. Beyond a sound…
Descriptors: International Assessment, Educational Methods, Educational Policy, Minimum Competency Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Unsworth, Len – Pedagogies: An International Journal, 2014
Interpreting the image-language interface in multimodal texts is now well recognized as a crucial aspect of reading comprehension in a number of official school syllabi such as the recently published Australian Curriculum: English (ACE). This article outlines the relevant expected student learning outcomes in this curriculum and draws attention to…
Descriptors: Foreign Countries, National Curriculum, Reading Comprehension, Reading Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Topçu, Mustafa Sami; Arikan, Serkan; Erbilgin, Evrim – Australian Educational Researcher, 2015
The OECD's Programme for International Student Assessment (PISA) enables participating countries to monitor 15-year old students' progress in reading, mathematics, and science literacy. The present study investigates persistent factors that contribute to science performance of Turkish students in PISA 2006 and PISA 2009. Additionally, the study…
Descriptors: Foreign Countries, Science Achievement, Science Tests, Testing Programs
Peer reviewed Peer reviewed
Direct linkDirect link
Shohamy, Elana – Language and Intercultural Communication, 2013
While much of the work in language testing is concerned with constructing quality tests in order to measure language knowledge in reliable and valid ways, there has been a significant movement in language testing research that examines tests in the context of their use in education and society. This line of research exits from the notion that…
Descriptors: Language Tests, Testing, Evaluation Research, Ideology
Peer reviewed Peer reviewed
Direct linkDirect link
Benítez, Isabel; Padilla, José-Luis – Journal of Mixed Methods Research, 2014
Differential item functioning (DIF) can undermine the validity of cross-lingual comparisons. While a lot of efficient statistics for detecting DIF are available, few general findings have been found to explain DIF results. The objective of the article was to study DIF sources by using a mixed method design. The design involves a quantitative phase…
Descriptors: Foreign Countries, Mixed Methods Research, Test Bias, Cross Cultural Studies
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wagner, Daniel A.; Babson, Andrew; Murphy, Katie M. – Current Issues in Comparative Education, 2011
Timely and credible data on student learning has become a global issue in the ongoing effort to improve educational outcomes. With the potential to serve as a powerful diagnostic tool to gauge the overall health and well-being of an educational system, educational assessments have received increasing attention among specialists and the media.…
Descriptors: Low Income, Educational Objectives, Outcomes of Education, Educational Change
Peer reviewed Peer reviewed
Direct linkDirect link
Debeer, Dries; Buchholz, Janine; Hartig, Johannes; Janssen, Rianne – Journal of Educational and Behavioral Statistics, 2014
In this article, the change in examinee effort during an assessment, which we will refer to as persistence, is modeled as an effect of item position. A multilevel extension is proposed to analyze hierarchically structured data and decompose the individual differences in persistence. Data from the 2009 Program of International Student Achievement…
Descriptors: Reading Tests, International Programs, Testing Programs, Individual Differences
Australian Council for Educational Research, 2015
Monitoring Trends in Educational Growth (MTEG) offers a flexible, collaborative approach to developing and implementing an assessment of learning outcomes that yields high-quality, nationally relevant data. MTEG is a service that involves ACER staff working closely with each country to develop an assessment program that meets the country's…
Descriptors: Educational Development, Educational Trends, Progress Monitoring, Educational Quality
Previous Page | Next Page »
Pages: 1  |  2