Project Period
September 1, 1999-August 31, 2002
Level of Access
Open-Access Report
Grant Number
9970853
Submission Date
3-9-2004
Abstract
This study examined two major questions. Do national and state assessments provide consistent information on the performance of state education systems? What accounts for discrepancies between national and state assessment results if they are found?
Data came from national and state assessments in grade 4 and grade 8 mathematics from 1992 to 1996 in Maine and Kentucky: National Assessment of Educational Progress (NAEP), Kentucky Instructional Results Information System (KIRIS), and Maine Educational Assessment (MEA). Here is a very brief summary of major research findings:
1. NAEP and state assessments reported inconsistent results on the performance level of students in Maine and Kentucky across grades and years. Both MEA and KIRIS appear to have more rigorous performance standards, which reduces the percentage of students identified as performing at Proficient/ Advanced level. These discrepancies may be understood in light of the differences between the NAEP and state assessments in their definitions of performance standards and the methods of standard setting.
2. The size of achievement gaps between different groups of students appeared somewhat smaller on state assessments than on the NAEP. The discrepancies may be explained by examining the differences between NAEP and state assessments in the representation of different student groups in their testing samples, the distribution of item difficulties in their tests, and differential impact of state assessment on low-performing students/schools.
3. The sizes of achievement gains from the states’ own assessments were considerably greater than that of NAEP’s. At the same time, the amount of difference is not always consistent across grades. These gaps and inconsistencies might be related to differences between the national and state assessments in the stakes of testing for school systems and changes in test format that impact test equating.
The study findings raise cautions in using either national or state assessment results alone to evaluate the performance of particular state education systems. This report also provides some preliminary analyses of the sources of inconsistencies and discrepancies between national and state assessments. Although these findings may not be generalized to all states, they suggest that policymakers and educators become more aware of the unique features and limitations of current national and state assessments. While the NAEP assessment can be used to cross-check and validate the states’ own assessment results, each state’s unique assessment characteristics (both policy and technical aspects) need to be considered. The study gives us implications for comparing and/or combining the results from national and state assessments.
Rights and Access Note
This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. In addition, no permission is required from the rights-holder(s) for educational uses. For other uses, you need to obtain permission from the rights-holder(s).
Recommended Citation
Lee, Jaekyung; McIntire, Walter; and Coladarci, Theodore, "Exploring Data and Methods to Assess and Understand the Performance of SSI States: Learning from the Cases of Kentucky and Maine" (2004). University of Maine Office of Research Administration: Grant Reports. 110.
https://digitalcommons.library.umaine.edu/orsp_reports/110
Additional Participants
Other Collaborators or Contacts
Benjamin Wright, University of Chicago
Ken Wong, University of Chicago
Ruey Yahle, Curriculum Coordinator, Orono School District