Katana VentraIP

National Assessment of Educational Progress

The National Assessment of Educational Progress (NAEP) is the largest continuing and nationally representative assessment of what U.S. students know and can do in various subjects. NAEP is a congressionally mandated project administered by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the United States Department of Education. The first national administration of NAEP occurred in 1969.[1] The National Assessment Governing Board (NAGB) is an independent, bipartisan board that sets policy for NAEP and is responsible for developing the framework and test specifications.The National Assessment Governing Board, whose members are appointed by the U.S. Secretary of Education, includes governors, state legislators, local and state school officials, educators, business representatives, and members of the general public. Congress created the 26-member Governing Board in 1988.

NAEP results are designed to provide group-level data on student achievement in various subjects, and are released as The Nation's Report Card.[2] There are no results for individual students, classrooms, or schools. NAEP reports results for different demographic groups, including gender, socioeconomic status, and race/ethnicity. Assessments are given most frequently in mathematics, reading, science and writing. Other subjects such as the arts, civics, economics, geography, technology and engineering literacy (TEL) and U.S. history are assessed periodically.


In addition to assessing student achievement in various subjects, NAEP also surveys students, teachers, and school administrators to help provide contextual information. Questions asking about participants' race or ethnicity, school attendance, and academic expectations help policy makers, researchers, and the general public better understand the assessment results.


Teachers, principals, parents, policymakers, and researchers all use NAEP results to assess student progress across the country and develop ways to improve education in the United States. NAEP has been providing data on student performance since 1969.[3][4]


NAEP uses a sampling procedure that allows the assessment to be representative of the geographical, racial, ethnic, and socioeconomic diversity of the schools and students in the United States. Data is also provided on students with disabilities and English language learners. NAEP assessments are administered to participating students using the same test booklets and procedures, except accommodations for students with disabilities,[5][6] so NAEP results are used for comparison of states and urban districts that participate in the assessment.


There are two NAEP websites: the NCES NAEP website and The Nation's Report Card website. The first site details the NAEP program holistically, while the second focuses primarily on the individual releases of data.

coordinating the NAEP administration in the state,

assisting with the analysis and reporting of NAEP data, and

promoting public understanding of NAEP and its resources

Technology and society – deals with the effects that technology has on society and on the natural world and with the sorts of ethical questions that arise from those effects.

Design and systems – covers the nature of technology; the engineering design process by which technologies are developed; and basic principles of dealing with everyday technologies, including maintenance and troubleshooting.

Information and communication technology – includes computers and software learning tools; networking systems and protocols; hand-held digital devices; and other technologies for accessing, creating, and communicating information and for facilitating creative expression.

The was undertaken to discover how well the nation's fourth-graders can read aloud a typical grade 4 story. The assessment provided information about students' fluency in reading aloud and examined the relationship between oral reading, accuracy, rate, fluency, and reading comprehension.

Oral Reading Study

was a pilot study conducted as a part of the 2003 NAEP assessments in mathematics and reading at the fourth-grade level. While charter schools are similar to other public schools in many respects, they differ in several important ways, including the makeup of the student population and their location.

America's Charter Schools

educate about 10 percent of the nation's students. In the first report, assessment results for all private schools and for the largest private school categories—Catholic, Lutheran, and Conservative Christian—were compared with those for public schools (when applicable). The second report examined differences between public and private schools in 2003 NAEP mean mathematics and reading scores when selected characteristics of students and/or schools were taken into account.

Private Schools

was designed to explore the use of technology, especially the use of the computer as a tool to enhance the quality and efficiency of educational assessments.

Technology-Based Assessment project

Criticism[edit]

NAEP's heavy use of statistical hypothesis testing has drawn some criticism related to interpretation of results. For example, the Nation's Report Card reported "Males Outperform Females at all Three Grades in 2005" as a result of science test scores of 100,000 students in each grade.[14] Hyde and Linn criticized this claim, because the mean difference was only 4 out of 300 points, implying a small effect size and heavily overlapped distributions. They argue that "small differences in performance in the NAEP and other studies receive extensive publicity, reinforcing subtle, persistent, biases."[15]


NAEP's choice of which answers to mark right or wrong has also been criticized, a problem which happens in other countries too.[16] For example, a history question asked about the 1954 Brown v. Board of Education ruling, and explicitly referred to the 1954 decision which identified the problem, not the 1955 decision which ordered desegregation. NAEP asked students to "describe the conditions that this 1954 decision was designed to correct." They marked students wrong who mentioned segregation without mentioning desegregation. In fact the question asked only about existing conditions, not remedies, and in any case the 1954 decision did not order desegregation.[17][18] The country waited until the 1955 Brown II decision to hear about "all deliberate speed." Another history question marked students wrong who knew the US fought Russians as well as Chinese and North Koreans in the Korean War. Other released questions on math and writing have had similar criticism. Math answers have penalized students who understand negative square roots, interest on loans, and errors in extrapolating a graph beyond the data.[19][20]


NAEP's claim to measure critical thinking has also been criticized. UCLA researchers found that students could choose the correct answers without critical thinking.[21]


NAEP scores each test by a statistical method, sets cutoffs for "basic" and "proficient" standards, and gives examples of what students at each level accomplished on the test. The process to design the tests and standards has been criticized by Western Michigan University (1991), the National Academy of Education (1993), the Government Accountability Office (1993), the National Academy of Sciences (1999),[22][23] the American Institutes for Research and RTI International (2007),[24] Brookings Institution (2007[25] and 2016[24]), the Buros Center for Testing (2009),[22] and the National Academies of Sciences, Engineering, and Medicine (2016).[24]


Interpretation of NAEP results has been difficult: NAEP's category of "proficient" on a reading test given to fourth graders reflects students who do well on the test and are at seventh grade level.[24] NAEP's category of "proficient" on a math test given to eighth graders reflects students who do well on the test and are at twelfth grade level.[25] The fact that few eighth graders are proficient by this standard and achieve at twelfth grade level has been misinterpreted to allege that few eighth graders achieve even at eighth grade level.[26] NAEP says, "Students who may be proficient in a subject, given the common usage of the term, might not satisfy the requirements for performance at the NAEP achievement level"[24] James Harvey, principal author of A Nation at Risk, says, "It's hard to avoid concluding that the word was consciously chosen to confuse policymakers and the public."[24]

NAEP - Reading achievement

James W. Pellegrino; Lee R. Jones; Karen J. Mitchell, eds. (1999), , doi:10.17226/6296, ISBN 978-0-309-06285-5

Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress

Official website

NAEP Data Explorer

NAEP Questions Tool

NAEP assessment reports since 2005

Massachusetts NAEP Web page

- From the Education Resources Information Center Clearinghouse on Tests Measurement and Evaluation.

The National Assessment of Educational Progress (NAEP)

National Assessment Governing Board