TAP's Elements TAP Outcomes Legislation Understanding Value-Added Teacher Quality Resources The Working Group on Teacher Quality
Understanding Value-Added Analysis of Student Achievement@if>
What is Value-Added Analysis?
Value-added analysis is a statistical technique that uses student achievement data over time to measure the learning gains students make. This methodology offers a way to estimate the impact schools and teachers have on student learning isolated from other contributing factors such as family characteristics and socioeconomic background. In other words, value-added analysis provides a way to measure the effect a school or teacher has on student academic performance over the course of a school year or another period of time.
Academic Attainment v. Academic Growth
Academic attainment is the level of achievement a student reaches at a point in time (e.g., on the state standardized test given at the end of any given school year). Usually referred to by a specific numerical score or standard of achievement (e.g., basic, proficient, advanced, etc.), academic attainment levels are what are typically used to rate school and/or teacher performance.
In contrast academic growth is the amount of academic gain or progress a student makes over a period of time (e.g., on the state test given over several grades). Value-added analysis is a methodology to measure academic growth and attribute it to the impact the school or teacher has had on student learning.
Benefits of Using Value-Added Analysis
Value-added analysis provides a more useful indicator of school and teacher performance than looking at student attainment levels, which is commonly used in public education today, for several reasons.
First, value-added analysis provides a more accurate way to measure student academic progress. Value-added analysis tracks the same student over time and compares his/her test scores over several years. In contrast, systems like Adequate Yearly Progress (AYP) look at the fourth grade math scores for one year, for example, and compare them to the fourth grade math scores from the previous year. This yields an inaccurate comparison because the groups of students may be significantly different from year to year (Braun, 2005).
Second, value-added analysis provides a measure for how much of an impact the school and individual teachers have on student achievement. Looking at the attainment level of a school or classroom on a state test provides little information about the impact the school or teacher has had on the final score as compared with other influences on student achievement like family background and socioeconomic status. By judging only one score, it is difficult to identify how much of that score was influenced by factors outside of the school as compared to other factors that can be controlled within the school (e.g., the contributions of the teacher and school).
Third, when student achievement is tied to accountability systems, value-added analysis provides a fairer method to measure school and teacher impact on student achievement because it takes into account where a student started the school year academically and how much that student grew. Judging a schoolís or teacherís performance by looking at student academic attainment levels is unfair because some students may enter a teacherís classroom already at high levels of achievement ó or conversely, several grade levels behind their peers. Without considering the academic growth teachers and schools are able to make with their students, some teachers and schools may inaccurately be attributed with making a significant impact while others may be unfairly penalized.
How TAP Schools Use Value-Added Data
School districts that are implementing TAP district-wide often use value-added data to identify schools, grades and content areas that have or have not increased student achievement. These data help district officials plan how to target professional development so that it is most effective for teachers and schools. Districts can also use these data to identify effective teachers and administrators who can be utilized as mentors for others at schools that have not made significant academic gains.
At the school level, TAP leadership teams utilize value-added data to address the instructional needs of teachers both at the individual and group levels. By analyzing teacher value-added scores and comparing them to a teacher's evaluation scores (based on observations of classroom instruction), leadership teams are able to identify "best practices" that are having a positive impact on student achievement. Leadership teams can then share these best practices with other teachers during weekly cluster group meetings (professional learning communities) to promote effective instruction. Leadership team members also use comparative data to conference with teachers on a one-on-one basis and inform the development of teachers' individual professional growth plans to reach instructional goals.
At the classroom level, teachers analyze the value-added data from their own students by subgroups (such as high, medium and low performing students) to identify trends in their own instruction. The data may reveal that their instruction is targeted more to a specific subgroup and, as a result, teachers make adjustments in their instruction. This data analysis process allows teachers to meet the needs of all students more effectively and support the individual academic growth of their students regardless of their ability level.
Resources on Value-Added Analysis
Comparisons Among Various Educational Assessment Value-Added Models
William L. Sanders, Presented at "The Power of Two" National Value-Added Conference, October 16, 2006
This paper discusses the differences between value-added models as well as their advantages and disadvantages.
FORUM: "Accountability Gains: Are we measuring achievement gains accurately enough?"
Education Next (2002) No. 2
This forum includes four articles by Dale Ballou, Anita A. Summer, Jay P. Greene and Donald R. McAdams discussing the pros and cons of value-added measurement of student achievement.
Research Brief: The Promise and Peril of Using Value-Added Modeling to Measure Teacher Effectiveness
RAND Corporation (2004)
This brief summarizes the findings of a longer research report, "Evaluating Value-Added Models for Teacher Accountability," by Daniel F. McCaffrey, Daniel M. Koretz, J.R. Lockwood and Laura S. Hamilton (2004), which compares several value-added models and discusses the strengths and weaknesses of using such methodology for both diagnostic and accountability purposes.
Roundtable Discussion on Value-Added Analysis of Student Achievement: A Summary of Findings
The Working Group on Teacher Quality (2007)
Summarized from a roundtable discussion held in October 2007 among policymakers, researchers and practitioners, this document presents major themes, findings and lessons learned in value-added analysis of student achievement. The purpose of the discussion was to create a broader understanding of how value-added analysis of student achievement can be used as an indicator of teacher effectiveness and the implications this has for policy and practice.
Using Student Progress to Evaluate Teachers: A Primer on Value-Added Models
Henry I. Braun (September 2005)
This policy perspective provides reader-friendly information on the more technical issues associated with value-added modeling.
Value-Added Modeling: The Challenge of Measuring Educational Outcomes
Barbara Elizabeth Stewart (2006)
This article provides a summary of the history, definition, strengths and weaknesses of value-added modeling.
Battelle for Kids
Battelle for Kids is a non-profit organization based in Ohio, whose activities include helping districts and schools use value-added data to improve their instruction.
Houston Independent School District's Accelerating Student Progress, Increasing Results & Expectations (ASPIRE) Program
This website provides information on the value-added model used in Houston Independent School District's ASPIRE program. The site includes a guide to value-added for parents and families.