Ten Reasons Value-Added Measures Are a Bad Idea

Districts across the country are adopting value-aded measures (VAM) to their teacher evaluations due to state and federal pressure to "hold teachers accountable." At first glance, it makes sense. The best way to measure a teacher's effectiveness is to look at what students are learning. However, like New Coke and Crystal Clear Pepsi, VAM is a really bad idea that sounds great to the people at the top. Here are ten reasons why:

  1. Loss of Leadership: When VAM scores become the dominant metric, administrators lose out on the opportunity to evaluate teachers and offer guidance on curricular decisions. After all, their input often makes up only a small percentage of new evaluations. 
  2. The Immeasurable: So much of what teachers do are things that are observable, but not measurable. This includes guiding paradigm shifts, helping with conceptual development, fostering creativity and pushing critical thinking. VAM scores fail to grasp the immeasurable aspects of teaching. 
  3. Bad Competition: VAM pits teachers against one another, because the demonstration of growth is ultimately what leads to teacher renewal. For all the talk of collaboration and shared leadership, what incentive do teachers have to work together if the bottom line is a VAM score.
  4. Cheating: When the bottom line is a test score and nothing else, it puts pressure on teachers to cheat. That's the nature of hyper-competitive environments. Think Enron or Wall Street or juiced-up athletes or the New England Patriots. 
  5. Bad Metrics: Multiple choice tests are some of the worst methods for looking at conceptual development, connective thinking, critical thinking, analysis and performance standards. And yet, because of a need to be objective and efficient  multiple choice tests are almost always used in VAM scores. 
  6. The Human Factor: There are things that VAM scores never consider. For example, if a teacher leaves for two months on medical leave, there are almost never policies that cover a shift in achievement that might result. 
  7. Transience: VAM scores begin with the notion that the students remain in the same location with the same teachers throughout the year. Often, that's not the case. For example, I have lost thirteen students and gained nine over the course of this year. I don't teach the same group I taught during first quarter. Do we count the new kids, even if the teacher wasn't with them to "add value?" Or do we fail to count them and have teachers focus more on the students who have been there the whole time while neglecting the newer students?
  8. Wrong Focus: VAM scores confuse student learning with student achievement. Teaching to the test is already an issue in many schools. VAM puts teaching to the test on steroids, because it is too risky to trust that project-based, constructivist, authentic learning will transfer to high test scores. 
  9. Compacted Curriculum: In many cases, VAM is based upon a pre and post assessment. Because these need to be calculated before contracts are issued, many schools have moved post-assessments to the third quarter (like my district). So, an already crowded, test-heavy curriculum becomes even more rushed than before. 
  10. Job Placement: I know of certain teachers who have already decided that they want to abandon the gifted and honors classes, because VAM is all about growth and value-added. An evaluation method shouldn't be a part of what a teacher considers when choosing a job placement.