By Steve Hansen
QCS Managing Editor
The Tucumcari Education Association has added its voice to the growing outcry among educators to protest the use of “value-added models” for teacher evaluations.
Locally, the teachers have administrators on their side.
The value-added model is a complex statistical formula that produces a score that is used to evaluate teacher effectiveness, based on standardized test scores.
The formula is so complex and apparently ambiguous that even the upper-echelon scientists at the Los Alamos National Laboratories have a hard time figuring out how it works, according to Christina Fleming, the president of the education association. Fleming aired her protest before the Tucumcari Schools Board of Education on Monday night.
Tucumcari Superintendent Aaron McKinney also registered an objection
“I don’t want to evaluate teachers on a standard that I don’t understand myself,” he said. “The teachers are upset and I don’t blame them.”
McKinney said he is confused by findings that rate one school with a C, or average grade, but in which most teachers are rated as effective, but rate another school as a B, above average, in which teachers were rated overall as less effective.
The New Mexico Public Education Department, however, has adopted the value-added model, as have many other states, Fleming pointed out.
One of those states is Texas, she said, where the Houston Federation of Teachers, a teachers’ union, has filed a federal lawsuit over the use of the value-added model, saying it violates teachers’ rights to due process because the formula is opaque and yields results that cannot be effectively challenged, thus denying teachers due process, according to the Houston federation’s website.
The American Statistical Association recently evaluated the value-added method and found it lacking, according to a Washington Post blog that Fleming cited in her presentation.
According to the Post, among the statistical association’s objections were:
• “VAMs (value-added models) are generally based on standardized test scores and do not directly measure potential teacher contributions toward other student outcomes.”
• “VAMs typically measure correlation, not causation: Effects — positive or negative — attributed to a teacher may actually be caused by other factors that are not captured in the model.”
Economists for the National Bureau of Economic Research, however, say that higher value-added scores for teachers lead to more economic success for their students later in life, according to the Washington Post blog. That work, the blog said, has been challenged by others, including the National Education Policy Center at the University of Colorado Boulder.
Fleming also pointed out a Santa Fe New Mexican news article in which state education department superintendent Hannah Skandera admitted that Santa Fe teachers had been unfairly knocked down in their evaluations because student surveys, which had not been submitted before the evaluations were calculated, were left out.
Using her own evaluation as an example, Fleming demonstrated the effect of eliminating student evaluations. Fleming scored a 112, which scored as minimally effective, with the student evaluations figured in, compared with 102 without the evaluations, which, she pointed out, is only 10 points above the minimum acceptable level.
Fleming, however, has always been rated highly among teachers using other evaluation methods, she and others said.
Fleming said her ratings are also unfairly affected by the fact that her chemistry I class is taken by all students, not just those interested in science, which means her effectiveness is unfairly determined by overall test scores from the entire student body.
Her performance in teaching advanced chemistry, a class populated by more academically oriented students, however, is generally taken by students who are not subject to the standardized tests, which keeps the test scores on which she is evaluated artificially low.