Quantcast

Queens skeptical about teacher evaluations

Queens skeptical about teacher evaluations
Photo by Christina Santucci
By Rich Bockmann

Queens stakeholders had mixed reactions to the publication of the city Department of Education’s controversial teacher data reports and raised questions about the impact making such information public would have on the borough’s schools.

The DOE compiled the reports over the last three academic school years, assigning a numeric score to rate the effectiveness of nearly 18,000 fourth- through eighth-grade teachers across the city.

The reports were designed to measure how a teacher helped his or her students improve over the course of a year based on state English and math test scores, measured as a percentile score from zero to 99.

They were intended to be used internally, along with other measures, to help identify teachers’ strengths and weaknesses, but in 2010 several media outlets sued to have the reports released under the Freedom of Information law, and earlier this month the state’s courts ruled the DOE was required to make them available to the public.

Critics said the reports are flawed and paint an inaccurate picture of teacher performance.

But Principal Anthony Lombardi, at PS 49 in Middle Village, said he used the reports, though relied mostly on his own observations when evaluating his teachers. He said if the data sample were expanded and the margin of error greatly reduced, he believed they could be a useful tool.

“Overall, I’m in favor of transparency in schools. I’d be more comfortable if I knew the info was completely accurate, without such a disparity in terms of margin of error,” Lombardi said. “I think there is some potential in the use of data, but I think many things have to be worked out and we need a few more years under our belt.”

During a two-year-long legal process, critics of the reports, including the United Federation of Teachers, cited faults such as flawed tests, bureaucratic errors and a large margin of error — in some cases as much as 54 percentile points.

James Eterno, a history teacher and UFT chapter leader at Jamaica High School, said he would be critical of any teacher evaluation method devised by education policymakers whom he said “know nothing about education.”

“Arne Duncan [U.S. education secretary] right on down to Andrew Cuomo, John King [state education commissioner], Mayor Bloomberg … most of them have no experience in education,” he said. “Cuomo, you could fit what he knows about education onto the head of a pin.”

“It’s very difficult to have people come into a school and try to quantify it like it’s some kind of science. Teaching is an art, not a science,” he continued.

On Saturday, several news organizations published the reports, and the day before city Schools Chancellor Dennis Walcott expressed his concern the public would use them to judge individual teachers.

“Let me be clear on where I stand: The data is now two years old, and it would be irresponsible for anyone to use this information to render judgments about individual teachers,” he said. “Teacher Data Reports were created primarily as a tool to help teachers improve and not to be used in isolation.”

Walcott stressed the reports did not tell the full story about a teacher’s performance.

“They provide one important perspective on how well teachers were doing their most important job — helping students learn — using a method called ‘value-added’ that has been found to predict a teacher’s future success better than any other technique,” he said.

This method considers where a student is academically at the beginning of the year and takes into account factors outside the teacher’s control — such as poverty level and English-language learner status — to set a standard by which to judge a teacher’s impact.

State Sen. Toby Stavisky (D-Whitestone), a member of the state Education Committee, said she did not think the complex formula used to measure a teacher could account for so many student variables.

“I don’t think the offsets are enough,” she said. “I’ve had math through calculus and I saw the equation and it made no sense to me at all.”

Jeanette Segal, president of the District 26 Community Education Council in northeast Queens, said she found a number of scores to be incongruent with the effectiveness of teachers she knows personally.

“There are a couple of teachers I have personal contact with who scored low and I’m dumbfounded as to how that could happen,” she said. “I know of one teacher in particular who I think is amazing and the fact that her score was near the bottom is shocking to me. It raises a red flag.”

Segal said she was concerned what kind of reaction the reports’ publication would create among parents.

“My concern is the panic with principals will be facing next September. Parents are saying they’re already requesting their child’s teacher get changed,” she said. “Everyone is going to want that ‘A’ teacher — that 99 — but that can’t happen.”

Bayside mother Kathy Relly said she was opposed to rating teachers at all and was sickened by the media’s decision to publish them.

“I think it’s sad because these teachers are really hard workers,” she said.

Segal agreed that she did not think it was wise to publicize teacher evaluations.

“Teachers should be evaluated. There’s not a job out there where you don’t get an evaluation,” she said. “I don’t agree with [publishing them]. What are you really going to do with that information?”

The 2009-10 school year was the last one for which the DOE compiled teacher data reports.

The DOE started using this data in 2008 as part of its consideration in tenure decisions and will continue to use similar data provided by the state in the future.

A department spokesman said that since the DOE and the UFT had not yet come to a final agreement on a teacher-evaluation system, he could not comment on whether or not that information would be made public either voluntarily or by FOIL request.

Reach reporter Rich Bockmann by e-mail at rbockmann@cnglocal.com or by phone at 718-260-4574.