Teacher performance data and its discontents
The Los Angeles Unified School District has kicked the hornet’s nest of teaching quality assurance by proposing to publish a list of six thousand teachers’ students’ gains and losses on a statewide test in English and math. The exercise is considerable because it turns out performance is not random but for many teachers, strongly correlated from year to year: some see their students’ relative scores go up year after year, and some teachers’ students do worse, again year after year. For the moment, I’m carefully avoiding language like “raise their students’ scores” and “good teachers and bad teachers.” Jonathan Zasloff has the links in his post here , along with a well-deserved thumb in the eye of the LA teachers union president.
KQED’s Forum had an hour on this today, and started it off on the wrong foot with the title:Evaluating Teachers. This common shorthand is an instant source of mischief: of course no-one has the right to evaluate another human being, and what’s meant (I hope) is evaluatingteacher performance, but even that version is off the rails. Evaluating teacher performance is almost entirely sideways to what we want, which is improving student learning. Still, data like the LAUSD files can be a start in the right direction.
I’m having ambivalence overload even thinking about this story. On the one hand, measuring performance is desperately important for improving quality in any service or production process, and it appears the LAUSD has an enormously useful resource here. On the other, it’s so easy to measure it wrong and do the wrong thing with the measurements. Measuring teacher performance is especially difficult because what we really care about, which is contribution to lifetime performance of students (productivity, happiness, income, menschlichkeit, and more) happens long after the teaching ; because different students click with different [kinds of] teachers; because different teachers are provided different resources, especially including differently supportive parents and student peer sociology; and because the world is just noisy and full of random stuff. So, on the one hand, the LAUSD data looks at a narrow measure of value acquired by students (fairly stupid statewide short-answer tests in two subjects), but on the other hand, there appear to be stable, significant effects of individual teachers. Now what? The obvious answer is, fire the teachers in the bottom third, and give the ones at the top a nice raise. There are certainly a few LAUSD teachers who should be fired, but like so many obvious things, this reaction is almost completely wrong (it does respond to several of our worst instincts, including a desire that things be simple and a wish to punish). In the first place, of course, every set of measurements has a bottom third; indeed, though it will shock you to learn this, fully half my wonderful students are below average, no matter how much I shame and ridicule them when every other student scores below the median on a midterm again. I taught an honors course once and only admitted the top half, and would you believe it, one out of two of those stars slacked off and dropped into the bottom half during the semester!
More important, no organization has ever fired its way to success; 50% of new teachers in urban school districts already leave in the first three years, and we see how well that’s working for us. (That fact, along with a good bit of the thinking in this post, is courtesy of my colleague Alan Schoenfeld, an actual education professor who was nice enough to hip me to a lot of interesting background on this issue.)
What teachers need, and don’t have, is a really dispiriting list of resources. The first, and most important, is each other: teaching is probably the most isolating and isolated profession this side of pathology. Teachers never see each other work, almost never get to talk to each other about individual students, and have practically no opportunity for the core practice of quality assurance, which is observing and then discussing a particular practice, comparing alternatives, in a group of peers. Public school teachers in California also lack a long list of pretty basic stuff, from decent, clean, maintained buildings to copy paper, not to mention supportive staff and competent leadership at the school and district level, and too often, parents who are on their side and at the kids’ side.
What the LA performance data does is highlight a batch of teachers at the top of the data whose classrooms need to be visited by their peers, perhaps by videotape, and discussed. The point is not that everyone should be completely focused on increasing these test scores, but that a successful record at that measurable result is a good (not perfect) indicator of teaching practices that, if observed and discussed, will lead to better outcomes for students on a variety of dimensions.
It also highlights a batch almost all of whom (not all, some cases are hopeless) need to have their attention focused on what they are doing by habit or instinct that isn’t working, and to be shown (not just told) some alternatives. Not one of them wants to be a bad teacher! There may be a couple with such weird values that they know how to teach effectively, but intentionally sabotage their own performance unless they are paid some amount more money to deliver, but making policy for bizarre cases, if they exist at all, is absurd.
All this warm and fuzzy collaboration is expensive (an hour in a quality circle is an hour not in class and a cost for a substitute), a real challenge to administrators for scheduling, and more work for managers. It’s a safe bet that real quality assurance will pay off in reduced costs, but not instantly and anyway a lot of people don’t believe this. It’s possible to waste enormous sums in a failing school district not achieving much learning, and breaking a lot of hearts and spirits, but in California we’ve spent a couple of decades feeding the horse one less straw a day waiting for him to learn to live on nothing, and it isn’t working for us. In the end, improving school performance depends on being willing to invest (not spend; invest!) what it takes to get it right, and on management willing to do heavy retail lifting rather than simpleminded stick-and-carrot tricks. The LA Times enterprise may cut either way; it might just further infuriate and demoralize the workforce, but if it’s handled right, it could be a place to step off in a useful direction.
Cross-posted from blog site The Reality-Based Community.
No comments:
Post a Comment