Using Student Learning in Teacher Assessment
A strategy for building a strong collegial culture while taking student learning into account in assessing teachers.
Your content has been saved!
Go to My Saved Content.In the early 2000s, education experts sold the federal government and many schools on a logical-sounding way to hold teachers accountable: use test scores, value-added measures (VAM), and student learning objectives (SLOs) as a big part of teacher evaluation. Teachers pushed back, with good reason: Using test scores and VAM to judge individual educators has major design flaws, among them that this year’s A teacher can be next year’s F teacher because of random variations that have nothing to do with teaching quality.
That’s why the American Statistical Association, the American Educational Research Association, and numerous researchers have cautioned against high-stakes use of such measures. What’s more, VAM data are just numbers—they don’t give teachers and principals detailed information that can be used to appreciate and improve what’s going on in classrooms.
The result of all this: a contentious labor-management dynamic and few gains in student achievement.
The good news is that the 2016 Every Student Succeeds Act opens the door for states to implement more-effective policies. But what does a better system look like? How can we get past seeing students as numbers on a spreadsheet and get teachers and school leaders working as partners in pursuit of growth and success for all students?
It’s also important to know: Are students learning? And how will educators respond when some students aren’t successful? We believe it’s possible to answer these two questions without using test scores. More and more schools are dialing back the pressure and using lower-key measures of student learning throughout the school year.
Here’s what this approach looks like: Administrators make frequent short, unannounced classrooms visits (at least once a month), followed promptly by face-to-face listening/coaching conversations; teacher teams meet regularly to discuss planning, pedagogy, and assessment results; and teacher assessment is saved till the end of the school year, pulling together observations, other points of contact, and teachers’ self-assessments.
Can student learning be part of teacher evaluation? Here are five ways it can, each with powerful potential for building trust and improving teaching, leadership, and learning.
How Principals Can Incorporate Student Learning in Teacher Assessment
1. Frequent classroom visits: During informal classroom visits, principals look over students’ shoulders or sit down next to them and ask questions like, “What are you learning today?” and “How will you know when you’ve succeeded?” Insights from these chats lead to really helpful teacher-principal conversations afterward.
2. Post visit check-in: In post-observation conversations—ideally in teachers’ classrooms when students aren’t there—teachers and administrators look together at student work, exit tickets, and other assessments. They then discuss how students are understanding the content, celebrate what’s working, and talk about possible tweaks to instruction.
3. Curriculum planning meetings: Principals drop in on teachers’ unit planning meetings and make suggestions on ways to check for understanding during lessons, in unit tests, and in performance tasks requiring deeper engagement. Without high-quality assessments, analysis of student learning will be unproductive.
4. Collaborative data meetings: When teacher teams sit down to discuss the results of common assessments, principals join in and help make these meetings an engine for improvement. Many teachers complain that their professional-learning community meetings are not productive, which means their schools are squandering time and money looking at data in a process that’s not substantive and satisfying.
The discussion shouldn’t be only, “What do these assessment results say?” but also, “What are we going to do about it?” Many collaborative teams look at data. Truly effective teams consider not only the data but the teaching practices that led to those results. In the best meetings, you hear questions like, “Your students did better on this item than mine—what did you do?”
5. Goal setting: Same-grade/same-subject teacher teams set goals and at the end of the year report to the principal on their students’ progress.
Here’s an example: At the beginning of the year, the members of a second-grade team do a baseline assessment of each student’s reading level and set a specific goal for student growth—perhaps 100 percent of students reading at least on grade level by June.
Teachers seek out the best methods and materials and implement them through the year, with special attention on students who need to accelerate their growth. They track student progress, share ideas, and fine-tune instruction, observed and coached by colleagues and the principal.
At year’s end, teachers do another formal reading assessment and report all their students’ progress to the principal: “Ninety-one percent of our kids are at or above grade level!” The principal makes a notation of the team’s collective accomplishments in each teacher’s individual performance evaluation.
Transforming Accountability
This last item is the basic idea behind SLOs but done at the team level with low-stakes, school-based accountability. By reporting before-and-after data within the same school year with the same teachers, there’s a much better chance that teams will set ambitious goals, use rigorous measures they respect, care about the results, use during-the-year data to improve instruction, spur each other on (especially team members who don’t seem to be pulling their weight), and at the end of the year take real pride and satisfaction in their collective gains in student learning.
This fundamentally transforms accountability from a threatening and mysterious process into a credible reflection of the impact of teachers on their kids. So yes, student learning can be a legitimate element in teacher supervision and evaluation, as long as we avoid the mistakes of the VAM era.