And Then There Was Roster Verification
I received an email last night with a 30-page document attached showing the different recommendations to the Teacher and Leader Effectiveness (TLE) Commission by the various working groups that developed them. In all, the Value Added/Student Growth Measures for Teachers of Non-Tested Grades/Subjects and Teachers Without a Teaching Assignment. In all, this document contains suggestions for how to quantify the effectiveness of 18 different classifications of certified teachers. These include:
|Early Childhood/PreK||Elementary (1-6) Non Tested|
|English Language Learners||Fine Arts|
|Library Media Specialists||Nurses|
|Physical Education||Reading Specialists/Response to Intervention|
|School Psychologists||Secondary: Non-Tested Subjects|
|Special Education||Speech Language Pathologists|
I am repulsed by the idea that we have to come up with some sort of a quantitative measure to evaluate some of these groups of teachers (and nurses, really!?!), but I decided to play along and read through the sections. What I found were some drastically disparate ways to calculate teacher effectiveness. All include some level of new training for both the teachers and the principals who would evaluate them. And most ask for more time to come up with a workable plan.
The different recommendations include some similar language that we rarely use when talking about education reform initiatives. We are going to have to learn these terms in the same ways that our students learn the academic vocabulary. I have already discussed Value-Added Measures (VAM) on this blog; I am not a fan. I do not believe that an agency incapable of developing a statistically-sound report card can develop VAM in a way that is fair to teachers. I’m not convinced that it is achievable in the first place.
Several of the proposals also call for some kind of a matrix, portfolio, or rubric to assess teachers. Principals would have to become familiar with all of these instruments. They also call at various times for different weights on the quantitative pieces of the evaluation. Imagine keeping track of all of that!
The newest term for educators, however, is Roster Verification. The only group that mentioned this process in the report to the Commission was Special Education. In an email to superintendents and principals this week, here’s how the SDE described Roster Verification:
|Roster Verification – Voluntary, yet Valuable!
The Oklahoma State Department of Education is offering Roster Verification as a service to school districts this spring. The OSDE will be completing value added analysis for all teachers of TESTED grades and subjects after testing occurs this spring. Value added analysis will be used for INFORMATIONAL purposes so that teachers and administrators have the opportunity to learn about the process and can use data to inform instructional practices during the 2013-2014 school year. This is a NO STAKES process meaning NONE of the value added calculations will be used in evaluations.
Because there are so many different teaching scenarios that occur throughout the year, Roster Verification allows teachers to account for who they taught, for which months during the year, and for what percentage of the instructional time. For example, when I taught 5th grade, we were departmentalized. I was responsible for MANY students’ instruction in mathematics and science, but my team member was responsible for their instruction in reading and social studies. Without completing Roster Verification, my value added analysis would be based on my HOMEROOM roster (unless someone uploaded that information differently into the Wave.) As a teacher, I would want to be held accountable for the growth of the students I instructed in math and science, but I would want my partner to be responsible for their growth in reading and social studies. Roster Verification gives teachers the ability to account for such scenarios, therefore value added analysis reports are much more accurate for teachers who were able to complete the Roster Verification process.
The SDE provides even more detailed information on this flyer, which includes training dates, a shout out to the Gates foundation for funding, and a picture that would lead you to believe this is about children. One line even promises that Roster Verification will provide “much more accurate value added reports which will be extremely useful as a professional growth tool.”
This is not remotely about professional growth. This is about continuing down the path of assigning blame, and trying to find a mathematical formula for doing so. In ten years, we will be able to look at the students of two second-grade teachers and see which ones are better prepared for college. We will be able to assign partial credit/blame for the success/lack-thereof to all the teachers those students have ever had. Over time, we’ll have all kinds of data pointing back to that second grade classroom.
Think back to that bizarre oak tree analogy. The disembodied voice in the video tells us that countless factors go into two farmers raising their oak trees. It also tells us that some of those factors are out of the control of the farmers. (By the way, I still don’t know any oak tree farmers). Removing all of those factors, the argument is that we can tell which farmer is adding more value. Extending the analogy, we can remove the factors out of the control of teachers and look at results, and ascertain value-added that way as well. This too makes me uncomfortable.
The factors out of a teacher’s control are too many to count. We will be assigning value-added sometimes ten or twelve years after the fact. Would you want part of your evaluation during your eleventh year in the profession to be based on something you did your first year?
Many districts have chosen not to participate in Roster Verification at this time. Others, for the sake of curiosity, are joining in the trial run. I understand both positions. While I can’t think of one school administrator who wants to see this happen, many want to see what the process looks like since it will happen eventually anyway.
The recommendations to the TLE Commission are non-binding. Commission members can act to accept, revise, or reject the proposals at a later date. Meanwhile, the SDE is wisely pushing for more time to implement the quantitative piece of the evaluation system. While they would be even wiser to scrap it altogether, that won’t happen. Too much money –taxpayer and corporate money – is invested at this point. The agency is philosophically entrenched in this process.