Reason #10 to Pick a New State Superintendent: Ignoring Researchers
We have entered the Top 10! I could introduce the list like Letterman and say that it was sent in from the home office in…wait, where’s the home office again? Not important.
Any of the issues covered in the previous ten posts would make for a decent stand-alone cause to fire Janet Barresi. Imagine a school superintendent hiring people illegally. Or burning time and cash on a testing program only to scrap it while it’s still in development. Or insulting the workforce time and time again. The only reason blogs like this exist is that under Janet Barresi’s watch, head-scratching decisions and mistakes are the norm. They cease to surprise us anymore.
With that, let’s review where we’ve been and get going with the next one.
#14 – Value-added Measurements
#13 – Being Damned
#12 – Holding Back State Aid
#10 – Ignoring Researchers
In January 2013, researchers from OU and OSU released a study that was critical of Oklahoma’s A-F Report Cards (Version One). Their concerns were similar to those raised in other states that have run this play from the Florida playbook.
Accountability systems are only useful if their measures are credible and clear. Despite good intentions, the features of the Oklahoma A-F grading system produce school letter grades that are neither clear, nor comparable; their lack of clarity makes unjustified decisions about schools. Further, A-F grades are not productive for school improvement because they do not explain the how or why of low performance. Building on what has already been done, Oklahoma can and should move toward a more trustworthy and fair assessment system for holding schools accountable and embracing continuous, incremental improvement.
Among their findings were several complaints about the system of a statistical and academic nature:
- Scores assigned “do not seem to correspond to any recognizable metric.”
- The use of proficiency levels “introduces grouping error.”
- There is “unclear conceptual meaning of the index” for student growth.
- Whole school performance grades are skewed by “overreliance on attendance and graduation rates.”
The researchers went on to put their concerns in more accessible language:
- By not making explicit threats to the validity of report card grades, the OSDE misinforms the public about the credibility and utility of the A-F accountability system.
- Performance information from the current A-F Report Card has limited improvement value; particularly, it is not useful for diagnosing causes of performance variation.
- The summative aspects of the accountability system overshadow formative uses of assessment and performance.
- High stakes testing, as a cornerstone of school assessment and accountability, corrupts instructional delivery by focusing effort on learning that is easily measured.
It wasn’t just OU and OSU, by the way. An Oklahoma City University professor did a separate study showing that the report card results were largely tied to poverty. Apparently his multivariate regression analysis was too complicated for the Oklahoman, the SDE, and key legislators. They seized upon both studies as propaganda of the Education Establishment and evidence that teachers and administrators just used poverty as an excuse.
To deny the impact of poverty on student learning would be like denying the impact of the Internet on newspaper circulation (sorry, Tulsa World – I love you, but it’s true). We know that with high-poverty student populations:
- More intensive basic instruction is often necessary;
- The fruits of teachers’ labor often leave for another school in the middle of the year;
- What works for one group of students may not work for another; and
- Past performance doesn’t always predict future results.
Either the legislature heard the criticism and it resonated, or they were frustrated with how the SDE handled the formula development in 2012. During the 2013 legislative session, they wrote new, simpler rules for the A-F Report Cards. The report cards would be easier to understand (though obviously not easier to calculate, as I will discuss in my #9 post).
When the SDE was set to roll out the report cards (Version Two) last fall, the researchers from OU and OSU released another study showing even more statistical flaws. Simple wasn’t better. Poverty still mattered more than all other considerations put together.
In her typical manner, Superintendent Barresi dismissed the findings, famously using air quotes around the word “researchers.”
This act of anti-intellectualism shows that she will say anything to appease the base (the comment was made during a candidate forum). She later hired a researcher from Harvard, made her one of the highest-paid people at the SDE, and had her issue her own findings about the OU/OSU report. Though SDE spokesperson Phil Bacharach (whom the Lost Ogle discusses on their site today) said the agency wasn’t “casting aspersions” on the OU/OSU report; they were just saying it was completely wrong.
Under Barresi, the SDE denies researchers who don’t fit their agenda and expects the public to accept their own data crunching. When we don’t, we’re beholden to the status quo. Every piece of Barresi’s reform agenda has one overarching goal – discrediting the work of public schools. When empirical evidence contradicts this agenda, she simply sticks out her tongue and makes raspberries at us.