Measuring education system performance
CALGARY, AB, Mar. 13, 2012, Troy Media/ – It’s that time of year when across North America we begin reporting on the performance of the educational system. In Canada, the Fraser Institute’s rankings of schools was released recently to much fanfare in the media.
This hasn’t gone over well with everyone. In Alberta, the Minister of Education is considering prohibiting publication of Provincial Achievement Test (PAT) results upon which the Fraser Institute rankings are based.
A very bad idea
That’s a bad idea. Transparency demands PAT results be made available to the public. In defense of free speech, we tolerate the expression of offensive ideas, Similarly, transparency demands the tolerance of, in my opinion, statistical nonsense, even as it parades itself as objective analysis of school performance.
The Fraser Institute rankings use statistical/scientific terms and methods to give the appearance of scientific rigor where there is none – what famed American physicist Richard Feynman called ‘cargo-cult science’. This tactic fends-off effective criticism from reporters and educational administrators who are not familiar enough with statistical methods to criticize. The result is typically weak-kneed and off-target critiques concerning the lack of completeness encompassed by the measures.
The real problem with the Fraser Institute rankings is not that the measures are incomplete. It’s that statistical science and methods are so misused and mangled that the results are nothing more than a statistical fairy tale. It doesn’t matter if this is a function of intent, ignorance of statistical methods, or some other factor, the end result is the same – a pile of hopelessly corrupted and worthless information that only misleads those naive enough to believe it.
Some of the technical problems with the Fraser Institute’s rankings, concerning hospitals but equally applicable to schools, are outlined in a warning we issued to clients and available on our website at http://www.converge-group.net/386/.
Technical issues concerning statistical analysis tend to be dry but one of possible interest to general readers is the nature of ranking. Darrell Huff wrote about it in his classic book, How to Lie With Statistics. Ranking is an old ‘statistical trick’ to make small, meaningless differences look big and important. If your research discovers nothing, ranking makes it look as though it did. (This is also the favored technique of those selling employee surveys with rankings of ‘the best companies to work for’), A simple, practical self-defence heuristic for consumers of reports, studies, news stories, or advertising – never trust a ranking.
This much ado about nothing strategy is applied with greater sophistication in the United States through ‘value add assessments’ of individual teachers rather than schools. For example, New York teacher Stacey Isaacson was reportedly denied tenure because her proficiency score of 3.63 was 0.06 points shy of the 3.69 score the computer model predicted it should have been. Are you kidding me? No one of scientific competence would base decisions on such trivial and meaningless results. Certainly no one should be allowed to suffer because of them.
A gleeful media
As is the case in Canada though, major newspapers in the United States have had a field day, gleefully publishing teacher rankings and waiting for the inevitable newsworthy chaos of incrimination’s and denials. Bill Gates, whose foundation is largely responsible for this institutionalized stupidity, apologized for the mess in an op-ed piece in The New York Times entitled ‘Shame is Not The Answer.’
Neither, I would add, is selling statistical fairy tales. This cargo-cult science doesn’t do the educational system any good but it brings huge profits to computer analytics companies like SAS selling this statistical snake-oil to school boards across the country while simultaneously ruining the educational system and people’s lives in the process. Think how much good the Bill and Melinda Gates Foundation could do if it actually gave money to helping education, rather than subsidizing Bill’s computer buddies systematic destruction of it.
Robert Gerst is a Partner in Charge of Operational Excellence and Research & Statistical Methods at Converge Consulting Group Inc. He is author of numerous peer reviewed articles and of The Performance Improvement Toolkit: The Guide to Knowledge-Based Improvement.
Filed Under: Mgmt 4.0
About the Author: