On Thursday, June 18, The Fraser Institute released the Hospital Report Card Alberta 2009. The purpose of the Report Card, according to the report’s authors is:
to help patients choose the best hospital for their inpatient care by providing them with information on the performance of acute-care hospitals in Alberta.
The Report Card is primarily concerned with assessing organizational performance. To measure this performance, the authors of the Report Card relied on data obtained from the Canadian Institute for Health Information’s (CIHI) Discharge Abstract Database (DAD). The indicators used were based on a framework developed by the Agency for Healthcare Research & Quality (AHRQ) in the United States.
The Report Card and the accompanying news release has made much of the fact that Alberta Health Services would not release the names of the hospitals reported upon, unlike jurisdictions in Ontario and British Columbia where the Fraser Institute has released similar Report Cards. As such, the Alberta Report Card simply identifies Hospital 67 as the top ranked hospital for the year.
The Fraser Institute Report: Cargo Cult Science
The Fraser Institute is to be commended for adhering to the principle of transparency, detailing the methodology and analytical techniques used in preparing Hospital Report Card Alberta 2009. It is a transparency that is all too often missing in reports from the health services and the public services sector generally.
It is this transparency that permits us to review of the methodology employed by the Report Card. Our conclusion is not favorable. In our opinion, the Hospital Report Card Alberta 2009 represents little more than cargo cult science – where the tools and techniques of science are used to lend credibility to a pseudo-scientific methodology. Our concern is that this feeble and flawed effort at measuring organizational (hospital) performance may be mistaken as an appropriate or serious approach to the challenge of assessing organizational performance by our clients.
The credibility is provided through the Report Card’s use of the CIHI Discharge Abstract Database and the indicators framework provided by the AHRQ. What happens to that data afterwards is where the concern, and the pseudo-science comes in.
The findings of the report are presented as a set of rankings based on a hospital mortality index. Top-ranked and bottom-ranked hospitals are highlighted (anonymously) as are municipality rankings. The use of rankings in this manner is completely inappropriate. A form of economisting, the act of process of converting limited evidence into grand claims (Edward Tufte, Beautiful Evidence), the ranking of results (including hospital mortality figures) in this way is a long standing and well understood technique of making irrelevant or trivial evidence look like it is supporting important conclusions.
Ranking takes advantage of a simple truism, in any ranking of a data set (including the ranking of hospitals) roughly half of the observations will be below average and half above. This has absolutely nothing to do with hospital performance and everything to do with the definition of the average. A different version of this truism would state that in any ranking or sorting of data, some data points (hospitals) will be at the bottom and some at the top. Again, this has nothing to do with the performance of hospitals (or anything else) and everything to do with what it means to sort data.
This then represents what the Hospital Report Card Alberta 2009 has managed to prove: if you sort Alberta hospitals from highest to lowest, some hospitals will be at the top and some will be at the bottom.
What Hospital Report Card Alberta 2009 does not do, is provide any evidence that the performance of any Alberta hospital is better or worse than any other Alberta hospital. That is not to say there is no difference between hospitals, there may well be and probably is. That evidence, however, is not to be found in the Report Card, despite claims to the contrary.
The misuse of scientific or statistical technique in developing the rankings is evident elsewhere in the Report Card. For example, in responding to an FAQ concerning the meaningfulness of the differences based on the scores that produced the rankings, the report states:
. . . we have compared each institution’s and each municipality’s risk-adjusted rate (per indicator) to the upper and lower bounds of a 95% confidence interval (CI). This additional analysis was performed to measure the statistical significance of each result. Those below the lower CI are statistically ‘better than average’ and those that are above the upper CI are ‘worse than average’
The use of confidence intervals in this way is a gross misrepresentation of significance testing. Statistical significance calculations are measures of detectability, not meaningfulness or importance. Most statistically significant results are trivial. The use of the technique in the Report Card lends the appearance of scientific method without the substance. The conclusion that those, below the lower CI are statistically better than average, is downright silly.
In retrospect, the decision of Alberta Health Services not to release hospital names is likely a wise one. Releasing the names of hospitals associated with meaningless rankings of performance can cause serious damage in terms of public perceptions and those of hospital staff.
For our clients, it is important to recognize that this is not the way that organizational performance measurement should be done. It gives the appearance of being scientific, but it lacks the substance of scientific/statistical method. The result is bad information, unsuitable for any meaningful purpose. Certainly unsuitable to any task as evaluating organizations generally and hospitals specifically.
Converge is cautioning that the approach used in Hospital Report Card Alberta 2009 should not be used as a model by clients in the development of their own performance scorecards or dashboards or in the assessment of organizational performance.
About the Author: