SHMI, the new indicator for measuring hospital mortality: more light or more confusion?

After lengthy deliberation, the Department of Health's review of hospital standardised mortality ratios (HSMR) has resulted in a new offspring to the HSMR – the summary hospital-level mortality indicator (SHMI). The SHMI is now the official, NHS hospital-wide mortality indicator for acute trusts in England.

Will this new incarnation provide the definitive measure of a hospital's quality and will it put to bed the arguments about the merits and evils of HSMRs? The answer to both questions is 'probably not'.

The experts on the review panel, of which I was a member, debated long and hard about the methodology for producing this indicator. But statistics cannot entirely overcome the complexities of adjusting for the many factors – some unrelated to the quality of care – that contribute to deaths in hospital. The result is an indicator that is inevitably subject to caveats, as provided by the Information Centre. No one, not even its ardent supporters, would maintain that this indicator is an unequivocal marker of hospital quality. No single indicator ever is, given the complexity of an acute trust's patient mix and clinical activities, and there are many ifs and buts associated with this one.

There has been fierce debate among professionals about the validity of this indicator, and whether it is even sensible to try to use hospital-wide mortality as a measure of quality, when most deaths in hospital are unavoidable or inevitable. This debate will not go away and could even be stoked further by the SHMI. Critics may say this is window dressing for an inherently flawed concept.

So why have the SHMI at all? First, such an indicator has been around in one guise or another for many years, in NHS Choices and other publicly available sources of information; withdrawing it from public view would be difficult to explain. Second, the government's open data policy aims to make more, not less, information publicly available for accountability, transparency and to support choice. The Freedom of Information Act also makes publication unavoidable. Third, hospitals should make use of all available information, even if imperfect, for scrutinising the quality of their care. Mortality rates can be a useful prompt for further investigation by providers and regulators – this indicator has, on occasion, helped to identify organisations where there have been serious failures of care. An analogy I have heard is that mortality indicators serve as a smoke alarm – when a smoke alarm goes off, it doesn't necessarily mean the kitchen is on fire but it does mean you should check if there is a problem.

On the other hand, there is a risk that the data may be used inappropriately, without regard to the accompanying health warnings, and that some organisations may incorrectly be categorised as providing poor-quality care. This can mislead patients and the public, and erode confidence in NHS staff and organisations.

Is it therefore impossible to measure the quality of hospital care, and where do we go from here? We have two practical suggestions. First, the focus should be on developing clinical indicators that relate to particular conditions or procedures; these are less prone to the technical problems that come with an SHMI-type summary indicator. Second, the Department of Health and Information Centre should speed up the process of using national clinical audit data as the basis for measuring and reporting on quality. Such data provides a more robust basis for informing health care professionals, patients and the public about the quality of care provided by an organisation. The time will come when NHS data on the quality of its services is a useful and valued currency that serves many audiences and purposes. In the meantime, boards should be focused on reviewing all available data to understand how they are performing and how they can drive improvement.

For more statistics and analysis on NHS reforms, read our quarterly monitoring report

Keep up to date

Subscribe to our email newsletters and follow @TheKingsFund on Twitter to see our latest news and content.

Comments

#579 Jenifer Smith
DPH
NHS Isle of Wight

I think is a helpful and balanced view. There is a particular problem though with publishing data about something everyone understands - death - manipulated statistically into an indicator almost no-one understands - a standardised ratio. Supporting the responsible use in an effective way is very difficult for those of us (who do understand the statistical basis) with the job of explaining what is publsihed to the local media / patients / professionals.

#581 diana smith
citizen

This is a very helpful blog.

I live in Stafford, where the troubles began in 2007 with the publication of HSMR mortality data as a league table, with the confident assertion that this was a good indicator of quality.

What we can be certain of is that in the case of Stafford the figures were the result of seriously poor record keeping, and the fact that the Coding manager was on long term sick.

It has never been possible to make that matter of fact clear to the press. and I know that many of my fellow Staffordians still firmly believe that there were hundreds of "excess deaths", - (calculated in a way that no one accepts as appropriate, on the basis of the flawed HSMR figures) despite the best efforts of David Colin Thome and Robert Francis to correct their misaprehensions.

These seriously flawed excess death figures are still regularly used bu press and politicians.

I take a keen interest in the development of health statistics, and see many signs that things are improving by leaps and bounds. The Specific indicators are potentially of great value, as they are timely, and sufficiently focused to prompt scrutiny of small sections of a hospital practice.

The Birmingham University Hospital approach to building the key indicators that clinicians need to measure their own performance is also remarkable.

As this blog says, there is a real question on wether the HSMRs, or their successor the SHMIs do serve any useful function. I think there will come a time when we will recognise them as an important early attempt to comment on quality which has been superceeded by more useful methods.

There are currently big question marks on the future of Midstaffs. It has been so badly affected by the reputational damage that it has major difficulties in attracting the right staff or the through put of patients- this is despite the fact that the staff have made massive efforts to correct anything that may have been wrong.
We are funtioning in an unforgiving economic climate. Market signals may be OK in calm waters, but here, when a hospital has been subjected to 4 years of relentless bad publicity the market is a pretty poor indicator of what Stafford actually needs.

I look forward to the time when we can measure quality better and when Midstaffs if it survives the damage is recognised as being in part a bad data blip on the road to better quality data.

#583 doc

yesterday a list of poorly performing hospitals was published based on mortality indicators.
i wonder if an acutely ill person, especially an older person, is discharge prematurely , would this not skew the indicators to show that the hospital is performing well , as they will probably die at home.
is such a hospital performing better or failing to provide what every hospital should, which is care rather than meaningless data that health professionals are currently being asked to collect.
having said that the data has more relevance to younger adults than older adults as care of elderly seems to be more accepting of mortality.

#40653 ian spencer
PAS Development
PAS supplier

Diana. You make an interesting point about the Coding Manager being on sick leave and the poor data quality. Few people know anything about Clinical coding and the complexity of data that feeds into health care stats yet it all get reduced to "[invent a number] avoidable [invent a diagnosis/procedure] incidents" for the newspaper headlines. But who is feeding these numbers to the press? This is more than a stats. problem. This is politics.

Add new comment