The NHS and technology: turn it off and on again

This content relates to the following topics:

With all the excitement around the upcoming general election, it would be quite understandable to have missed the announcement that the Department of Health’s long awaited response to Dame Fiona Caldicott's review of data security, consent and opt-outs has been delayed due to the pre-election period.

As well as being another example of how perpetual political campaigns add uncertainty to policy, this threatens to inject further rigor mortis into a debate that is in sore need of advancement. With UK e-Health Week taking place this week, it offers a perfect opportunity to reflect on the state of technology within the NHS.

It has been more than three years since the programme to implement wider sharing of patient records, care.data, was suspended by NHS England, with the Royal College of General Practitioners citing a 'crisis in public confidence' over the plan. Ever since, caution and delay have dogged the process to develop a better system of collecting, storing and sharing health data.

The caution is understandable. The public have legitimate trust and privacy concerns with their health data being shared, especially with third parties such as researchers and the private sector. But we also know that the public overwhelmingly appreciates the benefits that safe sharing of health data can bring. Ipsos MORI research, with Cancer Research UK and Macmillan Cancer Support, found that more than 80 per cent of the public support health data being shared for uses such as research and direct care. This is why it is so important to address issues such as safeguards and who can access data.

In addition to privacy, there is now a whole series of other practical and ethical concerns that were quite literally beyond comprehension at the turn of the millennium. One example of this is how much responsibility we delegate to algorithms; the extent to which we allow machines to make decisions about our lives – so-called 'algorithmic accountability'.

In health care, algorithmic accountability raises profound questions about the role of technology in providing care. For example, should the algorithm refer on every patient who has even the slightest chance of developing acute kidney injury, or should it be selective, not over-burdening clinicians, and accept that some patients will be missed? To be clear, risk is an inherent part of care, regardless of who or what delivers it, but consideration needs to be given to the framework within which an algorithm operates and who is accountable for this framework.

If that seems like a hard question to answer right now, there are plenty more. The recent DeepMind (a technology company owned by Google) agreement with the Royal Free NHS Foundation Trust received a sceptical reaction from the press. The concern was over the scope and scale of patient data that DeepMind was to receive for a new health technology called Streams, an early warning system for kidney failure. The Royal Free has defended itself by saying that the agreement it has with DeepMind is standard and that data protection issues are a matter for regulators to assess. Nevertheless, the privacy debate – in the DeepMind example and elsewhere – has proved a stumbling block to thinking more widely about what new technology means for the NHS.

DeepMind is just one example of the new partnerships we are seeing. There are similar relationships with IBM Watson and Vitalpac, who are providing artificial intelligence expertise to the NHS, which all raise important questions. The health sector needs greater clarity over what is being purchased from technology companies; there are decisions to be made over whether NHS organisations are purchasing a technology or outsourcing care. Artificial intelligence is not a technology like an MRI machine, it supplements or replaces part of the care pathway usually provided by a clinician. How we classify these algorithms is important for the regulation and governance of how the data is used.

Consideration should also be given to ensuring that NHS organisations have the expertise to contract and evaluate these complex technologies when there is such an asymmetry of information between them and the provider company. A final question is about how we ensure an effective market, which rewards technological innovation while also protecting the NHS from excessive prices.

It is imperative that the NHS can access innovation and new technologies. To enable that we need to write new rules of engagement for NHS organisations and technology companies. Delay after delay has prevented us from thinking ahead, as our conversations remain stuck in the discourse on privacy. But technology is not waiting for us to catch up – the NHS has already fallen behind other sectors in delivering innovative technologies, and if we continue to procrastinate then there is a risk that the NHS will be left well and truly behind.

Comments

Dr Julia Powles

Position
Research Associate,
Organisation
University of Cambridge and Cornell Tech
Comment date
03 May 2017
A thoughtful post and stimulant, thank you. I would be interested in follow-ups on two matters:

1) What do you have in mind regarding "new rules of engagement" between the NHS and tech companies, specifically? In my experience, the subtext of such an assertion is generally that it's simply too difficult to deal with the distributed, messy, human complexity of patients, practitioners and the public

Pearl baker

Position
Independent Mental Health Advocate and Adviser/Carer,
Organisation
Independent
Comment date
03 May 2017
The system should be about patient Care? Is the route of using machine (algorithmic Accountability) really providing 'Safe and Effective Care' based on 'probability' research? we are not all sausages, coming out of the same machine.

i am allergic to Paracetamol unusual i know, effected my liver. It was through my own investigations that i found this out. I am know good 'dead' if a Carer to more than one individual.

It looks to me we are trying to do away with GPs, installing Machines everywhere 'supermarkets' questions and answers, many of us look towards our Computers for answers 'forced' upon us by the system.

'Safeguarding' is an issue associated with the above! if we go down this route and the system has shown its failed to provide a 'duty of care' who is responsible? the machine, the programmer or the Patients's GP.

Caldicot and 'Confidentiality' is the most serious problem for Carers. If you satisfy the Care Act 2014 in that you provide emotional and practical support, regardless of what the LA or Patients reports, the problems start: 'safeguarding' is everyone's concern? and Carers have Legal status?

The words 'duty of care' has a meaning, and it is not a MACHINE.

DWP Appointees have problems with GPs, as an Appointee you are required to complete forms, including medical information, failure to provide relevant information can have serious consequences for both patient and Carer. Caldicott review (which i responded to) should take into account Carers have Legal Rights, failure to take into account the 'wider' system, including the contents of the Care Act 2014 could have serious consequences.

The PHSO and LGO upheld complaints would need to be read and understood in how 'Safeguarding' in particular has a profound effect on everything to do with Health and Social Care.

Mark Wardle

Position
Consultant Neurologist,
Comment date
17 May 2017
This is a great post. It is clear to me that the regulatory frameworks have not kept pace with advances in technology and our expectations from the use of such technology. In particular, phrases such as "transfer of data" are less useful in an age where organisations do not themselves host their own data and it is more important to consider "access to data" than its transfer, for example. In addition, there are criticisms of the use of real patient data to test technologies in order to ensure safety. This might be correct when one buys a completed product, and even in the recent past, it would be a reasonable assumption to suggest that deployment of an application would be much like buying anything else "off-the-shelf". However, developments nowadays are much more agile, more user-centred and incremental and so ongoing testing, continuous improvement and development should be commonplace.

As a clinician, I for one look forward to a time when the machines can support my work making sure I see the right information at the right time. For too long, we have been information-poor and we need frameworks in place in order to make things better!

Mark

Add your comment