Skip to content
Blog

Can the NHS manage without AI?

Authors

In November 2022 artificial intelligence (AI) became a hot topic of discussion in the news with the dramatic improvement in a type of AI called large language models (LLM). Now, a year later more than half of UK citizens have heard of one of these newer AI tools. An LLM is AI software that is provided with very large volumes of text and it uses this to create learning on how sentences are constructed, the links between words and the unwritten rules of language.

Without robust evaluations the limits of the LLMs are not well defined and any assumptions the models make could be incorrect...

LLMs have been met with enthusiasm for their potential and have been rapidly tested and compared with people largely using predominately established written examinations, in subjects from law to science to health care. This method of testing assumes the performance of LLMs can be compared with that of people – and the results show LLMs often excel in these tests. However, it’s important to note that using human assessment approaches (such as examinations) for software assessment is not an effective testing mechanism. A standardised test for people is designed to test for knowledge or capabilities with the assumption of the broader context it is built on. Although these early results offer proof of concept for what the technology could do, more evidence generation and validation are needed before these technologies can be used in health and social care. Without robust evaluations the limits of the LLMs are not well defined and any assumptions the models make could be incorrect so there could be errors and failures with serious consequences for patient safety, outcomes or experience.

In reality, technologists are still trying to develop the methods to benchmark the performance of these technologies, organisations such as Microsoft and Google are working to assess LLMs for health and care applications. An additional problem unique to LLMs is that they can – and do – make up information that is not factually correct or evidence-based. This ‘creativity’ risks introducing non-evidence based or clinically validated information in response to medical questions. Many questions must be answered before LLMs are ready for clinical use, for example: how much of this creativity is acceptably low risk? What are the best-use cases for LLMs and other similar AI in health and care? Can the combination of people and processes mitigate the fallings?

LLMs have the potential to reduce the time to complete tasks and in doing so enable the limited time and attention of staff to be used differently.

It's logical to assume the most impactful way to improve health care services for patient benefit is by applying these AI tools to patient-facing services. However, we shouldn’t overlook the significant potential for this technology to be applied to operational activity, eg, for reporting and creating funding applications, creating commissioning documentation and procurement information, or by creating job descriptions. LLMs have the potential to reduce the time to complete tasks and in doing so enable the limited time and attention of staff to be used differently. This is particularly important when there’s a shortage of non-clinical staff, for example, health services and public health managers and directors are on the shortage occupation list open to visas for overseas workers to boost staff numbers. As well as managers there are real challenges when it comes to specialist roles in digital, data and technology.

The Hewitt review identified the need to recruit into digital, data and technology roles within health and care to continue the evolution of the health care system towards better use of data. The review flagged the existing shortages in this specialist technical workforce and the challenge of recruiting to vacant posts when needing to provide salaries that compare with those on offer in other sectors including technology companies. LLMs have the potential to help alleviate some of these pressures if the technologies are developed for non-clinical use in health and care settings.

The non-clinical uses for AI potentially have lower regulatory requirements and shorter timeframes for demonstrating value. AI has the potential to improve system and service efficiency, productivity and accountability while relieving workload pressures and overcoming skills shortages. To realise these benefits there needs to be greater focus and drive behind the development, evaluation and deployment of AI for non-clinical use, as well as skills and education development for non-clinical staff to use these technologies to the fullest extent.

Building effective health and life sciences partnerships

Join our in-person event this November to explore the challenges and opportunities of developing a thriving life sciences sector which works effectively in partnership with the NHS.

Find out more