More than just hype: how emerging AI use is assisting health and social care
The NHS and social care systems in England are on a journey towards digitalisation. One particular technology that is generating high levels of both excitement and scepticism is artificial intelligence (AI). While many are excited by the opportunities offered by AI, others may be feeling more doubtful or unsure.
The fast pace of technological development can make it difficult to understand how AI could be applied and what its benefits are. Here, we explore some of the ways that artificial intelligence is currently being developed, tested and used within the NHS and social care. We look at how AI is enhancing the quality of care, workforce support, patient safety, person-centred care, productivity, education and training. This work is based on conversations with a broad range of people from specialists, GPs and dentists to researchers and innovators. The discussion focused on ‘Where do staff see value and potential for AI?’. We’ve included some of their responses throughout this long read.
We provide context for how this technology is, or could be, applied but do not cover the technical specifics. We assume that factors are in place to enable AI to have beneficial impact, for example the environment AI is deployed within and the equipment it requires to work reliably and provide high-quality data, and that the AI has been trained appropriately to mitigate bias in data sources. These are not insignificant challenges that will take focus and effort to overcome.
So if there are so many challenges, why should the health and social care systems put effort into using AI tools? Here is what we found.
Quality of care
Quality of care is important to enable care that is effective and safe and provides a good experience for patients and those who draw on services. Although the NHS and integrated care boards have a duty to continually improve the quality of care, this can be a challenge when the system is under pressure. To support improvement in the quality of care, innovators and suppliers are working on a number of different AI tools that can improve speed, experience, outcomes and decision-making.
Computer vision is one of the more mature forms of AI and has a growing evidence base. It uses specific methodologies (eg, deep learning and machine learning) to identify features within a medical image. These tools can recognise structures (such as heart valves or chambers) to enable measurement or identify abnormalities in an image, such as a cancerous tumour. Computer vision is particularly suited to services and tasks that are image-oriented – for example pathology, radiology, cancer detection and dermatology – and has the potential to make the workload manageable for the number of staff or reduce the time needed to analyse medical images; the latter could help reduce patient waits. Computer vision could also lead to autonomous systems in health care in which AI systems are able to assess medical images without staff oversight; instead, staff would check the technology and outputs for accuracy and safety.
Computer vision also has the potential to improve quality of care by improving speed and consistency when it comes to identifying, segmenting and measuring parts of the body and anomalies in medical images. This in turn assists staff with decision-making, potentially improving response times – for example, routinely checking for blood clots and haemorrhage in medical images can enable quicker response and treatment prioritisation.
Other AI tools in development include 3D virtual models from CT or MRI scans, which are helping to improve surgical planning in trusts and enhance precision in dentistry. These types of models can help reduce the likelihood of complications, giving better outcomes and fewer repeated interventions.
AI video tools are being developed in areas such as minimally invasive diagnosis and treatment, for example in endoscopy and laparoscopy, which use cameras as part of the procedure to routinely capture and use video footage. AI tools can be used for both non-real time and real-time analysis of video which can improve quality of care.
For example, AI tools can record a surgical intervention and write a draft report using natural language processing to summarise the intervention, which staff can then review. It can also incorporate tissue images based on the recording. This type of AI application was initially used just to describe the intervention, but its capability is now expanding to enable it to analyse the intervention as well. It is then able to provide feedback and coaching to a surgeon, and has already demonstrated improvements in care quality.
AI can also provide feedback, which we've demonstrated in our hospital to improve outcomes. We have reduced complications and reduced length of stay for some patients just by using AI to provide some soft metrics on performance and facilitate discussion with coaching.
Neurosurgeon
Other real-time video-based AI applications in development can also provide assistance during surgery, helping to guide where incisions can be made and how much tissue should be removed. In discussions we heard how past negative experiences can lead to overcautious behaviour in surgeons. This type of AI coaching and guidance can help to improve surgeons’ confidence and mitigate overly cautious bias behaviours that result in repeat interventions for patients.
AI can also be applied to text. There is a wealth of freeform text information in the NHS and social care, from GP and social worker notes to patient surveys. AI is able to analyse this text and find patterns and correlations – although there is still a need to understand the effects of bias in the data. Hospitals are applying AI to unstructured text information to detect under-reported and hard-to-recognise conditions across large patient populations, helping to identify people with conditions such as myeloproliferative neoplasms (a rare type of blood cancer) and refractory epilepsy (not responding to treatment), and then use this information to proactively improve treatment and outcomes.
Within local authorities and social care some suppliers and providers are working in collaboration to use non-health information from public services to address wider determinants of health and prevention. For example, AI analysis of data on financial vulnerability and domestic violence can identify people who are at risk of homelessness, facilitating proactive case management support or interventions.
Supporting the workforce
Technology needs to work for and support staff in a well-functioning health and care system; it builds staff enthusiasm for innovation and technology while supporting them to focus on patient care. A number of AI technologies that have the potential to improve the working conditions and workload for staff in health and care are currently being developed, tested and implemented.
AI scribes (also known as ambient voice technologies) are an emerging category of AI that is starting to be used across NHS and social care settings. This type of AI starts by taking an audio recording of a consultation or interaction between staff or carer and the individual. The recording is then automatically transcribed to text and summarised to provide the most useful and relevant information based on templates and software set up. If required, staff can add additional observational details using the audio recording. As well as providing useful documentation, AI scribes have the potential to support staff by reducing workload and alleviating cognitive burden.
Actually, they're probably getting 50% of my attention because 50% of my optimal brain is going through ‘How do I store all this so I can retrieve it later?’ And now it's completely different. So my cognitive processing is much more about the problem that I'm being presented with and thinking about solutions and issues.
GP
AI scribes are being tested in a range of settings across health and social care – from outpatient services and general practice to home care and residential care. However, there’s potential and appetite to improve the technology and expand it to more care situations, such as medicine reviews in hospitals, physiotherapy and occupational therapy (where consultations are long and unstructured and have similarities to both general practice and social care interactions) and even emergency handovers.
Early indications are that this type of AI scribe tool can save staff time, yet clinical opinions are mixed as to the time-saving element and the evidence base is still building. However, it is clear that not all AI scribe tools are equal so there is a need for transparent benchmarking. Tools with high error rates through hallucinations (incorrect, misleading or nonsensical text), poor transcription accuracy or other factors can mean more staff time is spent on corrections. Additionally, interoperability limitations and consent requirements could erode projected time savings, while staff with good digital confidence may find that the AI scribes provide little time savings.
Future AI scribes have the potential to go beyond standard consultation interactions and could evolve to support staff, suggest training gaps and improve inclusivity. The AI outputs could suggest follow-on actions for staff, while analysis of multiple consultations could help identify training needs for staff and prevention opportunities at public health level. There is also potential for AI to support translation.
Many staff in health and care work under high pressure and in difficult circumstances, for example ambulance service call handlers have very demanding roles. These staff are highly trained to have the correct knowledge and skills, but there is also high turnover. Some ambulance services are interested in whether ambient voice technologies (which record the interactions) could alert managers if a call handler has had a difficult day. This could not only support staff wellbeing, but also help with call handler retention, which is particularly low, by providing managers with insight into any particularly challenging calls that staff have experienced.
If we could use AI to listen into calls to notice when someone’s had a hard day, maybe they’ve had three deaths, and then alert the supervisor that these things have happened. We think that would be quite valuable.
Ambulance Trust Digital Transformation Lead
Generative AI tools (AI that creates text, audio, video or data) are also being trialled to see whether they could support staff to generate documentation that they don’t complete often enough to become experts at but frequently enough that it can take a lot of time – for example, application forms for benefits such as Personal Independence Payment, fit notes or housing support letters. Primary care staff are using generic descriptions in generative AI tools to create a letter with all necessary information, which can then be tailored for the individual. This use of generative AI reduces workload and improves quality and completeness.
You can create letters with non-identifiable information and essentially produce a better letter because GPs are often very time pressed, and it saves huge amounts of time.
GP doctor
Bespoke AI tools are being developed and used in general practice settings to support practice managers and administrative staff. One GP we spoke to has developed a generative AI tool that provides assistive information, drawing on trusted information sources to answer questions on contractual requirements, policies and best practice. This helps practice managers navigate questions and tasks, for example around surgery opening hours, car parking and transport links etc, that would normally require dedicated GP time.
The start point was trying to answer: how do I get expert general practice advice [to my practice manager], which is good and reasonable, and I would trust her to use? And in a way that she can use without massive hurdles.
GP doctor
Similarly, generative AI based tools are being developed and tested by social care providers and local authorities to support staff in accessing relevant information across multiple policies and procedures quickly and easily.
Support services in local areas can vary and change, which makes it a challenge for case workers to find suitable services for an individual. Unfortunately, this can mean incomplete care assessments, which then get rejected. Local authorities are exploring AI tools that are able to take an individual’s information and provide a list of suggested local services tailored to their documented care needs. These sources of support can then be reviewed by the social worker with the individual. This approach can reduce workload and provide data-led suggestions for care packages while keeping people in control.
Social enterprise organisations have developed AI-powered tools to support paid and unpaid carers to complete assessment forms. These tools support the carer while helping to improve the quality and consistency of information, and provide signposting to potential support in the locality.
Patient empowerment, accessibility and communication
Technology can shift and change power and roles, and the same is true for AI. It has the potential to improve how patients experience care and empower them to manage their own health and wellbeing.
People can, and do, struggle to reach hospitals or miss appointments. A number of providers and suppliers are successfully applying AI to identify people at risk of missing appointments, then allocating staff time and resources to contacting them to reduce their specific barriers to access. AI algorithms use data such as social determinants of health, location, age and time of appointments to predict which patients are unlikely to attend an appointment.
Imagine if someone has to spend their weekly food budget to go to the hospital, there'd be a high chance they’d question if this is really required. So we're using AI to predict high likelihood of not turning up. We'll get in touch to find out why they're not going to come. And ask could we send you a taxi, try telephone or group appointments together so that we take up less of your time and you lose less earnings.
Hospital doctor
Integrated care boards (ICBs) are also exploring whether AI can help to improve how and where services can be placed to make them more inclusive and accessible. By linking data across providers and applying AI tools, system leaders can understand gaps in appointment attendance by population and geography. As a next step, some ICBs would like to trial modelling tools to explore different service configurations, based on factors such as location and transport links, to try to improve accessibility for underserved populations.
To help give patients easier access to information on services, parking, transport, opening hours and more, providers in trusts, GP practices and dentistry are working with suppliers to modernise their website and phone systems. By monitoring phone queries, they are able to recognise common questions and then create bespoke AI tools that are able to answer these questions using text or audio.
GP practices are also starting to use generative AI to improve how they communicate with patients through social media channels. Some GPs and practices have developed generative AI tools to create health information social media posts, reducing time and costs for them while improving public communications.
Health and care charities are trialling the use of AI to support their communities with better services and tools, for example running surveys and collecting information on how people feel about services within their communities. The organisations then use AI tools to analyse this information to understand how services are performing, saving them weeks of time. This information is also giving providers constructive suggestions for improvement and could enable monitoring service performance over time.
When people feel informed and knowledgeable, they are more likely to feel empowered to make decisions about their health and care. Charities that support people’s health and wellbeing are using AI tools to improve people’s understanding and knowledge of conditions and services. They are using generative AI to translate text or audio information into another language and then working with communities to adjust for cultural relevance and sensitivities. These generative AI tools are far from perfect and because of the potential for errors introduced by AI the process cannot be automated, so it’s important to include people as part of the process. However, AI can help to create a draft, using less time or money, which can then be improved by engaging with communities and native speakers.
We might use auto translate to get a first draft but then we'll take that first draft and go back to the native speakers in our co-production group and get the scripts checked for tone appropriateness, and we'll also look at the medical accuracy with native language speakers.
Charity CEO
Saving service time and money
Many of the examples mentioned earlier have the potential to improve not just quality of care, staff workload and patient experience but also the productivity of the health and care system itself. While the productivity boost from each individual use of AI can be relatively small, the cumulative gain of multiple tools at different points of a service can enable substantial improvements – assuming released time can be redirected to care. The computer vision-based AI tools mentioned earlier can remove a proportion of images without indication of disease from human oversight, reducing staff workload and improving productivity. Tools that are able to recognise and analyse images by counting and outlining medically relevant parts, which can take staff anything between 30 minutes to a few hours, can save staff time.
Foetal imaging and measurements are part of a routine ultrasound scan. Yet an ultrasound scan can be time consuming and have a high degree of variability largely due to the manual nature of the process. For example, taking measurements requires the staff member to repeatedly optimise the probe position, pause the video and take onscreen measurements, and this is repeated at least three times per measurement. AI can make the task easier for staff and save time.
Often, during foetal ultrasound imaging, a specific series of images needs to be captured. AI can help to automatically capture and save the best image in a video sequence, potentially saving staff time. Understanding what would improve the patient experience is important given that pregnancy is an emotive and important life event. AI would need to save staff time while also improving patient experience.
I ran a clinical trial basically comparing the AI-enabled approach with standard manual ultrasound. It didn't change the measurement, so the sensitivity and specificity were about the same, but it massively cut the time it takes to do the scan.
Paediatrician
Pathology (the study of samples such as fluids and tissues from someone’s body to help diagnose what is making them unwell) is a highly image-orientated part of health and care delivery. The pathologist workforce is under pressure with insufficient staff to meet the rising workloads and approximately 20% of staff set to retire in the next five years.
Recent investment into the digitalisation of pathology sets the foundation for AI tools to be used to support staff and improve productivity. Areas where AI can assist productivity include cell counting and grading.
Across health, social care and local authorities, generative AI tools are being used to save time. This includes updating guidelines, capturing minutes of meetings and summarising actions. The summarisation of care records and incidences can help to release staff time, as long as appropriate review processes are in place.
Social care providers are rapidly digitalising their social care records. Thousands of care plans and notes are being created each week by providers. However, many providers do not have access to the analytical staff or expertise needed to analyse this data effectively to create insights, for example of predictors of falls. Instead, several organisations are developing and testing AI tools to help with the analysis of data in care records. They are finding AI can help provide carers with timely and relevant information to provide optimised care and identify changes needed to a person’s care plan. In future, AI could update care plans dynamically based on recent interactions.
The NHS receives approximately 1,000 freedom of information requests each quarter and a hospital can receive approximately 200 requests each quarter. These are essential to give the public transparency of information on NHS services, spend, targets and results. However, the teams dealing with these requests often have high workloads, and completing requests within the necessary time can be a challenge. NHS organisations are exploring the use of AI to support teams and improve productivity by developing draft responses from curated information sources.
Workforce education and training
The rapidly emerging accessibility of AI tools and their increasing capabilities also have the potential to impact how staff access education and training. Ultimately, the aspiration is for AI to tailor learning material based on an individual’s educational needs and gaps, although the current capability is not there yet. Education is a content-heavy area, often requiring the same course information to be provided in multiple formats. Generative AI is being used to create handouts and summary information from guidance and course material, taking existing information and changing the format and language or summarising the materials, potentially reducing costs and saving staff time, as well as providing a range of different educational materials for the learner. Generative AI is also being used by students and trainees as a coach to improve coursework. By providing AI with the marking scheme and the coursework, AI can suggest areas for improvement. In future, AI could help to improve staff training by identifying gaps and areas for support (see AI scribes) alongside in-situ coaching (see AI video tools). For example, the ambient AI tools mentioned earlier (or recordings of staff-patient interactions) have the potential to use AI-based analysis to ascertain whether staff are completing the basics to a high standard – for example, introductions and taking a patient’s history. AI can also provide reflective prompts to help staff improve their consultation skills.
Safety
Safe and effective care is important for patients, staff and the system. AI has a role to play in making health and care services safer. In this section, we briefly illustrate how AI is being explored to improve patient safety.
More people are living longer and with long-term conditions, which means that the proportion of people needing surgical care are increasingly likely to be older, have comorbidities and be taking multiple medications. This makes it harder to decide which medications to change and what anaesthesia to use during surgery to reduce the risk of adverse reactions and harm. Innovators in the NHS are piloting AI tools to make this complex process more efficient and safer for patients by bringing together information scattered throughout the system to then risk assess patients and help staff to adapt their treatment and surgery.
We are increasingly putting people forward for quite major surgery with multiple comorbidities. The interactions of these comorbidities with each other, the interactions of the treatments with each other, and the treatments of this whole package with the surgery is becoming a thing.
Anaesthetist
In 2024, the Patient Safety Incidents Response Framework (PSIRF) became the new NHS safety framework. The framework promotes a system-based approach to learn from multiple patient safety incidences. Patient safety incidences can range from log-in failure to pressure sores to serious injury and death. The majority of incidences combined with limited staff capacity has stimulated some NHS innovators to explore the potential for AI to support improvements in patient safety as part of PSIRF.
It took them 30 to 40 hours to manually go through 15 incident reports to identify common themes using robust methodology, but only 15 cases. You can imagine if the ask is thousands of cases, it's just not doable.
Specialty Registrar
Staff at a general district hospital are working with academia to explore the potential for AI to analyse patient safety incidence information to understand common themes across much larger numbers, saving staff time which can then be applied to supporting improvements in patient safety.
Safety auditing is another area being explored. Ambulance trusts regularly audit a percentage of calls to ensure staff adherence to correct approaches and processes. Providers are exploring whether it’s feasible for AI tools to analyse recordings of calls to check staff are using the correct processes and flag anomalies to a member of staff. It is hoped this could help improve the auditing process, enabling all calls, rather than just a proportion, to be audited, allowing for more comprehensive audit and safety improvements.
Understanding the challenges and consequences of using AI
In our research we encountered ingenuity and innovation in the NHS, with lots of potential applications for AI to support improvements in quality, staff workload, patient safety, productivity, education and training. However, existing research and implementation have highlighted potential challenges and concerns that need to be acknowledged and overcome.
Bias and inconsistent data
A significant challenge is the potential bias in the data used to create AI tools that could embed or worsen inequalities. For example, as well as health inequalities there are also inequalities in access and experience of health services. Underserved groups face barriers to care and can present later and with more advanced onset of conditions, meaning that any early disease datasets and subsequent AI tools will have these groups missing, which in turn affects the capabilities of AI. If the data represents what the system sees it risks perpetuating these biases. In addition, AI clinical scientists in the NHS are unclear how many people would need to be included in the data in order for it to be representative.
If only 1% of my patients are Hispanic then including in the data a sample of 1% of patients being Hispanic means that we don't know if that’s sufficient for the AI to pick up on their care needs, and there are nuances here that we don't understand and this is a constant that keeps coming up.
AI Clinical Scientist
There continues to be significant and ongoing work on the concerns and mitigation of AI and data bias in health and care to ensure diversity and inclusivity considerations and mitigate unfair biases in medical devices, and frameworks are being developed to assess possible AI risks. Different approaches to data generation, such as use of synthetic data (artificially created data that mimics real-world data but doesn’t contain any real-world information) could also help; however, this adds regulatory uncertainty due to creating new and unfamiliar concerns.
AI is an advanced technology and the cases outlined in this long read require technical and data infrastructure, as well as staff skills and education. Without addressing the variation in digital maturity across providers, the push towards AI is likely to mean that the most digitally capable organisations are better suited to use AI while the recently digitalised providers struggle. This has potential to widen inequalities and differences in patient outcomes and staff conditions.
I am in a little bit of a black spot in terms of significant innovation from the system point of view, because we're rural, we're always fighting rurality issues because the spend, ideas and the processes seem to be focused on cities and towns and has disregard for some of our issues.
GP
Data drift
Providers and manufacturers carry out periodic checks for changes in the performance of a medical device. But this is far from codified and insufficiently robust for AI tools and automation where tools that start safe do not stay safe. Without knowing when this happens, and the processes for responding in a rapid way, there is a real risk of increasing harm and endangering patients’ safety.
Unintended consequences
Historically, NHS digitalisation has often taken place one organisation, or even one department, at a time. Unfortunately, AI deployment is continuing this trend. Siloed approaches mean that not enough attention is paid to the downstream effects of AI across providers, with regard to both benefits and drawbacks. This risks creating more pressure elsewhere in the system and worsening patient experience, which continuous improvement approaches can help to mitigate. For example, AI tools such as computer vision speed up medical image analysis, but simply deploying AI at this stage doesn’t increase the number of consultants. Instead, it risks creating a surge of activity at the next contact point, resulting in more frustration for patients who are still having to wait and staff who feel the burden of a workload that’s rapidly increased. Sensitivity and specificity are equally important to avoid overloading pathways.
When we first did the [falls prevention] project we generated 650 people we thought [had] risk of falls using machine learning … [but the provider] only had capacity to deal with 40% of the alerts we generated.
Social care AI supplier
For example, for a single foetal heart lesion, you would generate 40,000 false positives in the UK a year and there are only ten centres that do foetal cardiology. So that's 4,000 just for this one centre. You introduce a test, there’ll be these additional 4,000 patients, which you can't do.
Paediatrician
When using AI there needs to be consideration of the extent of the impact of an AI tool across a system, a key role for ICB strategic commissioning. AI that addresses one aspect of the patient journey, for example medical image analysis, can help reduce radiography workload by removing images that give a negative diagnosis. But if these patients still have undiagnosed conditions they still need clinical time, previously this might be more spread out but accelerating one contact point could overwhelm the next. AI solutions deployed to address a single activity and poorly optimised across the pathway risk overwhelming other parts of the system, creating more work or inefficiencies rather than productivity gains.
Conclusion
AI is already here, the recent leap in generative AI tools has sparked interest, but many AI tools have been in development for much longer. AI is no longer about future potential but its here today and has potential benefits if it can be integrated into the health and care system effectively based on the evidence and limitations of the tools. Many of the hurdles facing the use of AI are not unique to this technology but are long standing barriers to digital transformation, there needs to be leadership drive to establish the infrastructure, implementation capacity and staff development to embed the benefits of AI to address the challenges in the health and care system.
Adapt to thrive in digital health and care – community of interest
Explore how to adapt and thrive in the developing digital health and care system by joining our new community of interest, with an online platform and monthly online sessions.
Comments