Carol Platt: Creating a living hospital

This content relates to the following topics:

Article information

  • Posted:Friday 06 October 2017

Speaking at The King's Fund's Ideas that change health care event on 6 October 2017, Carol Platt (Innovation Associate at Alder Hey Children's Hospital) shares how Alder Hey is creating a digital ‘living hospital’ that can sense and feel what is happening inside its building and act accordingly.

Transcript

Good afternoon.  My name’s Carol Platt, I’m from up North, if you can tell by my accent.  I work at Alder Hey Children’s Hospital and it’s great actually.  I’ll just give you a little bit of my background, I’m not clinical but I have worked in the NHS for 41 years and I actually started my career at Alder Hey, I haven’t been there the entire time, and when I started in healthcare the technology I was using was a shorthand pad and a pencil and manual typewriter, so as you can see we’ve come an awful long way in 41 years. 

A little bit of a vision of our future at Alder Hey, this is the vision of our new hospital, our hospital in the park, it’ll be a hospital that’s unique in Europe, it’ll be centred in a park, and we’re calling it the living hospital vision.  And why are we saying that?  Steal the phrase from one of our most famous songs in Liverpool: Imagine.  Ian Hennessy is our clinical director of innovation.  Ian’s vision was just imagine how magical a hospital could be that it did more than just house patients and staff.  Imagine how it could sense and feel what was happening within it.  Imagine if a building could have a caring heart, a smart brain and interact and entertain within it, particularly in paediatrics.  It’s quite a scary place for children to come into, and equally their parents as well. 

But actually, what we also wanted is a hospital that never tires, works 24/7, always on the go, always continuously learning and always improving. 

Well actually, we moved into that building two years ago.  Two years ago this weekend just gone.  It’s an amazing building.  I walk out of the innovation hub where I work now and just blown away every time I walk out of there, it’s beautiful.  I say I worked in the old hospital for a long time as well.

What we’re going to need in there, we’re going to need a brain so this is where the cognitive computer comes in.  We’re working with IBM Watson in Heart Tree.  We’re going to need some senses, internet of things, I’ve heard about that.  I’m just finding out about that as well, I’ve only been in the innovation service for three months so I’m learning a lot.  And the heart, so actually the cognitive computing can we look at how we do sentiment analysis, and something in there about an engaging avatar.  We’re looking about how we can implement some apps, some mobile technology that will actually engage with the children, collect that information and some smart screens in the rooms of the hospital.  So those are the three things that we want our living hospital to do.

So I’ll take you back just a step; this is where we came from two years ago, this is what we were in, an old Victorian Edwardian building, as I say, this is where we are now.  So these are some of the things we’ve got in there. So we have an interactive operative theatre, this is live, this is what we use now.  It collects images, we can look at those anywhere in the hospital. This is one of the great successes that we had with the children, it’s an interactive 4D television. This is in our surgical day case unit.  The children find they colour in little fish or turtles and they launch them into the screen and we have medical students go round and shadow our patients and families round their experience as well.  The one thing they come out with, what did you like, everybody loves this one.  And this is our 4D immersive ECG and echo lab.  So it’s just sound and vision.

This is where I work, I work in the innovation lab, or as we refer to it as the bat cave, very Blade Runner, got of a Blade Runner motif going on there, very industrial.  It was supposed to be plant and it was not needed so we took it over. 

What else are we doing in there?  Well actually some of these things that we’ve heard about this morning, these things that you could do, we’re actually doing some of this stuff.  So on the top there we’ve got Raphael who is one of our cardiac surgeons and to the right of him, this is work that we did with the University of Liverpool and we have an interactive heart, it’s virtual engineering, this was technology that was used in the automotive industry.  We’ve worked with them from scans we have built a 3D heart.  We use this through VR and people come on and put their headsets on and they’re absolutely blown away by it and I had Google in this week showing them around and nearly every one of them had the headset on and it’s like caving, you’re going in to looking at a heart that actually you could almost hold in the palm of your hand and actually deep caving inside there.  The surgeons are using this actually to look at sort of how did they do some preoperative planning on a child’s heart, so the actual child’s heart we had permission from the family to use this and they’ll be looking at it in preoperative planning. 

Working with Microsoft looking at hollow lens, we’re working with Sensor City looking at sensors, the one on the right there next to the 20 pence piece, is this wearable?  Going back to the wearables, but this is actually one of the worst things you can do with a child actually is stick a needle in them, it’s a terrible experience and particularly if it’s a small baby, we’ve got to get lots of blood off them and they’ve not got much.  Actually if we can use some sensors in there, we’ve got 3D printing, it constantly goes.  It’s like this white noise that goes on opposite my desk.  We have six of them and we use these again preoperative planning but also education and not just for clinicians and students that we have there, but actually we use it for helping the patients and families understand around their care around this.  So we’re using all of this. 

The guy in the middle there that looks as though he’s talking to a small robot, that’s Ian Hennessy, that’s not the robot I should add. Don’t know if you’ve ever seen it on Big Bang where Sheldon rolls in on a little iPad.  We have a robodoc and Ian frequently was in the programme board morning on Tuesday night, from Florida in our room. 

And there on the right hand side, this is actually a photograph from one of our wards.  This is something that’s called no isolation, we’re actually trialling this at the moment with one of our children from the oncology unit.  So obviously some children who have long term conditions and have to come into hospital for long periods of treatment, actually they get isolated from their social and educational streams and we’ve literally about two weeks ago put this out there to start evaluating it with one of our children. 

Going back to the beginning, what did we want to do about building the living hospital.  I was fortunate a few years ago to do some work with The King’s Fund about patient and family centred care model and it’s always been at the heart of, and driving passion of what we do.  So actually always building out there, taking it back to what is a good patient experience and it isn’t just about keeping them entertained.  What is the experience like?

So actually if we follow a patient’s and family’s journey, it’s always the family that’s always in there as well, the parents as well.  We know that at certain points in their journey there’s going to be more anxiety, so we want to see how do we look at that and actually equally then parents’ anxiety can run alongside that, so we’re thinking about how can we pick this information up.  So about eighteen months ago we started working with Heart Tree, and Heart Tree is fortuitously for us actually only up the road at Darsbury in Warrington and what we’re looking to do is do the first UK’s cognitive hospital, using IBM Watson technology. 

So Watson obviously looks at natural language processing, and it’s obviously continuously learning, and this is the bit that I’m involved in and again, as I go back, I’m not actually a tech person, and I’m not a clinical person, but I work with tech people and I work with clinicians as well.  And what we’ve been doing, we’ve been building the chatbot.  First off we actually started to just gather an awful lot of questions.  We took about six months, just going out and finding what were the things that the patients and families wanted to know and we’re going to start off quite small because one of the things that we want to do, there’s an awful lot of data and we want to understand it from the start, about what are all the challenges that we’re going to face around this.  So we started to collect all of this information and then we started to embed that knowledge.  Then we started to teach and we learnt about intense and entities and if anybody’s seen Arrival I nearly jumped out of my chair when she wrote this thing on there, going you have to understand this language, we have to break it down into intense and entities and I went, oh my god that’s all I’m doing at the moment.  I can go and talk to aliens. 

At the moment we’re training the system, but we’re very close actually, we out there at the moment so we’re about to go live with this.  At the moment, whilst I say we’re keeping it quite small, it’s quote anonymised, we’re not collecting any patient information but eventually yes we do.  All the stuff that I’ve talked about, the sensors and the 3D and all of that, eventually that’s what we want to do is try and bring all of that information in but we realise we have to start small because we have to understand the implications of where we’re going with this and what all of the challenges are. 

So this is Oli, if I clicked on there in the past, I would have took you straight through to our link.  Again, it’s going back to the challenges.  Obviously in paediatrics as well, we’ve had a lot of information about adults having that information.  Well actually when it’s a child, we’ve got to think about that balance there.  We’ve got to think about third party integrations, there’s an awful lot of other stuff in there, we’ve got multiple platforms, how do we make it child friendly?  Actually, if the people who are going to tell us how this technology is going to work, are probably the children because they’re going to be the ones who are going to be really using this.  How do we actually commercialise it?  Actually this is great, but how do we take it forward?  And obviously, as I said, the security and future integration with our electronic patient record and always making sure that we get it right.

We’re continuously learning and as I say, he’s about to be set out into the wild shortly and we carry on testing with him.  So, that’s what we’re doing at Alder Hey and those are our contact details.

Thank you very much.