Skip to content

It is time to put trust, transparency and fair value at the centre of digital health and care


Health data has the potential to change health and care provision, but it also reveals some of the most personal information about ourselves. Emerging examples demonstrate a remarkable potential for data to give new insights that enable service improvements, such as better understanding the unmet need in universal childhood vision screening, enabling blood cancer treatment to be tailored, and training AI software potentially improving detection of eye disease.

For data to be used for public benefit citizens must be able to trust their data will be used appropriately with complementary benefits (or fair value) for patients, public, the system and private companies. Do our current approaches incorporate trust and fair value in a way that is suitable for the new world of digital technologies?

The NHS has been buying physical goods for a long time so we have a reasonable degree of understanding what fair value looks like so there is mutual benefit – the NHS can agree a good price and companies have a sustainable income – although this process is far from perfect, with variability and inefficiency. But digital technologies increasingly complicate this exchange as we move from tangible physical goods to intangible goods such as software, services and data. There is some potential to learn from the purchasing of physical goods, but data specifically is an aspect unique to digital which brings with it a whole new set of challenges.

'What is fair value to the NHS? Is it monetary return, improved outcomes or increased efficiency?'

It is difficult to codify fair value for data into a simple price list largely because it is highly personal, has multiple variables and is use dependent. Perceived value is dependent upon multiple factors, such as how much data there is, its quality, the time span covered, demographic representation, completeness, etc. One person’s blood pressure data is valuable in isolation and rapidly becomes much more valuable in combination with other data such as heart rate and level of physical activity (eg from a wearable device). This personal data is valuable for an individual and their care provider, but once pooled for a representative population the perceived value increases substantially – the data can be used to improve the health of many. Physical goods can be considered a simple one-off monetary transaction in return for a medical product but digital technologies create data that can also benefit companies by improving their algorithms and insights to generate new business. Ideally, we seek equally fair value for the health and care system, patients, the public and companies. It sounds easy until we try to pin down what exactly we mean by fair value, and specifically, what is fair value to the NHS? Is it monetary return, improved outcomes or increased efficiency?

Integral to agreeing on fair value is trust in the institutions, companies and decision-makers that hold, control and use our most personal data. In the UK we are lucky to have a health service that is trusted by the majority of the public to keep data safe and use it appropriately to improve care. However, we have seen past missteps erode this trust, subsequently many believe that agreements benefit private companies more than the NHS. Unfortunately, there’s little evidence that missteps have translated into lessons learnt. For example, the recent test and trace agreement with McKinsey states no data sharing will occur without written consent from the data controller and relevant data protection in place, which may seem sufficient. But, it’s unclear how fair value to the NHS and the public fit within the conditions for data access. Just as data protection has become standard, transparency on how fair value will be defined and generated needs to become a standard part of agreements.

While it is a complicated area, there are already good precedents on how to establish a shared understanding of fair value. One approach is a citizens deliberation, such as the recent engagement with OneLondon citizens, a collaboration with Understanding Patient Data, The King’s Fund and Ipsos Mori. Over several years, multiple citizens deliberations have developed knowledge and processes for engaging citizens on complex topics in a structured way to derive consensus on fair value and appropriate data usage.

Given what we know about the benefits of involving the public who are becoming more privacy conscious, we are left with the question: when will this established knowledge be utilised by data controllers at all levels of the system when working with corporate partners to modernise best practice for transparency and fair use of data? From the existing body of work, data controllers sharing patient data need to consider the following questions.

  • How have established citizens expectations on fair use of data with appropriate practices been incorporated into agreements?

  • How will the public be engaged in the development of any agreement and ongoing activities?

  • If agreements have multi-year durations, how will the initial agreed value be adjusted over time?

'Holding and using data is a privilege based on trust, earned through professional values and behaviour.'

Deliberation on a complex issue like fair value helps develop shared understanding and a way forward that is more nuanced than simply ‘do not sell data’. Most, if not all, data agreement approaches can be improved by incorporating established expectations and engaging with the public. This won’t solve all the problems, but it will help to maintain and improve trust while simultaneously modernising best practice.

Holding and using data is a privilege based on trust, earned through professional values and behaviour. Covid-19 has increased digitalisation and created new data streams, and the importance of data and good data practices continues to grow, but we can’t continue to ignore modernising protocol. The system will only work optimally if patients feel safe providing and sharing their data. Without improved protocols there is a real risk that the sense of trust and safety will be further eroded.