Health systems are trusted with lives, but not data

Do those who serve in the military suffer long-term damage to their health? To read the frequent newspaper stories about army veterans down on their luck, you might well think so.

But a study in Scotland suggests very strongly that it isn’t true. Researchers in Glasgow studied the medical records of 56,000 veterans who were born between 1945 and 1985, comparing them with 173,000 people who hadn’t done military service. The results showed that the veterans’ health was better, not worse, than that of civilians.

The study is a classic example of the power of data to answer interesting and important questions. It linked military records, hospital admission data, the Census, which shows where people live, and death certificates which record when, where and how they die, to reach its conclusions. Increasingly, data sources like these are stored digitally, making it much easier to link information gathered by different organizations for different purposes to achieve something which nobody could have anticipated at the time the records were made.

Most people are willing to allow their medical records to be shared for this kind of research, when it is explained to them. Most are also happy to see personal medical data used to improve health services. But consent weakens when it is proposed that the same data should be available more widely, to industry and charities, for example. Some people bridle at the idea of insurance companies poring over their personal medical records, however carefully they have been anonymised to prevent the identification of individuals.

This failure of trust has created a vicious circle that could make things worse. The more that well-meaning organisations such as the Nuffield Council on Bioethics or Dame Fiona Caldicott, the National Data Guardian, call for tougher penalties for breaches of data security, the more people come to believe that such breaches are a profound threat to their well-being. And every move to improve public willingness to share data by making it easier for doubters to opt out further legitimizes these fears.

The danger is that by making consent a fetish at the expense of other public goods, and criminalizing accidental data breaches by jailing those responsible, the wider public interest is not being served. The Farr Institute, which represents 21 universities and healthcare bodies in the UK involved in data-based research, believes such recommendations are “politically expedient” and that severe sanctions, if applied inappropriately, “may hinder data sharing without adding safeguards”.

Ironies abound in public attitudes towards personal data. On Facebook, people share their intimate lives without a care. On Twitter, they share their opinions, wise or foolish. On dating sites, they share – well, let’s not go there. They acquire supermarket loyalty cards which provide small savings in return for the store getting hold of huge quantities of data about their personal shopping preferences.  They bank online, trusting the software to protect their interests. They commit indiscretions on email as if the conversation were a whispered exchange in a dark wood. Yet they harbor suspicions of medical data-sharing, despite belt-and-braces safeguards that leave most of these other platforms standing. According to Philips Future Health Index 2016, one of the largest barriers to the adoption of connected technology in healthcare is data privacy.

Nor do the ironies end there. Hospital Episode Statistics, comprehensive anonymized databases about every treatment carried out by NHS hospitals, have been in the public domain for decades without complaint. They help pinpoint instances of poor care, such as at Mid Staffordshire NHS Foundation Trust. Yet when an attempt was made to incorporate data from GP practices into the system, it failed to win public trust and was eventually killed off by the Caldicott report of June 2016.

It was this failure that lies behind the current scramble to regain trust. Care.data, as the project was called, had good intentions but muddled implementation. The calculation seems to have been made that if it had parliamentary approval, as it did through the 2012 Health and Social Care Act, and if it merely mimicked what Hospital Episode Statistics were already doing, it could be implemented swiftly and painlessly, more or less without anybody noticing. If so, the ploy failed.

The strategy has now changed, but less than you might suppose. The aim remains to get the maximum possible access to data with the minimum possible box-ticking. Moves by the European Parliament to toughen the European Data Protection Regulation by requiring specific and explicit consent from individuals was strongly opposed by a coalition of research organisations led by the Wellcome Trust. “When safeguards become disproportionate, they benefit no one” said Dr Jeremy Farrar, director of the trust.

He was right: specific and explicit consent is impossible since nobody knows when data is collected how it may be used, as the example of the Scottish military veterans makes clear. What is needed is broad consent, backed by strong rules on anonymization and confidentiality safeguards, within a regime where ethical approval balances benefit of a particular study against risk. Fortunately the EU heeded the warning and retreated to a more balanced position in the final text.

Dame Fiona Caldicott’s panel attacks the problem by proposing that once personal data has been anonymized, it ceases to be personal data at all and can safely be used by others. It proposes that de-identification should be the job of NHS Digital (formerly the Health and Social Care information Centre). Individuals would not be empowered to stop their medical data reaching NHS Digital from either primary or secondary care.

Personal data would continue to be used for the direct care of individuals (as it is now) and, subject to opt-outs, for running the health and social care system and for research. Whether these two uses should be subject to separate opt-outs or a single question was left unanswered.

What happens next? The UK Government launched a consultation which closed on September 7 2016. It has yet to make the results public, or to say what it will do. The future of big data as a tool for medical discovery may depend on how persuasively it can make the argument.


Nigel Hawkes

About the author

Nigel Hawkes

Science and health journalist, formerly with The Observer and The Times, he is a regular contributor to British Medical Journal, and was director of Straight Statistics, a campaign group for the honest presentation and use of statistical data.


Share your thoughts

(0 comments)

  • No comments have been posted on this article yet.

Please or to post a reply.

Related stories