Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Politics
Cori Crider

‘Our health data is about to flow more freely, like it or not’: big tech’s plans for the NHS

Woman Filing Medical Records

Last December, I had an abortion. Most unwanted pregnancies set a panic-timer ticking, but I was almost counting down the seconds until I had to catch a flight from London, where I live, to Texas, where I grew up, and where the provision of abortion care was recently made a felony. You bleed for a while after most abortions, and I was still bleeding when I boarded the plane home for Christmas.

Going to Texas so soon after the procedure made me consider where the record of my abortion – my health data – would end up. When I phoned an abortion clinic in late November to book an appointment, one of the first questions staff asked was: “May we share a record of your treatment with your GP?”

I hesitated. There’s nothing wrong, in principle, with this question, and a lot that’s right. It’s not just that a complete health record helps my GP treat me. My Texan parents, both scientists, taught me that sharing information with organisations like the NHS can help it plan services and research ways to improve care. I’ve joined clinical studies in the past.

But I also help run a legal campaign group, Foxglove, that takes action against the government and tech companies when they infringe people’s rights. In a series of cases about NHS data since the start of the pandemic, we have defended people’s right to a say about who sees their medical information. This work has exposed me to worrying details about how our medical data can be used, including the Home Office practice of tracking migrants using their health records.

NHS data is special. For decades the government has required GPs to store patients’ records in a standardised way: as well as longhand notes, every interaction with a GP is saved on a computer database in a simple, consistent code. The NHS may hold the richest set of population-wide, machine-readable health data in the world. Many see in that data a source of immense potential – and profit. Ernst & Young has valued NHS patient data at £9.6bn a year. It is particularly valuable to tech giants, who would like to get their hands on NHS datasets to build AI machine-learning systems.

As things stand today, I believe my local GP would safeguard the record of my abortion. By Christmas next year? I’m not so sure. It may no longer be up to them. The government is seeking to overhaul the way it handles every health record in England, and its plans have filled some healthcare workers with alarm.

There is clearly a need to make patient data more consistent and accessible across the NHS. At the moment, caregivers in one part of the NHS often can’t access the records of care their patients received elsewhere. Imagine you have had knee pain for years. Doctors want to operate. At various times you have seen your GP, and been to a hospital in south London, but the surgery is due to be carried out in a different borough. (This kind of thing happens all the time.) A doctor is trying to judge how risky your operation is, to decide if you can go to a health centre, where you might be seen sooner and released quicker, or must be sent to a hospital.

Because your health data is held in many places – at your GP and across various hospitals where you have been – there is not one complete picture of you as a patient they can use to check. Of course the clinician will ask about your health before the operation; but you may not remember, or even know, every detail that could affect how your surgery could go. If this data was better shared, they could create a clearer picture of your health. Without this information, their assessment of your risk for surgery is, at best, an educated guess.

This partial view of the patient is commonplace in the NHS, and it is one reason the government is trying to create a vast new NHS England database, known as the federated data platform (FDP). (Health data is managed nationally, so this system is England-only.) If data from the nation’s hospitals, GPs and social care were fed into a single system, accessible in one place to health service doctors and planners, it could potentially help planners by showing trends across regions, and the population as a whole. The contract to build and run the platform is worth a huge sum – £480m over five years – with the winning bidder expected to be announced any day.

Putting so much data under central control may increase efficiency – but it also risks failing because of poor consultation and low patient trust. Many would prefer that NHS England invested in its own capacity, instead of farming out to private enterprise. The frontrunner for the contract to run the FDP is the US tech firm Palantir, which has performed data analytics work for the US security services, border forces and police. If you don’t know Palantir, you may be familiar with the company’s chair, Peter Thiel, a tech billionaire and Trump supporter, who has funded anti-abortion candidates and invested in anti-birth-control startups. Contacted for this article, Palantir emphasised that it is “not in the business of mining data, nor do we sell or monetise it in any way. What we do is provide tools that help customers understand and organise the information that they hold.” NHS England said that it, not the platform, will control the patient data inside it. It also points to contractual safeguards “to prevent any supplier gaining a dominant role in NHS data management”.

Still, the FDP gives government a lot of power over health data – and it hasn’t always earned our trust. In 2019, the head of NHS England and other top health officials discussed nine “commercial models” for access to the nation’s health data with tech and drug executives. A bill currently going through parliament, the data protection and digital information bill, would significantly dilute some of the laws that protect patient privacy. Data protection law – including the EU 2018 General Data Protection Regulation (GDPR) and its UK equivalent – limits what can be done with sensitive data such as health records. We mostly associate GDPR with infuriating website pop-ups, but it is a useful safeguard. It was one of the reasons the abortion clinic needed my permission to share my medical record with anyone, including my own doctor.

One of the most important of the proposed changes is the move to redefine what counts as personal data. Through a process of “pseudonymisation”, the system can detach your name and other identifying details. But this process is imperfect and reversible – and patients consistently say they want control over how their health data is used. If the government amends or interprets data protection law to take this choice away from us, it is hard to see how patient confidentiality as we know it will still exist. Our health records are precious to us. We come to the doctor as supplicants. We share worry, fear, pain. Our ability to do so depends on trust, which itself depends on the guarantee of privacy.

After a summer of strikes over pay and conditions, everyone can see that the NHS urgently needs more resources. The NHS could use our data – with strict safeguards – to streamline clinical decision-making and take pressure off doctors and nurses. But the government’s weakening of data protection rules, set against its cosying up to tech industry, is worrying. In the US in 2021, Google won a deal to access patient records at more than 2,000 clinics to develop healthcare algorithms. Google, like Microsoft and IBM, seeks to access patient records at scale in order to train algorithms to develop apps or organise care. In this troubling scenario, my pain, and yours, may become training data for someone else’s AI.

All this lingered in my mind on the call with the abortion clinic.

“I asked, was it all right if we shared this record with your doctor?”

“I’m sorry, but no,” I said.

Having an abortion was the most isolating experience of my adult life. It taught me how stigma lingers over abortion even in the UK, where treatment is widely available. It also changed how I think about health data: not only mine, but that of my family in Texas, and of patients across the UK. It spurred me to reflect on what is at stake if these changes to patient data go ahead.

With the drive towards AI in healthcare (from triage to diagnosis to drug research), we in England face a choice: will we privilege care and the public sector, or profit? People in the UK treasure the sanctity of our health record – we want control over who sees it, when, and why. What will it mean when we no longer believe a conversation with an NHS doctor is private?

* * *

As I was flicking through Twitter on the early morning train to the clinic, a promoted post I’d never seen before popped up. It was a jokey meme showing a contraceptive pill pack, with the line: “Take your magic pills ladies, you don’t want any Baby Jesuses for Christmas.” Living under surveillance capitalism makes you paranoid. I snapped a screenshot, sent it to two female friends on Signal, then panicked, deleted the picture and turned my phone off.

It was 8am when I sat down at the abortion clinic in Ealing, west London, where a range of women awaited their appointments. Some wore fur coats, others tracksuits and false eyelashes; a few had been crying. A middle-aged woman, who looked as if she was rushing out to her next meeting, just seemed very pissed off. The only man in the waiting room was a teenager who had come along with his girlfriend.

At first I thought my reluctance to share my abortion record had mainly to do with data. I later realised it also had to do with shame. A cloud hangs over abortion. There is a strong residual stereotype of the woman who aborts: someone young, a bit careless, who knows no better.

In the parallel system you enter in England to access abortion, the language reveals a world shadowed by female vulnerability and male violence. I could hear it when I phoned the clinic: “Are you alone?” “Are you safe?” “Please choose a password and a memorable number; we will only use these to contact you about your treatment.” The clinic never used the word “abortion” by text or email. All this signals that you have ventured to a space outside normal healthcare. Nobody hassled me outside the clinic that day, but I’ve seen the protesters with signs outside another London clinic.

Nurses examining data on a computer in a hospital ward in Glasgow.
Nurses examining data on a computer in a hospital ward in Glasgow. Photograph: Murdo MacLeod/The Guardian

Go for surgery, a friend at a party had whispered; the pills cause far more pain. I remembered lying on the bathroom floor a decade ago during a miscarriage, the weeks of bleeding and pain, and decided to investigate. At the clinic, a seasoned Danish nurse told me I needed an internal ultrasound to assess possible treatments. I asked how soon I could have the surgery. The nurse said there would be a wait, until just before I was due to go to Texas, where there would be no aftercare.

It was only then, when I was lying on the table and the nurse prodded my ovaries and my vagina, that the sobs came. I didn’t want another child. But bodies tend to hold on to the remnants of pregnancies, miscarriages and births; I remembered watching other ultrasounds: the child who developed and the child who did not. I lay on the shore of life and death and felt utterly alone.

The nurse told me it was early, about six weeks, and handed me a tissue. (Six weeks is the criminal threshold in Texas.) She had had this conversation thousands of times. “Were you hoping I wouldn’t see anything?” No, I blubbered, I am just tired. I didn’t mention my rage: at every Baptist or Lutheran or Methodist minister in the small town where I was raised, at Texas, at myself. It was clear the surgery would be too late, as I had to fly to the US. Before leaving the clinic I took the first lot of pills.

You may support abortion rights on principle, as I do. You may also, as I do, admit privately that you find its ethics ambiguous. Having an abortion taught me that a woman’s ability to sustain or end life is simultaneously an awful burden and an awful power. Mulling over the complexity of it all now seems like wallowing in a warm bath. When the reality appears, you are thrown in a cold river. The cells are dividing; a choice must be made. Swim on, or turn back? I felt the power and I hated and feared it, and I realised why so many men hate it too, and seek to control it.

In most of the UK, abortion is still a technically a crime. The 1967 Abortion Act does not legalise abortion; it excuses it if a doctor considers that pregnancy (before 24 weeks) poses a risk to the mental or physical health of the mother. (Northern Ireland is different, with abortion legal on demand to 12 weeks.) This teaches women that to access care they must construct a narrative that continuing the pregnancy would be damaging to mother or baby, or both.

Perhaps that’s why I didn’t want the clinic to share my data. Having reduced a complex experience into a just-so story once, I didn’t want that story to be fed into a mass data system that could, I worried, one day be shared with the health division of Google, or the German drug giant Merck, or some other company. I saw my abortion as my story, and not one from which I was keen to permit any corporation, however modestly, to profit.

You emerge from this ordeal with a story you guard fiercely. This doesn’t mean you don’t tell it to others: it means you want absolute authority to tell the story as and when you see fit. It was months before I told my Texan family about the abortion; in the service of a wider point, I am writing about it now.

In German legal language, this is “informational self-determination”, or informationelle Selbstbestimmung. This idea means something slightly different to me from privacy. Privacy connotes a turning away, a hiding. But I didn’t want to hide my abortion: I wanted agency over when the story was told. Real data protection means you deal with your record on your terms.

* * *

In some places, the data trails we leave have become a way to criminalise us. Earlier this year, a heartrending photo from Nebraska showed a girl of 19 sobbing as she was escorted from court after being sentenced to 90 days in prison after having an abortion at the start of her third trimester. Police had obtained her Facebook messages with her mother about organising the pills. Her mother was given five years for procuring the pills and hiding the body.

In Texas, over Christmas lunch with my sister and her teenage daughter, I wondered what I would do if they needed help. On what platform could we speak safely? How could I send them information, or bootleg them pills? Texas, like most of the US, has no GDPR equivalent, meaning women have no real control over their data. This allows apps such as period trackers to share location and other data with the authorities when the law requires.

This summer, a British woman was sentenced to prison after her search history revealed that, during the pandemic, she had a late-stage abortion. A couple of phrases typed into Google – one was “I need to have an abortion but I’m past 24 weeks” – helped to convict her. (The sentence, which drew broad condemnation from MPs, doctors and feminist groups, was later suspended, but her criminal record stands.)

“The fundamental thing to understand about NHS England is that it is the government,” says Marcus Baw, a GP and IT specialist. By this he means that officials, not clinicians, will make many of the critical decisions about which data flows into the new NHS platform. These officials, not doctors, will interpret the law about who will have access to it, including other government departments. I have yet to see an official give a clear answer to this question: once a copy of my health data has gone into the central platform, who decides what happens to it? The answer will seriously affect the public’s trust in the scheme.

Peter Thiel, cofounder of Palantir, at the Bitcoin 2022 Conference in Miami.
Peter Thiel, cofounder of Palantir, which is bidding for the contract to run the NHS federated data platform. Photograph: Marco Bello/Getty Images

This is not a new problem. In 2013, NHS Digital was set up to develop and operate digital services for NHS England. Its first chair, Kingsley Manning, stepped down in 2016 because he said he got fed up with the Home Office demanding to see migrants’ health data, to check their location for border enforcement. This, understandably, deterred so many migrants from seeking care that the government had to run an information campaign during Covid to encourage people who had migrated to Britain to come for their jabs.

NHS Digital was also set up to protect patient data, with decisions about data-sharing theoretically insulated from central government control. It was abolished last year, in a move Manning described as a “retrograde step” with “the potential for undermining the relationship between clinicians and their patients”.

It is jarring to set these serious questions of patient trust against Rishi Sunak’s breezy promises, last November, about “using our new Brexit freedoms to create the most pro-innovation regulatory environment in the world in sectors like life sciences, financial services, artificial intelligence and data”. Palantir’s CEO, Alex Karp, has praised the UK’s “pragmatic but serious understanding of data protection”. He recently told the BBC: “If you’re a pharmaceutical company and you want to do research on a medicine, you’re going to be able to do things in the UK you would not be able to do easily in the continent.”

The winning bidder for the contract to run the federated data platform will soon be announced. The data protection and digital information bill, which weakens individual control over personal data, has had a second reading, a key step on its way to passing into law. In the name of progress, our health data is about to flow more freely, like it or not.

* * *

Palantir has spent years preparing the ground for its FDP bid. At the start of the pandemic, with the normal procurement rules suspended, Palantir was given a contract to build a national Covid-19 data store for the NHS for the nominal fee of £1, collecting vast sets of patient data to track the pandemic and, for example, create a list of vulnerable people who needed to shield at home. This data was used to distribute ventilators and manage the vaccine rollout. But the scale of the data collection caused alarm: one Whitehall observer described the level of sensitive patient data being swept into the Covid data store as “unprecedented”, and claimed there was insufficient regard to privacy and data protection. In December 2020, it was given a two-year, £23m contract to continue working on the project. That was renewed for another six months at a cost of £11.5m in January this year. Officials had promised to delete that data store when the pandemic was over – instead, Palantir is being paid £24.9m to migrate its data store to the new system. The procurement documents describe the purpose of the FDP as “to replace the Covid-19 datastore provided by Palantir”. If it’s anything like its predecessor, the central platform of the FDP will retain a copy of at least some patient data for all of England.

The rules of the FDP – the framework for handling patient records – are crucial. In particular, the public needs to know exactly which parts of their health record are being pooled or accessed nationally. Patients are also likely to ask what specific uses the FDP will be put to, and what may be bolted on later. (The procurement documents for the FDP explicitly say the purposes are set to expand.) Once the FDP is built and a permanent national flow of patient data is established, the main thing standing between that data and other uses is the health minister. There will be information governance advisers, and a watchdog, the national data guardian, but at this scale, these soft power checks and balances will be severely tested.

If the data protection and digital information bill passes, there will be even more reason to be concerned about the security of national health data. “Taken together, this bill maximises data and minimises governance of data fed to AIs,” says the privacy group medConfidential. Given Sunak’s stated enthusiasm for the use of AI in healthcare, the concern is that a senior official will later seek to open up the data for commercial purposes. Without clear, strong safeguards, more people will opt out, and the FDP may fail.

Anti-Trump demonstrators with a banner that reads ‘Take your hands off our NHS’ in Parliament Square, London, 4 June 2019.
A protest in London in 2019. Photograph: Tolga Akmen/AFP/Getty Images

The NHS is a boneyard of failed data-centralisation programmes. The first version of a national IT structure for the NHS was proposed by Tony Blair’s government in 2002. The idea was to replace paper records with a single digital version accessible to patients and doctors. The National Programme for IT ran into serious problems with software, and an over-centralised delivery. It cost more than £12bn over 10 years, and it failed partly because the proposed scale of data-sharing was seen as “highly problematic”. In 2014, David Cameron’s government tried again, with a programme called care.data. Again unresolved questions about corporate access, particularly insurance companies, sparked an outcry, provoking more than a million people to opt out of sharing health records. The government eventually gave up and cancelled the programme in 2016. Five years later, the government launched another project, General Practice Data for Planning and Research, to pool the most detailed data the NHS has: GP records. Following an outcry over government plans to make health data available to private companies, there was another huge wave of opt-outs. In August 2021, the scheme was paused. (My organisation, Foxglove, was involved in a legal case that precipitated the freeze.)

This adds up to more than 20 years of health centralisation plans – costing billions – and all of it foundered on the shoals of patient trust. Every scandal over centralising patient data also tends to increase the numbers of people exercising their right to “opt out” of sharing.

Statistics for the national opt-out serve as a rough measure of patient trust. It’s not a good picture: the number of patients who have opted out has ballooned over the years to some 3.3 million. (More than a million of those were added during the row over GP records in 2021.)

Opting out is currently the only practical way to register disapproval with state health plans, but it’s a blunt instrument. If you don’t want to give Google access to healthcare records to develop its algorithms, for example, or support the tobacco firm Philip Morris to develop inhalers, you’re also refusing positive uses of health data – say, helping NHS wards to plan ahead for numbers of patients vulnerable to flu. The current rate of opt-out is “incredibly problematic”, warns Dr Jess Morley, a researcher at Oxford University who has worked on patient data systems, because it makes the dataset “more random” and damages the value of the health data to the NHS. But to get these 3.3 million people to opt back in, the state will have to persuade them of its intentions – and its competence.

Recent polling by YouGov that Foxglove organised for a report also suggests that if the FDP were to go ahead in a form that is both managed by and accessible to private companies, around 48% of the residual population of England who have not opted out will do so. Anything like that figure will be disastrous for the NHS, because it would make the data useless.

* * *

Palantir isn’t the only contender for the platform. A team at Oxford University developed an open-source system called openSafely to support researchers during the pandemic. The system was secure and transparent – every question the system is asked is logged, and all approved projects are published. But it seems no one asked openSafely to help set up the FDP. There were other bidders, from Oracle Cerner to IBM to a “best of British” consortium of small UK companies and universities – but it’s not clear any of them had a serious shot. By the time the procurement opened in January, Palantir had its feet under the NHS table.

Palantir seems an odd choice of partner for the NHS. Originally funded by the CIA’s venture capital arm, In-Q-Tel, its early corporate history mainly involves supporting mass surveillance and the wars in Iraq and Afghanistan. In 2021, Palantir’s CTO, Shyam Sankar, said he hoped to see “something that looks like Palantir inside of every missile, inside of every drone”. (The company notes its services are also used by Ukrainian armed forces to resist Russian aggression, and by the World Food Programme.) Palantir helped the US and UK’s digital spy agencies (NSA and GCHQ) manage mass surveillance programmes such as XKeyscore, a system that tracked millions of people online. In Los Angeles, the police have used Palantir to build a tool, Laser, that claimed it would “extract” suspected offenders from the community like “tumors,” until community organisers forced the LAPD to scrap it. Not once have I seen a senior British official publicly reflect on how this history may affect trust in public healthcare among, say, migrant groups, or Muslims.

Palantir’s values appear to sit uncomfortably with the NHS. In January this year, Thiel described Britons’ affection for the NHS as “Stockholm syndrome” and said the best thing to do would be to “rip the whole thing from the ground and start over”. The company quickly distanced itself from its chairman’s remarks, which they said he made as a private individual.

This summer Karp hosted Sunak at a baseball game in the US, one in a string of lobbying efforts over the past four years. In 2019, Palantir served watermelon cocktails to the then-chair of the NHS, and Palantir representatives joined a closed-door chat at Davos with the then trade secretary Liam Fox and other tech executives. Fox’s briefing notes for that meetings said: “We have a huge and as yet untapped resource through the UK’s single healthcare system … We aim to treble industry contract and R&D collaborative research in the NHS over 10 years, to nearly £1bn.” (Again, there’s no suggestion that Palantir was seeking to sell data – just that UK officials were courting it and others to help the UK make health data commercially available.)

Views of Palantir’s usefulness to NHS staff vary. One hospital, Chelsea & Westminster, says its operating platform, called Foundry, which they have been trialling since early 2022, has helped cut waiting lists by 28%. Other local trials of Palantir’s software were apparently less successful. An answer to a parliamentary question earlier this year said that of the pilots of Foundry software at local NHS hospitals, 11 had been paused or suspended. Some have resumed, but a spokesperson for Liverpool Heart & Chest hospital stated that it had stopped using Foundry because staff had decided “it didn’t meet our needs”. Other trusts said they had more pressing problems to fix, and at least one said they had no one who could manage the system. A more recent letter to the health and social care committee updated these figures to say there are now 36 trials underway at various stages of rollout, with three of those paused due to restart, and five still suspended.

Turning to Palantir for so much of England’s infrastructure and analysis exposes a deeper problem: our own lack of data science staff. The NHS badly needs data scientists to make it fit for the future. (Palantir has hired two of its senior people.) In February, it was reported that the NHS has more than 3,000 vacant tech roles. NHS data scientists have warned that Foundry is not “user friendly”, and that the cost is “ridiculous”.

What would a better future for NHS data look like? It would probably start more humbly, with broad democratic engagement. The government could ask the public how they would like their data managed. Who do you trust to handle it, who would you like to be able to use it, and under what conditions? Should the opt-outs be reformed to give you more choice?

With a £480m contract about to be awarded, and the DPDI bill trundling ahead, expect to see a pitched battle over these questions in the coming months. It is a battle the government seemingly hopes to avoid, as it so often has in the past, by dodging debate rather than meeting tricky questions head on. Most people want to see the NHS use data better. But they also care deeply about who is given the keys to the kingdom.

Every prior effort to centralise NHS data has failed; this chapter is yet to be written. We can still get it right, and win trust for a better solution. If we don’t, many patients will do as I did, and turn away.

• This article was amended on 14 September 2023 to update the number of ongoing pilots of Foundry software.

• Follow the Long Read on Twitter at @gdnlongread, listen to our podcasts here and sign up to the long read weekly email here.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.