NHS England Puts Pause on AI Project Following Concerns Over Use of GP Data
NHS England has made the decision to put a pause on their project to use GP data to train an artificial intelligence model, known as Foresight, following concerns raised by GP leaders.
What Is Foresight?
Foresight, the AI model with oversight from NHS England, is trained on de-identified NHS data from roughly 57 million patients in England. The purpose of the model is to predict potential health outcomes for patient groups across England, based on knowledge about the patient’s condition. NHS England has previously described it as working “like an auto-complete function for medical timelines”.
It’s important to note that the fact that the data has been de-identified doesn’t mean it has been fully anonymised, meaning that in theory, individuals could be re-identified when this data is paired with other data.
What Are the Concerns Surrounding Foresight?
The concerns, which have been raised by the British Medical Association (BMA) and Royal College of General Practitioners (RCGP), are less about the use of an AI model, but more about the lack of transparency, potential misuse of data, and data governance, specifically around how GP data was used without any consent or communication process.
Lack of transparency: GP leaders say that they were not informed that the data, initially collected for the purpose of Covid-19 research was being repurposed for training Foresight.
Misuse of data: There are concerns surrounding potential misuse of the data beyond its intended scope, with RCGP chair Professor Kamila Hawthorne stating that “We want to be sure data isn’t being used beyond its scope, in this case to train an AI programme”. Using data for a purpose that differs from the original one, without proper safeguards or justification, would violate the principle of purpose limitation under UK GDPR.
Data governance and consent: It is unclear whether the correct processes were followed to ensure that data was shared in line with the expectations of patients and established governance process. It is vital that patients aren’t worried that what they tell their GP will be fed into an AI model without the appropriate safeguards in place. If patients feel there is a chance that their data is misused, it may lead them to provide less information and be less forthcoming, which may impact medical care.
The need for regulatory oversight: GP leaders have requested an external review on Foresight by the Information Commissioner’s Office (ICO) to assess the legality and regulatory compliance of the system.
Why does this matter?
This situation demonstrates the requirement for a balance between technological advancement and ensuring appropriate use of personal data. AI models like Foresight have the potential to completely transform patient care for the better, but without the appropriate measures being taken and full transparency, the risk of damage being caused (even if unintentional) is high.
NHS England has agreed to pause the project while their Data Protection Officer conducts a full review of the situation to determine if further action is required.
What can we learn from this?
Despite this being a rather unique situation, there are still a few key takeaways from this that also may apply to your business:
Transparency builds trust: You must always be upfront about how customer data is used, especially when it’s being re-purposed. In this case, GP leaders weren’t told that data collected for Covid-19 research was used to train Foresight. Doing things like this can reduce trust and cause reputational damage. You must clearly communicate any changes in data use to affected individuals.
Consent isn’t just a one-time action: Gaining consent is an ongoing ethical obligation. You must ensure that customer consent is specific, informed, and up to date. This is especially true if data will be used for purposed not originally specified.
De-identified data still carries risk: Even in cases where data is stripped of all direct identifiers, it can still pose privacy risks. You must apply the same level of security to de-identified or anonymised data as you would to more standard types of data.
Be prepared for regulatory scrutiny: Don’t wait for issues to arise or a complaint to be made to think about compliance. You should always consider whether an assessment such as a Data Protection Impact Assessment is necessary, which helps you identify and manage privacy risks early and make sure that your processing activities align with legal and ethical standards.
Next steps
Ensuring that you handle, process and store personal data in a way that’s in line with the law isn’t easy. To be confident that you’re compliant, and to address any concerns you may have, contact PRIVACY HELPER today, and let us remove the stress of managing data compliance in your business.