Shadow AI is already inside GP surgeries and the NHS can’t control it
Leaked consultations, offshore servers, and compliance blind spots: how unapproved voice-tech is exposing patients and leaving surgeries liable.




When Sky News reported earlier this year that patient data from a GP surgery had been exposed online through a trial voice-tech tool, it barely made the front page. But for doctors inside the system, the story was a warning shot. What looked like a harmless transcription app had been quietly recording consultations and routing them offshore. No consent forms. No safeguards. Just another example of how AI slips into the NHS under the radar and puts both patients and practices at risk.
Voice technology is marketed as a cure for burnout. The pitch is simple: let software handle note-taking, so doctors can spend more time looking patients in the eye. But behind the pitch lies a growing problem. Across the NHS, unapproved AI tools are being used in clinics without the knowledge of patients, regulators, or sometimes even the IT departments that are meant to oversee them.
This hidden market what experts now call shadow AI is quietly reshaping the consultation room. And it is happening without the checks and balances that are supposed to keep sensitive data safe.
A system open to risk
Earlier this year, NHS England’s clinical safety team issued an unusual public warning. Ambient voice-tech tools that hadn’t gone through DTAC the Digital Technology Assessment Criteria were unsafe. They risked breaching GDPR, the Data Protection Act, and the NHS’s own clinical standards.
The warning was blunt, but it hasn’t stopped adoption. Across the country, surgeries are trialling apps that record patient voices and transcribe consultations in real time. Some are doing it with board approval. Others are doing it quietly, encouraged by vendors offering free pilots. The pressure to cut costs and time means many are tempted to accept.
What few realise is that the free trial is often the most expensive decision they will ever make.
The real cost of “free”
In one case highlighted by campaigners, a US-based vendor openly admitted that it trained its large language model on consultation transcripts from the UK. The company argued that data was anonymised, but investigators found that metadata appointment times, GP names, surgery codes made re-identification possible.
That isn’t just sloppy practice. It’s a direct breach of the NHS’s “Data Saves Lives” policy and of the ICO’s own guidelines. And the consequences don’t stop at fines. For any clinic, once patient trust is broken, it may never return.
As one GP partner in Manchester told us, “The fine is bad, but what really kills you is when patients stop believing you’re protecting them. That’s when the waiting room empties.”
The problem is no longer hypothetical. Breaches are stacking up.
In Leeds, one AI transcription app mislabelled asthma medication, confusing “Salbutamol” with “Sertraline.” The GP caught it, but another might not have. The result could have been catastrophic.
In Bristol, a children’s unit found that audio recordings were stored unencrypted on Amazon Web Services. When the vendor pushed a routine update, permissions reset. For 48 hours, 93 consultations were searchable on Google.
In Manchester, a whistle-blower revealed that an app advertised as “UK-based only” was in fact routing calls to processors in India and the Philippines. The contract said one thing. The software did another.
Each case is different, but the pattern is clear: vendors move fast, oversight moves slow, and surgeries are left to carry the liability.
The insurer’s dilemma
Few practices realise the knock-on effects of shadow AI on their indemnity cover. If a data breach is traced to a tool without DTAC approval, insurers are entitled to refuse cover. That means the practice itself must pay damages.
For NHS trusts, that might mean political embarrassment. For private practices, it’s existential. A single breach could destroy financial stability overnight. One compliance lawyer we spoke to put it bluntly: “If you’re private and uninsured, you’re finished.
Britain isn’t alone. In the United States, the Mayo Clinic suspended trials of an ambient AI tool after HIPAA violations surfaced. In Germany, regulators banned a voice-dictation vendor when it was discovered that consultation data was being shipped to servers in India.
Wherever you look, the story is the same. Tech companies promise efficiency. Healthcare systems, under pressure, accept. And when the breach comes, it’s clinics and patients who pay.
What every surgery should be asking
The question is no longer whether shadow AI is in the NHS. It’s already here. The real question is whether surgeries know what to ask before they use it.
Every GP partner should be able to answer five things today:
Where is patient audio stored which country, which cloud, with what redundancy?
Does the supplier hold valid DTAC approval, or an NHSX waiver in writing?
Can model retention be switched off, so consultation data isn’t used to train the vendor’s AI?
Has a Data Protection Impact Assessment been signed, dated, and filed?
How will the practice prove human override if the transcript gets a drug name wrong?
If the answer to any of those is silence, the practice is already in danger.
Regulators won’t wait
When the ICO or the CQC investigates, they ask simple questions. Audit logs. Supplier contracts. DPIAs. Proof of patient consent. Incident reports. They won’t care if the app was free or the vendor persuasive. They’ll care whether the clinic was compliant.
And when the paperwork isn’t there, it isn’t the vendor who pays. It’s the surgery.
There are approved vendors. Tools like Lexicon Echo and Augnito Dictate have DTAC sign-off, UK-based storage, and integration into NHS systems. They are slow to roll out and not always glamorous. But they meet the baseline.
The danger isn’t that safe tools exist. It’s that so many clinics are gambling with unsafe ones, convinced nothing will go wrong.
The reality is stark. Shadow AI is here to stay. Voice-tech is too useful to vanish, and too profitable for vendors to walk away from. But unless surgeries stop treating it as a harmless experiment, the breaches will keep coming.
For practices, the choice is simple. Map what’s in use. Demand contracts. Check compliance. Or risk becoming the next case study.
Because in this story, there are no hypotheticals left. The leaks are real, the breaches are real, and the knock at the door from regulators is only a matter of time.