P R E S S   R E L E A S E

US court ruling could put NHS AI scribe tools in breach of data protection law

“Many people won’t realise the weight of this – but it puts NHS patient data at serious risk,”  says Dr Andrew Whiteley, Lexacom

A landmark US court ruling could have major implications for NHS GP practices, according to British medtech firm Lexacom. The company warns that some AI-powered scribe tools may now breach UK data protection law – without practices even realising it.

Where your data goes matters - is your AI scribe in breach of UK data protection laws?

The ruling stems from a case brought by The New York Times, in which a US federal court ordered OpenAI to indefinitely retain all ChatGPT and API output logs, including those generated via third-party integrations.

This undermines the long-standing assumption that AI scribe tools used in the NHS do not store patient data – an assumption central to many clinical safety assessments and data protection assurances.

“It’s deeply worrying that a legal decision in another country could have such serious consequences for patient data here in the UK,”  says Dr Andrew Whiteley, a former GP and founder of Lexacom.

“Many people won’t realise the weight of this – but it puts NHS patient data at serious risk. Despite what suppliers may be telling NHS practices, this shows that a court outside the UK can override those assurances, compelling big tech companies like OpenAI to retain data indefinitely – even when it is processed on UK soil. This places patient information outside the protection of UK law and exposes both privacy and NHS compliance to serious risk.”

“The only truly safe way to protect patient confidentiality is to ensure that identifiable data is never sent to big tech for AI processing in the first place”

In anticipation of growing privacy concerns around AI in healthcare, Lexacom launched Patient Shield® earlier this year. A first-of-its-kind tool, it ensures patient confidentiality by automatically redacting personally identifiable information before any AI processing takes place.

While public sentiment towards AI in healthcare is becoming more positive – with over half of people (54%) supporting its use in patient care – a recent YouGov survey commissioned by Lexacom found that 73% of people do not trust big tech firms with their personal information. Patient Shield® bridges this trust gap by giving NHS clinicians a safe way to benefit from AI while protecting patient privacy.

“The only truly safe way to protect patient confidentiality is to ensure that identifiable data is never sent to big tech for AI processing in the first place,” adds Dr Andrew Whiteley. “By redacting it before any AI processing occurs, there can be no question of personal data being retained by an AI system or moved offshore – regardless of what a court beyond UK legal jurisdiction may compel a tech provider to do.”

 


Find out more


With a 25-year track record of NHS-compliant innovation, Lexacom currently supports 60% of UK GP practices. The company pioneered:

  • The first clinical system integration in 1999
  • Mobile dictation in 2013
  • Cloud-based speech recognition in 2016
  • Redaction-first ambient AI with Patient Shield® in 2025

 


Dr Whiteley is available to comment on:

  • The legal and clinical implications of the OpenAI ruling for UK healthcare
  • Why many ambient AI scribes may now be operating outside NHS compliance standards
  • How Lexacom’s redaction-first model sets a new standard for safe AI in general practice

All figures, unless otherwise stated, are from YouGov Plc. Total sample size was 2,266 adults. Fieldwork was undertaken between 20th – 21st March 2025. The survey was carried out online. The figures have been weighted and are representative of all UK adults (aged 18+).

E N D S

logo