top of page
Search

Bring Your Own AI - The Risks for Data Protection

ree

It’s becoming common to see employees using their own AI tools at work; like a comms officer who drafts with ChatGPT, a finance manager who automates reconciliation through a plug-in, a policy lead who runs data through an “AI summariser” to save time. Small, pragmatic innovations emerge as people find ways to work within systems that often struggle to keep pace with real-world demands.


For Data Protection Officers, this “Bring Your Own AI” trend is both inevitable and risky. Much like the old debates around Bring Your Own Device, it blurs organisational boundaries: personal tools, professional data, invisible processing chains.


Why It Matters: Governance, Contracts and Accountability

Healthcare data governance frameworks are built around a simple assumption: that patient data stays within the systems and contractual boundaries defined by the host organisation. “Bring Your Own Tech” compromises that.


  1. Data Protection Law: The host organisation becomes responsible for processing that it may not even know is happening. There’s often no lawful basis or data-sharing agreement in place.

  2. Information Security: Their tools may not meet policy or security standards.

  3. Contractual Liability: Standard employee contracts rarely cover secondary systems or third-party AI tools, leaving gaps if data is mishandled.

  4. Clinical Record Integrity: Records created via personal tools may not integrate properly with official systems, leading to fragmentation or missing data in company records.

  5. Information Rights: If a data subject asks for a copy of the record created, the organisation may not be able to obtain it easily.


What Organisations Can Do Right Now

  1. Review Policies and Contracts: Update data handling, information governance, and acceptable-use policies to include all third-party technology, both AI and traditional software. Make sure contracts explicitly restrict use of unapproved systems.

  2. Provide Safe, Approved Alternatives: Offer centrally managed, compliant tools so staff have no need to rely on their own accounts.

  3. Raise Awareness Across Clinical Teams: Many staff assume their own tools are harmless. Clear communication and training can help them understand the risks and organisational accountability requirements.


A Cultural Shift, Not Just a Compliance Task

This isn’t about catching people out, it’s about recognising how fast technology habits are changing in the workplace. Tools that were once personal productivity aids are now powerful, networked systems capable of processing vast amounts of patient data.

The challenge for organisations is to create an environment where innovation and compliance coexist, where staff can use efficient tools safely, within proper governance frameworks.



Accessibility and Reasonable Adjustments

A final note of caution: some employees use specific software or hardware as a reasonable adjustment for a disability or health condition, for example, assistive dictation or speech-recognition tools.


Organisations should not prohibit these automatically. Instead:

  • Ask employees to declare any assistive technologies they rely on.

  • Conduct a quick risk and DPIA review to understand how the tool handles data.

  • Where the tool cannot be safely approved, aim to provide an equivalent, accessible alternative.


  1. Asking only after a employee is engaged removes the risk of discrimination or perceived bias in selection.

  2. Framing the question around technical requirements rather than personal conditions keeps the focus on operational readiness, not on the person’s health status.


Perhaps something like

“Do you require any specific technology, equipment, or configuration to support with health, access, or support needs?This information helps us assess compatibility with our security and interoperability standards.”

This approach respects both data protection and equality obligations, ensuring that clinical safety and inclusivity move forward together.

Do confirm the approach with your HR advisor.


Emma Kitcher, AI and Privacy nerd
Emma Kitcher, AI and Privacy nerd

 
 
 
00011-2939233035.png

DID YOU FIND THIS USEFUL?

Join our mailing list to get practical insights on ISO 27001, AI, and data protection; No fluff, just useful stuff.

You can unsubscribe at any time. You are welcome to read our Privacy Policy

bottom of page