top of page
Search

Mind the Gap: AI, the Equality Act, and Making Sure Your Meeting Scribe Works for Everyone

Updated: Aug 9

ree

Organisations increasingly rely on AI tools to streamline internal processes, and meeting transcription tools are one of the most widely adopted. It is easy to see the appeal: AI scribes promise efficient, real-time capture of discussions, action points, and decisions.


But what happens when the tool does not work equally well for everyone? What happens when it consistently misrepresents, mis-transcribes, or outright ignores certain voices?


If your AI cannot understand a member of staff because of their accent, speech pattern, or neurodivergence, it is not just a technical glitch. It may well be a breach of your legal responsibilities under the Equality Act 2010.


The Equality Act: Why Does it Matter?

The Equality Act 2010 protects individuals from discrimination, harassment, and victimisation in the workplace on the basis of protected characteristics.

These include:

  • Race and ethnic background

  • Sex

  • Disability (which includes many forms of neurodivergence)

  • Gender reassignment

  • Age

  • Religion or belief

  • Sexual orientation

  • Pregnancy and maternity

  • Marriage and civil partnership


Under this law, employers have both a duty not to discriminate and a positive duty to make reasonable adjustments where a process or system places someone at a disadvantage.


When AI Transcription Tools Discriminate


Many AI scribes struggle with:

  • Accented English, whether regional, international, or speech patterns shaped by disability

  • Female voices, particularly in male-dominated conversations, where pitch or cadence may be softer

  • Neurodivergent speech patterns, such as differences in pacing, syntax, or tone


These errors are not benign. If a system fails to transcribe certain voices accurately, or at all, it introduces structural exclusion into your most basic workplace processes.

Imagine being the only person whose contributions are routinely misquoted or omitted. That is not just frustrating. It may be evidence of indirect discrimination, particularly if the pattern correlates with a protected characteristic.


Reasonable Adjustments in a Digital Age

If a staff member is disadvantaged by an AI tool, whether due to a stammer, autism, or any other disability, the employer has a duty to make reasonable adjustments. This could include:

  • Choosing transcription tools with better linguistic range and inclusivity

  • Allowing human review for meetings involving sensitive or complex input

  • Providing alternative methods of engagement, such as written contributions or visual notes


Failure to do this does not just risk poor workplace culture. It could result in a legal claim.


What Employers Should Be Doing


  1. Test your transcription tools in the real world

    Do not just accept vendor claims. Trial your tool with diverse staff, including those with regional accents, neurodivergent communication styles, or non-native English fluency.


  2. Log accuracy issues as inclusion risks

    Treat recurring transcription errors as a workplace equity issue, not just a technical one. Who gets misquoted? Who gets left out of the record?


  3. Embed accessibility checks into procurement

    Ask suppliers how they have tested for bias. What speech samples did they train on? How do they ensure equity across voice types?


  4. Offer alternatives by default

    If someone asks to opt out of AI transcription or requests an alternative format, be ready to say yes. That is part of your reasonable adjustment duty.


🧠 AI Is Not Neutral. But It Can Be Accountable.

The promise of AI in the workplace is not efficiency alone. It is the possibility of equity through thoughtful automation. But unless we interrogate how these tools work, and who they exclude, we risk recreating analogue injustices in digital form.

The Equality Act 2010 does not stop being relevant just because a tool is clever.


If anything, the smarter the tool, the greater your responsibility to ensure it serves everyone.


Emma Kitcher, AI Nerd
Emma Kitcher, AI Nerd

 
 
 

Comments


00011-2939233035.png

DID YOU FIND THIS USEFUL?

Join our mailing list to get practical insights on ISO 27001, AI, and data protection; No fluff, just useful stuff.

You can unsubscribe at any time. You are welcome to read our Privacy Policy

bottom of page