Main content

How and why employers should support whistleblowing in an AI era

Cath Everett Profile picture for user catheverett February 13, 2026
Summary:
The ethical and moral challenges of AI combined with inadequate government and private sector action to tackle them mean that whistleblowing is becoming an increasingly important guardrail in preventing serious incidents from taking place. So, what can employers do to support and even encourage such activity?

whistleblowing

Over the last couple of years, the tech industry has experienced a growing number of AI whistleblowers. Particularly high-profile cases include Daniel Kokotajlo, a former researcher in OpenAI’s governance function. He was one of the organizers behind an open letter signed by 11 current and former employees in 2024, which called on the company to create stronger whistleblowing protections. The aim was to provide researchers with a ‘right to warn’ about AI dangers without fear of retaliation.Kokotajlo had resigned from OpenAI earlier that year, accusing it of having a “reckless" culture and taking serious risks in the rush to develop Artificial General Intelligence.

Another notable whistleblowing case was former Meta Global Public Policy Director, Sarah Wynn-Williams. She made a series of claims about the social media giant’s culture and employee behavior, including sexual harassment, in her memoir ‘Careless People, published last year. Meta, on the other hand, attested it fired her for “poor performance and toxic behavior”, but also secured a gagging order to prevent her from publicizing her book. Wynn-Williams subsequently appeared before a US Senate judiciary subcommittee, alleging that Meta had undermined national security so it could build an $18 billion business in China. She also attested the company worked “hand in glove” with Beijing to build AI-based censorship and surveillance tools aimed at silencing critics of the Chinese Communist Party. Meta denied both claims.

Why AI makes whistleblowing harder

While these cases may be very different, they face a common challenge, points out The Future Society in a blog. This challenge consists of an absence of clear legal protections. Existing frameworks are not only inconsistent globally but are also enforced in different ways, which creates uncertainty and discourages action. Moreover, the blog says:

In many jurisdictions, existing whistleblower laws were not designed with AI-specific concerns in mind. As a result, disclosures about issues like unsafe model deployment, inadequate safety protocols or misalignment with stated ethical commitments often fall outside the scope of current regulations. Even when legal frameworks exist, they are often ineffective.

The Society adds that legal threats also act as a powerful deterrent:

Companies frequently frame internal disclosures as trade secret theft, which exposes employees to costly, reputation-damaging litigation. And even when whistleblowers do pursue responding with legal action, their cases can drag on for years, exacting a high financial and emotional toll.

Unfortunately, the nature of AI development only makes these problems worse though, the Society warns:

Whistleblowers often raise concerns like corner-cutting on safety, covert military collaborations, or misaligned incentives. These issues are typically difficult to categorize, hard to prove, and easy to dismiss. As AI firms grow more powerful and increasingly partner with national security agencies, their incentives to supress internal dissent deepen…[But] without credible protections for those who speak up, critical failures will remain hidden until they erupt into public crises.

A key challenge in this context is that even if internal reporting systems exist, they often do not work effectively. As the blog explains:

Employees may not know how to access them, may not trust that their concerns will be acted upon, or may fear that reporting will be seen as a personal attack rather than a professional responsibility. In the high-pressure, high-ambiguity environment of frontier AI development, these social and psychological dynamics can be paralyzing.

Catalysts for employers to take action

So, given this context, what is it that tends to push employers towards positive action and what can they do to address this worrying situation?

While one in three organizations admit treating whistleblowing as a low priority, the biggest catalyst for change tends to be a desire to take preventative action. This includes wanting to stop a recent media crisis from happening again or taking a risk management-based approach to dealing with changing legislation or mandates. Dr Enya Doyle, Founder of harassment prevention consultancy Enya Doyle Consulting, explains:

Employers often don’t have the time and resources to take whistleblowing seriously and they also often lack inhouse expertise. With employment lawyers, you’re often relying on a module they took when they qualified. HR directors don’t usually have the time to dedicate to it, and the leadership team range from apathetic to ‘not a priority’. So, it’s a lethal combination of different priorities. It’s not about intentionally looking the other way, but until a situation blows up, there’s always something else taking people’s attention.

Another challenge is a general “over-reliance on silence”, which means presuming nothing is wrong if people don’t make a fuss. As Doyle says:

If no one’s screaming, there’s an assumption there can’t be a serious problem. But this lack of curiosity is how many cases end up at a tribunal. I’ve seen a number of instances where if senior leaders had shown any curiosity, the situation could have been avoided. But it’s common to pass the buck to HR or an independent function, which end up with a bad reputation.

One of the reasons that many people refrain from whistleblowing, beyond a widespread fear of possible repercussions, is that they are unaware their employer has a policy about it or they have the right to do so, Doyle adds.

Practical steps to support whistleblowing

As a result, she advises introducing a consistent communications campaign rather than talking about it “intensively for three months and never mentioning it again, which happens more frequently than you might think”.

This includes clarifying what the company means by ‘confidential’ and being clear about processes and procedures and how they work in practice. Taking this approach also helps reassure employees that the situation will be dealt with objectively and they will be supported through it. But, Doyle says, it is just as important to:

Ensure your policy is fit-for-purpose. Many aren’t written with the audience in mind, which means they don’t help much. So, ensure it matches your processes and what you actually do.

Stephen Simpson, a Principal HR Strategy and Practice Editor at HR consultancy, Brightmine, agrees:

The language you use is important as there’s a stigma around whistleblowing. So, some employers try to avoid technical terms and use ‘raising concerns’ rather than ‘disclosure’, for example.

Simpson also advises offering a variety of confidential, anonymous ways of reporting, which include hotlines. The idea here is that, while line managers are often cited as the first point of contact, an alternative needs to be in place if the problem is with them. An appeals process is also useful should people dispute or be unhappy with the outcome of an investigation.

Another important consideration is ensuring that those investigating a whistleblowing situation are properly trained in the particulars rather than just in general investigation techniques. Doyle explains:

So, are they trained in sexual harassment, and do they have a beyond average awareness of the myths that are told about it? Or do they know the ins and outs of investigating fraud and property information theft? All too often you have senior leaders investigating things with 101 training they received 18 months ago.

Tackling the cultural issues

To encourage whistleblowing, the most important thing of all though, believes Helen Dallimore, is to ensure a culture of trust and psychological safety exists within the organization. A Director and Head of Training at workplace culture change consultancy Byrne Dean, she says to this often involves bringing about cultural change from the top:

Senior people have to want to hear it and to value diverse views rather than simply enabling a culture of groupthink. It’s very hard to openly challenge the status quo if you don’t feel you can ask questions or be yourself. So, it has to be part of the organization’s values and part of the senior leadership’s approach that if people share different views, they’ll be taken on board. Not everyone may agree with them, but their views need to be valued.

A priority here is helping leaders and managers understand why it makes sense to reframe their emotional reaction to whistleblowing and to welcome rather than fear it or be defensive about it. Dallimore explains:

So, if for example, there are concerns about unethical behavior, welcome people’s input as they’re trying to address the problem rather than create it. The fact they’re raising concerns at all is positive as it shows things are working. If they don’t, it could indicate a lack of psychological safety. So, you can read the data on the number of people who raise concerns in different ways.

As for the thorny issue of retaliation, this requires the introduction of proactive anti-retaliation policies, unconscious bias training, and objective processes in areas, such as performance reviews.

In relation to the specific issue of AI whistleblowing, meanwhile, there are a couple of options for potential candidates. Firstly, whistleblowing NGOs Protect, The Signals Network and Whistleblowing International Network have jointly developed a practical “Tech Workers Guide to Whistleblowing”.

But a dedicated AI Whistleblower Initiative (formerly OAISIS) was also set up in early 2024 to support individuals wishing to flag up any risks related to the technology in a safe and legal manner.

My take

In my view, The Future Society’s blog sums up beautifully why whistleblowers are so important, particularly in the AI era:

Whistleblowers can help us spot risks before they escalate into serious incidents. As governments prioritize geopolitical advantage and AI companies resist scrutiny, whistleblowers remain one of the few guardrails against a potential serious AI incident ahead. Like their counterparts in the fields of aviation, healthcare, and finance, AI whistleblowers can catalyze vital reforms if we create the conditions that allow them to come forward.

Image credit - Pixabay

Read more on:
Loading
A grey colored placeholder image