The hidden cost of ‘Shadow AI’

By Caroline Fanning, Chief Employee Success Officer at The Access Group.

A quiet revolution is happening inside companies right now. Employees are increasingly using unauthorised AI tools to boost their productivity. This is ‘Shadow AI’, the unsanctioned use of AI tools by staff who want to be more productive. While this may seem harmless, it's creating a critical business risk.

This problem is familiar to IT teams, and the established lessons from managing Shadow IT can be directly applied to Shadow AI. Fundamentally, shadow AI is the same challenge with a different coat on. The good news is, there are proven steps to take to solve it.

The Rise and Risks of Shadow AI

But why is it even a problem? As AI embeds itself deeper into our workflows, so does the risk of leaving it unregulated. This secret use of AI poses serious threats, including everything from major data leaks to compliance violations, resulting in lost revenue and reputational damage. Uncontrolled AI adoption is creating a huge blind spot for businesses. Sensitive data is input into systems that have not been thoroughly checked, and corporate regulatory rules could be ignored or bypassed. This, along with the use of faulty AI tools, can create complex issues that are difficult to solve.

The core issue is that the rapid adoption of AI is outpacing corporate governance. Research from Access Group reveals the scale of this issue. While 65% of employees said they would happily tell their boss they used AI, a significant portion admit to covert use. The quest for efficiency is ongoing, and 35% of employees admit to using AI to "get ahead" in a way they wouldn't tell their boss about. This jumps to 55% for 18-29-year-olds.

When looking at shadow AI, or shadow IT, in general, it's helpful to think about why this happens. The reality is that people want tools that make their jobs easier and allow them to do more. If approved tools aren’t available to them (or they are and they don’t know about them), they may go and find their own – especially when alternatives are free and readily available (like LLMs).

The opportunity cost

Away from the active risk of shadow AI and costs from a security and compliance perspective, there is something else to consider: the opportunity cost. Shadow AI is, by its definition, extremely siloed. We’ve already covered the risks of this from a governance and security perspective, but it also means the outcomes for the person (and the business) will be limited. There’s an opportunity cost from people using siloed, unapproved tools when

they could be using something the business has direct control over and knows works efficiently.

Instead of siloed shadow AI, which uses generic public tools, teams could be armed with something industry-specific or purpose-built, helping to ensure that AI has the data it needs (and not data it shouldn’t) and is integrated across people, process, and other company technology. It is also important for the whole team to have access to the tool to remove the duplication of effort. There is no need for multiple teams to waste time on problems that another team has already used AI to solve.

Shadow AI use also means a fragmented and even secretive knowledge base where no one is sharing learning from AI tools. Removing the element of shame and secrecy around AI will ensure companies don’t miss out on the collective improvement gained from using tools out in the open.

From Risk to Opportunity

The best way to counter shadow AI is to offer something more convenient and better. This is a trend across sectors; for example, film and TV piracy declined after the rise of streaming because there was a more convenient and quality alternative. If employees have effective AI tools readily available, they are much more likely to use these rather than unreliable or dangerous alternatives.

Instead of banning this Shadow AI, companies should see it as an opportunity. Employees are showing their appetite for AI use and intent to improve efficiency. By creating policies and providing guidance on AI tools, businesses can turn this risky trend into a strategic advantage before a major incident occurs.

Businesses should provide actionable steps for employees to harness this AI enthusiasm. For example, establishing clear governance and providing secure, integrated tools to foster an open dialogue. Discussion of AI tools use should be destigmatised and instead encouraged. Providing suggestions of corporate-approved tools will ensure the conversation is more open and that employees can experiment with AI securely and efficiently.

Businesses need to go beyond point AI solutions and empower workers across the organisation with connected AI systems. This alternative to shadow AI not only reduces risk but also means better outcomes for the businesses and employees.

The Solution?

By creating policies and providing guidance, businesses can turn this risky trend into a strategic advantage before a major incident occurs.

Employees should feel empowered to use AI. Instead of banning AI, companies should manage and embrace these tools. This approach can turn a potential security threat into a strategic business advantage.

By Nadeem Azhar, Founder and CEO of a Houston-based technology firm specializing in...
By Matt Tebay, Multi-cloud evangelist, OVHcloud.
Many UK businesses are still reeling from last year’s global IT outage that brought systems to a...
By Kyle Hauptfleisch, Chief Growth Officer, Daemon.
By Pascal Lemasson, AVP Business Development and Sales - Europe at MediaTek.
By Subhashis Nath, AVP and Head of Analytics, Infosys.
By Kaspar Korjus, CEO and co-founder at Pactum.