Skip to main content

What is shadow AI? Risks and solutions for businesses

Shadow AI is the unapproved use of generative AI tools and features by employees. Learn more about the risks and what IT teams can do to mitigate shadow AI.

Last updated December 17, 2024

A hand holding an origami crane as the person considers shadow AI and its organisational risks.

With the growth of new technology comes curiosity, especially for tech like artificial intelligence (AI) and automation tools that can make work easier and more efficient. Although, even with an increase and onslaught of ingenuity, companies sometimes resist adopting technology at first—or even second—glance.

Resisting change, however, doesn’t necessarily stop employees from secretly dabbling in AI use, especially as tools like Microsoft Copilot, ChatGPT, and Claude make this technology more accessible to non-technical employees. This is called shadow AI, and it is a growing phenomenon across various industries.

Use our guide to learn more about shadow AI, its rise in CX and other industries, risks, and mitigation strategies.

More in this guide:

What is shadow AI?

To satisfy their needs in the workplace, almost 50% of customer service agents use shadow AI.

Shadow AI is the use of unapproved or unsanctioned external AI tools without company knowledge or oversight from IT teams. Across various industries, employees turn to shadow AI like:

  • Content generators for writing emails

  • AI analytics tools to support reports

  • AI HR tools to screen job applicants

  • AI image generators

  • Privately installed AI coding assistants

  • Risk assessment tools to analyse credit or fraud risks

Also, according to our Zendesk Customer Experience Trends Report 2025, almost 50% of customer service agents use shadow AI, including:

  • Unsanctioned generative AI tools and software
  • Unauthorized AI service assistants

  • Unapproved AI-powered productivity tools

For businesses unwilling or unable to implement AI in customer service or other industries, shadow AI is often prevalent.

Shadow AI vs. shadow IT

Shadow AI and shadow IT both refer to the unapproved use of technology within an organisation, but the types of technology used and potential risks are where the biggest differences lie:

  • Shadow AI: Use of AI tools and technology without the approval of IT or data governance teams.
  • Shadow IT: The use of IT software, hardware, or infrastructure, usually on an enterprise network.

Shadow IT risks are contained to the teams or team members using unauthorised tools, while shadow AI risks can occur across an organisation.

The differences between shadow AI and shadow IT
Shadow AIShadow IT
DefinitionUse of AI tools and technology without IT or data governance team approvalUse of unapproved IT software, hardware, or infrastructure on an enterprise network
AdoptionAdopted by individual employees seeking to improve productivity and tool convenienceAdopted by employees or teams to address IT challenges in real time
Governance and complianceLacks IT or data team oversight and controlLacks larger IT or organisation oversight
Risks
  • Data privacy

  • AI model biases

  • Compliance violations

  • Lack of transparency

  • Data breaches

  • Regulatory non-compliance

  • Network security threats

Cultural impactEncourages innovation but risks inconsistency in data usage and decision-makingPromotes agility but risks a fragmented IT environment and reinforced silos
ExampleCustomer service team uses an unapproved AI tool to analyse customer sentimentEmployee uses an unapproved storage service to store and share work files

The rise of shadow AI

In some industries, shadow AI usage has increased as much as 250% year over year.

The explosive growth of AI technology, including generative and conversational AI, has given rise to its grassroots adoption. The increased accessibility of consumer-facing AI tools (that need little to no technical knowledge) and a lack of official AI governance enable employees to seek and use available yet unvetted AI solutions.

According to our 2025 CX Trends Report, shadow AI usage in some industries has increased as much as 250% year over year, exposing companies to significant risk. This development has serious implications for data security, compliance, and business ethics, and many employees opt for shadow AI instead of authorised, business-supported solutions because:

  • They are frustrated with existing tools.

  • They can easily access and navigate available solutions.

  • They need to fulfil specific actions.

  • They have a desire to increase personal and team productivity.

This chasm will continue to grow if CX Traditionalists delay the development of AI solutions, whether due to budget, knowledge, or internal support. But as organisations grapple with this new reality, many CX Trendsetters continue to strike a balance between using approved AI solutions, like AI agents and customer experience automation (CXA), and maintaining necessary oversight.

Shadow AI risks

The three most common shadow AI risks include data breaches, regulation compliance, and AI hallucinations.

If not properly managed or mitigated, shadow AI presents serious organisational risks, including:

  • Security vulnerabilities: From unsecured data access to data leaks, shadow AI often forgoes typical security measures and makes businesses vulnerable to attacks.
  • Information integrity: Since shadow AI has less oversight and vetted security protocols, business data may be compromised or at risk of being tampered with.
  • Compliance challenges: If employees share sensitive data with third-party AI platforms without company permission or knowledge, the potential for violating regulations and NDAs increases.
  • Cybersecurity threats: Unapproved and unvetted tools can introduce bugs, malware, or faulty code into business processes
  • Inconsistent quality: Isolated and non-interoperable AI solutions may produce inconsistent or unreliable outputs that can harm customer relations or employee reputations.

Even if employees adopt shadow AI to increase efficiency, they can negatively impact resource use, project scaling, and customer data privacy.

Shadow AI management and mitigation

Sanctioned tools like AI copilot help 93% of CX Trendsetters get comfortable with AI and explore advanced use cases.

AI in the workplace is here to stay, so getting rid of shadow AI without having a plan for adopting organisation-wide solutions is nearly impossible. Below, we’ve included a list of management and mitigation best practices to keep in mind while working towards change:

  • Sanction AI tools: If you want to mitigate shadow AI, provide reasonable and helpful tools for your employees to use. Often, sanctioned tools have enterprise licences that offer added security.
  • Set clear AI use guidelines. Create clear, concise specifications about your company’s AI use expectations. For service industry brands, follow AI ethics in CX best practices.
  • Develop an AI governance framework. Consider bias and culture while collaborating to create a set of policies and practices to guide the use and deployment of AI systems.
  • Create an AI Centre of Excellence (CoE). This impartial, diverse team or department manages and directs a business's AI initiatives.
  • Prioritise AI education and training. These programmes should cover the risks and consequences of AI use along with a rundown of how to use specific tools or solutions.
  • Create a safe AI usage culture. With your governance leading the way, lean into AI tools, like an AI knowledge base, to communicate to your teams that AI use is supported internally.
  • Support business and IT alignment. Ensure AI tools address operational needs and adhere to security, compliance, and performance standards.
  • Provide safe experimentation opportunities. Create designated environments for experimentation to allow teams to explore new AI applications in safe, monitored spaces.
  • Encourage AI transparency. Be open with your teams about the importance of fostering responsible adoption and integration of approved AI solutions by teaching them how your AI tools work and how they will use data.
  • Use quality assurance (QA) and monitoring tools. Regularly assess quality monitoring findings to assess whether your teams are using shadow AI based on the consistency and condition of their work.
  • Look for warning signs. Keep an eye out for unusual data patterns, unexplained productivity spikes, inconsistencies in responses or documentation, and non-standard work outputs (both negative and positive).
  • Assess potential use. Use network monitoring tools, anonymous employee surveys, and expense analyses to gain feedback about in-use tools and potential shadow AI use.
  • Recognise and integrate innovations. Recognise valuable shadow AI innovations and incorporate findings into your business’s systems to encourage continued creativity within approved applications.
  • Reinforce access controls. Manually select and activate access to sensitive data to safely manage customer data, even if employees use shadow AI.

It’s time to embrace the growth of AI, and our 2025 CX Trends Report found that 93% of CX Trendsetters agree using sanctioned AI tools helps employees get comfortable with AI and its advanced use cases instead of pointing them towards unsanctioned, risky alternatives.

Frequently asked questions

Boost your business with sanctioned AI tools

As organisations navigate shadow AI and its sanctioned counterpart, the key to success lies in proactive management that prioritises technological governance and employee empowerment.

With comprehensive strategies to offer stellar employee support through approved and vetted tools and solutions, companies can transform the challenges of shadow AI into opportunities for innovation and growth. By sanctioning AI-powered tools and software like Zendesk AI copilot, you can also support employee performance with suggested responses, real-time insights, and recommendations for personalisation.

Fight the shadows and set your teams up for AI success with a modern, secure, and sanctioned AI-powered employee service solution.