Research commissioned by Software AG has found that half of all employees are using Shadow AI (i.e. non-company issued AI tools). This finding explores the AI habits of 6,000 knowledge workers. According to the research 75% of knowledge workers already use AI. This is set to rise to 90% in the near future. The surprising thing is that more than 50% of this group are using personal or otherwise non-company issued tools. More surprising still is that half of these employees are so attached to such tools that, even if their company banned their use, they would still continue using them.
This is the Shadow AI phenomenon. An environment where employees source and use AI tools to complete tasks for employers without formal approval or IT oversight. Battling burnout, skills shortages, and reduced headcounts, staff have turned to GenAI as an invaluable tool to boost productivity. These employees are able to tick off their day-to-day responsibilities. However, Shadow AI means that businesses have little visibility or control of the AI applications used in-house. This could lead to cyber-security risks, skills gaps, and inaccurate work.
AI tools an invaluable toolkit
The research goes on to show that personal AI tools are so valuable that half of workers (46%) would refuse to give them up. Even if their organisation banned them completely. This is a powerful signal to organisations needing more robust and comprehensive AI strategies, to prevent inviting significant risk into their business.
Steve Ponting, Director at Software AG said, “If 2023 was a year of experimentation. 2024 will be defined as the year that GenAI took hold. 75% of knowledge workers use AI today, that figure will rise to 90% in the near future. This increase it expected to help save time (83%), makes employees’ jobs easier (81%) and improves productivity (71%). As usage increases, so does the risk of cyber-attacks, data leakage or regulatory non-compliance. Consequently, business leaders need to have a plan in place for this before it’s too late.”
The survey found that AI is having a day-to-day impact on individuals. 47% of workers believe these tools will help them to be promoted faster. This suggests a future where AI tools are wholly ingrained in many roles due to their criticality in job success.
The AI utility gap
Most knowledge workers said they use their own AI tools because they prefer their independence (53%). An additional 33% said it’s because their IT team does not currently offer the tools they need. That suggests that if businesses want their employees to use officially issued tools, a different process is needed. A process for determining which ones are actually made available.
Risk Management
Most employees aren’t blind to the risks of their AI choices, and high volumes recognise cybersecurity (72%), data governance (70%), and information inaccuracy as potential pitfalls. However, businesses should be concerned that few employees take adequate precautions like running security scans (27%) and checking data usage policies (29%).
J-M Erlendson, Global Evangelist at Software AG, added: “There is some comfort that regular users of AI are better prepared to mitigate risks compared to occasional users. This fact alone should encourage organisations to implement more rigorous training programmes. Many still don’t have anything robust in place. We need this now, because the future – where 90% of workers use AI – is just around the corner. Furthermore, this will bring more of the occasional users, which is a problem. This group is far less adept at taking risk management precautions compared to their more experienced counterparts. But they’re just as likely to take the risks.”
Erlendson continued, “Shadow AI is not going anywhere, but it is supercharging the operational chaos already engulfing many organisations. A transparent framework for their processes, coupled with an understanding of the tools employees want – and the training needed. These are good building blocks for better incorporating Shadow AI. It’s clear that AI is not going away, and, collectively, we need to address it in the right way now.”
Research Methodology
Six thousand knowledge workers (defined as those who primarily work at a desk or computer) in the US (2,000), UK (2,000), and Germany (2,000) were surveyed between September 13-25, 2024. All respondents were 18 or older, and the sample was census-balanced by age. The survey respondents were independently sourced from RepData and the research was conducted by TEAM LEWIS.
Enterprise Times: What this means for businesses
This concept of a shadow AI is an interesting idea, but it is as old as the history of business technology. IT departments, frontline departments, and the individuals working within them have occasionally always ‘gone it alone’. They have used software and tools to complete day-to-day tasks without formal approval or IT oversight.
So, if Shadow AI isn’t going away, what can companies do about it? As the report suggests, Shadow AI needs to be embraced and channelled. As more people use AI, enterprises need to create processes that allow for its inclusion. At the same time, without exerting excessive control over specific tools that results in more off-the-books usage.
Transparency of processes is a key building block in this regard. Even if the business doesn’t officially sanction a tool, it should be properly evaluated. When its use can be integrated into key processes, then data use, compliance and efficiency can be monitored and optimised.