ThreatQuotient released its 2023 State of Cybersecurity Automation Adoption Research report this week. This is the third year that the research report has been carried out so some trends are beginning to emerge.
This year’s report threw up a number of interesting questions around ROI, how automation was perceived and use cases. To get a greater understanding of some of the key issues, Enterprise Times Editor, Ian Murphy, talked with Leon Ward, Vice President, Product Management at ThreatQuotient.
Awareness of automation is on the increase
Automation has been a hot topic in IT for the past few years, especially in some cybersecurity teams. How is that reflected in the report?
Ward replied, “The importance of automation has risen rapidly with 75% of respondents saying it’s important. That is up from 60% year last year. Efficiency is one of the main drivers as companies don’t want to waste anything, especially as cybersecurity operations teams have only so many hours in a day. It is about making them as efficient as possible.
“Regulations and compliance and also driving automation. Interestingly, there’s a significant drop this year for maintaining cybersecurity standards, which was actually the main driver last year.”
Automation needs focus and can be a good thing
In talking about the challenge of automating cybersecurity, Ward commented, “You get a lot of bad outcomes if you don’t focus on the right things.
“Why are people implementing it? There’s absolutely a problem with skills. Skill shortage is one of the main reasons people are looking at automation. It’s hard to find skilled cybersecurity people right now. It’s a real challenge globally. That lack of people and experience and resources and budget for them is one of the things that is driving people to try and automate some stuff and also retain staff.”
Ward believes that using automation to overcome a lack of skills is no bad thing. If automation can remove the drudgery, then it makes jobs more attractive which is likely to attract more people. It also gives those already in role a reason to continue working rather than job hopping.
How does generative AI fit into this?
There are numerous stories about the use of generative AI tools, such as ChatGPT being used in cybersecurity. It is a topic that always sparks debate at conferences. Ward says that ThreatQuotient is seeing an increasing interest in using such services.
One issue that Ward sees, is the risk of AI hallucination due to incomplete Large Language Models on which generative AI is built. He posed two questions that organisations need to consider when looking at generative AI. “How does AI, how does ChatGPT fit into solving some of these challenges? How do we adopt it in a responsible, trustworthy way?”
From ThreatQuotient’s perspective, Ward points to their integration with OpenAI. That allows organisations to take advantage of LLMs. However, he was clear that the company would not be building its own LLM around cybersecurity and risk. The recent announcement over integration is about allowing organisations to focus on their information.
Automation as an enabler for better auditing
One of the key challenges for respondents in several verticals and across all regions was the impact of compliance and regulation. This has, over recent decades, led to a welter of new standards and certifications. But the challenge with all of those is how to audit them to prove an organisation is in compliance. How does automation improve this?
Ward called out SOC 2, which has a focus on proving that an organization has implemented essential data security controls. SOC 2 Type 1 is about the processes and documentation. What interests Ward is SOC 2 Type 2. He said, “Type two is – we are going to audit you every few months and demand that you show us proof that you have followed all these things and demand you tell us for any violations you’ve identified and how you’re going to fix those violations.
“It’s a real heavy burden on us. It means that we have to do things like full background checks on people. We have to have employee performance reviews. I have to do performance management on my team make sure what their goals are and that they’re meeting their goals, even though they don’t access operational data in any way. But it’s part of that framework of what are you doing and how are you managing?”
Ward believes that automation is necessary to not only meet the requirements but to ensure that continuous auditing is possible. Without automation, the overhead is significant.
What other use cases are key in cybersecurity?
Ward sees several areas where automation benefits cybersecurity. There are the traditional areas of analytics and dealing with large amounts of data. Another is that of incident response to ensure that organisations have a consistent approach to managing incidents. More importantly, to make sure that everything that has to be done to respond to an incident, is done.
Vulnerability management is also a key area for automation. Ward points out that interest in that is up from last year. He said, “This is a great use case for automation, trying to decrease the horrendous amount of noise in vulnerability management reports.
“Classically, a vulnerability scan report will give you a list of all your assets and a whole bunch of things that you could be vulnerable to you. It results in too much work for a team to patch and resolve. There’s just too many vulnerabilities and too many pieces of software for a security team to be able to remediate so triaging comes to focus.”
What is interesting about that is the use of automation to identify assets and risk. It allows teams to prioritise their patching and other processes.
What is holding back automation?
While respondents were all keen on automation, every single one had experienced problems in the last year. Last year only 97% had experienced issues. So what were the problems?
Ward replied, “The top problem is a lack of trust in the outcome. This is followed by user adoption and then bad decisions.”
These are all related problems. If there is a lack of trust in the outcome, users are unlikely to adopt the solution. If it is making bad decisions due to poorly defined parameters or poor training, users won’t adopt it. It suggests that there is a lot of additional testing and work that needs to be carried out before automation is rolled out to production.
The challenge going forward will be to convince users that automation can be trusted and it will be interesting to see next year, how this has improved.
What KPIs are being used and where do they come from?
Key Performance Indicators (KPIs) are key to understanding anything. Typically you’d expect a list of technical KPIs such as uptime, incident reduction or faster incident response. So what was the key KPI in the report?
Ward replied, “The wellbeing of the team is the key metric for this year with ROI measurement for security automation. 61.5% say it’s how well they are managing the team, employee satisfaction and retention. That is up from 39% the previous year.
“We found that the industry that cared most about employee satisfaction and retention for automation was actually the defence industry. Our analysis was that this was because of how hard it is to retain good people to stay in those functions in the military, when there’s so much demand for those skills in the private sector.
“It’s balancing that idea of how much enjoyment, value and satisfaction I get from my job working for the government, versus probably less satisfaction, probably more stress and more money in the private sector. So those who can’t really compete with the money, try to compete with job satisfaction. Get rid of the stuff that’s boring and mundane and the processes they don’t like.”
Wellbeing is a deeper issue than it appears
For many, this will appear a strange KPI. But look in detail at what the wider challenges are in terms of retention, skills shortages, overworked employees, and this suddenly makes sense. Is it a good enough reason to be a traditional buying decision? Probably not for many organisations when you use the phrase wellbeing. But mix that with what it delivers, and suddenly it becomes a powerful KPI.
The respondents also provided some other clarity on wellbeing. If it is being used as a measure, it also impacts other decisions. Investment in smarter tools, greater flexibility in working arrangements and increasing teams.
What the report doesn’t do is look at what was seen during COVID. Many IT and especially cybersecurity teams discovered that remote working did not impact their job function. That has followed through with an expectation that you can recruit teams who are solely remote. In the US, it has led to some companies recruiting workers in places where pay and the cost of living is lower. While that hasn’t been seen worldwide, there is still an expectation of remote working.
The tools discussion is also of interest. Tool integration is a major pain point for cybersecurity. Giving people smarter tools to make life easier makes sense. It’s strange, however, that we have had to get to a discussion of wellbeing for this to be recognised.
Enterprise Times: What does this mean?
The report is worth reading. The additional insight from Ward is interesting, especially in how to interpret some of the statistics. However, automation is here to stay and isn’t going away anytime soon. That means that businesses need to be sorting out their reasons to automate and addressing the pain points.
The biggest challenge will be to rebuild trust and that can only come by focusing more on the areas to be automated and then testing. Small pieces of automation that are successful provide a great foundation. Large, ambitious projects, increase the chance of failure.