Gigamon has published its 2024 Hybrid Cloud Security Survey (registration required), and the results are not good. It shows that security teams are losing ground to cybercriminals as cybercrime rises by 20% in a year and detection rates drop.

Gigamon states that the survey shows organisations around the world are still startlingly unprepared for modern, sophisticated cyberthreats. More concerning is that Gigamon says that the report shows a decline in detection and response capabilities year-on-year.

Mark Jow, EMEA Technical Evangelist at Gigamon (Image Credit: LinkedIn)
Mark Jow, EMEA Technical Evangelist at Gigamon

Mark Jow, EMEA Technical Evangelist at Gigamon, said, Cyber risk is firmly in the spotlight this year, with governments and boardrooms finally recognising its place at the very top of the business risk register. Yet, cybercriminals are evading detection over a third of the time.

Todays MELT-based (Metrics, Events, Logs, and Traces) approaches are no longer enough, as organisations need 360-degree visibility across the hybrid cloud. Whether organisations are fending off AI-powered attacks, integrating AI-powered solutions into hybrid cloud environments, or seeking to establish Zero Trust, deep observability is fundamental to success. 

Survey data points to multiple issues

The survey data points to several serious issues that call into question how well cybersecurity teams are doing. Of additional concern is that there is no one cause for these issues. They cover tooling, processes, technology, work practices and blind spots throughout businesses. Some of these are addressed in a podcast between Karl Van den Bergh, CMO of Gigamon and Enterprise Times.

All of this leads to six key findings in the report. Each of these is based on multiple data points from the data.

Organisations are not prepared for today’s attacks

In the last twelve months, 37% failed to detect a breach using their existing tools. 31% only realised they had been breached when an extortion demand was received. A similar number only knew about a breach when data appeared on the dark web. It is not clear what the overlap between these groups is.

All this means that 65% of Security and IT leaders don’t trust their tools. If there is no trust in the tools, there can be no trust in security. It raises the question as to what needs to be done to repair trust.

Encrypted East-West traffic is an issue

This is something that Ven den Bergh talks about in the podcast. The number of organisations that have visibility into East-West traffic dropped to 40% (down 8% from 2023). That poor visibility into encrypted data (73%) makes it a bigger issue than North-South traffic.

Of great concern is that 93% of malware now lurks behind encryption. If this weren’t serious enough, as Van den Bergh points out in the podcast, 76% trust the encrypted traffic is secure, and 62% say it is less likely that traffic will be inspected. The result is that not only is malware allowed to move undetected through the organisation, but data movement to the point of exfiltration is also missed.

CISOs are under pressure

85% of CISOs report that cloud security is an issue for the board, which aligns with other surveys. Main boards have been taking a much wider interest in cybersecurity challenges over the last decade. Part of that is due to a move away from selling through FUD (Fear, Uncertainty, Doubt) and a focus on risk.

Zero Trust mandates in the US are also having an impact on the board and the CISO. 44% now say that meeting Zero Trust deadlines is a serious concern.

One of the more interesting statistics from the report is that 85% of CISOs want a proactive mindset. But they need greater visibility to get there, which has declined in the last year. What is not clear is how they intend to achieve that. Will it be through new tooling or something else?

Siloed tool stacks need an overhaul

This comes as no surprise. The problems caused by complex tool stacks have been known for years. 69% of respondents complained about their tool stacks. Yet 87% are still spending money on new security tools.

There seemed to be no qualitative research examining how the new tools will be used. Are they like-for-like replacements? Are they a consolidation of multiple tools into a single tool? How many integrate into existing platforms? What is the learning cost of adopting them? Do they begin to address existing gaps? Are they future-proofed?

The latter point is important, as 54% claim they are strongly prepared to identify threats across their hybrid cloud infrastructure.

Regulation is elevating Zero Trust

The report already identified 44% of CISOs saying meeting Zero Trust deadlines is a serious concern. However, Zero Trust is going deeper than that. 64% of organisations expect a mandate in the next two years. Of more interest, 80% of organisations see it as a key priority over the next 18 months. Zero Trust is gaining traction outside the boardroom and throughout the IT and Security leadership.

However, the implementation of Zero Trust will not be simple. Tooling and process change will need to occur. Tooling will take time to acquire, test, configure, deploy and train staff. Process change will also take time and require many organisations to rethink how they operate. This plays into the next takeaway from the report.

Deep Observability is foundational to Zero Trust

Observability is key for any Zero Trust deployment. However, the demands of Zero Trust are high. Despite the expectations, 59% say it requires too many resources to make it worthwhile. That is 46% up on the 2023 survey and marks a significant increase in awareness and a shift in approach.

It also raises the question of how tools will help reduce that resource burden. Are the right tools for Zero Trust already factored into the money spent on new tools? Will it be an additional cost? Where will the money and resources come from to deploy Zero Trust?

What else can we learn from this report?

The report looks at all of the points above in more detail. Those data points make for interesting reading as they give geographical differences.

For example, respondents from France (55%) and Singapore (49%) have real problems detecting threats using their tools compared to the UK (72%), US (72%) and Germany (70%). Despite this, French respondents saw markedly less data leaked (17%) than Australia (42%) and Singapore (36%).

The section on tooling throws up some things concerns over how organisations retool. It highlights that the average life of a tool is just 2 years. Given the cost of tools, it suggests that few deliver any demonstrable ROI. That problem is compounded by CISOs often bringing tools they know with them rather than adapting to the tools in place.

Of more concern is the statement, “But retooling will never reveal success if organizations cannot feed high-fidelity data to the tools they deploy.” It talks about the challenge of data fidelity and why the lack of it leaves blind spots. Importantly, it points out that tech stacks must be built around deep observability. This is something that respondents admit they lack.

Part of that observability challenge is encryption. Organisations that encrypt all traffic to protect data must work out how to see into that traffic. It is a challenge that many will struggle with.

Enterprise Times: What does this mean?

This is an interesting report with a set of data that raises eyebrows. While most people accept they are losing the cybersecurity battle, few are willing to voice it. This report does just that. It exposes problems with tools, visibility and detection of attacks. It also articulates why things must change to meet the future requirements of Zero Trust.

At 24 pages, the report is not lengthy, but that does not mean it is weak even though it is based purely on quantitative responses. It shows that there is much to be learned by taking this report as a starting point for future qualitative research. That research would give much deeper insight into the issues the report has already identified.

The podcast between Karl Van den Bergh, CMO of Gigamon and Enterprise Times also touches on this report and provides an additional assessment of the findings.

Never underestimate the importance of network data

LEAVE A REPLY

Please enter your comment!
Please enter your name here