Are you doing enough application security testing? (Image Credit: Mitchell Luo on Unsplash)Poor application security is responsible for over 55% of security attacks, according to the 2020 NTT Ltd Global Threat Intelligence Report. In the latest NTT GTIC monthly report (November) it takes a closer look at this issue. It devotes more than half of the monthly report looking at what can be done to improve application security.

Putting that 55% of security attacks against applications into context is another stat from the report. NTT cites Gartner Group saying its 2020 Security Market Segment report puts spending at US$59 billion annually. But how much of that is focused on application security? The answer is just $3.3 billion, around 5.5% of the budget. A more representative amount would be in excess of $32 billion.

Exacerbating the problem over 2020 has been the move to Work From Home as a result of the current pandemic. WFH is likely to continue not just into 2021 but for many companies, it is a long term strategy. That means organisations need to pay much more attention to application security.

Why is application security such a problem?

Zach Jones, Sr. Director of Detection Research, WhiteHat Security (Image Credit: LinkedIn)
Zach Jones, Sr. Director of Detection Research, WhiteHat Security

It is easy to say a lack of funds causes the problem, but that’s a simplistic response. So is it a failure of the way we write software? Possibly. Zach Jones, Sr. Director of Detection Research, WhiteHat Security, US says there are three categories of application security risk:

  1. Assemble: Risk inherited whenever we bring together the components we rely on as the bedrock of our applications, like OS packages, frameworks and libraries.
  2. Build: Risk created when we implement features without security by design or appropriate security controls.
  3. Configure: Risk created when we deploy our applications to enable new functionality without hardening defaults and evolving past development setups.

Jones goes on to write: “Traditional IT security capabilities provide some visibility to risks in the assemble and configure categories, but almost no visibility into the risks of the build category. These risks are created by the features developed to meet the specific needs of the business and the functions the application performs on behalf of a user. Notably, seven of the OWASP top ten application security risks are flaws which fall into the ‘build’ category of application risks.”

What is required to solve the problem?

Jones believes that organisations need to start looking at three key techniques to improve application security – DAST, SAST and SCA. All of these can be delivered through in-house teams or outsourced. The key, however, is not who does it but that someone does the testing.

For each of the three techniques, Jones provides a set of pros and cons. Look at any organisation, and those pros and cons are easy to see.

Dynamic Application Security Testing (DAST)

DAST tests applications from the perspective of an attacker. The test teams use the same tools that are available to attackers to find flaws. It is an approach that most red team testing uses. However, many organisations do not have a red team test process, either internally or externally. By not thinking like an attacker, they fail to understand what makes their applications vulnerable.

To make DAST effective, Jones believes there has to be a mix of automated and manual testing. Known attacks and brute force attacks are best automated. Manual testing is ideal for finding errors in business logic and should be combined with other forms of attack, such as social engineering. The latter is ideal for finding out how users really implement a process rather than how process engineers think it works.

Static Application Security Testing (SAST)

SAST is widely used inside many organisations and often integrated into the Integrated Development Environment (IDE). Whenever developers check-in code, an automated set of static tests can be run against it. It provides a quick view of any issues with the code and enables them to be rectified before the build phase.

A long-term problem with static testing has been how often the tests are checked, reviewed and updated. It is not uncommon for developers to write their own static tests without any real input from security teams. It means that SAST can be far less effective than it should be.

Software Composition Analysis (SCA)

SCA looks for known vulnerabilities and other issues in the technology stack. Done correctly, it will identify where patching is required and where the software needs to be updated. The problem inside many corporate IT environments is that old software is a fact of life. There is often not enough budget to keep updating to the latest version. In addition, it relies on proper manifests of software to show what third-party libraries and components have been used that could be vulnerable.

One of the challenges of SCA is that IT departments rarely target SCA at commercial software. It is partly because of licence agreements that explicitly ban this and partly due to a false assumption that the problems are more likely to be in code written inside the business. As the vast majority of that code sits on or integrates with commercial software, this is a major weakness.

Is application security an unsolvable problem?

No. What is needed is for application security to be baked into the entire software process. From an NTT perspective, this starts with their Secure by Design approach. It then requires application testing to be treated as a first-level security approach. Commercial pressures often mean the approach is one of “ship it now and patch later”. This is not only building in failure but guaranteeing applications will be breached.

As Jones points out: “Application security is not just about patching. Effective application testing can help identify vulnerabilities and trends or tendencies which may potentially lead to the introduction of future vulnerabilities. Proper application testing is an effective preventive and proactive technique which can help reduce an organization’s overall threat profile.”

What else is in the November GTIC report?

The report also takes a look at the recent takedown of Trickbot and the part that NTT played in that operation. The takedown was led and coordinated by the Microsoft Digital Crimes Unit (DCU).  NTT carries 40% of the global Internet traffic, and this enabled it to identify, isolate and verify the C2 servers used by Trickbot.

With Black Friday and Cyber Monday around the corner, the report also takes a brief look at the threats to retail. This follows on from previous reports where NTT has highlighted attacks against CMS used by retailers. With so many retailers going online for the first time in 2020, attackers have upped the attacks. Card skimming, trojans stealing bank details, DDoS and spyware all get a mention.

Enterprise Times: What does this mean?

Like other monthly GTIC reports, this one does not disappoint. Jones focus on application security is long overdue, and something organisations need to spend more time on. As 2020 has shown, the pandemic has created a massive remote workforce issue for organisations. Cybercriminals are exploiting that and poorly secured applications give them an easy foothold.

Poor application security means that attackers will continue to have access routes into an organisation. Last year 22% of attacks came via web applications. This year, attacks on web applications are up. Before a business thinks that security is too expensive, a data breach not only carries large regulatory fines but can be devastating when it comes to customer retention.

There is no one solution to application security. However, without a proper approach that utilises the various techniques around, it won’t happen.

LEAVE A REPLY

Please enter your comment!
Please enter your name here