Varonis releases Data Risk Report 2016

Ken Spinner, VP of Field Engineering, Varonis
Ken Spinner, VP of Field Engineering, Varonis

Data protection specialist Varonis has published its latest Data Risk Report. The report is based on the data risk assessments that Varonis conducted on its customers during 2016. It shows that companies are failing to implement proper data controls leaving sensitive data widely available to all employees. This failure increase the risk of a successful insider attacks.

According to Ken Spinner, VP of Field Engineering at Varonis: “In data breaches and ransomware attacks, files are targeted because they are high value assets and usually vulnerable to misuse by insiders and outsiders that transgress the perimeter. While organizations focus on outer defenses and chasing threats, the data itself is left broadly accessible and unmonitored.”

How bad is the data security problem?

The headline numbers from the report suggest it is a serious issue.

  • 236.5 million folders containing 2.8 billion files, comprising 3.79 petabytes of data were analyzed.
  • 48,054,198 folders were open to “global access groups,” or groups that grant access to the entire organization.
  • 47% of organizations had at least 1,000 sensitive files open to every employee; 22% had 12,000 or more sensitive files exposed to every employee.
  • 71% of all folders contained stale data, accounting for almost 2 petabytes of data.
  • 24.4 million folders had unique permissions, increasing complexity and making it more difficult to enforce a least privilege model and comply with regulations like General Data Protection Regulation (GDPR).

This is not a new problem and is one that continues to get worse. One of the culprits of data leakage has been the demand for greater data sharing and collaboration. Employees no longer collaborate just within their own departments or organisation. Many now collaborate with partners and third-parties, especially when new software is being developed. To improve collaboration, employees will copy data to shared document libraries. This exposes documents and data.

Using default permissions a bad idea

Employees are not, however, the major cause of poor data controls. Too many organisations rely on the default permissions on their system. In a Windows environment that means the Everyone, Domain Users and Authenticated Users groups. These catch-all group permissions mean that data will be available to anyone who can access the network. Varonis found 48 million folders with global group access. This represents 20% of all the folders they found.

Solving the problem will take time. It requires organisations to review folders and files to see where they can change the group access. During this process, organisations also need to look at data duplication across their network. There is little point improving the security of files and folders if a file is so widely copied it effectively still ends up with almost global access.

IT teams also need to track how permissions are inherited as data is moved around. There will be areas where they can allow wider access to some data. However, across an organisation there will also be some data where the permissions need to be more tightly controlled. It is also important that tightening permissions is carried out carefully. Adding too many groups creates complexity that can be hard to manage. The harder it is the less effective it will be and the more likely it is that users will find a way around the solution.

It is not just data that acquires too many permissions. Over time any user will acquire more permissions than they need for they job. As users get promoted or move departments, permissions are rarely revoked. This is not a new issue but is still often ignored inside organisations.

Data classification a requirement for new legislation

The introduction of data classification solutions is also a necessity. This includes files that contain personal data on employees and customers. It also covers any business sensitive data such as Intellectual Property (IP), accounting information and trade secrets.

Data classification allows IT teams to track where a file is being stored or sent. It allows them to intercept files that are being moved to cloud storage not owned by the business. It also provides a way to stop files being emailed out of the organisation. While this might cause some problems for collaboration teams it is about protecting corporate data.

There is already a large amount of legislation around privacy and data protection. For companies that work across multiple countries, they must prove that they are protecting data in all the countries they operate in. Some countries are now making data sovereignty a requirement. Data classification will ensure that sensitive data can also be geo-fenced to comply with regulatory requirements.

Stale data wastes money

Stale data is data that is no longer being used regularly or even at all. Large organisations using tiered storage solutions track how often files are opened and used. Files are then migrated across storage tiers as their usage drops. For many organisations this is more about reducing the cost of their data storage than data security.

Varonis discovered that over 67% of data, 1.95 petabytes out of 2.8 petabytes was stale data. This represents a staggering 169 million folders. From both a storage cost and data security perspective it is hard to see why this data is online and readily accessible. More importantly, a very large part of this data had default permissions.


Data security is constantly in the headlines as the rate of data breaches increases. The problem for many organisations is that poor historical practices continue to persist. Resolving them will take time, money and inevitably cause short-term issues with users who may object to their data access being limited.

However, data and privacy legislation is being tightened around the world. The EU GDPR will hit next year. Companies that do not deal with this issue now will find themselves facing massive fines when they have a data breach.

Improving data security will also pay off over the long-term. For most organisations, the opportunity to reduce the costs of their data storage by 50% is a dream. Yet the wasted cost of holding stale data shows that there are significant savings to be made. If that process also includes improving security and lowering the risk of a data breach and subsequent fine it begs the question: “What are you waiting for?”.


Please enter your comment!
Please enter your name here