One of the big risks that many people take when moving to the cloud is to forget that they still need to be able to protect their servers. This ranges from backing up data locally in case the cloud provider goes out of business to creating global disaster recovery plans that ensure the loss of one cloud provider or part of the Internet will not prevent a company from continuing to operate.
What is CTERA launching?
The launch is about CTERA Cloud Server Data Protection. It enables companies to deal with data on a single cloud or, as it becoming increasingly common, data on multiple clouds. At the moment there is direct support for Amazon Web Services (AWS), Microsoft Azure and OpenStack. Missing from this list is Google Cloud which will raise some eyebrows and CTERA have yet to indicate when they will support Google.
All data is de-duplicated and encrypted before being moved across cloud instances. This should reduce traffic and provide good security. What is not clear is whether CTERA will also de-duplicate ALL the data or just on a cloud server by cloud server basis.
This is an important distinction as it is not unusual to have multiple copies of data on servers in any location. De-duplication of all the servers in a given cloud will deliver a significant data reduction for many companies. If CTERA can go further and carry out a secondary de-duplication of the data once it reaches the enterprise this help make it possible for companies to keep multiple copies of data over time.
CTERA say that the solution is not just about public cloud. It can be deployed to public, private and managed clouds which should cover most instances for companies. As the data can be directed to specific locations it also means that when setting it up, IT operations teams can make sure that compliance requirements are taken into account. This means that data can be kept in-country or in-region as demanded by different regulators.
In the press release CTERA provide a list of benefits from this solution. It includes:
- A cloud-agnostic backup solution that enables organizations to protect data in their virtual private clouds (VPC) while also leveraging the native object storage service of any cloud (Amazon Web Services, Microsoft Azure, IBM Cloud, OpenStack and more) to cost-effectively store backup data. This unique capability frees organizations from being tied to any platform-specific or hypervisor-level data protection offering.
- A multi-tenant data protection management system capable of protecting thousands of tenants and tens of thousands of servers from a single console – where data is encrypted by the user at the source to eliminate the risk of an administrator recovering sensitive data to the wrong user.
- A fully-automated service delivery platform that creates dramatic operational efficiency, eliminating the manual IT intervention associated with backup administration by leveraging RESTful APIs, configuration templates and policy-based automated server discovery to easily scale every aspect of ITaaS service requests, service provisioning, chargeback and more.
- Advanced network and storage efficiency through the use of source-based global deduplication and compression, as well as an incremental-forever data protection architecture that minimizes network and storage costs while also enabling WAN optimized backup for cross-region and cross-cloud data protection.
- Application-aware data protection tools for application-consistent backup and granular, file-level recovery of Microsoft SQL Server, SharePoint, and other applications running in any cloud infrastructure.
IT operations teams need to get a grip
Quoting an Ovum report from November 2014 Liran Eshel, CEO and co-founder of CTERA said: “By 2016, nearly 75 percent of all enterprises will have implemented a hybrid cloud computing strategy.” As we are almost at 2016 it is hard to see that figure being achieved with many enterprise customers yet to extend their existing backup and protection mechanisms to the cloud.
One of the reasons for this is that many IT teams know little about the cloud instances taken out by their business units and individual users. They are still struggling to develop their own ways of monitoring and locating cloud instances from both a management and a security perspective. This is likely to be the major focus of most IT operations and IT security teams for 2016.
At the same time, many people purchasing cloud believe that they no longer have to backup data as it is part of the service they are provided with. As we have seen over the last two years when there have been cloud outages, the small print is all important. Most cloud vendors will make best effort to backup and restore cloud instances. However, these are shared backups not individual backups and restoring them is often a lengthy process.
It appears that the lessons from almost two decades of hosting websites have still to be properly understood. Most IT departments developed their own robust approaches to backing up data and websites as part of their daily, weekly and monthly backup cycles.
Cloud requires an approach that is just as robust or else companies are going to start losing data. We are now at the point where an increasing amount of new data is created and spends its entire life in the cloud. With no local or controlled backup the loss of that data will have an impact on enterprises and this is a governance issue that must be addressed.
Eshel also commented: “The last 10 years of data protection advancements for virtual environments go out the window when enterprises can no longer manage infrastructure at the hypervisor level. Traditional tools were not designed to natively leverage the advantages of cloud infrastructure and do not adhere to the basic principles of cloud orchestration and automation.
“We are happy to take a leadership position in enabling adoption of hybrid and ‘all-in’ cloud strategies by providing a truly unique, cloud-native and yet cloud-agnostic platform for data-protection as a service delivery.”
Conclusion
The explosion of cloud outside the control of IT departments means that they have to review their governance of data. While the current focus is on discovery of cloud usage and identification of sensitive data stored in the cloud, there has to be urgent attention to including anything discovered in a structured backup programme.
Users might see IT as a blocker to their cloud ambitions but the moment anything is lost they are the first to blame IT rather than accept their own failure. To deal with this companies must educate their users so that when they buy cloud they also look at the backup processes. If they don’t want IT to deal with this they must organise their own data protection or accept that any loss of data is down to their governance failure which could have the potential of creating a more serious incident with a regulator.
Hi Ian –
Thanks for helping to break a lot of this down. The platform’s Deduplication is “global” and has the ability to compare and reduce duplicate data across all servers in a given security domain (secure tenant). The reason we bound the process to a tenant is that each tenant generates and owns its own encryption key – and you can’t share data across secure cloud tenants.
HTH
Jeff Denworth
SVP, Marketing
CTERA Networks
[…] CTERA delivers cloud protection A cloud-agnostic backup solution that enables organizations to protect data in their virtual private clouds (VPC) while also leveraging the native object storage service of any cloud (Amazon Web Services, Microsoft Azure, IBM Cloud, OpenStack and more … Read more on Enterprise Times […]