Hewlett Packard Enterprise (HPE) has announced HPE Synergy calling it a new platform to run both traditional and cloud native applications.
HPE Synergy was originally announced back in June at HP Discover Las Vegas. In a blog from Celia Lawren she provided much more detail than was available at Discover and called Synergy a Composable Infrastructure. This is not just about hardware. Lawren announced a Unified API that is designed to make it easier for HP Partners to deliver new solutions and at HPE Discover in London this week it will be interesting to see how many partners have solutions ready for the official launch of HPE Synergy.
In the press release announcing HPE Synergy, Antonio Neri, Executive Vice President and General Manager of the Enterprise Group at Hewlett Packard Enterprise said: “Market data clearly shows that a hybrid combination of traditional IT and private clouds will dominate the market over the next five years. Organizations are looking to capitalize on the speed and agility of the cloud but want the reliability and security of running business critical applications in their own datacenters.
“With HPE Synergy, IT can deliver infrastructure as code and give businesses a cloud experience in their datacenter.”when HP, as it was then, announced it was planning to enter the bare metal server market with a new provisioning approach”
What does HPE Synergy offer?
The focus for HPE Synergy is around the acceptance that cloud is not and will not be the end game for every customer no matter how large or how small. This means that vendors have to solve the requirement to have data and applications split not just over on-premises and cloud, but across multiple clouds as well.
In order to achieve this, HPE has decided that the best approach is to rethink the way it does provisioning. At the heart of this is the ability to address all resources, regardless of where they are located, as if they were part of the same system and this is handled by a what HPE describes as a software defined intelligence. This has to go much deeper than just provisioning. Applications need to be able to deal with a complex infrastructure where latency and asynchronous behaviour is the norm.
There are three key parts to HPE Synergy:
- Fluid Resource Pools
- Compute, storage and fabric networking that can be composed and recomposed to the exact need of the application
- Boots up ready to deploy workloads
- Supports all workloads – Physical, Virtual and Containerized.
- Software Defined Intelligence
- Self-discovers and self-assembles the infrastructure you need
- Repeatable, frictionless updates
- A Unified API
- Single line of code to abstract every element of infrastructure
- 100% infrastructure programmability
- Bare-metal interface for Infrastructure as a Service
What stands out here is the introduction of bare metal capacity. As was seen in the recent test of cloud and bare metal by VoltDB, bare metal is far superior in performance when it comes to key workloads. HPE has clearly taken that on board and is focusing on those customers who want to run mission critical workloads at speed.
Overprovisioning still a major problem
One of the key goals for HPE Synergy is to help companies reduce their overprovisioning. HPE has a long history in the virtualisation market especially across its blade server customer base. As a result it understands the challenge of dealing with overprovisioning is not simply about reducing resources and improving use.
This use of a hybrid cloud model means that HPE Synergy will seek to use the single resource pool approach to help companies use on-premises and cloud to match requirements. Part of the solution will be to help companies move workloads to the cloud when they require temporary resource increases but in order to do that it will also need to match demand with compliance requirements.
HPE has a history with compliance solutions and a comprehensive set of ITIL solutions. It will no doubt be seeking to convince customers that it can bring those to bear on HPE Synergy. Success here will not only help reduce overprovisioning and its associated costs but also help improve data privacy and security. These are key goals for all vendors at the moment and customers are paying much more attention to those providers who are capable of delivery an out of the box solution rather than selling consultancy.
Pricing, availability and bandwidth
HPE Synergy will not be available until Q2 2016 and there are no details on cost and how the solution will be sold. For example, will there be a cut down version for smaller customers or is HPE Synergy a one size fits all solution? What level of commitment will customers need to make in terms of cloud usage and how will that impact HPEs SME customer base?
One of the biggest challenges for hybrid is bandwidth and this announcement has made no mention of HPE launching its own competitor to that of IBM SoftLayer and Microsoft. It may be that HPE will seek to follow the path of many other cloud providers seeking to solve the hybrid challenge and do a deal with Equinix. At the moment we just don’t know and HPE will need to resolve this quickly.
While many will see HPE as being late to the whole cloud market let along hybrid it has been around it for some time. What it hasn’t done is articulate what its products can do nor has it managed to deliver a set of solutions that appeal to customers. Mixing bare metal, on-premises and cloud gives it a chance to restructure its cloud offerings and start anew.
Looking at the orchestration and provisioning behind HPE Synergy it is clear that the company has gone back to its blade system and Matrix roots and drawn on the lessons there. If it can solve the bandwidth issue and really make hybrid appear as a single compute pool it will have taken a significant step in the right direction to making itself a key player in the cloud market.