Networking specialist Mellanox Technologies has announced its latest data centre interconnect solutions. The ConnectX-6 adapters, Quantum switches and LinkX cables deliver a 200Gb/s HDR InfiniBand interconnect. The first product will ship in early 2017 and are targeted at HPC, machine learning, big data and other high data environments. The Mellanox LinkX solutions will also offer a range of 200Gb/s copper and Silicon Photonics fibre cables.
According to Eyal Waldman, president and CEO of Mellanox Technologies: “The ability to effectively utilize the exponential growth of data and to leverage data insights to gain that competitive advantage in real time is key for business success, homeland security, technology innovation, new research capabilities and beyond. The network is a critical enabler in today’s system designs that will propel the most demanding applications and drive the next life-changing discoveries.”
What do the Mellanox ConnectX-6 network cards deliver?
This is a complete set of solutions for the data centre. There are single and dual port ConnectX-6 adapters. These support both InfiniBand and Ethernet. More importantly they are CPU architecture agnostic with Mellanox claiming support for “x86, GPU, POWER, ARM, FPGA and others”.
One of the challenges for end-users will be effectively using the available bandwidth. To help with that The ConnectX-6 will use an updated version of the Mellanox Multi-Host technology. This allows up to eight hosts to connect to a single adapter by segmenting the PCIe adapter into multiple interfaces. This will enable cloud service providers to enable greater throughput in multi-tenant environments. It will also reduce their infrastructure costs with fewer NICs and cables required.
Mellanox claims the ConnectX-6 NICs can deliver: “200 million messages per second and ultra-low latency of 0.6usec(microsecond)”. This will meet the performance requirements of any high data users. One area where this has an obvious benefit is customers who need to do SSL Inspection. At this speed they will be able to open, check and close messages without any change to current network performance. SSL Inspection is increasingly being used to check for sensitive data being moved outside of an enterprise. It is also used to identify C&C commands to infected machines and data being exfiltrated as a result of a security breach.
Quantum switches also delivering on cost and performance
Customers can now reduce the number of switches they run. The Quantum 200Gb/s HDR InfiniBand switch is claimed by Mellanox to be the fastest switch available. It will support 40-ports of 200Gb/s InfiniBand or 80-ports of 100Gb/s InfiniBand connectivity. This gives a total of 16Tb/s of switching capacity with a low latency of 90ns.
It also supports the same multi-host technology as the ConnectX-6 adapters. This will allow data centre operations teams to reduce the number of switches and cables they manage. This reduction will not only save on capital expenditure but will also increase airflow by removing blockages caused by cables.
No FPGA on-board?
Mellanox has been putting FPGAs on their switches for a while now. This allows customers to deploy their own applications in a sandbox to the network edge. It has also been developing its own applications aimed at telcos and delivering edge-based data encryption. The ability to deploy apps to validate, pre-sort and pre-select data onto network cards reduces the amount of data eventually flowing over the network. This can only be a good thing for those customers with very large data flows.
Surprisingly there is no announcement here of any FPGA technology on the ConnectX-6 cards or the Quantum switch ports. This may be because this is the first generation of the technology or because Mellanox feels that the solution is fast enough. Either way, it seems like a missed opportunity.
Over the last 18 months Mellanox has been raising the stakes in the network market. It is now offering speeds from 40Gb/s to 200Gb/s. This latest announcement seems to have caught Cisco and other network equipment providers by surprise. With networks once again becoming the bottleneck, raising the speed of the data centre network should get Mellanox a lot of attention.