+44 20 3290 3020 [email protected]

Cloud computing: Why Microsoft's open source data plan is a big step forward

This device is unable to play the requested video. Microsoft’s new Project Zipline compression algorithm is fast enough to compress data while it’s being written to an SSD or uploaded from an IoT device, and efficient enough to get up to 96 percent compression on the Microsoft internal workloads it was first developed for. The reason it’s so fast and so efficient is that it uses a custom hardware accelerator to look for many times more patterns than compression algorithms can usually handle; data that matches any of those patterns gets replaced by a reference to the pattern, taking up much less space. So as well as publishing the specification of the compression algorithm as part of its contribution to the Open Compute Project Foundation , Microsoft is also publishing the Verilog register-transfer level (RTL) files required to create the silicon that runs the algorithm. It’s planning to do the same with the next version of Project Cerberus, the hardware ‘root of trust’ specification that aims to protect against malware for firmware and avoid worries about whether the hardware you order has been tampered with before it reaches you. The first version has a separate controller that plugs in to the PCI bus on the server, but Microsoft wants to see the same protection move into silicon like the CPU and even memory and storage. And again, when the second Cerberus spec is contributed to OCP, it will include the RTL files so vendors can easily add it to their silicon designs. That physical implementation is the tricky bit; it’s the kind of thing that hardware suppliers are usually left to figure out on their own when they deliver systems built on open standards, because that’s part of how they can compete with each other. But you want vendors to adopt your open standard because it’s more useful to you the more people are using it. SEE: 20 pro tips to make Windows 10 work the way you want (free PDF) If you want to move data from Azure to Adobe’s marketing cloud or between SAP and Dynamics the way the three companies encourage customers to do under the Open Data Initiative , then it would make more sense to move it while it’s compressed rather than expanding it. And doing that means both clouds and any servers you’re using will need to understand Zipline. Specifying silicon designs with Verilog and the RTL files that describe the circuitry that makes them up is hard. Programmers with those skills are few and far between; it’s not a common skill, which is why it’s hyperscale clouds rather than ordinary enterprises who take advantage of the flexibility of FPGAs to deliver hardware that’s precisely designed to run a specific algorithm efficiently. By handing out the kind of implementation design implementation that would usually stay inside the company, Microsoft makes it far easier to build Project Zipline and Cerberus into products, so they’re likely to get used. Intel, AMD, Ampere, Arm, Marvell and […]

Send this to a friend