Dear HPE, When it Comes to Big Data, All Software is “Core”
September 9, 2016 Leave a comment
• HPE is no longer burdened by application delivery management, IT operations management, big data, enterprise security, and information management software, all of which it termed to be “non-core.”
• Unfortunately, with the sale of these offerings to Micro Focus, HPE has dropped the very thing that would have driven forward its newfound remaining portfolio, namely business value.
Not even a full year has passed since HPE broke off from Hewlett-Packard Company and re-launched as HPE last November, creating a standalone company equipped with a pretty impressive software portfolio covering the cloud, data center infrastructure, and workplace applications. That was a lot to take in, given the storied history of Hewlett-Packard Company. But I think the data and analytics industry looked favorably on the idea of HPE as an enterprise-oriented firm, especially one in possession of software assets like Vertica, IDOL, and Haven.
And now here we are facing yet another, new HPE, one no longer burdened by application delivery management, IT operations management, big data, enterprise security, and information management software which it termed to be “non-core.” Non-core? Really?
I understand the company’s stated desire to simplify. By offloading its software assets to Micro Focus, HPE should in theory be able to act more quickly in responding to market demands using what is now a streamlined portfolio specific to the rapidly evolving orchestrated infrastructure marketplace. Add to this the potential for HPE to go private in the future, and the market should see improved competition between HPE, Dell EMC, Cisco, and IBM.
What worries me about this divestiture (or as HPE has suggested, a “spin-merge”) is that HPE has dropped the very technologies that would have driven forward its newfound focus on cloud enablement and resource orchestration. In a formal statement on September 7th, CEO Meg Whitman said that HPE was sticking with and still believed in original goal back in November 2015 building hybrid-IT capable of enabling “the emerging intelligent edge that will power campus, branch and IoT applications for decades to come.”
Unfortunately, HPE customers will now need to look outside of HPE for their “non-core” data and analytics software in support of opportunities like IoT. They can go to Micro Focus, the new home of Vertica, IDOL, and Haven, and then perform their own integration. But why do that when so many vendors are offering solution-complete offerings (hardware, software and service) built to address specific use cases within unique verticals. Ideas like Oracle’s Big Data Appliance or Dell EMC’s Isilon Data Lake make sense because they transfer what’s great about the cloud to the on-premises data center, namely scalability, speed (of deployment), and simplicity.
Certainly HPE can focus on creating differentiated hardware solutions capable of meeting any and all big data use cases, but it will have to do so without the speed and simplicity available from its solution-complete rivals. More importantly, it will have to make do without a business outcome. Hardware matters, but as we’ve seen with the rise of cloud services, hardware is truly a commodity item. What makes one bit of silicon different or better than another is what runs on that bit of burnt sand. What makes a big data solution matter to the business is business, not equipment — not equipment as an end unto itself anyway.
Given HPE’s earlier emphasis on building converged systems in support of analytics software from SAP and Microsoft as a part of its HPE ConvergedSystem for Big Data portfolio, that what HPE customers will likely get is what they have now: a single offering that pulls together software from multiple vendors. In that way, they can look forward to a wide range of hardware systems capable of supporting a wide range of software. That’s not a bad thing. Choice is important. It’s just not what we were looking forward to with the launch of HPE last November.