While both HPE and Microsoft machine learning platforms offer numerous possibilities for developers and data scientists, HPE Haven OnDemand is a diverse collection of APIs for interacting with data designed with flexibility in mind, allowing developers to quickly perform data tasks in the cloud.
Data is everywhere. Data is big, complex, and growing exponentially in volume. Perhaps most importantly, data is not a fad, and the challenges associated with it are not going anywhere. With organizations inundated with data these days, turning it from liability to asset can be a challenge, with the greatest potential asset being insight. With such a wide availability of data-related tools today, it can be difficult to know where to begin looking for help.
This is where HPE Haven OnDemand comes in. HPE Haven OnDemand is a cloud services platform which simplifies how you can interact with data, allowing it to be transformed into an asset anytime, anywhere. HPE Haven OnDemand provides a collection of machine learning application programming interfaces (APIs) for interacting with structured and unstructured data in a variety of ways.
HPE Haven OnDemand also includes APIs for anomaly detection, trend analysis, and a variety of other analytics APIs. A full overview of the APIs can be found here.
HPE Haven OnDemand is currently making the headlines with their Machine Learning as a Service, which is hosted on Microsoft’s Azure Cloud. Both companies have a machine learning platform; as such, this article helps to contrast the two offerings and toolsets. In this article, we will take an introductory look at HPE Haven OnDemand, with a focus on one of the most common and useful contemporary data-related tasks: prediction. A prediction task will be undertaken, and the process discussed. In order to put HPE Haven OnDemand’s services in perspective, a similar process will be undertaken using Azure Machine Learning, and the differences between the 2 platforms will be highlighted.
First off, signing up for HPE Haven OnDemand is straightforward, with common sign-on options including Google, Facebook, and Twitter authentication, as well as HP Passport. I was signed up with my Google ID in a matter of seconds. The only other step necessary prior to employing the various APIs is to generate an API key, after which I was on my way. More information on getting started can be found here.
In order to make predictions, some data is obviously required.;
Enterprise Data World 2017
$200 off with code 7WDATA
Data Visualisation Summit San Francisco
$200 off with code DATA200
Chief Analytics Officer Europe
15% off with code 7WDCAO17
Chief Analytics Officer Spring 2017
15% off with code MP15
Big Data and Analytics for Healthcare Philadelphia
$200 off with code DATA200