Monday, December 18, 2017

Azure Machine Learning in Practice: Productionalization

Today, we're going to finish up our Fraud Detection experiment.  If you haven't read our previous posts in this series, it's recommended that you do so.  They cover the Preparation, Data Cleansing, Model Selection, Model Evaluation, Threshold Selection, Feature Selection and Feature Engineering phases of the experiment.  In this post, we're going to walk through the Productionalization process.

Productionalization is the process of taking the work we've done so far and making it accessible to the end user.  This is by far the most important process.  If we are unable to connect the end user to the model, then everything up until now was for nothing.  Fortunately, this is where Azure Machine Learning really differentiates itself from the rest of the data science tools on the market.  First, let's create a simple experiment that takes our testing data and scores that data using our trained model.  Remember that we investigated the use of some basic engineered features, but found that they didn't add value.
Productionalization
Now, let's take a minute to talk about web services.  A web service is a simple resource that sits on the Internet.  A user or application can send a set of data to this web service and receive a set of data in return, assuming they have the permissions to do so.  In our case, Azure Machine Learning makes it incredibly simple to create a deploy our experiement as an Azure Web Service.
Set Up Web Service
On the bar at the bottom of the Azure Machine Learning Studio, there's a button for "Set Up Web Service".  If we click it, we get a neat animation and a few changes to our experiment.
Predictive Experiment
We can see that we now have two new modules, "Web Service Input" and "Web Service Output".  When the user or application hits the web service, these are what they interact with.  The user or application passes a data set to the web service as a JSON payload.  Then, that payload flows into our Predictive Experiment and is scored using our model.  Finally, that scored data set is passed back to the user or application as a JSON payload.  The simplicity and flexibility of this type of model means that virtually any environment can easily integrate with Azure Machine Learning experiments.  However, we need to deploy it first.
Deploy Web Service
Just like with creating the web service, deployment is as easy as clicking a button on the bottom bar.  Unless you have a reason, it's good practice to deploy a new web service, as opposed to a classic one.
Web Service Deployment
Now, all we have to do is link it to a web service plan and we're off!  You can find out more about web service plans and their pricing here.  Basically, you can pay-as-you-go or you can buy a bundle at a discount and pay for any overges.  Now, let's take a look at a brand new portal, the Azure Machine Learning Web Services Portal.
Azure Machine Learning Web Services Portal
This is where we can manage and monitor all of of our Azure Machine Learning Web Services.  We'll gloss over this for now, as it's not the subject of this post.  However, we may venture back in a later post.  Let's move over to the "Consume" tab.
Azure Machine Learning Web Service Consumption Information
On this tab, we can find the keys and URIs for our new web services.  However, there's something far more powerful lurking further down on the page.
Sample Web Service Code
Azure Machine Learning provides sample code for calling the web service using four languages, C#, Python, Python 3+ and R.  This is amazing for us because we're not developers.  We couldn't code our way out of a box.  But, Azure Machine Learning makes it so easy that we don't have to.

Hopefully, this post sparked your imagination for all the ways that you could utilize Azure Machine Learning in your organization.  Azure Machine Learning is one of the best data science tools on the market because it drastically slashes the amount of time it takes to build, evaluate and productionalize your machine learning algorithms.  Thanks for reading.  We hope you found this informative.

Brad Llewellyn
Data Science Consultant
Valorem
@BreakingBI
www.linkedin.com/in/bradllewellyn
llewellyn.wb@gmail.com

No comments:

Post a Comment