0% found this document useful (0 votes)
31 views8 pages

Lab 02 - Machine Learning Life Cycle - Build, Train, Publish To Model Catalog, and Deploy A ML Model

Uploaded by

tariq ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
31 views8 pages

Lab 02 - Machine Learning Life Cycle - Build, Train, Publish To Model Catalog, and Deploy A ML Model

Uploaded by

tariq ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 8
Machine Learning Life Cycle: Build, Train, Publish to Model Catalog, and Deploy a ML Model Lab 02 Practices Estimated Time: 40 minutes Setup Instructions for the Lab The setup to run this practice is already completed in Lab O1. Activate the notebook session to continue working with this Lab. Steps to Activate and Open Notebook session: a. Inthe navigation menu, navigate to Machine Learning under Analytics & Al and click Data Science. Select your ©. Click the created in Lab01 d. Select the use LabO1. e. Click the three dots on the right to open the Actions menu. Select Activate. Click Open to open the notebook session Note: In case, you are creating a new Project and Notebook session, refer to the instructions given in Lab1. Cape IOS, Grade arate aT 18 “Machine Learning Life Cycle: Build, Train, Publish to Model Catalog, and Deploy @ ML Model Get Started Overview This lab will guide you through an entire Machine Learning Lifecycle Model. To start with, the environment is set up. This is followed by a practical example of how to build, train, and save a machine learning model. Then is the model catalog. The model catalog is a central repository of model artifacts and metadata. In the Oracle Cloud Infrastructure Data Science service, the model catalog lets data scientists share and control access to the models and deploy them as REST endpoints via the Data Science Model Deployment service. The model catalog is centered on documentation, reproducibility, and auditability — these are critical concepts in ML experimentation Note: Model reproducibility and auditability are very real concerns of regulated industries. that need to comply with audit rules and other traceability requirements. The model that is in the model catalog is deployed as a REST endpoint using the Data Science Model Deployment service. A common use case for doing this is when an application needs to perform inference, but you want to keep the model separate from the application. You will use ADS to deploy the model from a notebook. The dataset used in the lab practice, is a Binary Classification dataset used to predict employee retention. While this example uses a classifier, the steps learned with this exercise are applicable to many other types of machine learning models. Cape IOS, Grade area we aT Machine Learning Lite Cycle: Bull, Train, Publish to Model Catalog, and Deploy a ML Model 9 eon 980 ev Con nvronment In this lab, you'l: Open and explore DataScience_LabO2ipynb. b. Model Catalog the trained model and Deploy model for real-time inference. Prerequisites + Successful completion of Setup as described in Lab 01 + Ensure to select the condo en Python 3.8 following the simi jonment General Machine Learning for CPUs on structions for setting conda environment in Lab 01. Cape IOS, Grade arate aT 20 “Machine Learning Life Cycle: Build, Train, Publish to Model Catalog, and Deploy @ ML Model Open and Explore DataScience_Labo2.ipynb A notebook has been prepared containing all the necessary Python code to Build, Train and save the machine learning model for Binary Classification for Predicting Employee Attrition with ADS. Tasks 1. Inthe file browser, navigate to the directory . This directory was 2. Open the notebook Datascience_Lab02. ipynb by double-clicking it. A new tab opens in the workspace on the right. 3. Notice in the upper right corner of the notebook tab, it displays the name of the conda environment being used by this notebook. Confirm that the name you see is the slugname of the General Machine Learning for CPUs on Python 3.8 conda environment (generalml_p38_cpu_v1). 4. Scroll through each cell in the notebook and read the explanations. When you encounter a code cell, execute it (using shift + enter) and view the results. For executable cells, the “™¥(]" changes to a "[*]" while executing, then a number when complete "[1]". (If you run short on time, you can use the Run menu to run the remaining cells and the review the results. When a model is saved with ADS, a bunch of metadata is automatically extracted about your model object and stored in the model catalog. You will review some of the key extracted metadata. 1. Select your 2. On the project details page, click Models option available under Resources. 3. Click the model you just saved with the Datascience_uab2 4, On the left pane under Resources, you can see details of Model provenance, Model taxonomy, Model introspection, and input and output data schema. Examine each of these. 5. Click Model Provenance under Resources. Model Provenance is a piece of documentation that helps you improve the model reproducibility and auditability. Cape IOS, Grade area we aT Machine Learning Lite Cycle: Bull, Train, Publish to Model Catalog, and Deploy a ML Model a Click the tab Model training Source Code. It gives the git reference to the training source code for the Model created 7. Under Resources, click Model Taxonomy. Model taxonomy allows you to describe the model you are saving to the model catalog. You can use pre-allocated fields or create custom attributes. 8. Under Resources, click Model introspection. Model introspection captures the results of the tests run on the client side before the model is saved to the model catalog 9. Under Resources, click Input and Output data schema. The input schema is the sequence of steps to pass as an input to an orchestration API to enable actual workflow execution. The output schema specifies what and how the data results are returned. This can be visualized from the input and output data schema tabs respectively. 10. A simple deployment configuration is created which results in an active deployment. This takes some time to become active. 11. To invoke your Deployed Model, under Resources, click Invoking Your Model. 12. Click Copy next to your model HTTP endpoint URI. 13. Go back to the notebook DataScience_Lab02.ipynb and paste the URI in the notebook cell (under the header Invoking your Model with the Predict from Console Model Deployment) where the URI variable is assigned. 414. Run the notebook again. You will successfully invoke a new model endpoint! Cong OO, rad andar 2 Machine Learning Life Cycle Bld, Tan, Publish to Model Catalog, and Deploy & ML Mode! Purge Instructions Deleting Model Deployments 1. From the navigation menu, select Analytics & Al and click on Data Science. 2. You are now on Projects page. 3. Select the Data Science project with the model deployment (to be deleted), 4, Click Model deployments under Resources section, 5, Click Actions icon, and then select Terminate. 6. When prompted for confirmation, provide the deployment name (as entered) and click Delete. Deleting Models 1. From the navigation menu, select Analytics & Al and click on Data Science. 2. You are now on Projects page. 3. Select the Data Science project with the model (to be deleted). 4, Click Models under Resources section. 5. Click Actions icon, and then select Delete. 6. When prompted for confirmation, provide the model's name (as entered) and click Delete. Deactivating Notebook Sessions 1. From the navigation menu, under Analyties & Al, click on Data Science. 2. You are now on Projects page. 3. Select the Data Science project with the notebook sessions (to be deactivated). 4. Click Notebook sessions under Resources section. Cape IOS, Grade area we aT Machine Learning Lite Cycle: Bull, Train, Publish to Model Catalog, and Deploy a ML Model 2 5. Click Actions icon, and then select Deactivate. Note: It is best practice to Deactivate the notebook session to save cost. You can Deactivate the notebook session, until required in the subsequent labs. This completes the cleanup of the artifacts for this lab. Cape IOS, Grade arate aT 24 Machine Learning Life Cycle: Build, Train, Publish to Model Catalog, and Deploy @ ML Model

You might also like