0% found this document useful (0 votes)
59 views44 pages

Walkthrough Introduction To Vertex AI Pipelines

This document provides instructions for setting up a machine learning pipeline using Vertex AI Pipelines. It outlines the following key steps: 1. Setting up the cloud environment by activating Cloud Shell, finding the project ID, and enabling necessary APIs. 2. Creating a Google Cloud Storage bucket to store pipeline artifacts. 3. Configuring the environment by enabling the Vertex AI API and launching a Vertex AI Workbench notebook. 4. Cloning an example notebook for pipeline creation. 5. Using the notebook to define and run a basic pipeline that analyzes training data to demonstrate pipeline concepts.

Uploaded by

afif.yassir12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views44 pages

Walkthrough Introduction To Vertex AI Pipelines

This document provides instructions for setting up a machine learning pipeline using Vertex AI Pipelines. It outlines the following key steps: 1. Setting up the cloud environment by activating Cloud Shell, finding the project ID, and enabling necessary APIs. 2. Creating a Google Cloud Storage bucket to store pipeline artifacts. 3. Configuring the environment by enabling the Vertex AI API and launching a Vertex AI Workbench notebook. 4. Cloning an example notebook for pipeline creation. 5. Using the notebook to define and run a basic pipeline that analyzes training data to demonstrate pipeline concepts.

Uploaded by

afif.yassir12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Proprietary + Confidential

Introduction to Vertex AI Pipelines


Lab Walkthrough
Workflow: Introduction to Vertex AI Lab

Setup Steps

Setup environment/launch
Cloud Setup Storage Environment Setup
notebooks
Setup your Google Create a Google Set up your
Cloud evironment Cloud Storage Bucket environment

Pipeline Creation

Create a basic pipeline to


understand concepts

Launch Notebook Clone Notebook Create Pipeline

Launch a Vertex AI Clone a Notebook Create your pipeline


Workbench Notebook within your instance
Task 1:
Cloud Environment Setup
Activate Cloud Shell
Find the Google Cloud Project ID
Enable the APIs
Task 2:
Create a GCS Bucket
Create a Cloud Storage Bucket
Access to Bucket

Next we give our compute service account access to this bucket. This ensures that Vertex Pipelines
has the necessary permissions to write files to this bucket.
Task 3:
Your Environment Setup
Navigate to the Vertex AI section of your Cloud Console and click Enable
Vertex AI API.
Select Vertex AI, then Workbench.
Task 4:
Launch Vertex AI Workbench
TensorFlow 2.6 – LTS is “Long Term Supported”
Here are the
parameters of your
new Workbench
Notebook.

Your region may show


“us-central’ - do not
change your region.

It takes a few minutes


to spin up your
Notebook.
Here is your new Notebook. Select “Open Jupyter Lab.”
You will see “Build recommended” pop up, click Build. If you see the build failed,
ignore it.
You may see this message - just dismiss it.
Task 5:
Clone the Notebook
To clone the training-data-analyst notebook in your JupyterLab instance. In
JupyterLab, to open a new terminal, click the Terminal icon.
At the command-line prompt, run the following command.
After you run the command, you will see a new folder.

New folder
To confirm that you have cloned the
repository, double-click on the
training-data-analyst directory and
ensure that you can see its contents.
The files for all the Jupyter
notebook-based labs throughout this
course are available in this directory.
Task 6:
Create the Vertex AI Pipeline
Here is the notebook!
After installing these packages, you'll need to restart the kernel
Restart the Kernel
Check that you have installed the packages
https://arxiv.org/ftp/arxiv/papers/160
2/1602.07637.pdf

https://kubeflow-pipelines.readthedo
cs.io/en/latest/source/kfp.dsl.html
Selecting one of the nodes
shows information here
You can also see the JSON for the
entire run when you select the run from
the Pipeline Interface.

You might also like