0% found this document useful (0 votes)
10 views

Lab Summary Google ML Path

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Lab Summary Google ML Path

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Lab summary google ML path

Predicting Visitor Purchases with BigQuery ML

 Use BigQuery to find public datasets


 Query and explore the ecommerce dataset
 Create a training and evaluation dataset to be used for batch prediction
 Create a classification (logistic regression) model in BigQuery ML
 Evaluate the performance of your machine learning model
 Predict and rank the probability that a visitor will make a purchase

In BigQuery some “Columns” are aggregated on “ARRAYS”, there exists the command “UNNEST”
which can then be used with an alias like this:
UNNEST(hits) AS h,
UNNEST(h.product) AS p

There also exists the command “EXCEPT”

Entity and Sentiment Analysis with the Natural Language API


 Creating a Natural Language API request and call the API with cURL
 Extracting entities and run sentiment analysis on text
 Performing linguistic analysis on text
 Creating a Natural Language API request in a different language
Vertex AI: Predicting Loan Risk with AutoML

 Upload a dataset to Vertex AI.


 Train a machine learning model with AutoML.
 Evaluate the model performance.
 Deploy the model to an endpoint.
 Get predictions.

Get Started with Generative AI Studio

 Create prompts with free-form and structured mode.


 Create conversations.
 Explore the prompt gallery.

Improving Data Quality

 Resolve missing values.


 Convert the Date feature column to a datetime format.
 Rename a feature column, remove a value from a feature column.
 Create one-hot encoding features.
 Understand temporal feature conversions.

Exploratory Data Analysis using Python and BigQuery

 Analyze a Pandas Dataframe


 Create Seaborn plots for Exploratory Data Analysis in Python
 Write a SQL query to pick up specific fields from a BigQuery dataset
 Exploratory Analysis in BigQuery

Introduction to Linear Regression

 Analyze a Pandas dataframe.


 Create Seaborn plots for exploratory data analysis.
 Train a linear regression model using Scikit-Learn.

Using BigQuery ML to Predict Penguin Weight

 Create a linear regression model using the CREATE MODEL statement with BigQuery
ML.
 Evaluate the ML model with the ML.EVALUATE function.
 Make predictions using the ML model with the ML.PREDICT function.
 Explain with ML.EXPLAIN_PREDICT
 Explain Globally with ML.GLOBAL_EXPLAIN
Text Classification using reusable embeddings

 Use pre-trained TF Hub text modules to generate sentence vectors.


 Incorporate a pre-trained TF-Hub module into a Keras model.
 Deploy and use a text model on CAIP.

Keras for Text Classification using Vertex AI

 Learn how to tokenize and integerize a corpus of text for training in Keras
 Learn how to do one-hot-encodings in Keras
 Learn how to use embedding layers to represent words in Keras
 Learn about the bag-of-word representation for sentences
 Learn how to use DNN/CNN/RNN model to classify text in keras

Exploring the Dialogflow API


You'll learn how to: Create a Dialogflow account and your first Dialogflow agent, which lets you
define a natural language understanding model.

Could not do the lab because link was broken.

Text classification using reusable embeddings

 Use pre-trained TF Hub text modules to generate sentence vectors.


 Incorporate a pre-trained TF-Hub module into a Keras model.
 Deploy and use a text model on CAIP.

Text Classification Using Vertex AI

 Learn how to tokenize and integerize a corpus of text for training in Keras
 Learn how to do one-hot-encodings in Keras
 Learn how to use embedding layers to represent words in Keras
 Learn about the bag-of-word representation for sentences
 Learn how to use DNN/CNN/RNN model to classify text in keras

Could not do the lab because query failed

ERROR:
404 Not found: Table bigquery-public-data:hacker_news.stories was not found in location US

Location: US
Job ID: f79fb493-587c-4051-a147-e172545ba3d9
Econder Decoder

 Create a tf.data.Dataset [WellSaid please say tf dot data dot dataset] for a seq2seq
problem.
 Train an encoder-decoder model in Keras for a translation task.
 Save the encoder and the decoder as separate model.
 Merge the trained encoder and decoder into a translation function.

Text Classification using AutoML

 Import a text dataset to AutoML.


 Train the ML model for text classification.
 Evaluate the model performance.
 Deploy the model to an endpoint.
 Get predictions.

TensorFlow Dataset API

 Use tf.data to read data from memory.


 Use tf.data in a training loop.
 Use tf.data to read data from disk.
 Write production input pipelines with feature engineering (batching, shuffling, etc.).
Classifying Structured Data using Keras Preprocessing Layers

 Load a CSV file using Pandas.


 Build an input pipeline to batch and shuffle the rows using tf.data.
 Map from columns in the CSV to features used to train the model using Keras
preprocessing layers.
 Build, train, and evaluate a model using Keras.
Introducing the Keras Sequential API on Vertex AI Platform

 Build a DNN model using the Keras Sequential API.


 Learn how to use feature columns in a Keras model.
 Learn how to train a model with Keras.
 Learn how to save/load, and deploy a Keras model on GCP.
 Learn how to deploy and make predictions with the Keras model.

Training at Scale with Vertex AI Training Service

 Learn how to organize your training code into a Python package.


 Train your model using cloud infrastructure via Google Cloud Training Service.
 (optional) Learn how to run your training package using Docker containers and
push training Docker images on a Docker registry.

Performing Basic Feature Engineering in BQML

 Create SQL statements to evaluate the model


 Extract temporal features
 Perform a feature cross on temporal features

CONCAT(CAST(EXTRACT(DAYOFWEEK
FROM
pickup_datetime) AS STRING), CAST(EXTRACT(HOUR
FROM
pickup_datetime) AS STRING)) AS hourofday,

# Here, ML.EVALUATE function is used to evaluate model metrics


SELECT *, SQRT(loss) AS rmse FROM ML.TRAINING_INFO(MODEL
feat_eng.baseline_model)
# Here, ML.EVALUATE function is used to evaluate model metrics
SELECT * FROM ML.EVALUATE(MODEL feat_eng.baseline_model)
Performing Basic Feature Engineering in Keras

 Create an input pipeline using tf.data.


 Engineer features to create categorical, crossed, and numerical feature columns.

Exploring and Creating an Ecommerce Analytics Pipeline with Could Dataprep v1.5

 Connect BigQuery datasets to Cloud Dataprep


 Explore dataset quality with Cloud Dataprep
 Create a data transformation pipeline with Cloud Dataprep
 Schedule transformation jobs outputs to BigQuery

Vertex AI Workbench Notebook: Qwik Start

 Create a TensorFlow 2.x training application and validate it locally.


 Run your training job on a single worker instance in the cloud.
 Deploy a model to support prediction.
 Request an online prediction and see the response.

Vertex AI: Hyperparameter Tuning

 Modify training application code for hyperparameter tuning.


 Launch a hyperparameter tuning job from the Vertex AI UI.

Monitoring Vertex AI Models

 Deploy a pre-trained model.


 Configure model monitoring.
 Generate some artificial traffic.
 Interpret the data reported by the model monitoring feature.

Very complex – revisit if required

Introduction to Vertex Pipelines

 Use the Kubeflow Pipelines SDK to build scalable ML pipelines.


 Create and run a 3-step intro pipeline that takes text input.

Running Pipelines on Vertex AI 2.5

 Set up the Project Environment.


 Inspect and Configure Pipeline Code.
 Execute the AI Pipeline.

Structured data prediction using Vertex AI Platform

 Launch Vertex AI notebook instance


 Create a BigQuery Dataset and GCS Bucket
 Export from BigQuery to CSVs in GCS
 Training on Cloud AI Platform
 Deploy trained model

Introduction to TensorFlow Data Validation

 Review TFDV methods.


 Generate statistics.
 Visualize statistics.
 Infer a schema.
 Update a schema.

[methods for methods in dir(tfdv)]

Advanced Visualizations with TensorFlow Data Validation

 Install TFDV
 Compute and visualize statistics
 Infer a schema
 Check evaluation data for errors
 Check for evaluation anomalies and fix it
 Check for drift and skew
 Freeze the schema

Distributed Training with Keras

 Define a distribution strategy and set an input pipeline.


 Create the Keras model.
 Define the callbacks.
 Train and evaluate the model.

TPU Speed Data Pipelines

 To use the tf.data.Dataset API to load training data.


 To use TFRecord format to load training data efficiently from Cloud Storage.
Detecting Labels, Faces, and Landmarks in Images with the Cloud Vision API

 Create a Vision API request and call the API with curl.
 Use the label, face, and landmark detection methods of the vision API.

You've looked at the Vision API's label, face, and landmark detection methods, but there
are three others you haven't explored. Dive into the docs to learn about the other three:

 Logo detection: identify common logos and their location in an image.


 Safe search detection: determine whether or not an image contains explicit content.
This is useful for any application with user-generated content. You can filter images
based on four factors: adult, medical, violent, and spoof content.
 Text detection: run OCR to extract text from images. This method can even
identify the language of text present in an image.

Extracting Text from the Images using the Google Cloud Vision API

 Write and deploy several Background Cloud Functions.


 Upload images to Cloud Storage.
 Extract, translate and save text contained in uploaded images.

gcloud functions deploy ocr-extract \

--runtime python39 \

--trigger-bucket image_bucket_qwiklabs-gcp-02-0d0010f22e78 \
--entry-point process_image \

--set-env-vars "^:^GCP_PROJECT=qwiklabs-gcp-02-
0d0010f22e78:TRANSLATE_TOPIC=translate_lab:RESULT_TOPIC=result_lab:TO_LA
NG=es,en,fr,ja"

gcloud functions deploy ocr-translate \

--runtime python39 \

--trigger-topic translate_lab \

--entry-point translate_text \

--set-env-vars "GCP_PROJECT=qwiklabs-gcp-02-
0d0010f22e78,RESULT_TOPIC=result_lab"

gcloud functions deploy ocr-save \

--runtime python39 \

--trigger-topic result_lab \

--entry-point save_result \

--set-env-vars "GCP_PROJECT=qwiklabs-gcp-02-
0d0010f22e78,RESULT_BUCKET=result_bucket_qwiklabs-gcp-02-0d0010f22e78"

Identifying Damaged Car Parts with Vertex AI for AutoML Vision Users

 Upload a labeled dataset to Cloud Storage using a CSV file and connect it to Vertex
AI as a Managed Dataset.
 Inspect uploaded images to ensure there are no errors in your dataset.
 Review your trained model and evaluate its accuracy.
Classifying Images with a Linear Model

 Examine and understand the data.


 Implement a linear model using the Keras API.
 Plot the predictions.

Classifying Images with a NN and DNN Model

 Define Helper Functions.


 Train and evaluate a Neural Network (NN) model.
 Train and evaluate a Deep Neural Network model.

You might also like