0% found this document useful (0 votes)
67 views14 pages

AIA 6600 Module 5

Discussion on NXU AIA 6600
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views14 pages

AIA 6600 Module 5

Discussion on NXU AIA 6600
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Module 5: Introduction

Welcome to Module 5, Use of TensorFlow, Boosting, and Bagging Advanced


Pattern Analysis Algorithms in AI
This is the third module focused on tools and techniques. This is a clear
indication of how very important the tools are in the implementation and
use. If you are using the incorrect tools, the system won’t work optimally,
opportunities will be missed, and potentially misleading data the
organization depends on. This module focuses more on advanced
applications of TensorFlow– briefly introduced in Module 1. At the end of the
module, you will have an improved understanding and comfort of using
TensorFlow as part of the ML/AI implementation.
Recognizing the value in AI implementation and the elements to ensure the
implementation is a success is pertinent; however, tools must be available to
be implemented to achieve your and the organization’s goals. We have
introduced some of these tools in the previous modules. Two of these are
Quest and CS5.0, both used in various capacities for advanced pattern
analysis.

 Introduction
 Overview
 TensorFlow

 Boosting
 Bagging
 Module Wrap-up

Module 5: TensorFlow and Pattern


Recognition
With AI and advanced pattern analysis, several different and unique
algorithms exist to implement. The TensorFlow, boosting, and bagging will be
analyzed, and the optimal use case explored for each.
Let’s look at a couple of videos that will help lay a foundation for this
module. The first video is an introduction to pattern recognition. Pattern
recognition isn’t just for stop signs and for vehicle cameras to understand
what is in their environment, but it is also for something as simple as
handwriting. This may be used with classification for spam, regression
analysis with stock prices, or clustering for customer segmentation.
This video is applicable as this is one of the functions of TensorFlow, which
we will focus on in our next topic. As you review the video, consider the
following questions:
1. From the recording, what is scikit-learn used for, and what makes it
important for this module?
2. For pattern recognition, a common use is to decipher handwriting. What
are two industries that use this, and how?
https://www.youtube.com/watch?v=zb4J8_weas0
The next video is an introductory recording focusing on the patterns within
pictures. This is applicable as this is one of the functions of TensorFlow. As
you review the video, consider the following questions:

1. With letter or handwriting pattern recognition, how does the number of lines with
each character increase or decrease the complexity and processing needs?
2. How would curves make this more or less complex?

https://www.youtube.com/watch?v=mWYUx_HJeSM&t=1s

Module 5: Neural Networks and Image


Recognition
Image recognition is used in the mainstream across the nation by various
organizations. This uses AI algorithms to identify objects, people, places, and
other items as needed for the use case. The program can also label the
images for its content and guiding equipment (e.g., robots, autonomous
vehicles, and systems designed to help the drivers as they drive along the
expressways).
To work optimally, image recognition uses convolutional neural networks
(CNN). This works as the human visual cortex would divide data into pieces
and analyze the images. This is the core component of computer vision.
Commercial applications abound. These include e-commerce, gaming,
automotive, manufacturing, education, and law enforcement. In e-
commerce, this is used to process, categorize, and name product images
automatically. The gaming industry uses this to place a digital layer on top of
a real-world image. An example of this is Pokemon Go. Within the
automotive industry, this has many uses. These are used primarily in
vehicles to facilitate autonomous driving and image recognition. This
function detects and recognizes the objects in the road as moving objects,
vehicles, trucks, people, roadways, sidewalks, street signs, and other
images. In manufacturing, this may be used to decrease the number of
defects encountered during the production process. An example of this,
processed with TensorFlow, can be examined in the following article:

Neural Networks for Image Recognition: Methods, Best Practices,

Applications Links to an external site..

Use of Convolutional Neural Network for Image Classification Links to an


external site.
A recent application involves a forward-looking camera attached to a helmet.
The visualizations appear on the face shield. Curiously, this has a thermal
function to measure temperatures on a body. This also has facial recognition
capabilities. Vehicle applications are fascinating. You probably have seen
these on the news or in advertising. As the vehicle is driving down the road,
the camera views the vehicle’s environment and categorizes the images into
humans, other vehicles, etc. This is a great use case because CNN is
engineered to process efficiently, correlate images, and comprehend the
mass amount of data. From this description, you can see how this is vitally
important.
The present image recognition in computer science operates much like a
human eye. In the back of the eye are the cells that pick up the piece of the
image after it enters the front of the eye. The computer version does not
have individual cells lined up at the back of the eye but takes the 2D image
and splits this into very small squares. If you want to think about it, each cell
in the eye is like the square in the computer system. The computer system
works by applying the algorithm to the image. This first simplifies the image
and focuses on the most important information.
Once the image is prepared, CNN begins to work on analyzing this. The
neural networks are interconnected nodes, termed perceptrons. Each
perceptron analyzes a piece of the overall image (e.g., one pixel) and applies
a simple algorithm. The algorithm is the activation function, which generates
a result.

Module 5: TensorFlow Reflection Questions


Pattern recognition is used with vehicle systems and their sensors (e.g.,
cameras, radar, LiDAR, and others). In particular, these accept the data from
the sensors regarding the environment. This data is analyzed and
determined to be a person walking across the street, a stop sign, other
vehicles, and other environmental assets. Consider the following:
1. What do you see as potential issues with this?
2. Does the analysis need to be timely for the vehicle to operate on the road?
3. What could happen if a street sign was mistaken for a vehicle?

Module 5: TensorFlow
TensorFlow is a tool for beginner and advanced users to create these pattern
analysis models/algorithms. This section will explore the more advanced
applications of this.
TensorFlow assists the user in developing and training their ML models. This
tool is comprehensive in its functionality yet flexible. The tool can quickly
and easily generate models for the user using high-level APIs (e.g., Keras).
This is a quality feature in that there is no substantial wait to test the
models, work through workflow issues, and debug. The tool is also a quality
addition to the user as it may be used for training and deployed in the cloud,
on your premises, or your device, depending on its processing ability and
memory. This aspect is available regardless of the platform. TensorFlow was
designed to be used to assist the user in taking the idea from the early
stages with the concept to code level quicker than other tools or protocols.

Using TensorFlow for Time-Series Data


Now let’s review a video that shows how to use time-series data in a
predictive manner. In this case, the data was weather related. This shows
another industry use for TensorFlow. Consider the following questions:

1. With what other data sets may you use data sets?
2. In your opinion, how much data would be a minimum for analysis of material costs
and availability?

https://www.youtube.com/watch?v=6f67zrH-_IE

How TensorFlow Teaches Ccomputers to Think Like


Humans
How TensorFlow, which Google just made free, teaches computers to think

like humans Links to an external site. was published just after TensorFlow
was made open source by Google. The article is older. However, it provides a
baseline for the software. You can read this and appreciate where the
software started compared to where we are now. Consider the following
questions:

1. How has the functionality of TensorFlow increased over the years?


2. What is the most important improvement from the readings and your experience,
and why?

TensorFlow Independent Reflection


Consider the following questions in the use of TensorFlow:

1. If you were in a manufacturing firm producing toys, how would you use the
programmable interface?
2. Which visualizations do you think you would want to use, for which potential data
sets, and why?

Module 5: Examples of TensorFlow in


Industry
This is also meant to be used by the full range of users, from beginners to
experts. Beginners can start their projects with the user-friendly sequential
API. This allows the user to piece together their model by putting building
blocks of code together. This is at one layer of the spectrum and maybe
where you will start depending on your experience. On the other end of the
spectrum are the experts. TensorFlow is well-purposed for advanced
research and applications. The user can create a class within the model and
customize layers, activations, and training loops, among other activities, with
relative ease. As you can see, model building, integral to ML and AI, is
flexible and not overly taxing.
This has been used to assist with practical problems in many areas. These
vary per use case; however, they may be grouped for the use or industry.
There are too many industries to go over them all, but the following
interactive gives some information for some relevant industries. Select each
tab to learn more.

Healthcare
This has been used for detecting respiratory diseases and assisting the
medical staff in prescribing the best antibiotics for the patient. GE Healthcare
uses TensorFlow to train a neural network to identify anatomy from brain
MRIs.
Hospitality
Airbnb uses TensorFlow to improve the guest experience via classifying
images and detecting objects at scale.

Aeronautics
Airbus uses TensorFlow to pull information from its satellite images for
analysis and delivery to its clients.

Retail
Coca-Cola uses this to enable mobile proof-of-purchase for easy
transactions.

IT
Intel uses TensorFlow to optimize inference for their different models in use.
Lenovo uses this to assist in improving its deep learning models. Qualcomm
uses this on their Snapdragon models. Twitter uses this for ranking tweets
and building the ranked timeline.

Module 5: TensorFlow and AI Value Chain


TensorFlow has many capabilities, which are described in the following video.
One of these is Text Classification, which assists the learner with assigning
categories to the text documents. As you review the video, consider the
following questions:

 Does the language of the text matter? Why or why not?


 If part of the text was in a different format (e.g., mixed languages, bolt, or
italics), would this create a significant issue for the ML model, and if so, how
would you correct this?

With the application of TensorFlow, let’s examine the AI value chain. As a reminder,
this phase is Project, Produce, Promote, and Provide. Select each item below to
learn more.

Project
The use of TensorFlow allows for models to be tested before being implemented.
This is a large step for Research & Development. This saves time and resources
(labor, capital, etc.). This also is an exceptional tool to use with forecasting. With
the modeling, the team would be able to improve the forecasting model from
hopeful guessing to more of a data-based forecast.
Produce
With TensorFlow modeling used for the forecast function, this will optimize
the proposed production of the goods the firm manufactures. This would
move spreadsheets to predictive models used for this purpose, engineered
to bring this down to a much narrower forecast. This would also be used for
maintenance schedules to ensure the equipment used by the organization
does not break down before being maintained. TensorFlow’s models and
processing power are adept at this preventive maintenance.

Promote
The TensorFlow model may be used to forecast the best or optimal uses for
the marketing dollars before these are spent. This would be engineered to
spend the funds to focus on the target markets, which would be more likely
to spend the money.

Provide
Enhancing the customer experience will be achieved from the prior steps
allowing the operations and sales functions to move much more smoothly.

Module 5: TensorFlow and Machine


Learning
Review the following articles for additional applications of TensorFlow.

Smart Homes
Intelligent smart home energy efficiency model using artificial TensorFlow

engine Links to an external site., examines the lack of home and IoT
operating systems integrated into a central system due to the independent
platforms. The researchers used TensorFlow for the analysis. This shows
another area to use TensorFlow. Consider the following questions:

1. With the lack of data or integrated data, what would the system or you do (before
submitting the data to be analyzed) for this to be “clean” and usable?
2. How would you analyze the two separate data sets to build a thorough analysis if
both could not be integrated?

Deep Learning Model and Snakemake Workflow


TensorFlow based deep learning model and snakemake workflow for

peptide-protein binding predictions Links to an external site.

reviewed using the TensorFlow Deep Learning model in predicting the


binding of peptides. This provides another use of TensorFlow for you
consider. Consider the following questions:

1. We have seen TensorFlow used in many industries and instances. What instances or
industries would this not work well in?
2. When might the data provide a less accurate prediction in this use case with the
peptides?

Reflection
You are managing the construction of a new bridge to replace an aging one.
You have read about ML and AI and how these can assist with business. You
ask the data science staff member to investigate this.
The activity is to apply the use case to the value and respond for each of the
four in a robust, meaningful way. The four phases in the AI value chain to
elaborate on are Project, Produce, Promote, and Provide.

Module 5: Boosting Overview


The boosting algorithm is engineered to increase the accuracy of other
algorithms. This will focus on the background, use cases, and peculiarities.
The focus with ML models overall has been on a predictive value that could
be trusted. This is done by not using one model but by training a set of other
models, each improving on the prior generation, to improve the algorithm’s
prediction abilities. The first method we’ll discuss is boosting.
Boosting has proven to be one of the more popular and effective ML
approaches. This is used in both regression and classification. Boosting uses
iterations as its process. Each iteration builds on the prior iteration as the
algorithm learns. These iterations change and adapt the training samples for
the learning process so that the process begins to focus more on samples
the system finds difficult to classify. Unlike bagging, which will be discussed
in greater detail in the next topic, boosting also assigns a weight to each
sample. Depending on the individual iteration, these weights may change as
the iteration adapts to the environment. This is known as re-weighting. This
weighted average being re-evaluated with each iteration is what reduces the
bias.
Bragging and Boosting Example
The following video reviews the two methods, bagging and boosting, among
other topics. This covers their use's what, why, where, and when. As you
review the video, consider the following questions:

1. How does ensemble learning work?


2. How does combining these provide better predictions than using only one?

https://www.youtube.com/watch?v=m-S9Hojj1as&embeds_referring_euri=https%3A
%2F%2Fnexford.instructure.com%2Fcourses%2F5179%2Fpages%2Fmodule-5-
boosting-overview%3Fmodule_item_id%3D225215&source_ve_path=OTY3MTQ

Let’s use spam email as a practical example. You can have certain criteria to
define an email as spam or not. These criteria or rules, you determine, are
not strong enough to classify an email as spam or not. Since the
criteria/rules are not strong enough, the criteria/rules are listed as weak
learners.
We want to have strong learners, meaning the criteria/rules can identify a
spam email correctly. To accomplish this, the system uses the weighting for
the weighted average for each iteration and considers prediction for higher
votes. For instance, imagine there are seven weak learners. Of these, five
indicate the email as spam, and two indicate the email as not spam. Since
more “weak learners” voted or predicted this was spam, the system
considers it spam email. Through this extensive process, the weak learners
become the strong learners as the predictions increase.
There are several algorithms present that are used with this. A sample of
these is AdaBoost (the most popular and used), LPBoost, BrownBoost,
XGBoost, MadaBoost, LogitBoost, and many others.
One area this may be applied to is facial recognition. This was controversial
in 2020, with police departments and government units implementing this.
The issue has pivoted around a person’s right to privacy.
This also has been applied to palmprint identification. This is used, among
other uses, as authentication into secure facilities. In this method, the palm
texture/print may be turned into grayscale to accentuate the print into more
of a binary feature (print or no print) in a selected block of the palmprint.
Module 5: TensorFlow and Machine
Learning
Review the following articles for additional applications of TensorFlow.

Palmprint Identification
Palmprint identification using boosting local binary pattern Links to an
external site., provides a new method for the learner to review. You will be
able to apply the new or a similar method to their research. Consider the
following questions:

1. What makes using the grayscale so important in this research?


2. If the palm print scan were to not be of a quality level, how would this affect the
algorithm’s functioning?

Emotion Recognition
Emotion recognition with boosted tree classifiers Links to an external site.,
focuses on a new system to recognize emotions on the subject’s face using
video datasets and boosting. This will expose you to another example of
boosting based on image analysis. Consider the following questions:
The researcher’s report indicated the mouth held the most information for
the algorithm. What area of the face would you say held the next greatest
amount of data, and what makes this useful for the algorithm?
How would you factor in the context of the images being analyzed by the
algorithm?

Boosting Image Retrieval


Boosting image retrievalLinks to an external site., created a new method for
computing large numbers of features. The new method still works with the
AdaBoost algorithm. You will gain more knowledge of how this applies to
other work areas. Consider the following questions:

1. The researchers used 46,000 features. How does this number affect the algorithms?
2. The researchers also selected a sample of a few images for their testing. How would
many more images affect the algorithm’s effectiveness?

Marquette University
Algorithms; Findings from Marquette University broaden understanding of
algorithmsLinks to an external site., and exposes you to another application
with boosting. This version is with language recognition. Consider the
following questions:

1. How would the acoustic models affect the algorithmic analysis?


2. Using acoustic information, per the article, was more costly. With boosting, what
aspects of the acoustic information might the researchers have used, or what would
you recommend?

University of California Algorithms


Algorithms; Findings from University of California broaden understanding of
algorithmsLinks to an external site., a new boosting algorithm was proposed.
This was the fast cascade boosting (FCBoost) algorithm. This provides the
learner with insight into creating new algorithms. Consider the following
questions:

1. The researchers note the algorithm may be adjusted to accommodate cost-sensitive


circumstances. When would this be an issue for an organization?
2. The researchers note their new algorithm is able to determine the number of weak
learners needed for the algorithm to work. What factors would you include in this
analysis?

Reflective Question
You are working with the quality control working group. There has recently
been an issue with the quality of the whistles the organization
manufactures.
Please walk through the steps you would use with the boost technique (weak
learners to strong learners, factors considered, etc.).

Module 5: Bagging Algorithm


The bagging algorithm also has its specific use case for increasing stability
and making these more accurate. This will focus on the background, use
cases, and peculiarities.
Curiously, bagging is also known as bootstrap aggregating. The bootstrap
references the statistical estimate for the quantity used from a data sample.
This technique uses samples taken repeatedly from a data set using
replacement. Each bootstrap sample is the same size. As there is a
replacement, the same data points may repeatedly appear while others may
not. With the replacement, each point has the same probability of
reappearing with the next data set. Remember, the difference is with
boosting, the data points are weighted.
This intends to reduce variance and prevent the researcher from overfitting
the data. With more sample data sets, there will be less variance due to the
large number. You may use this method for classification and regression
problems.
Many industry examples are using this. One of these involves the ozone. The
relationship between ozone and temperature is nonlinear. To measure this,
you could pull 100 samples with replacements from the data. Each sample
point is different, yet these resemble the distribution and variability. Once
these are graphed, you could draw a predictor line through the data points.

Additional Articles About Bagging


A theoretical analysis of bagging as a linear combination of classifiers Links
to an external site.,
provides another example of bagging applied to in a theoretical set of data.
Consider the following question:

1. The researchers note there is a trade-off with computational requirements


regarding memory size and CPU time. With the advances in the last five years, is
this really still a concern for the large organizations? Lee, T., Tu, Y, & Ullah, A.
(2010).

Bagging nonparametric forecasts with constraints Links to an external


site.,
researched the use of economic information. This provides a new area for
you to consider applying the algorithms. Consider the following questions:

1. The economic data used for the analysis is important, and the analysis is much
more important to the nation. What are two things researchers could do to reduce
the risk of error?
2. The article notes the benefits of using bagging. What issues could you see as having
held this back from more use in economics?

Extreme learning machine ensemble using bagging for facial expression

recognition Links to an external site., explores using the bagging algorithm


for facial expression recognition. The article may be compared to the earlier
article on the same topic. The article allows you to compare the two articles
and styles. Consider the following questions:

1. The proposed method divides the picture into small cells for analysis. How small is
too small for bagging, or at what point would the square size be workable for
optimal results with the algorithm?
2. The researchers used two datasets for the images. How would the researchers
clearly show they did not pick the data sets so they would work best with their
algorithm?

Module 3: Milestone 1 Artificial Intelligence


Product Portfolio in the Healthcare Industry

Do this Assignment for me: This assignment is the second milestone in
working toward your final project for this class.

Assignment Description
You are the Chief Technology Officer for the Midwest Cardiology Institute, the
most extensive cardiology practice in the Midwest United States. With the
increase of heart disease in the American population, there continue to be
many more patients with heart health issues that require monitoring. This
significant increase in cardiology patients that require monitoring has
stretched the intended resources at the institute.
You have already demonstrated to the executive partners at Midwest
Cardiology Institute that AI can be applied in AI-powered wellness wearables.
The executive partners are seeking your consultation to propose a workable
solution to resolve the issue with resources and demonstrate how AI can
increase the Midwest Cardiology Institute’s operations (growth) and profits
(revenue maximization).

Assignment Instructions
You will prepare a memorandum for the Cardiology Institute. This
memorandum needs to be 4 pages, double-spaced, 12-point Times New
Roman Font. You are required to include at least two references. You can use
external references from our class in your reference list. These references
need to be formatted in NXU style formatting at the end of the memo with
appropriate in-text citations. You should include compelling justifications
throughout the memo. The memorandum should include the following
headers:
Create a plan that you can present to the partners as a memorandum. Your
memo should contain, at the least, the following six sections:

 Executive Summary
 Background
 Revenue maximization
 Assess the various profit maximization techniques generally used in the
healthcare industry.
 TensorFlow Application. Note: The TensorFlow analysis uses various models
which apply to this case.

 Image Recognition. Note: Apply neural networks and convolutional neural


networks. Cardiologists use various images of the heart in their diagnosis.

 Time Series Note: Apply recurrent neural networks to monitor cases of heart
failure

 Recommendations

Please a need a well detailed response

You might also like