0% found this document useful (0 votes)
26 views28 pages

Notes Class 09 AI Project Cycle

The AI Project Cycle consists of five key elements: Problem Scoping, Data Acquisition, Data Exploration, Modeling, and Evaluation, which differ from traditional IT project cycles due to the learning aspect of AI systems. Each stage involves specific tasks such as identifying problems, acquiring relevant data, exploring data trends, building models, and continuously evaluating performance. The document emphasizes the importance of authentic data and iterative improvements throughout the project cycle.

Uploaded by

Aarav Rajput
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views28 pages

Notes Class 09 AI Project Cycle

The AI Project Cycle consists of five key elements: Problem Scoping, Data Acquisition, Data Exploration, Modeling, and Evaluation, which differ from traditional IT project cycles due to the learning aspect of AI systems. Each stage involves specific tasks such as identifying problems, acquiring relevant data, exploring data trends, building models, and continuously evaluating performance. The document emphasizes the importance of authentic data and iterative improvements throughout the project cycle.

Uploaded by

Aarav Rajput
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

AI Project Cycle

AI Project Cycle

• Artificial Intelligence (AI) is one of the interdisciplinary branches of


computer science. This technology, as we already know, deals with
making computer system “Smarter”.
• Contrary to the popular understanding, artificial intelligence is not a
recent development. This technology has been developed over a period
spanning decades before it was commercialized.
• AI project cycle is different then normal IT project cycle. An important
distinction in the AI systems is that they learn; this means the AI project
system becomes cyclic!
AI Project Cycle Elements

1. Problem Scoping
2. Data Acquisition
3. Data Exploration
4. Modeling
5. Evaluation
1. Problem Scoping
• We start with the first stage of AI Project Cycle that
is Problem Scoping. As we have understood,
Problem Scoping. Means selecting a problem
which we might want to solve using our AI
knowledge.
• You can select a theme for those given below. But
delve deep into the theme to find out topics where
problems exist.
• After listing down the problems, go further down till the root cause of the
problem is found. We can use Ishikawa tool for this purpose
1. Problem Scoping

Ishikawa or fishbone or 5 ways or root cause analysis is a tool which is intended


to reveal key relationship between various underlying variables of an effect, and
the possible cause provide additional insight into process behavior.
1. Problem Scoping
We will use another tool called 4W problem canvas. This canvas helps us in
identifying 4 critical parameters we need to know for solving a problem.
4Ws refers to Who, What, Where and Why
Who
The ”Who” block helps us in analyzing the people who are getting affected
directly or indirectly by it. Stakeholders are people who face this problem and
would be benefited from the solution
Who are the stakeholders? Are these the people of the organization
What do you know about them and their needs
1. Problem Scoping
What
Under the “What” section you need to look into what is the actual problems
you have on hand on hand. At this stage, you need to determine the exact
nature of the problem. What is the problem and how do you know that it is
really a problem for your stakeholders? Supporting evidence to show that this
problem exists, could be print media, TV news, government reports etc.
What is the exact problem? Define it.
How do you know it is a problem? (Is there any evidence?) Show sources
1. Problem Scoping
Where
Now that you know who is affected by the problem and what the problem
actually is, you need to focus on the context/location/frequency, etc. of the
problem. This section will help you look into the problem with some specific
and move you closer to the solution.
Let us fill the “Where” canvas!
Where is the context/situation/frequency of the problem that the stakeholders
experience?
Where is the problem exactly located?
1. Problem Scoping
Why
After we have figured out who is affected, know the exact nature of the
problem and where it is located, we need to understand where the solution to
the problem be deployed. Think of the benefits of this solution to the
stakeholders and society at large.
Let us fill “Why” Canvas
Why will this solution be of use to the stakeholders?
How will the solution will improvise their condition?
Problem statements template with spaces to fill details
according to your goal:
2. Data Acquisition
• As we move ahead in the AI Project cycle , we come across the second element
called Data Acquisition. Data can be a piece of information or facts collected
together for reference or analysis. Whenever we want an AI project to be able
to predict an output, we need to train first using data-training data set.
• For example, suppose you want to make an AI-powered system which, based
on historical data, can predict whether an employee will come late the next
day or not. This is the data with which the machine can be trained. Once it is
ready, it will efferently predict whether the employee will come late or not. The
previous salary data over here is known as historical data or training data
while the next late-coming prediction data set is known as the testing data.
2. Data Acquisition
For better efficiency of an AI project, the
training data needs to be relevant and
authentic. In the above example, if the
training data was not of his late coming but of
salary, the machine would not have predicted
his next late-coming event correctly since the
while training would have gone wrong.
Similarly, if the previous attendance was not
authentic, that is, if it was not correct, then
also the prediction would have gone wrong.
2. Data Acquisition

Data Features
• For any AI project to be efficient, the training data should be authentic and
relevant to the problem statement scoped.
2. Data Acquisition

Data Features
• We have now come to the stage of data acquisition how do we know what
data to get based on the problem statement? We need to visualize the factors
which affects the problem statement. For this, we need to extract Data
Feature for the problem scoped. Now try to find out what are the parameters
which affect problem statements directly or indirectly and list them down.
• Look at your problem statement once again and try to find out the data
features require to address this issue. Data feature refers to the type of data
you want to collect. In our previous example, data feature would be day, in
time, out time, biometric machine attendance data, leave record etc.
3. Data Exploration

• Why do you think we need to explore and


visualize data before jumping to the AI model?
When we pick up a library book, we tend to
look at the book cover, read the back cover and
skim through the content of the book prior to
choosing it as it helps us to understand if this
book is appropriate for our needs and interest.
3. Data Exploration

• Similarly, when we get a data set in our hands,


spending time to explore it will help us get a
sense of the trends, relationship and patterns
present in the data. It will also help us better
decide which model/models to use in the
subsequent AI Project Cycle stage. We use data
visualization as a method because it is much
easier to comprehend information quickly and
communicate the story to others.
3. Data Exploration

In the previous modules you set the goal of your


project and also found ways to acquire data. While
acquiring data, you must have noticed that the data
consist of numbers, words, media, etc. For example,
if you go to a library and pick up a random book,
you first try to go through its content quickly by
turning pages and by reading the description before
borrowing it for yourself because it helps you in
understanding if the book is appropriate for your
needs and interests you or not.
3. Data Exploration

To analyze the data, you need to visualize it in some user-friendly


format so that you can:
• Quickly get a sense of the trends, relationships and patterns
contained within the data.
• Define strategy for which model to use at a later stage.
• Communicate the same to others effectively. To visualize data, we
can use various types of visual representation. (Like charts and
graphs)
4. Modeling

To built an AI-based project, you need to work around models or


algorithms to predict the output given a set of inputs. This could
be done either by designing your own model or by using the pre-
existing AI models. Before jumping to modeling, let us clarify the
definition of Artificial Intelligence (AI), Machine Learning (ML)
and Deep Learning (DL)
4. Modeling

Artificial Intelligence or AI
refers to any technique that
enables computers to mimic
human intelligence. An
artificially intelligent
machine works on
algorithms and data fed into
it and gives the desired
output.
4. Modeling

Machine Learning or ML enables


machines to improve upon tasks
with experience. The machine
here learns from the new data
fed into it while testing and uses
it for the next iteration. It also
takes into account the times
when it went wrong and
considers the exceptions too.
4. Modeling

Deep Learning or DL enables the


software to train itself to perform
tasks with a vast amount of data.
Since the system has got a huge set
of data, it is able to train itself with
the help of multiple machine
learning algorithm working
together to perform a specific task.
4. Modeling

As you can see in the diagram,


Artificial intelligence is the
umbrella term which covers both
machine learning and deep
learning. Deep learning comes
under Machine learning. It is a
funnel-type approach where there
are a lot of applications of AI, out
of which a few are those which
comes under ML and out of those,
very few go into DL
4. Modeling

Machine
Learning Learning
based Deep
AI Models Learning
Rule-
based

Rule based Approach refers to AI modeling where the relationship or


patterns in data are pre-defined. The algorithm just follows the rules or
instructions mentioned and performs its tasks accordingly. For example,
suppose you have a data comprising 1,000 images of onions and as
many of carrots.
4. Modeling

To train your machine, you feed this data into the machine and
label each image as either onion or carrot. This is your training
data. Now if you test the machine with the image (Testing Data)
of an onion, it will compare the images with the trained data and
as per the rules, it will identify the test image as an onion. This is
known as Rule-based approach
4. Modeling

Learning-based Approach refers to AI Modeling where the


underlying relationship or patterns in the data are not defined.
In this approach, random data is fed into them machine and
the algorithm needs to derive a relationship in the data.
Generally this approach is followed when the data is unlabeled
or too random for a human to make sense out of it. The
machine analyzes the data and tries to extract similar features
out of it and cluster same data sets together. In the end as
output, the machine gives us broad trends observed in the data
sets.
4. Modeling

For example, imagine you have a data set of 10,000 images of


people in your city and you have to understand which of them
are sick. Obviously, you would not have any ready reference as
to how to identify from a picture itself that someone is not
well. You might use their facial expressions and other emotions
to cluster them into groups and try to understand what
attributes in the images define a sick person.
5. Evaluation

After you have completed all the cycles of the AI project cycle, it
is very important to keep evaluating your model to ensure that
the results are working well with the new data. If there are
variances between the training data set and test data set, keep
iterating on the model to improve.

You might also like