0% found this document useful (0 votes)
62 views22 pages

Artificial Intelligence & Machine Learning Unit 6: Applications Question Bank and Its Solution

This document discusses human-machine interaction and provides a question bank and solutions for an engineering course. It defines human-machine interaction, describes how it works through interfaces, lists some common human-machine systems, and discusses trends like gesture control, augmented reality, and opportunities and challenges of HMI. The question bank covers topics like predictive maintenance, fault detection, and process optimization applications of artificial intelligence and machine learning.

Uploaded by

Tejas Narsale
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views22 pages

Artificial Intelligence & Machine Learning Unit 6: Applications Question Bank and Its Solution

This document discusses human-machine interaction and provides a question bank and solutions for an engineering course. It defines human-machine interaction, describes how it works through interfaces, lists some common human-machine systems, and discusses trends like gesture control, augmented reality, and opportunities and challenges of HMI. The question bank covers topics like predictive maintenance, fault detection, and process optimization applications of artificial intelligence and machine learning.

Uploaded by

Tejas Narsale
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/361525297

Artificial Intelligence & Machine Learning Unit 6: Applications Question bank


and its solution

Presentation · June 2022


DOI: 10.13140/RG.2.2.35031.96162

CITATIONS READS

0 4,359

1 author:

Abhishek D. Patange
College of Engineering, Pune
60 PUBLICATIONS   198 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Machine Fault Diagnosis using Machine Learning Techniques View project

Machine Health Monitoring using Wavelet Feature Extraction : A Machine Learning Approach View project

All content following this page was uploaded by Abhishek D. Patange on 25 June 2022.

The user has requested enhancement of the downloaded file.


Artificial Intelligence & Machine Learning
Course Code: 302049

Unit 6: Applications
Third Year Bachelor of Engineering (Choice Based Credit System)
Mechanical Engineering (2019 Course)
Board of Studies – Mechanical and Automobile Engineering, SPPU, Pune
(With Effect from Academic Year 2021-22)

Question bank and its solution


by

Abhishek D. Patange, Ph.D.


Department of Mechanical Engineering
College of Engineering Pune (COEP)
QUESTION BANK FOR UNIT 6: APPLICATIONS

Unit 6: Applications
Syllabus:
Content Theory
Human Machine Interaction
Predictive Maintenance and Health Management
Fault Detection
Dynamic System Order Reduction
Image based part classification
Process Optimization
Material Inspection
Tuning of control algorithms

Type of question and marks:


Type Theory
Marks 2 or 4 or 6

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

Topic: Human Machine Interaction

1. What is human-machine interaction?

 HMI is all about how people and automated systems interact and communicate with
each other. That has long ceased to be confined to just traditional machines in industry
and now also relates to computers, digital systems or devices for the Internet of Things
(IoT).
 More and more devices are connected and automatically carry out tasks. Operating all of
these machines, systems and devices needs to be intuitive and must not place excessive
demands on users.
 Human-machine interaction is all about how people and automated systems interact
with each other.
 HMI now plays a major role in industry and everyday life: More and more devices are
connected and automatically carry out tasks.
 A user interface that is as intuitive as possible is therefore needed to enable smooth
operation of these machines. That can take very different forms.

2. How does human-machine interaction work?

 Smooth communication between people and machines requires interfaces: The place
where or action by which a user engages with the machine.
 Simple examples are light switches or the pedals and steering wheel in a car: An action is
triggered when you flick a switch, turn the steering wheel or step on a pedal.
 However, a system can also be controlled by text being keyed in, a mouse, touch screens,
voice or gestures.
 The devices are either controlled directly: Users touch the smartphone’s screen or issue a
verbal command. Or the systems automatically identify what people want: Traffic lights
change color on their own when a vehicle drives over the inductive loop in the road’s
surface.
 Other technologies are not so much there to control devices, but rather to complement
our sensory organs. One example of that is a virtual reality glass.
 There are also digital assistants: Chatbots, for instance, reply automatically to requests
from customers and keep on learning.
 User interfaces in HMI are the places where or actions by which the user engages with
the machine.
 A system can be operated by means of buttons, a mouse, touch screens, voice or
gesture, for instance.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

 One simple example is a light switch – the interface between the machine ―light‖ and a
human being.
 It is also possible to differentiate further between direct control, such as tapping a touch
screen, and automatic control.
 In the latter case, the system itself identifies what people want.
 Think of traffic lights which change color as soon as a vehicle drives over the inductive
loop in the road’s surface.

3. What human-machine systems are there?

 For a long time, machines were mainly controlled by switches, levers, steering wheels or
buttons; these were joined later by the keyboard and mouse.
 Now we are in the age of the touch screen. Body sensors in wearables that automatically
collect data are also modern interfaces.
 Voice control is also making rapid advances: Users can already control digital assistants,
such as Amazon Alexa or Google Assistant, by voice.
 That entails far less effort. Chatbots are also used in such systems and their ability to
communicate with people is improving more and more thanks to artificial intelligence.

4. What trends are there in human-machine interaction?

 Gesture control is at least as intuitive as voice control. That means robovacs, for example,
could be stopped by a simple hand signal in the future.
 Google and Infineon have already developed a new type of gesture control by the name
of ―Soli‖:
 Devices can also be operated in the dark or remotely with the aid of radar technology.
 Technologies that augment reality now also act as an interface. Virtual reality glasses
immerse users in an artificially created 3D world, while augmented reality glasses
superimpose virtual elements in the real environment.
 Mixed reality glasses combine both technologies, thus enabling scenarios to be
presented realistically thanks to their high resolution.

5. What opportunities and challenges arise from human-machine interaction?

 Modern HMI helps people to use even very complex systems with ease. Machines also
keep on getting better at interpreting signals – and that is important in particular in
autonomous driving.
 Human needs are identified even more accurately, which means robots can be used in
caring for people, for instance. One potential risk is the fact that hackers might obtain
information on users via the machines’ sensors.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

 Last but not least, security is vital in human-machine interaction. Some critics also fear
that self-learning machines may become a risk by taking actions autonomously.
 It is also necessary to clarify the question of who is liable for accidents caused by HMI.

6. Where is human-machine interaction headed?

 Whether voice and gesture control or virtual, augmented and mixed reality, HMI
interaction is far from reaching the end of the line.
 In future, data from different sensors will also increasingly be combined to capture and
control complex processes optimally.
 The human senses will be replicated better and better with the aid of, for example, gas
sensors, 3D cameras and pressure sensors, thus expanding the devices’ capabilities.
 In contrast, there will be fewer of the input devices that are customary at present, such as
remote controllers.

7. What are the opportunities and challenges HMI?

 Even complex systems will become easier to use thanks to modern human-machine
interaction. To enable that, machines will adapt more and more to human habits and
needs. Virtual reality, augmented reality and mixed reality will also allow them to be
controlled remotely. As a result, humans expand their realm of experience and field of
action.
 Machines will also keep on getting better at interpreting signals in future – and that’s
also necessary: The fully autonomous car must respond correctly to hand signals from a
police officer at an intersection. Robots used in care must likewise be able to ―assess‖ the
needs of people who are unable to express these themselves.
 The more complex the contribution made by machines is, the more important it is to
have efficient communication between them and users. Does the technology also
understand the command as it was meant? If not, there’s the risk of misunderstandings –
and the system won’t work as it should. The upshot: A machine produces parts that don’t
fit, for example, or the connected car strays off the road.
 People, with their abilities and limitations, must always be taken into account in the
development of interfaces and sensors. Operating a machine must not be overly complex
or require too much familiarization. Smooth communication between human and
machine also needs the shortest possible response time between command and action,
otherwise users won’t perceive the interaction as being natural.
 One potential risk arises from the fact that machines are highly dependent on sensors to
be controlled or respond automatically. If hackers have access to the data, they obtain
details of the user’s actions and interests. Some critics also fear that even learning
Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)
QUESTION BANK FOR UNIT 6: APPLICATIONS

machines might act autonomously and subjugate people. One question that has also not
been clarified so far is who is liable for accidents caused by errors in human-machine
interaction, and who is responsible for them.
Reference: https://www.infineon.com/cms/en/discoveries/human-machine-interaction/

Topic: Fault Detection / Predictive Maintenance / Health Management

8. Make a list of maintenance and explain in brief. Discuss the scope of AIML.

Predictive maintenance: Predictive maintenance is used for


 Identify anomalies in the process, which help in preventive maintenance.
 Estimate the demand for product, raw material etc.: based on historical data and
current scenario.
 Forecast possible outcomes based on data obtained from the process.
Prescriptive maintenance: Prescriptive maintenance is used to identify ways in which an
industrial process can be improved. While predictive maintenance tells when could a
component/asset fails, prescriptive analytics tells what action you need to take to avoid the
failure. So, you can use the results obtained from prescriptive analysis to plan the
maintenance schedule, review your supplier, etc. Prescriptive maintenance also helps you
manage complex problems in the production process using relevant information.
Descriptive maintenance: The core purpose of descriptive maintenance is to describe the
problem by diagnosing the symptoms. This method also helps discover the trends and
patterns based on historical data. The results of a descriptive maintenance are usually shown
in the form of charts and graphs. These data visualization tools make it easy for all the
stakeholders, even those who are non-technical to understand the problems in the
manufacturing process.
Diagnostic maintenance: Diagnostic maintenance is also referred to as root cause analysis.
While descriptive maintenance can tell what happened based on historical data, diagnostic
maintenance tells you why it happened. Data mining, data discover, correlation, and down
and drill through methods are used in diagnostic analytics. Diagnostic maintenance can be
used to identify cause for equipment malfunction or reason for the drop in the product
quality.

9. Explain fault diagnosis (of any suitable machine element) using ML.

Refer following articles and explain the procedure they have adopted.
 Sakthivel, N. R., Sugumaran, V., & Babudevasenapati, S. (2010). Vibration based fault
diagnosis of monoblock centrifugal pump using decision tree. Expert Systems with
Applications, 37(6), 4040-4049.
https://www.sciencedirect.com/science/article/pii/S0957417409008689
Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)
QUESTION BANK FOR UNIT 6: APPLICATIONS

 Sugumaran, V., Muralidharan, V., & Ramachandran, K. I. (2007). Feature selection


using decision tree and classification through proximal support vector machine for
fault diagnostics of roller bearing. Mechanical systems and signal processing, 21(2),
930-942.
https://www.sciencedirect.com/science/article/pii/S0888327006001142
 Patange, A. D., & Jegadeeshwaran, R. (2021). A machine learning approach for
vibration-based multipoint tool insert health prediction on vertical machining centre
(VMC). Measurement, 173, 108649.
https://www.sciencedirect.com/science/article/pii/S0263224120311659
 Patange, A. D., & Jegadeeshwaran, R. (2020). Application of bayesian family classifiers
for cutting tool inserts health monitoring on CNC milling. International Journal of
Prognostics and Health Management, 11(2).
http://papers.phmsociety.org/index.php/ijphm/article/view/2929

Topic: Image based part classification

10. Explain an intelligent approach for classification of Nuts, Bolts, Washers and

Locating Pins?

An intelligent approach to classify Nuts, Bolts, Washers and Locating Pins as our Cats and
Dogs is explained here.

Bolt or Nut or Locating Pin or Washer? Will the AI be able to tell?


So how does it work? An algorithm is able to classify images (efficiently) by using a Machine
Learning algorithm called Convolutional Neural Networks (CNN) a method used in Deep
Learning. We will be using a simple version of this model called Sequential to let our model
distinguish the images into four classes Nuts, Bolts, Washers and Locating Pins. The model
will learn by ―observing‖ a set of training images. After learning we will see how accurately it
can predict what an image (which it has not seen) is.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

A flowchart of a Machine Learning algorithm trained on Images of Nuts and Bolts using a
Neural Network Model.
Data-set
We downloaded 238 parts each of the 4 classes (Total 238 x 4 = 952) from various part
libraries available on the internet. Then we took 8 different isometric images of each part.
This was done to augment the data available, as only 238 images for each part would not be
enough to train a good neural network. A single class now has 1904 images (8 isometric
images of 238 parts) a total of 7616 images. Each image is of 224 x 224 pixels.

Images of the 4 classes. 1 part has 8 images. Each image is treated as single data. We then
have our labels with numbers 0,1,2,3 each number corresponds to a particular image and
means it belongs to certain class #Integers and their corresponding classes
{0: 'locatingpin', 1: 'washer', 2: 'bolt', 3: 'nut'} After training on the above images we will then
see how well our model predicts a random image it has not seen.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

Methodology
The process took place in 7 steps. We will get to the details later. The brief summary is
1. Data Collection : The data for each class was collected from various standard part
libraries on the internet.
2. Data Preparation : 8 Isometric view screenshots were taken from each image and
reduced to 224 x 224 pixels.
3. Model Selection : A Sequential CNN model was selected as it was simple and good
for image classification
4. Train the Model: The model was trained on our data of 7616 images with 80/20
train-test split
5. Evaluate the Model: The results of the model were evaluated. How well it predicted
the classes?
6. Hyperparameter Tuning: This process is done to tune the hyperparameters to get
better results . We have already tuned our model in this case
7. Make Predictions: Check how well it predicts the real world data
Data Collection
We downloaded the part data of various nuts and bolts from the different part libraries on
the internet. These websites have numerous 3D models for standard parts from various
makers in different file formats. Since we will be using FreeCAD API to extract the images we
downloaded the files in neutral format (STEP).

Flowchart of the CAD model download


As already mentioned earlier, 238 parts from each of the 4 class was downloaded, that was a
total of 952 parts.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

Data Preparation
Then we ran a program using FreeCAD API that automatically took 8 isometric screenshots
of 224 x 224 pixels of each part. FreeCAD is a free and open-source general-purpose
parametric 3D computer-aided design modeler which is written in Python.

A flowchart of how the data was created


As already mentioned above, each data creates 8 images of 224 x 224 pixels. So we now
have a total of 1904 image from each of the 4 classes, thus a total of 7616 images. Each
image is treated as a separate data even though 8 images come from the same part.

8 isometric images of a 2 bolts. Each row represents a different part.


The images were kept in separated folders according to their class. i.e. we have four folders
Nut,Bolt, Washer and Locating Pin.
Next, each of these images were converted into an array with their pixel values in grayscale.
The value of the pixels range from 0 (black), 255 (white). So its actually 255 shades of gray.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

Example of an Image converted to an array of pixel values. (Source: openframeworks.cc)


Now each of our image becomes a 224 x 224 array. So our entire dataset is a 3D array of
7616 x 224 x 224 dimensions.
7616 (No. of images) x 224 x 224 (pixel value of each image)

Visualization of our pixel array using matplot.lib


Similarly we create a the label dataset by giving the value of the following integers for the
shown classes to corresponding indexes in the dataset. If our 5th(index) data in the
dataset(X) is a locating pin , the 5th data in label set (Y) will have value 0.
#integers and the corresponding classes as already mentioned above
{0: 'locatingpin', 1: 'washer', 2: 'bolt', 3: 'nut'}
Model Selection
Since this is an image recognition problem we will be using a Convolutional Neural Network
(CNN). CNN is a type of Neural Network that handles image data especially well. A Neural
Network is a type of Machine learning algorithm that learns in a similar manner to a human
brain.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

A basic neural network.

A Convolutional Neural network. A basic visualization of how our algorithm will work
The following code is how our CNN looks like. Don’t worry about it if you don’t understand.
The idea is the 224 x 224 features from each of our data will go through these network and
spit out an answer. The model will adjusts its weights accordingly and after many iterations
will be able to predict a random image’s class.
#Model description
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
===========================================================
======
conv2d_1 (Conv2D) (None, 222, 222, 128) 1280
_________________________________________________________________
activation_1 (Activation) (None, 222, 222, 128) 0

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 111, 111, 128) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 109, 109, 128) 147584
_________________________________________________________________
activation_2 (Activation) (None, 109, 109, 128) 0
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 54, 54, 128) 0
_________________________________________________________________
flatten_1 (Flatten) (None, 373248) 0
_________________________________________________________________
dense_1 (Dense) (None, 64) 23887936
_________________________________________________________________
dense_2 (Dense) (None, 4) 260
_________________________________________________________________
activation_3 (Activation) (None, 4) 0
===========================================================
======
Total params: 24,037,060
Trainable params: 24,037,060
Non-trainable params: 0
Model Training
Now finally the time has come to train the model using our dataset of 7616 images. So our
[X] is a 3D array of 7616 x 224 x224 and [y] label set is a 7616 x 1 array. For all training
purposes a data must be split into at least two parts: Training and Validation (Test) set (test
and validation are used interchangeably when only 2 sets are involved).

Data being split into training and test set.


The training set is the data the model sees and trains on. It is the data from which it adjusts
its weights and learn. The accuracy of our model on this set is the training accuracy. It is
generally higher than the validation accuracy.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

The validation data usually comes from the same distribution as the training set and is the
data the model has not seen. After the model has trained from the training set, it will try to
predict the data of the validation set. How accurately it predicts this, is our validation
accuracy. This is more important than the training accuracy. It shows how well the model
generalizes. In real life application it is common to split it even into three parts. Train,
Validation and Test. For our case we will only split it into a training and test set. It will be a
80–20 split. 80 % of the images will be used for training and 20% will be used for testing.
That is train on 6092 samples, test on 1524 samples from the total 7616.

Visualization of the model training.


For our model we trained for 15 epochs with a batch-size of 64.
The number of epochs is a hyperparameter that defines the number times that the learning
algorithm will work through the entire training dataset.
One epoch means that each sample in the training dataset has had an opportunity to update
the internal model parameters. An epoch is comprised of one or more batches.
You can think of a for-loop over the number of epochs where each loop proceeds over the
training dataset. Within this for-loop is another nested for-loop that iterates over each batch
of samples, where one batch has the specified ―batch size‖ number of samples. [2]
That is our model will go through our entire 7616 samples 15 times (epoch) in total and
adjust its weights each time so the prediction is more accurate each time. In each epoch, it
will go through the 7616 samples, 64 samples (batch size) at a time.
Evaluate the model
The model keeps updating its weight so as to minimize the cost(loss), thus giving us the best
accuracy. Cost is a measure of inaccuracy of the model in predicting the class of the image.
Cost functions are used to estimate how badly models are performing. Put simply, a cost
function is a measure of how wrong the model is in terms of its ability to estimate the
relationship between X and y. [1]

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

If the algorithm predicts incorrectly the cost increases, if it predicts correct the cost
decreases.
After training for 15 epochs we can see the following graph of loss and accuracy. (Cost and
loss can be used interchangeably for our case)

Graph generated from matplot.lib showing Training and Validation loss for our model
The loss decreased as the model trained more times. It becomes better at classifying the
images with each epoch. The model is not able to improve the performance much on the
validation set.

Graph generated from matplot.lib showing Training and Validation accuracy for our model
The accuracy increased as the model trains for each epoch. It becomes better at classifying
the images. The accuracy is for the validation set is lower than the training set as it has not
trained on it directly. The final value is 97.64% which is not bad.
Hyperparameter Tuning
The next step would to be change the hyperparameters, the learning rate,number of epochs,
data size etc. to improve our model. In machine learning, a hyperparameter is a parameter
whose value is used to control the learning process. By contrast, the values of other
parameters (typically node weights) are derived via training.[3] For our purpose we have
already modified these parameters before this article was written, in a way to obtain an

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

optimum performance for display on this article. We increased the dataset size and number
of epochs to improve the accuracy.

Hyperparameters affect parameters and eventually the final score (accuracy)


Make Predictions
The final step after making the adjustments on the model is to make predictions using actual
data that will be used on this model. If the model does not perform well on this further
hyperparameter tuning can commence.
Machine Learning is a rather iterative and empirical process and thus the tuning of
hyperparameters is often compared to an art rather than science as although we have an
idea of what changes will happen by changing certain hyperparameters, we cannot be
certain of it.

The machine learning algorithm flowchart


Applications
This ability to classify mechanical parts could enable us to recommend parts from a standard
library based only on an image or a CAD model provided by the customer. Currently to
search for a required part from a standard library you have to go through a catalogue and be
able to tell which part you want based on the available options and your knowledge of the

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

catalogue. There are serial codes to remember as a change in a single digit or alphabet
might mean a different type of part.

Example of a part number


If an image can be used to get the required part from the standard library, all we will need to
do is to make a rough CAD model of it and send it through our algorithm. The algorithm will
decide which parts are best and help narrow down our search significantly.

Visualisation of how the recommendation algorithm would work


If the classification method gets detailed and fine-tuned enough it should be able to classify
with much detail what type of part you want. The narrowed search saves a lot of time. This is
especially useful in a library where there are thousands of similar parts.

Topic: Material inspection

11. Explain scope of ML for Materials Science.

Reference: Five High-Impact Research Areas in Machine Learning for Materials Science by
Bryce Meredig. https://pubs.acs.org/doi/10.1021/acs.chemmater.9b04078
Over the past several years, the field of materials informatics has grown dramatically. (1)
Applications of machine learning (ML) and artificial intelligence (AI) to materials science are
now commonplace. As materials informatics has matured from a niche area of research into
an established discipline, distinct frontiers of this discipline have come into focus, and best

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

practices for applying ML to materials are emerging. (2) The purpose of this editorial is to
outline five broad categories of research that, in my view, represent particularly high-impact
opportunities in materials informatics today:
 Validation by experiment or physics-based simulation. One of the most common
applications of ML in materials science involves training models to predict materials
properties, typically with the goal of discovering new materials. With the availability of
user-friendly, open-source ML packages such as scikit-learn, (3)keras, (4) and pytorch, (5)
the process of training a model on a materials data set—which requires only a few lines
of python code—has become completely commoditized. Thus, standard practice in
designing materials with ML should include some form of validation, ideally by
experiment (6−8) or, in some cases, by physics-based simulation. (9,10) Of particular
interest are cases in which researchers use ML to identify materials whose properties are
superior to those of any material in the initial training set; (11) such extraordinary results
remain scarce.
 ML approaches tailored for materials data and applications. This category
encapsulates a diverse set of method development activities that make ML more
applicable to and effective for a wider range of materials problems. Materials science as a
field is characterized by small, sparse, noisy, multiscale, and heterogeneous
multidimensional (e.g., a blend of scalar property estimates, curves, images, time series,
etc.) data sets. At the same time, we are often interested in exploring very large, high-
dimensional chemistry and processing design spaces. Some method development
examples to address these challenges include new approaches for uncertainty
quantification (UQ), (12) extrapolation detection, (13) multiproperty optimization, (14)
descriptor development (i.e., the design of new materials representations for ML),
(15−17) materials-specific cross-validation, (18,19) ML-oriented data standards, (20,21)
and generative models for materials design. (22)
 High-throughput data acquisition capabilities. ML is notoriously data-hungry. Given
the typically very high cost of acquiring materials data, both in terms of time and money,
the materials informatics field is well-served by research that accelerates and
democratizes our ability to synthesize, characterize, and simulate materials. Examples
include high-throughput density functional theory calculations of materials properties,
(23−25) applications of robotics, automation, and operations research to materials
science, (26−30) and natural language processing (NLP) to extract materials data from
text corpora. (31,32)
 ML that makes us better scientists. A popular refrain in the materials informatics
community is that ―ML will not replace scientists, but scientists who use ML will replace

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

those who do not.‖ This bon mot suggests that ML has the potential to make scientists
more effective and enable them to do more interesting and impactful work. We are still
in the nascent stages of creating true ML-based copilots for scientists, but research areas
such as ML model explainability and interpretability (33,34) represent a valuable early
step. Another example is the application of ML to accelerate or simplify materials
characterization. Researchers have used deep learning to efficiently post-process and
understand images generated via existing characterization methods such as scanning
transmission electron microscopy (STEM) (35) and position averaged convergent beam
electron diffraction (PACBED). (36)
 Integration of physics within ML, and ML with physics-based simulations. The
paucity of data in many materials applications is a strong motivator for formally
integrating known physics into ML models. One approach to embedding physics within
ML is to develop methods that guarantee certain desirable properties by construction,
such as respecting the invariances present in a physical system. (37) Another strategy is
to use ML to model the difference between simulation outputs and experimental results.
For example, Google and collaborators created TossingBot, a robotic system that learned
to throw objects into bins with the aid of a ballistics simulation. (38) The researchers
found that a physics-aware ML approach, wherein ML learned and corrected for the
discrepancy between the simulations and real-world observations, dramatically
outperformed a pure trial-and-error ML training strategy. In a similar vein, ML can enable
us to derive more value from existing physics-based simulations. For example, ML-based
interatomic potentials (39−41) represent a means of capturing some of the physics of
first-principles simulations in a much more computationally efficient model that can
simulate orders of magnitude more atoms. ML can also serve as ―glue‖ to link physics-
based models operating at various fidelities and length scales. (42)
As ML becomes more widely used in materials research, I expect that efforts addressing one
or more of these five themes will have an outsized impact on both the materials informatics
discipline and materials science more broadly.

12. Explain machine learning for materials design and discovery.

Reference: Vasudevan, R., Pilania, G., & Balachandran, P. V. (2021). Machine learning for
materials design and discovery. Journal of Applied Physics, 129(7), 070401.
https://doi.org/10.1063/5.0043300
Liu, Y., Zhao, T., Ju, W., & Shi, S. (2017). Materials discovery and design using machine
learning. Journal of Materiomics, 3(3), 159-177.
https://www.sciencedirect.com/science/article/pii/S2352847817300515

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

An overview of the application of machine learning in materials science.

The fundamental framework for the application of machine learning in material property
prediction.

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)


QUESTION BANK FOR UNIT 6: APPLICATIONS

The general process of machine learning in materials science.

Topic: Process optimization

13. Explain process optimization using machine learning.

Refer following paper


https://www.icheme.org/media/12829/matthew-mcewan-a-hands-on-demonstration-of-
process-optimisation-using-machine-learning-techniques.pdf

Feel free to contact me on +91-8329347107 calling / +91-9922369797 whatsapp,


email ID: [email protected] and [email protected]

*********************

Dr. Abhishek D. Patange, Mechanical Engineering, College of Engineering Pune (COEP)

View publication stats

You might also like