0% found this document useful (0 votes)
14 views264 pages

ANNand Its Applicationsin Civil Engineering

Data and other stuff

Uploaded by

faisaloffice2020
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views264 pages

ANNand Its Applicationsin Civil Engineering

Data and other stuff

Uploaded by

faisaloffice2020
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 264

Proceedings of Two Days Workshop

On
Machine Learning for Civil Engineers
Series #3

August 10-11, 2023

RASTA
Theme
Center for Road
Technology
Artificial Neural Networks and
Their Applications in Civil
Engineering

Edited and Compiled

Dr.M.A.Jayaram
Professor
RASTA- Center for Road Technology
VOLVO- Construction Equipment Campus
Bengaluru
ML for Civil Engineers
Series #3
Two Days Workshop On
ANN & Civil Engg Applications
August 10-11, 2023

Center for Road Technology

Dr.M.A.Jayaram
Professor
Biological and Artificial Neural Nets
Biological Motivation
To develop a feel for this analogy, let us consider a few facts from
neurobiology
The human brain, is estimated to contain a densely interconnected
network of approximately 1011 neurons, each connected, on
average, to 10 4 neurons.
A NEURON IS A SIGNAL PROCESSING UNIT
The fastest neuron switching times are known to be on the order
of 10-3 seconds--quite slow compared to computer switching
speeds of 10-10 seconds. Yet humans are able to make surprisingly
complex decisions, surprisingly quickly. For example, it requires
approximately 10-1 seconds to visually recognize your mother.
If the biological neural network is stretched in a line, it
would cover around 5 km
While ANNs are loosely motivated by biological neural systems. Indeed
a vague mimicry!!

A Biological
Neural
Network is a
massive
connection
of neurons
Nucleus

Dendrites

Synaptic
Junctions

Axons
Synapse

Biological Neural net


Neuron : Biological Neural Nets, electrochemical signals flow in
them…
Perceptron/ Processing Unit : ANN, linear multiple of weight and
input flow in them….

Inputs Weights/ relative importance


Bias
x0
Synapse /
x1 Perceptron
w0 connecting edge

w1 o

x2 w2
Activation function
The representational power of a
perceptron
x0
x1
w0 -0.3
0.5

w1 o
 Hard limiting
activation
function
x2 w2 (0,1)
0.5 (1,1)

O = 0.5x1+0.5x2-0.3
OR
0 0 -1
1 0 1
0 1 1 (0,0) (1,0)
1 1 1
The representational power of a
perceptron
x0
x1 -0.8 Perceptron
w0
0.5

w1 o

x2 w2 (0,1)
0.5 (1,1)

O = 0.5x1+0.5x2-0.8
0 0 -1
1 0 -1
0 1 -1
AND
1 1 1
(0,0) (1,0)
OR
x1 The activation functions will decide the
w11
1 w13 nature of boundaries…….
O
w12
3 (1,1)
(0,1)
w21 w23
2
x2 w22

Inputs Output
(0,0) (1,0)
1 1 -1
0 0 -1
1 0 1
0 1 1 The representational power of a
perceptron
b1
Values of X1 b3
w1 w3
The locus of
boundaries
Values of X2 w4
w2
b2

X2

X1
Representational Power of a Multilayer Network

x1

O1

x2

x3
O2
x4
In general an ANN can represent:
A Boolean Value ( 0, 1) through a Boolean Function
12.2 3 1
1
11.4 4 1
4.2 1.5 0
3.4 1.2 0
12.8 4 1 0

17.3 6 1

A Continuous function output 12.2 3 1.1


11.4 4 1.5
4.2 1.5 0.5
3.4 1.2 0.6
12.8 4 1.7
17.3 6 1.9
Appropriate Problems for ANN learning
1.Instances are represented by many attribute-value pairs. The target function to be
learned is defined over instances that can be described by a vector of predefined features.
These input attributes may be highly correlated or independent of one another. Input
values can be any real values. attributes strength

w/c cem agg 


0.4 380 1300 32.5
0.3 278 1245 42.6
0.2 310 1324 55.6
0.4 255 1461 30.6
0.6 245 1500 21.5
0.25 237 1422 57
0.35 228 1385 48

vector
2.The target function output may be discrete-valued, real-valued, or
a vector of several real- or discrete-valued attributes

15 34 16 14 7 1 0 0
1
11 10 14 18 6 0 1 0
7 9 11 10 8 0 0 1 0
16 41 22 15 9 1 0 0 0
14.6 33.2 24.6 13.2 11 1 0 0

3.The training examples may contain errors. ANN learning


methods are quite robust to noise in the training data.
4.Long training times are acceptable. Network training algorithms
typically require longer training times than, say, decision tree
learning algorithms. Training times can range from a few seconds to
many hours, depending on factors such as the number of weights in
the network, the number of training examples considered, and the
settings of various learning algorithm parameters.
Center for Road Technology

Back Propagation
Preliminary Ideas
y

X value varying between 0 to 1 and y


x value between 0 to 1, typically non
linearly……
FANCY NOTATIONS
We square each
residual
Center for Road Technology

Back Propagation Neural Network


Detailed
That showed
how x values
from 0 to 1 has
corresponding
y- values that
are highly non
linear
We can optimize
the last bias term
b3
Now let us just pretend
that we don’t know the
b3 value.

The goal is to learn how chain


rule and gradient decent applies
to multiple parameters and to
introduce some….
X value
X-Value
X values
FANCY NOTATION !!!
X

Input xi
x
xi

xi

xi
Input X3

Input x3
Zero for the
third term

Zero for the


first term

+0
X Values
Center for Road Technology

ANNs With Multiple Inputs &


Outputs
More than
one output
nodes
More than one
input nodes
All it does is takes two Map it to corresponding
values of an entity/data output
instance
Later on we will
add two other
nodes for y2, y3
Now let us put values
for x1, x2, that
correspond to y1
We need a
three
dimensional
graph
And plug those
values into
neural network
So let us start
from this corner
where x1 and
We multiply both x1, x2
values with weights
associated and then add
the bias
1,1 correspond to
this blue point on
the graph
Instead, it refers to smallest x1 value in
Further steps include
i. Find the error for each
ii. Find sum of the squares of
errors
iii. Find the gradient to find
the values corresponding
to weights and bias.
iv. Find the optimal surface
that corresponds to
minimum error
Center for Road Technology

Introduction to Deep Learning


Machine learning is a collection of algorithms and tools that help
machines understand patterns within data and use this underlying
structure to perform reasoning about a given task.
Currently, AI is advancing in a great pace and deep learning is one of
the contributor to that
To put things in perspective, deep
learning is a sub-domain of machine
learning. Deep learning algorithms
are able to self-learn hidden patterns
within data to make predictions.

The magic of deep learning starts with the


humble perceptron.
Shallow Neural Network
Deep by Topology : More than one hidden Layer

Deep by Action of each hidden layer : The feature extraction is done in hidden layer.
In the example given above, the raw data of images are provided to the
first layer of the input layer. These input layer will determine the
patterns of local contrast and differentiate on the basis of colors,
luminosity, etc.
1st hidden layer will determine the face feature, i.e., it will fixate on
eyes, nose, and lips, etc. And then, it will fixate those face features on
the correct face template.
2nd hidden layer, will determine the correct face here as it can be seen
in the above image, after which it will be sent to the output layer.
Deep Learning Methods/Types

Feed Forward Neural Network


Recurrent Neural Network
Machine Translation
Robot Control
Time Series Prediction
Speech Recognition
Speech Synthesis
Time Series Anomaly Detection
Rhythm Learning
Music Composition

.Convolutional Neural Network : Image related classification,


identifying objects.
Restricted Boltzmann Machine : Based on energy principles.
Auto-encoders: Used in NLP to encode long sentenses.
Center for Road Technology

Deep Learning

Recurrent Neural Networks


Although basic, Recurrent Neural Networks are
awesome, they are usually thought of as a stepping
stone to understanding much fancier concepts like long
term memory networks and Transformers……

Transformers
Long term
Memory Neural
Recurrent Neural Networks
Networks

In general, they are modelled as memory models. Vaguely


simulating human memory !!!
We are measuring some entity say rainfall intensity/ runoff/ cement
production/ vehicle registrations/ some kind of a distress/ stock
market values of a construction company…… that is changing over
time/ measured over time scale. And we want to predict as to what
would happen some time later…….

Let us say stock market went up


High for 4 days……

Medium Before going down


Low

1 2 3 4 5 6 7
Intervals of time/time scale
Also , the longer a changing entity measured over time ,
the more data we will have for it.
For example we have more time points for the
measures from a company represented by blue
line
Than we have for the company represented
by red line
High

Medium

Low

1 2 3 4 5 6 7 8 9 10
Intervals of time/time scale
Medium

Low

1 2 3 4 5 6 7 8 9
Intervals of time/time scale
In deep learning based networks
We have
a portion of makes
image is covered by 6
x 6 pixels.
CoveredNo more, no less…!!
….And activation
functions
Now we put
yesterday’s Value
Where would
the gradient
go??
Is there
anything
that we can
do about
them??
THE END

You might also like