0% found this document useful (0 votes)
19 views30 pages

Seismic Resistant Design of Structures: Random Variables

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 30

Seismic Resistant Design of

Structures

Lecture 3:
Random Variables

M.Sc in Earthquake Engineering


Institute of Engineering
Random Variables

The random variable is defined to be a function of the outcomes of the


random experiment. Example: X = the number of spots on the top face of a
die.
How to characterize a random variable?
The Probability Density and Cumulative
Distribution Functions
From probability and statistics, given a continuous
random variable X, it is to denote:
• The probability density function, PDF as f (x).
(Note: This function could also be probability mass
function.)
• The cumulative distribution function, CDF, as F (x).
The PDF and CDF give a complete description of
the probability distribution of a random variable.
The Probability Density and
Cumulative Distribution Functions
If X is a continuous random variable, then the probability
density function, PDF, of X, is a function f (x) such that for
two numbers, a and b with :

(1)
The Probability Density and
Cumulative Distribution Functions

That is, the probability that X takes on a value in


the interval [a, b] is the area under the density
function from a to b.
The cumulative distribution function, CDF, is a
function F (x) of a random variable, X, and is
defined for a number x by:

That is, for a given value x, F (x) is the probability that


the observed value of X will be at most x.
A graphical representation of the relationship between the PDF and CDF.
The Probability Density and
Cumulative Distribution Functions

The mathematical relationship between the


PDF and CDF is given by:
The Probability Density and Cumulative
Distribution Functions
An example of a probability density function is
the well-known normal distribution, whose pdf is
given by:

where μ is the mean and σ is the standard deviation. The


normal distribution is a two-parameter distribution, i.e.
with two parameters μ and σ.
Probability Distribution Function
(or Cumulative Probability Function)
Probability Distribution Function
(or Cumulative Probability Function)

The density function is always a non-negative function, which


integrates over the full range to 1.
Probability Distribution Function
(or Cumulative Probability Function)
Consider the probability that an observation lies in a narrow
interval between x and x+dx.
Expection
The expectation or expected value of a function of a random
variable is defined as:
Expection
Some common expectations which partially characterize a
random variable-moments of the distribution:
1st: Mean or mean value

2nd: Mean squared value

These are the first two moments of the distribution. The complete
infinite set of moments would completely characterize the random
variable.
Expection
We also define central moments (moments about the mean). The
first non-zero one is:
1st: Variance
Standard Deviation
Random Process
• When x(t) represents a random process, it means that the values of x(t)
cannot be precisely predicted in advance.
• The time history of a random process can be used to determine to
calculate the probability density function for x(t).

Calculation of the probability density function p(x) for a random process


PDF for a Random Process
• The figure shows a sample time history for a random process with the
times for which x ≤ x(t) ≤ x + dx identified for the shaded strips.
• During the time interval T, x(t) lies in the band of values x to x + dx for a
total time of (dt1 + dt2 + dt3 + dt4).
• If T is long enough, the probability density function p(x) is given by
p(x) dx = Fraction of the total elapsed time for which x(t) lies in the x to x +
dx band

For the equation to be correct, the time interval T must be infinite.


PDF for a Random Process
If there are N sample values (as shown in the figure below), and dn of these values
lie in the band x to x + dx, then, the probability density function is given by
P(x) dx = Fraction of total number of samples which lie in the
x to x + dx band
= dn/N

Sampling a random time history for digital analysis


Gaussian Distribution
It has been well recognized that many naturally occurring random
vibrations have the well known “bell-shaped” probability
distributions shown below:
When the shape of the bell is
given by the equation

This distribution is extensively


used in random vibration
theory to approximate the
characteristics of random
First order probability density for a normal (or Gaussian) process excitation.
Calculation of Averages
If the probability density function p(x) is available for a random
process, it can be used to calculate certain statistics of the
random process x(t).
Mean value of x: usually denoted by E[x] - Expectation
In reference with the random process considered, the mean
value of the time history of x over the interval T is defined so that
(E[x])T = Total area under the x(t) curve during the interval T
(area below the zero line subtracting from the total
area)

=
Calculation of Averages
And hence
Mean value of Random Value

Which is really the fundamental definition of the mean value E[x]


The mean square value x, E[x2], is defined as the average value
of x2, which is given by
Variance and Standard Deviation
Variance of the random variable x:
Var [x] = E[(x – E[x])2]
Standard deviation, σ[x] = √Var [x]

(Variance) = (standard deviation)2 = {Mean square – (Mean)2}


Probability Distribution Function

The shaded area gives the value of the probability distribution function F(x)
Statistical Characterizations
• Expectation (Mean Value, First Moment):

• Second Moment:

25
Statistical Characterizations
• Variance of X:

• Standard Deviation of X:

26
Expectation
• Discrete Case: X  E  X    xi p x  xi 
alli


• Continuous Case:   EX    xf x dx
X
  of a random variable is itself
• According to the definition of a random variable, any function
a random variable. If h(x) is an arbitrary function of x, the expected value (or expectation)
of h(x) is defined as:
if X is discrete
Ehx    hxi p x xi 
i
if X is continuous

Ehx    hx  f x dx
X
 27
Moments of Random Variables
• The nth moment of the random variable X, denoted by , is
defined by EX   X
n n



 i p X  xi 
x n
X discrete
 
EX n
X 
n

i

  f X x dx X continuous
n
x
 
for n = 1, 2, 3, …….
• The first moment, E[X], is the expected value of X.

28
Central Moments (or Moments about the
Mean)
• These are the moments of the difference between a random variable
and its expected value. The nth central moment is defined by


  
xi  X p X xi  X discrete
n


E XX    X  X 
n n  i
 
 
  x - X f X x dx X continuous
-
n

29
Central Moments (Variance)
• The Central Moment for the case of n = 2 (Variance)


 
 
xi  X p X xi  X discrete
2

Var  X    2

E XX 
2
 
i

 
  x  X f X x dx X continuous
X 2

-

30

You might also like