0% found this document useful (0 votes)
88 views76 pages

Lectut CSN 102 PDF DS - Part1

This document discusses performance analysis of algorithms, specifically time and space complexity. It provides the following key points: 1. Time complexity is a measure of how long an algorithm takes to run based on the size of the input. Space complexity measures the amount of memory required. 2. The time complexity of an algorithm is expressed as a function relating the running time to the size of the input. Common time complexities include O(1), O(log n), O(n), O(n log n), O(n^2), and O(2^n). 3. The space complexity of an algorithm includes the memory required for instructions, data, and the call stack. It is also expressed

Uploaded by

Rahul Yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views76 pages

Lectut CSN 102 PDF DS - Part1

This document discusses performance analysis of algorithms, specifically time and space complexity. It provides the following key points: 1. Time complexity is a measure of how long an algorithm takes to run based on the size of the input. Space complexity measures the amount of memory required. 2. The time complexity of an algorithm is expressed as a function relating the running time to the size of the input. Common time complexities include O(1), O(log n), O(n), O(n log n), O(n^2), and O(2^n). 3. The space complexity of an algorithm includes the memory required for instructions, data, and the call stack. It is also expressed

Uploaded by

Rahul Yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 76

Data Structures

Performance Analysis
Fundamental Concepts

Some fundamental concepts that you should know:


– Dynamic memory allocation.
– Recursion.
– Performance analysis.
Analysis of Algorithms

Efficiency of an algorithm can be measured in terms of:


– Execution time (time complexity)
– The amount of memory required (space complexity)
Which measure is more important?
– Answer often depends on the limitations of the technology
available at time of analysis
Performance Analysis

Two criteria are used to judge algorithms: (i) time complexity (ii) space
complexity.
Space Complexity of an algorithm is the amount of memory it needs to
run to completion.
Time Complexity of an algorithm is the amount of CPU time it needs to
run to completion.
Space Complexity

Memory space S(P) needed by a program P, consists of two components:


– A fixed part: needed for instruction space (byte code), constants
space etc.  c
– A variable part: dependent on a particular instance of input and
output data.  Sp(instance)
S(P) = c + Sp(instance)
Space Complexity: Example 1

1. Algorithm abc (a, b, c)


2. {
3. return a+b+b*c+(a+b-c)/(a+b)+4.0;
4. }
For every instance computer requires to store variables: a, b, and c.
Therefore Sp()= 3. S(P) = 3.
Space Complexity: Example 2

1. Algorithm Sum(a[], n)
2. {
3. s:= 0.0;
4. for i = 1 to n do
5. s := s + a[i];
6. return s;
7. }
Space Complexity: Example 2.

Every instance needs to store array a[] & n.


– Space needed to store n = 1 word.
– Space needed to store a[ ] = n floating point
words (or at least n words)
– Space needed to store i and s = 2 words
Sp(n) = (n + 3). Hence S(P) = (n + 3).
Space Complexity

Components of Space Complexity


– Instruction space
– Data Space
– Environmental Stack Space
Instruction space

• Space needed to store the compiled version of the


program instructions.

The compiler used to compile the program into machine code


– Suppose we have an expression a+b+c*d, the compiler computes this as
(c*d)+a+b and generate the shorter and more time-efficient code.
Data Space

The space needed to store all the constants and variables.


Data space has two components
1. Space needed by constants and simple variables.
2. Space needed by dynamically allocated objects such as arrays and class
instances
Environment Stack Space
The environment stack is used to save information needed to resume
execution of partially completed functions and methods.
Beginning performance analysts often ignore the space needed by the
environment stack because they don’t understand how functions are invoked
and what happens on termination.
Each time a function is invoked the following data are saved on the
environment stack:
– The return address
– The values of all local variables and formal parameters in the functions being
invoked (necessary for recursive functions only).
Time Complexity

For most of the algorithms associated with this course, time complexity
comparisons are more interesting than space complexity comparisons
Time complexity: A measure of the amount of time required to execute an
algorithm
Time Complexity

Factors that should not affect time complexity analysis:


– The programming language chosen to implement the algorithm
– The quality of the compiler
– The speed of the computer on which the algorithm is to be
executed
Good Algo.’?
Time Complexity

Time complexity analysis for an algorithm is independent of programming


language,machine used
Objectives of time complexity analysis:
– To determine the feasibility of an algorithm by estimating an upper
bound on the amount of work performed
– To compare different algorithms before deciding on which one to
implement
Time Complexity

Analysis is based on the amount of work done by the algorithm


Time complexity expresses the relationship between the size of the input
and the run time for the algorithm
Usually expressed as a proportionality, rather than an exact function
Time Complexity

Time required T(P) to run a program P also consists of two components:


– A fixed part: compile time which is independent of the problem
instance  c.
– A variable part: run time which depends on the problem instance
 tp(instance)
T(P) = c + tp(instance)
Time Complexity

How to measure T(P)?


– Measure experimentally, using a “stop watch”
 T(P) obtained in secs, msecs.
– Count program steps  T(P) obtained as a step count.
Fixed part is usually ignored; only the variable part tp() is measured.
Measuring the Running Time

The C standard library provides a function called clock (in header


file time.h) that can sometimes be used in a simple way to time
computations:
clock_t start, finish;
start = clock();
sort(x.begin(), x.end());
// Call to STL generic sort algorithm
finish = clock();
cout << "Time for sort (seconds): " <<
((double)(finish - start))/CLOCKS_PER_SEC;
How to Analyze Time Complexity?

Machine - Single Processor

- 32 bit
- Sequential executer
- 1 unit time for
Arithmetic and
Logical Operations
- 1 unit time for
assignment and return
Time Complexity

To simplify analysis, we sometimes ignore work that takes a constant


amount of time, independent of the problem input size
When comparing two algorithms that perform the same task, we often just
concentrate on the differences between algorithms
Example: Polynomial Evaluation
Suppose that exponentiation is carried out using
multiplications. Two ways to evaluate the
polynomial
p(x) = 4x4 + 7x3 – 2x2 + 3x1 + 6
are:
Brute force method:
p(x) = 4*x*x*x*x + 7*x*x*x – 2*x*x + 3*x + 6
Horner’s method:
p(x) = (((4*x + 7) * x – 2) * x + 3) * x + 6
Example: Polynomial Evaluation
Method of analysis:
• Basic operations are multiplication, addition, and
subtraction
•We’ll examine the general form of a polynomial of
degree n, and express our result in terms of n
• We’ll look at the worst case (max number of
multiplications) to get an upper bound on the work
Example: Polynomial Evaluation

General form of polynomial is


p(x) = anxn + an-1xn-1 + an-2xn-2 + … + a1x1 + a0
where an is non-zero for all n >= 0
Example: Polynomial Evaluation
Analysis for Brute Force Method:
p(x) = an * x * x * … * x * x + n multiplications
a n-1 * x * x * … * x * x + n-1 multiplications
a n-2 * x * x * … * x * x + n-2 multiplications
…+ …
a2 * x * x + 2 multiplications
a1 * x + 1 multiplication
a0
Example: Polynomial Evaluation
Number of multiplications needed in the worst case is
T(n) = n + n-1 + n-2 + … + 3 + 2 + 1
= n(n + 1)/2 (result from high school math **)
= n2/2 + n/2
This is an exact formula for the maximum number of
multiplications. In general though, analyses yield
upper bounds rather than exact formulae. We say that
the number of multiplications is on the order of n2, or
O(n2). (Think of this as being proportional to n2.)
Example: Polynomial Evaluation
Analysis for Horner’s Method:
p(x) = ( … ((( an * x + 1 multiplication
an-1) * x + 1 multiplication
an-2) * x + 1 multiplication
…+ n times
a2) * x + 1 multiplication
a1) * x + 1 multiplication
a0
T(n) = n, so the number of multiplications is O(n)
Example: Polynomial Evaluation
n n2/2 + n/2 n2
(Horner) (brute force)
5 15 25
10 55 100
20 210 400
100 5050 10000
1000 500500 1000000
Example: Polynomial Evaluation
600

500 f(n) = n2
T(n) = n2/2 + n/2
400
# of mult’s
300

200

100
g(n) = n

5 10 15 20 25 30 35
n (degree of polynomial)
Time Complexity

What is a program step?


– a+b+b*c+(a+b)/(a-b)  one step;
– comments  zero steps;
– while (<expr>) do  step count equal to the number of times
<expr> is executed.
– for i=<expr> to <expr1> do  step count equal to number of
times <expr1> is checked.
Important Guidelines for finding Time
Complexity in a code

1. Loops
2. Nested Loops
3. Consecutive Statements
4. if –else statements
Few Examples

int Add(int a, int b)


{ return a+b;}

TAdd=1+1=2 units of
time
= Constant time
Time Complexity: Example 1

Statements Total

1 Algorithm Sum(a[],n) 0
2 { 0
3 S = 0.0; 1
4 for i=1 to n do n+1
5 s = s+a[i]; n
6 return s; 1
7 } 0
2n+3
Time Complexity: Example 2

Statements Total

1 Algorithm Sum(a[],n,m) 0

2 { 0

3 for i=1 to n do; n+1

4 for j=1 to m do n(m+1)

5 s = s+a[i][j]; nm

6 return s; 1

7 } 0

2nm+2n+2
Few Examples

TSum_of_list=1+3(n+1)+2n+1
=5n+5

=cn+c’
Tsum_of_Matrices =a*n^2+b*n+c

TAdd =O(1)
TSum_of_list = O(n)
Tsum_of_Matrices =O(n2)

Asymptotic Notation
Performance Measurement

Which is better?
– T(P1) = (n+1) or T(P2) = (n2 + 5).
– T(P1) = log (n2 + 1)/n! or T(P2) = nn(nlogn)/n2.
Complex step count functions are difficult to compare.
For comparing, ‘rate of growth’ of time and space complexity functions is
easy and sufficient.
Asymptotic Notations

O, W, Q, o, w
Defined for functions over the natural numbers.
– Ex: f(n) = Q(n2).
– Describes how f(n) grows in comparison to n2.
Define a set of functions; in practice used to compare
two function sizes.
The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.
Asymptotic Notations

Big Oh notation (with a capital letter O, not a zero), also


called Landau's symbol, is a symbolism used in complexity
theory, computer science, and mathematics to describe the
asymptotic behavior of functions. Basically, it tells you how
fast a function grows or declines.

Landau's symbol comes from the name of the German


number theoretician Edmund Landau who invented the
notation. The letter O is used because the rate of growth of a
function is also called its order.
Big O Notation

Big O of a function gives us ‘rate of growth’ of the


step count function f(n), in terms of a simple
function g(n), which is easy to compare.
Definition: [Big O] The function f(n) = O(g(n)) (big
‘oh’ of g of n) iff there exist positive constants c
and n0 such that f(n) <= c*g(n) for all n, n>=n0.
See graph on next slide.
Example: f(n) = 3n+2 is O(n) because 3n+2 <= 4n
for all n >= 2. c = 4, n0 = 2. Here g(n) = n.
Big O Notation

= n0
Big-Oh Notation (Formal Definition)
Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive
constants
c and n0 such that
f(n)  cg(n) for n  n0
Example: 2n + 10 is O(n)
2n + 10  cn
(c  2) n  10
n  10/(c  2)
Pick c = 3 and n0 = 10
Big O Notation

Example: f(n) = 10n2+4n+2 is O(n2) because 10n2+4n+2 <= 11n2 for all n
>=5.
Example: f(n) = 6*2n+n2 is O(2n) because 6*2n+n2 <=7*2n for all n>=4.
Algorithms can be: O(1)  constant; O(log n)  logrithmic; O(nlogn);
O(n) linear; O(n2)  quadratic; O(n3)  cubic; O(2n) 
exponential.
Big O Notation

Now it is easy to compare time or space complexities of algorithms.


Which algorithm complexity is better?
– T(P1) = O(n) or T(P2) = O(n2)
– T(P1) = O(1) or T(P2) = O(log n)
– T(P1) = O(2n) or T(P2) = O(n10)
Some Results

Sum of two functions: If f(n) = f1(n) + f2(n), and


f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then f(n) =
O(max(|g1(n)|, |g2(n)|)).

Product of two functions: If f(n) = f1(n)* f2(n), and


f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then f(n) =
O(g1(n)* g2(n)).
Big-Oh Example
Example: the function n2 is not O(n)
n2  cn
nc
The above inequality cannot be satisfied since c must be a
constant
n2 is O(n2).
Big-Oh Example
7n-2
7n-2 is O(n)
need c > 0 and n0  1 such that 7n-2  c•n for n  n0
this is true for c = 7 and n0 = 1

 3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0  1 such that 3n3 + 20n2 + 5  c•n3 for n  n0
this is true for c = 4 and n0 = 21
 3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0  1 such that 3 log n + 5  c•log n for n  n0
this is true for c = 8 and n0 = 2
Big-Oh and Growth Rate
The big-Oh notation gives an upper bound on the growth rate of a function
The statement “f(n) is O(g(n))” means that the growth rate of f(n) is no more than
the growth rate of g(n)
Useful to find the worst case of an algorithm
Find the output of the following code
int n=32;
steps=0;
for (int i=1; i<=n;i*=2)
steps++;
cout<<steps;
Example of O(log n) algorithm.
Example
W(g(n)) = {f(n) :  positive constants c and n0, such
that n  n0, we have 0  cg(n)  f(n)}

n = W(log n). Choose c and n0.


for c=1 and n0 =16,

c * log n  n , n  16
Omega
Omega gives us a LOWER BOUND on a function.

Big-Oh says, "Your algorithm is at least this good."


Omega says, "Your algorithm is at least this bad."
Q-notation
For function g(n), we define Q(g(n)),
big-Theta of n, as the set:
Q(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)
}
Intuitively: Set of all functions that
have the same rate of growth as g(n).

g(n) is an asymptotically tight bound for f(n).


f(n) and g(n) are nonnegative, for large n.
Example

Q(g(n)) = {f(n) :  positive constants c1, c2, and n0,


such that n  n0, 0  c1g(n)  f(n)  c2g(n)}

10n2 - 3n = Q(n2)
What constants for n0, c1, and c2 will work?
Make c1 a little smaller than the leading coefficient, and c2 a little bigger.
To compare orders of growth, look at the leading term.

Exercise: Prove that n2/2-3n= Q(n2)


Examples

3n2 + 17

W(1), W(n), W(n2)  lower bounds

O(n2), O(n3), ...  upper bounds

Q(n2)  exact bound


Relations Between O, W, Q
o-notation

For a given function g(n), the set little-o:


o(g(n)) = {f(n):  c > 0,  n0 > 0 such that
 n  n0, we have 0  f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n) as n approaches
infinity:
lim [f(n) / g(n)] = 0
n

g(n) is an upper bound for f(n) that is not asymptotically


tight.
Observe the difference in this definition from previous
ones.
o-notation

f(n)=3n+2 is o(n2)

f(n)=17n3 + n2 log n is o(n4)


w -notation

For a given function g(n), the set little-omega:

w(g(n)) = {f(n):  c > 0,  n0 > 0 such that


 n  n0, we have 0  cg(n) < f(n)}.

f(n) becomes arbitrarily large relative to g(n) as n


approaches infinity:
lim [f(n) / g(n)] = .
n

g(n) is a lower bound for f(n) that is not asymptotically


tight.
w -notation
f(n)=3n+2 is w (1)
f(n)=17n3 + n2 is
w (n ) 2
Comparison of Functions

fg  ab

f (n) = O(g(n))  a  b
f (n) = W(g(n))  a  b
f (n) = Q(g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
Limits

lim
n
[f(n) / g(n)] = 0  f(n)  o(g(n))
lim
n
[f(n) / g(n)] <   f(n)  O(g(n))
0 < lim
n
[f(n) / g(n)] <   f(n)  Q(g(n))
0 < lim
n
[f(n) / g(n)]  f(n)  W(g(n))
lim
n
[f(n) / g(n)] =   f(n)  w(g(n))
lim
n
[f(n) / g(n)] undefined  can’t say
Time Complexity

We analyze time complexity for


a) A very large input size
b) Worst case scenario
Rules
a) Drop lower order terms
b) Drop Constant multipliers
Time Complexity Calculation in a fragment of a code
int a; //Example 1
a=5;
a++;
for (int i=0; i<n; i++)
{Simple statements;}
for (int j=0; j<n;j++)
{ 2
for (int k=0; k<n;k++)
O( n )
{Simple statements;}
}
int main() //Example 2
{const int n=100;
int arr[n];
for (int i=0; i<n; i++)
for (int j=0; j<i; j++)
{ 2
some statements; O( n )
}
return 0;
}
//Example 3
sum=0;
for(i=1; i<=n; i++)
for(j=1; j<=n;j*=2)
sum+=1;
O(n log n)
const int n=100; //Example 4
int main()
{
for (int i=0;i<n;i++)
f();
return 0;
2
}
void f()
O( n )
{ int a[n];
for (j=0;j<n;j++)
{
some statements;
}
}
Classification
O(k) or O(1) -----> Desirable or more than excellent)
O(log n) -----> Excellent
O(n) -----> Very good
O(n2) -----> Not So good
O(n3) -----> Pretty Bad
O(dn) -----> Disaster where d>1
Examples

int main()
{ int i, sum=0;
int n; Time Complexity - O(n)
cin>>n; Space complexity – O(1)
for (i=0; i<n; i++)
sum+=i;
}
Examples
int main()
{ int n;
cin>>n; Time Complexity - O(n2)
int *arr;
arr=new int [n]; Space complexity – O(n)
for (int i=0; i<n; i++)
for (int j=0; j<i; j++)
{
some statements;
}
return 0;
}}
const int n=100; Time Complexity - O(n2)
int main()
{ Space complexity – O(n)
for (int i=0;i<n;i++)
f();
return 0;
}
void f()
{ int a[n];
for (j=0;j<n;j++)
{
some statements;
}
}
int n=64;
steps=0;
for (int i=1; i<=n;i*=2)
steps++;
cout<<steps; Time Complexity - O(log n)
int* A; Space complexity – O(log n)
A=new int[steps];
….
….
Programming Contest Sites
http://icpc.baylor.edu/public/worldMap/World-Finals-
2015 (ACM ICPC)
http://www.codechef.com (Online Contest)
http://www.topcoder.com
http://www.spoj.com
http://www.interviewstreet.com

You might also like