0% found this document useful (0 votes)
52 views

Week 02 - Algorithm Complexity - Design Analysis of Algorithm

This document discusses algorithm complexity and growth rates. It explains that algorithms can be analyzed based on their time and space complexity, which depends on the input size. Common operations like loops and nested loops are examined, with examples showing how their running time relates to the input size. Finally, different algorithm growth rates like constant, logarithmic, linear and quadratic are defined.

Uploaded by

Muhammad Afaq
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

Week 02 - Algorithm Complexity - Design Analysis of Algorithm

This document discusses algorithm complexity and growth rates. It explains that algorithms can be analyzed based on their time and space complexity, which depends on the input size. Common operations like loops and nested loops are examined, with examples showing how their running time relates to the input size. Finally, different algorithm growth rates like constant, logarithmic, linear and quadratic are defined.

Uploaded by

Muhammad Afaq
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Design & Analysis of Algorithm

Week – 2
Algorithms Complexity

The Islamia University of Bahawalpur [email protected]


Algorithm Complexity
 Suppose X is an algorithm and n is the size of input data, the
time and space used by the algorithm X are the two main
factors, which decide the efficiency of X.
 Time Factor − Time is measured by counting the number of key
operations such as comparisons in the sorting algorithm.
 Space Factor − Space is measured by counting the maximum
memory space required by the algorithm.

 The complexity of an algorithm f(n) gives the running time and/or


the storage space required by the algorithm in terms of n as the
size of input data.

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


The Execution Time of Algorithms
 Each operation in an algorithm (or a program) has a cost.
 Each operation takes a certain of time.

count = count + 1;  take a certain amount of time, but it is constant

A sequence of operations:

count = count + 1; Cost: c1


sum = sum + count; Cost: c2

 Total Cost = c1 + c2
Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur
The Execution Time of Algorithms (cont.)

Example: Simple If-Statement


Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1

Total Cost <= c1 + max(c2,c3)

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


The Execution Time of Algorithms (cont.)
Example: Simple Loop

Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5
 The time required for this algorithm is proportional to n

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


The Execution Time of Algorithms (cont.)
Example: Nested Loop

Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
 The time required for this algorithm is proportional to n2
Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur
General Rules for Estimation
 Loops: The running time of a loop is at most the running
time of the statements inside of that loop times the
number of iterations.
 Nested Loops: Running time of a nested loop containing
a statement in the inner most loop is the running time of
statement multiplied by the product of the sized of all
loops.
 Consecutive Statements: Just add the running times of
those consecutive statements.
 If/Else: Never more than the running time of the test plus
the larger of running times of S1 and S2.

7
Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur
Algorithm Growth Rates

 We measure an algorithm’s time requirement as


a function of the problem size.
 Problem size depends on the application: e.g. number of elements in a
list for a sorting algorithm.

 So, for instance, we say that (if the problem size


is n)
 Algorithm A requires 5*n2 time units to solve a problem of size n.
 Algorithm B requires 7*n time units to solve a problem of size n.

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


Algorithm Growth Rates
 The most important thing to learn is how quickly the algorithm’s time
requirement grows as a function of the problem size.
Algorithm A requires time proportional to n2.

 Algorithm B requires time proportional to n.
 An algorithm’s proportional time requirement is known as growth rate.
 We can compare the efficiency of two algorithms by comparing their growth
rates.

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


Algorithm Growth Rates (cont.)

Time requirements as a function


of the problem size n

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


Common Growth Rates

Function Growth Rate Name


c Constant
log N Logarithmic
log2N Log-squared
N Linear
N2 Quadratic
N3 Cubic
2N Exponential

Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur


Running times for moderate inputs

12
Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur
Design & Analysis of Algorithm @copyright – The Islamia University of Bahawalpur
Design & Analysis of Algorithm

End
Week – 2

The Islamia University of Bahawalpur [email protected]

You might also like