This document discusses algorithm analysis and asymptotic growth rates. It introduces key concepts like:
1. Algorithms must be analyzed to determine their efficiency as the input size increases to understand how well they scale.
2. The order of growth analyzes how an algorithm's behavior changes as the input size grows.
3. Common notations for asymptotic efficiency include Big O (upper bound), Big Omega (lower bound), and Big Theta (tight bound).
Download as PPT, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
90 views
Lecture-03 - Growth of Functions
This document discusses algorithm analysis and asymptotic growth rates. It introduces key concepts like:
1. Algorithms must be analyzed to determine their efficiency as the input size increases to understand how well they scale.
2. The order of growth analyzes how an algorithm's behavior changes as the input size grows.
3. Common notations for asymptotic efficiency include Big O (upper bound), Big Omega (lower bound), and Big Theta (tight bound).
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 23
Asymptotic Growth Rate
Algorithm Analysis
Algorithm is a step by step to solve any
problem. Algorithms can be written in many ways for solving a single problem So to obtain the best or optimum solution, we need to analyze these algorithms (judge or weightage)
Module Code and Module Title Title of Slides
Algorithm Analysis
Any Algorithm is expected to work fast for
any input size n. Generally, when the input size is small the algorithm will work fine.
So, how do we analyze the validity of an
algorithm ? Order of growth
Module Code and Module Title Title of Slides
How do we analyze algorithm?
The size of n has to be increased to analyze
how well the algorithm works. Consider the values 25,39,40,15,20 Sort the values in ascending order It works fine for n=5 What about for n= 500,5000? It may or may not work fast
Module Code and Module Title Title of Slides
Order of growth
The change in the behavior of algorithm
as the input size grows is called Order of growth.
Module Code and Module Title Title of Slides
Why size of n has to be high?
Whenever we find the Order of growth we
choose high values. Because
1.The algorithm may delay only for high values of n
2.All real time applications uses higher values of n
Module Code and Module Title Title of Slides
Efficiencies of Algorithm
1. Best Case 2. Average Case 3. Worst Case
Module Code and Module Title Title of Slides
Example: Best Case
25, 31, 42, 71, 105
If the element to be searched is 25, the
location can be found in the first position itself i.e ) only one comparison is done
Cbest (n) =1
Module Code and Module Title Title of Slides
Example: Worst Case
25, 31, 42, 71, 105
If the element to be searched is 105, the
location can be found either at the first position or not found at all i.e ) n comparisons are done Cworst (n) =n
Module Code and Module Title Title of Slides
Example: Average Case
The situation to be considered is that the
search element is not in the first or last position(ie. no best or no worst) & the element is somewhere in the middle of the list The probability of successful search is p The probability of unsuccessful search is 1-p 0p 1
Module Code and Module Title Title of Slides
Example: Average Case
Module Code and Module Title Title of Slides
Example: Average Case Case 1 : The element is available (i.e )successful search so p=1 Substitute in Eq.1 1(n+1) + n (1-1) 2 CAverage (n) = (n+1) 2 So at least half of the list will be searched for a successful search to find an element Module Code and Module Title Title of Slides Example: Average Case
Case 2 : The element is not available
(i.e )unsuccessful search so p=0 Substitute in Eq.1 0(n+1) + n (1-0) 2 CAverage (n) = n
Unsuccessful search is already a worst case..
Module Code and Module Title Title of Slides Asymptotic Running Time
The running time of an algorithm as
input size approaches infinity is called the asymptotic running time We study different notations for asymptotic efficiency(operations of order of growth) In particular, we study tight bounds, upper bounds and lower bounds.
Module Code and Module Title Title of Slides
Operations of Order of growth
1. Big O (Oh) (upper bound)
2. Big (Omega) (lower bound) 3. Big (Theta) (tight bound)
Consider an algorithm represented by two
different functions f(n) & g(n)
Module Code and Module Title Title of Slides
Big O (Oh)
The maximum( longest) time an algorithm
takes to complete its execution. The growth rate of f(n) will not be greater than the growth rate of g(n) f(n) O[g(n)]
Module Code and Module Title Title of Slides
g(n) O(f(n))
Module Code and Module Title Title of Slides
Big (Omega)
The minimum( least) time an algorithm
takes to complete its execution. The growth rate of f(n) will not be smaller than the growth rate of g(n) f(n) [g(n)]
Module Code and Module Title Title of Slides
g(n) (f(n))
Module Code and Module Title Title of Slides
Big (Theta)
The average time an algorithm takes to
complete its execution. F(n) is bounded above and below by C1(g(n)) & C2(g(n))
The growth rate of f(n) will lie between the
growth rate of C1(g(n)) & C2(g(n)) C2(g(n)) f(n) C1[g(n)] Module Code and Module Title Title of Slides g(n) (f(n))
Module Code and Module Title Title of Slides
Relations Between , O,
Module Code and Module Title Title of Slides Comp 122