0% found this document useful (0 votes)
4 views11 pages

Lecture - 08 - Asymptotic Analysis - Part 2

Uploaded by

cadet90925
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views11 pages

Lecture - 08 - Asymptotic Analysis - Part 2

Uploaded by

cadet90925
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

DESIGN AND ANALYSIS OF ALGORITHMS

Dr. Kashif Ayyub


1
The RAM Model of Computation
• Two common tools used for algorithm analysis are the RAM model of
computation and the asymptotic analysis of worst-case complexity.
• The RAM (Random Access Machine) model is used for analyzing algorithms
without running them on a physical machine.
• A simple operation (+, -, *, /, =, if) takes one time step.
• Loops and subroutines are comprised of simple operations.
• Memory is unlimited and access takes one time step (the RAM model does not
consider whether data is on cache or disk).
• Using the RAM model, you measure the running time of an algorithm by
counting the number of steps an algorithm takes on a given input.

2
Big-O: Asymptotic Upper Bound
• The notation Ο(n) is the formal way to express
the upper bound of an algorithm's running
time. It is the most commonly used notation.
It measures the worst case time complexity of
an algorithm.
• Asymptotic Less than and Equal
• For function g(n), we define O(g(n)), big-O of
n, as:
• Set of all functions whose rate of growth is the
same as or lower than that of g(n).
• g(n) is an asymptotically upper bound for f(n).
• f(n) ≤ g(n)
3
Big-O: Example
• 20n3 + 10nlgn + 5 = O(n3)
• 3lgn + lg(lgn) = O(lgn)
• 2100 = O(1)

4
Big-O: Properties

5
Big-: Asymptotic Tight Bound
• Asymptotic Equality
• For function g(n), we define (g(n)), big-Theta
of n, as:
• Set of all functions that have the same rate of
growth as g(n).
• g(n) is an asymptotically tight bound for f(n).
• f(n) = g(n) old concept
• f(n)  g(n) new concept

6
Big-: Example
Θ(n3): n3
5n3+ 4n
105n3+ 4n2 + 6n
Θ(n2): n2
5n2+ 4n + 6
n2 + 5
Θ(log n): log n
log n2
log (n + n3)

7
Big-: Asymptotic Lower Bound
• Asymptotic Lower Bound
• For function g(n), we define (g(n)), Omega of
n, as:
• Set of all functions whose rate of growth is the
same as or higher than that of g(n).
• g(n) is an asymptotically lower bound for f(n).
• f(n)  g(n)

8
Big-: Example
Θ(n3): n3
5n3+ 4n
105n3+ 4n2 + 6n
Θ(n2): n2
5n2+ 4n + 6
n2 + 5
Θ(log n): log n
log n2
log (n + n3)

9
Time and Space Complexity of Insertion Sort
• To sort an array of size N in ascending
order iterate over the array and
compare the current element (key) to
its predecessor, if the key element is
smaller than its predecessor,
compare it to the elements before.
• Move the greater elements one
position up to make space for the
swapped element.

10
Time Complexity of Insertion Sort
• Best Case: O(N)
• The best-case time complexity of Insertion Sort occurs when the input array is already
sorted.
• Each element is compared with its preceding elements, resulting in a linear time
complexity.
• Average Case: O(N2)
• This complexity arises from the nature of the algorithm, which involves pairwise
comparisons and swaps to sort the elements.
• Although the exact number of comparisons and swaps may vary depending on the input,
the average-case time complexity remains quadratic.
• Worst Case: O(N2)
• When the input array is in reverse sorted order.
• In this scenario, each element needs to be compared and possibly swapped with every
preceding element, resulting in a quadratic time complexity.
11

You might also like