0% found this document useful (0 votes)
101 views42 pages

Asymptotic Analysis

The document discusses data structures and algorithms. It defines data structures as ways to store and organize data to facilitate access and modifications. It discusses different types of data structures like arrays, lists, trees, graphs. It also discusses abstract data types, algorithms, pseudocode, and asymptotic analysis of algorithms. Asymptotic analysis evaluates how the time or space required by an algorithm grows as the input size increases. Common notations used are Big-O notation (for upper bounds), Omega notation (for lower bounds), and Theta notation (for tight bounds).

Uploaded by

sonu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views42 pages

Asymptotic Analysis

The document discusses data structures and algorithms. It defines data structures as ways to store and organize data to facilitate access and modifications. It discusses different types of data structures like arrays, lists, trees, graphs. It also discusses abstract data types, algorithms, pseudocode, and asymptotic analysis of algorithms. Asymptotic analysis evaluates how the time or space required by an algorithm grows as the input size increases. Common notations used are Big-O notation (for upper bounds), Omega notation (for lower bounds), and Theta notation (for tight bounds).

Uploaded by

sonu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

Introduction &

Asymptotic
Analysis

Prof. Sonu Gupta


Data Structures

 Representation of data and the operation allowed on


that data
 Way to store and organize data to facilitate access
and modifications
 Method of representing logical relationships between
individual data elements related to the solution of a
given problem

Matrix Tree
Prof. Sonu Gupta
Types of Data Structure

Prof. Sonu Gupta


Choice of Data Structure

 It’s structure should be able to represent relationship


between data elements
 It should be simple enough to effectively do required
processing on the data
 No single data structure works well for all purposes

Prof. Sonu Gupta


Abstract Data Type

 Collection of data and associated operations for


manipulating that data
 Specification of an ADT tells what operations it does,
not their implementation
 ADTs support abstraction, encapsulation, data hiding
 Implementing an ADT involves choosing a data
structure
 Core operations on ADT
 Add an item
 Remove an item
 Find, retrieve or access an item
Prof. Sonu Gupta
ADT vs Data Structures

Interface

Add
Request to perform
operations Remove
Program Data
Find Structures
Result of operation

Display

Wall of ADT
operations
Wall of ADT operations isolates program from data structure
Prof. Sonu Gupta
Example of Abstract Data Type

Prof. Sonu Gupta


Algorithm

A well defined, finite step-by-step procedure


independent of programming language to achieve a
required result. It has following properties:-
• Input
 Zero or more values
• Output
 At least one value
• Definiteness
 Each instruction precise and unambiguous
• Finiteness
 Should terminate after finite number of steps
• Feasibility
 Should be feasible with the available resources Prof. Sonu Gupta
Examples of an Algorithm

 Problem − Design an algorithm to add two numbers


and display the result.

step 1 − START
step 2 − declare three integers a, b & c step 1 − START ADD
step 3 − define values of a & b step 2 − get values of a & b
step 4 − add values of a & b step 3 − c ← a + b
step 5 − store output of step 4 to c step 4 − display c
step 6 − print c step 5 − STOP
step 7 − STOP

Prof. Sonu Gupta


Pseudocode

 Pseudo-code is a description of an algorithm that is


more structured than usual prose but less formal than
a programming language.
 Example: finding the maximum element of an array.
Algorithm arrayMax(A, n):
Input: An array A storing n integers.
Output: The maximum element in A.
1 largest  A[0]
2 for i  1 to n -1 do
2.1 if largest < A[i] then
2.1.1 largest  A[i]
3 return largest
Prof. Sonu Gupta
Performance
Analysis of
Algorithms
Algorithms to Find Biggest of 3 Numbers

Algorithm 1 Algorithm 2
big = a if(a > b)
if(b > big) { if(a > c)
big = b return a
if (c > big) else
big = c return c
return big }
else
{ if(b > c)
return b
else
return c
} Prof. Sonu Gupta
Algorithmic Efficiency

 More than one algorithms exist for solving one


problem
 One algorithm might be more efficient than others

Prof. Sonu Gupta


Performance Analysis of Algorithm

 Space Complexity

 Time Complexity

Prof. Sonu Gupta


Space Complexity

 Amount of memory needed by program to run to


completion.
 Components of Space Complexity
• Instruction space
• Data space
 Space needed by constants, variables, dynamically
allocated objects
• Environmental stack space
 Information needed to resume execution of partially
completed functions. Ex. Return address, values of
local variables and formal parameters
Prof. Sonu Gupta
Time Complexity

 Amount of time program needs to run to completion.


 Time complexity varies from system to system.
 Two ways to calculate time complexity
 Operation count: Identify one or more key operations &
determine number of times these are performed
(identify operations that contribute most to time
complexity)
 Step Count: Determine total number of steps executed
by program

Prof. Sonu Gupta


Example to Calculate Time Complexity

algo sum()
{ s=0 --------- 1
for i= 1 to n --------- n + 1
s=s+a[i] ---------- n
return s ---------- 1
}

Total number of steps: 2n + 3

Prof. Sonu Gupta


Example to Calculate Time Complexity

algo sum()
{ for i= 1 to n --------- n + 1
for j = 1 to n --------- n(n + 1)
cout<<i*j; ---------- n *n
}

Total number of steps: 2n2+2n+1

Prof. Sonu Gupta


Asymptotic
Analysis
Growth of Function with Input Size

 Rate of growth of the running


time - A function grows with
it’s input size.
 Ex. A program for input
size n, takes 6n2 + 100n +
300 time. With increasing ‘n’,
6n2 becomes much larger
than 100n+ 300
 Thus, important to analyze
performance of algorithm as
input size increases
Prof. Sonu Gupta
n n2 n2- n
1 1 0
2 4 2
3 9 6
4 16 12
5 25 20
6 36 30
7 49 42
8 64 56
9 81 72
As n increases, n2 becomes
10 100 90
11 121 110 much, much larger than n
12 144 132
13 169 156
14 196 182
15 225 210
16 256 240
17 289 272
18 324 306
19 361 342
20 400 380 Prof. Sonu Gupta
Worst, Average and Best Cases

 Worst Case Analysis


 Maximum time required for program execution
 Calculate upper bound on running time of an algorithm
 Ex. In Linear Search, element not present
 Usually done
 Average Case Analysis
 Average time required for program execution
 Sometimes done
 Best Case Analysis
 Minimum time required for program execution
 Calculate lower bound on running time of an algorithm
 Ex. In Linear Search, element present in first location
 Never done Prof. Sonu Gupta
Worst, Average and Best Cases

5 ms worst-case
4 ms

3 ms
}
average-case?
best-case
2 ms

1 ms

A B C D E F G
Input

Prof. Sonu Gupta


Asymptotic Analysis

 In Asymptotic Analysis, we evaluate the performance


of an algorithm in terms of input size
 It calculates, how does the time (or space) taken by
an algorithm increases with the input size.
 Asymptotic notations are mostly used to represent
time complexity
 O - Big Oh
 Ω - Omega
 Θ - Theta

Prof. Sonu Gupta


O-Notation

 For functions f(n) and g(n), we say that f(n) = O(g(n) ) if


and only if there are positive constants c and n0 such
that f(n)≤ c g(n) for n ≥ n0.
 g(n) should be as small as possible.
 Used for worst case analysis (upper bound)

O(g(n)) = { f(n): there exist positive constants c and n0


such that 0 <= f(n) <= cg(n) for all n >= n0}

Prof. Sonu Gupta


Ω-Notation (Lower Bound)

 For functions f(n) and g(n), we say that f(n) = (g(n) )


if and only if there exists positive constants c and n0
such that f(n) ≥ c g(n) for n ≥ n0.
 g(n) should be as large as possible.
 Used for best case analysis(Lower bound)

Ω (g(n)) = {f(n): there exist positive constants c and n0


such that 0 <= cg(n) <= f(n) for all n >= n0}.

Prof. Sonu Gupta


Θ-Notation

 For functions f(n) and g(n), we say that f(n) = Θ(g(n))


if there exist positive constants n0, c1 and c2 such f(n)
always lies between c1g(n) and c2g(n) for n ≥ n0
 g(n) is both upper and lower bound of f(n).
 Used for average case analysis

Θ(g(n)) = {f(n): there exist positive constants c1, c2 and


n0 such that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}

Prof. Sonu Gupta


n 4n+200 5n
 f(n) = 4n+200
1 204 5
f(n) is O(n) as 2 208 10
4n+200 <=5n for all n>=200 3 212 6
(c=5, n0=200) 4 216 12
5 220 20
…. … …
199 996 995
200 1000 1000
201 1004 1005
202 1008 1010

 f(n) = 10n2+ 4n + 2
f(n) is O(n2) as
10n2+ 4n + 2 <= 11n2 for all n >=5
( c=11, n0= 5)

Prof. Sonu Gupta


Calculation of Time complexity

 Drop lower order terms and constant factors.


 Remove coefficient of higher order term.
 Examples
 7n-3 is O(n)
 10n3+100n2+11n+900 is O(n3)

Prof. Sonu Gupta


Common Time Complexities

Prof. Sonu Gupta


Disadvantages of Asymptotic Analysis

 Fails if instances of our problem small

 Fails if 2 algorithms have same tight bounds (Ex.


1000n2 and 2n2 – complexity – O(n2))

 We often make simplifying assumptions when


analyzing which don’t hold true always

Prof. Sonu Gupta


Algorithm Design Methods

 Divide and Conquer


 Back Tracking Method
 Dynamic Programming
 Greedy Method
 Brute Force
 Branch and Bound

Prof. Sonu Gupta


Divide and Conquer

 Based on dividing problem into sub-problems


 Approach
• Divide problem into smaller sub-problems
 Sub-problems must be of same type
 Sub-problems do not need to overlap
• Solve each sub-problem recursively
• Combine solutions to solve original problem
 Usually contains two or more recursive calls
 Ex. Merge sort, Quick sort, Binary search

Prof. Sonu Gupta


Divide and Conquer

Prof. Sonu Gupta


Dynamic Programming

 Similar to Divide and Conquer, but sub-problems


must overlap
 Based on remembering past results
 Approach
• Divide problem into smaller sub-problems
 Sub-problems must be of same type
 Sub-problems must overlap
• Solve each sub-problem recursively
 Can use stored solution
 Combine solutions into to solve original problem
Prof. Sonu Gupta
Dynamic Programming

Prof. Sonu Gupta


Back Tracking Method

 Considers searching every possible combination in


order to solve an optimization problem
 Approach
• Make any possible move
• If found solution, return it
• Else backtrack and select another move
• If no move remains, return failure
 Ex. N Queen’s problem, Maze problem

Prof. Sonu Gupta


Back Tracking Method

Prof. Sonu Gupta


Greedy Method

 Based on trying best current (local) choice


 Approach
• At each step of algorithm Choose best local solution
• Avoid backtracking
 Example – Minimum Spanning Tree algorithms
(Kruskal, Prims), Dijkstra shortest path, Huffman code

Prof. Sonu Gupta


Brute Force

 Based on trying all possible solutions


 Approach
• Generate and evaluate possible solutions until
 Satisfactory solution is found
 Best solution is found (if can be determined)
 All possible solutions found
 Return best solution
 Return failure if no satisfactory solution

 Generally most expensive approach

Prof. Sonu Gupta


Branch and Bound

 Based on limiting search using current solution


 Approach
• Track best current solution found
• Eliminate partial solutions that can not improve
upon best current solution
• Reduces amount of backtracking

Prof. Sonu Gupta


Heuristic

 Based on trying to guide search for solution


 Heuristic ⇒ “rule of thumb”
 Approach
• Generate and evaluate possible solutions
 Using “rule of thumb”
 Stop if satisfactory solution is found
 Can reduce complexity

Prof. Sonu Gupta

You might also like