0% found this document useful (0 votes)
43 views63 pages

Algorithms

Algo Complexity Searching Sorting Algo Design Techniques

Uploaded by

rishabh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views63 pages

Algorithms

Algo Complexity Searching Sorting Algo Design Techniques

Uploaded by

rishabh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 63

Algorithms

Performance Analysis
Searching Techniques
Sorting Techniques
Algo Design Techniques
Algorithm

Algorithm is a step-by-step procedure for performing


some task in a finite amount of time.
Algorithm

Algorithm arrayMax(A,n):
Input: An array A storing n>=1 integers.
Output: The maximum element in A.

currentMax ←A[0]
for i ← 0 to n – 1 do
If currentMax < A[i] then
currentMax ←A[i]
return currentMax
Methodologies for Analyzing Algorithms

Running time
depends on size, distinct input, hardware
environment( processor, clock speed, memory, disk
type), software environment ( operating system,
programming language, compiler, interpreter)
Methodologies for Analyzing Algorithms

1. Pseudo-Code
It is a mixture of natural language and high-level
programming construct that describe the main ideas
behind a generic implementation of the data structure or
algorithm.
Methodologies for Analyzing Algorithms

2. The Random Access Machine ( RAM) Model


Here we define a set of high-level primitive operations
that are largely independent from the programming
language.
Primitive operation includes:
Assigning a value to a variable
Calling a method
Performing an arithmetic operation
Comparing two numbers
Indexing into an array
Following an object reference
Returning from a method
Methodologies for Analyzing Algorithms
3. Counting Primitive Operations

currentMax ←A[0] // 2 - (indexing array and


assign)
for i ←0 to n – 1 do // 1 (i initialize to zero)
// n (condition check i<n)
If currentMax < A[i] then // 4 ( indexing ,
assigning, i+1,mod i )
CurrentMx ←A[i] // 2 (indexing and assigning)
return currentMax // 1 (return)

Primitive operation count:


2+1+n+4(n-1)+1 = 5n ( best case)
2+1+n+6(n-1)+1 = 7n-2 ( worst case )
Methodologies for Analyzing Algorithms

4. Analyzing Recursive Algroithms

Algorithm recursiveMax(A,n):
Input: An array A storing n>=1 integers.
Output: The maximum element in A.

If n=1 then
return A[0]
return max { A[n-1], recursiveMax(A,n-1) }
Methodologies for Analyzing Algorithms

5. Analyzing Recursive Algroithms

Recurrence equation

T(n)= ( 3 if n=1 )
( T(n-1) + 7 otherwise )
Asymptotic Notation

Asymptotic Notations are languages that allows us to


analyse an algorithm's running time by identifying its
behaviour as the input size of the algorithm increase.
This is known as algorithm's growth rate.
Asymptotic Notation

Types of Asymptotic Notation:


1. Big- O
2. Small- o
3. Big-Omega Ω
4. Small-omega ω
5. Theta Θ
Asymptotic Notation

Types of Asymptotic Notation:


1. Big- O worst-case
2. Small- o worst-case
3. Big-Omega Ω best-case
4. Small-omega ω best-case
5. Theta Θ average-case
Types of Algorithm complexity

Two types of Complexity:


1. Space Complexity
2. Time Complexity
Searching Techniques

Two types of Searching Techniques:


1. Linear Search
2. Binary Search
Linear Search

3 , 2, 6, 1, 9, 0, 8, 7, 5, 4
Linear Search
3 , 2, 6, 1, 9, 0, 8, 7, 5, 4

int lSearch(int *p, int num)


{
for ( int i = 0 ; i < MAX ; i++ ) {
if ( p[i] == num )
{
printf ( " element %d is present at %dth.\n", num, i + 1 ) ;
return ;
}
}
if ( i == MAX )
printf ( "element %d is not present in the array",num ) ;
}
Linear Search

3 , 2, 6, 1, 9, 0, 8, 7, 5, 4

Time complexity: Ω(1) (best case )


O(n) ( worst case )
Θ(n) (average case)
Binary Search

10 , 11, 12, 13, 14, 15, 16, 17, 18, 19


Binary Search
int bSearch(int *p, int start, int end, int x)
{
while (start <= end)
{
int mid = start + (end-start)/2;
if (p[mid] == x)
return mid;
if (p[mid] < x)
start = mid + 1;
else
end = mid - 1;
}

return -1;
}
Binary Search
int binarySearch(int arr[], int start, int end, int x)
{
if (end >= start)
{
int mid = start + (end - start)/2;

if (arr[mid] == x) return mid;

if (arr[mid] > x) return binarySearch(arr, start, mid-1, x);


return binarySearch(arr, mid+1, end, x);
}
return -1;
}
Binary Search
0 , 1, 2, 3, 4, 5, 6, 7, 8, 9

Time complexity: Ω(1) (best case )


O(log n) ( worst case )
Θ(log n) (average case)
Sorting Techniques

Two types of Sorting Techniques:


1. Internal Sorting
2. External Sorting
Sorting Techniques
Internal Sorting : All the data to sort is stored in memory
at all the times while sorting is in progress.

Examples:
Bubble Sort
Selection Sort
Insertion Sort
Quick Sort
Heap Sort
Bucket Sort or Bin Sort
Radix Sort
Sorting Techniques
External Sorting : In external sorting data is stored
outside main memory (like disk) and only loaded in to
memory in small chunks

Examples:
Merge Sort
Multiway Merge Sort
Polyphase Sorting
Bubble sort

3 , 2, 6, 1, 9, 8
Bubble sort
3 , 2, 6, 1, 9, 8

Space complexity: O(1)


Time complexity: O(n 2) ( worst case )
Θ(n 2) ( average case )
Ω(n) (best case )
Selection sort

3 , 2, 6, 1, 9, 8
Selection sort
3 , 2, 6, 1, 9, 8

Space complexity: O(1)


Time complexity: O(n 2) ( worst case )
Θ(n 2) ( average case )
Ω(n 2) (best case )
Insertion sort

3 , 2, 6, 1, 9, 4
Insertion sort
3 , 2, 6, 1, 9, 4

Space complexity: O(1)


Time complexity: O(n 2) ( worst case )
Θ(n 2) ( average case )
Ω(n) (best case )
Merge sort
3 , 2, 6, 1, 9 0, 8, 7, 5, 4

1 , 2, 3, 6, 9 0, 4, 5, 7, 8
Merge Sort

3 , 2, 6, 1, 9, 0, 8, 7,
Merge Sort
3 , 2, 6, 1, 9, 0, 8, 7,
void partition(int arr[],int low,int high){

int mid;

if(low<high){
mid=(low+high)/2;
partition(arr,low,mid);
partition(arr,mid+1,high);
mergeSort(arr,low,mid,high);
}
}
Merge Sort
3 , 2, 6, 1, 9, 0, 8, 7

p(int arr[],int low,int high){


if(low<high){
mid=(low+high)/2;
p(arr,low,mid);
p(arr,mid+1,high);
mg(arr,low,mid,high);
}
}
Merge Sort
Complexity

Space complexity: O(n)


Time complexity: O(nlogn) ( worst case )
Θ(nlogn) (average case)
Ω(nlogn) (best case )
Quick sort

3 , 2, 6, 1, 9, 4
Quick sort
3 , 2, 6, 1, 9, 4

void quicksort ( int a[ ], int lower, int upper )


{
int i ;
if ( upper > lower )
{
i = split ( a, lower, upper ) ;
quicksort ( a, lower, i - 1 ) ;
quicksort ( a, i + 1, upper ) ;
}
}
Quick sort
3 , 2, 6, 1, 9, 4

Complexity:
Time Complexity : O(n 2) worst case
Θ(nlogn) Average case
Ω(nlogn) best case
Space complexity: In-place
Bin sort or Bucket sort

.70, .53, .23, .42, .21, .29, .61, .51, .64, .91, .22
Bin sort or Bucket sort
.70, .53, .23, .42, .21, .29, .61, .51, .64, .91, .22

Complexity:
Space Complexity: O(n)
Time Complexity: O(n+k) ( average case )
Θ(n2) ( worst case )
Ω(n+k) (best case )
Radix Sort

333 , 011, 010, 001, 300, 312, 414, 111


Radix Sort
333 , 011, 010, 001, 300, 312, 414, 111
Complexity:
Space Complexity: O(n+k)
Time Complexity: Θ(nk) ( average case )
O(nk) ( worst case )
Ω(nk) (best case )
Multi-way Merge Sorting
3, 12, 6, 15, 1, 9, 2, 0, 8, 7, 23, 5, 4 , 13
(assume that we have 3 tape and we can process 3
records at a time)
Multi-way Merge Sorting
3, 12, 6, 15, 1, 9, 2, 0, 8, 7, 23, 5, 4 , 13
(assume that we have 3 tape and we can process 3
records at a time)

Time Complexity: O(Nlog(N/M)) ( worst case )


N records on
M records can fit in the memory
Heap Sort

3 , 2, 6, 1, 9, 4
Heap Sort
3 , 2, 6, 1, 9, 4

Space Complexity: O(1)


Time Complexity: Θ(nlon(n)) ( average case )
O(nlog(n)) ( worst case )
Ω(nlog(n)) (best case )
Algorithm Design Paradigms

Types of Design Paradigms:


1. Brute force
2. Greedy algorithms
3. Divide-and-conquer or decrease-and-conquer
4. Dynamic programming or
5. Transform-and-conquer
6. Backtracking and Branch-and-bound
Algorithm Design Paradigms
Brute force:is a straightforward approach to solve a
problem based on the problem’s statement and
definitions of the concepts involved. It is considered as
one of the easiest approach to apply and is useful for
solving small – size instances of a problem.

Computing a pow n (a > 0, n a nonnegative integer) by


multiplying a*a*…*a
Computing n!
Selection sort
Bubble sort
Linear search
Algorithm Design Paradigms
Greedy Algorithms:It is based on "take what you can
get now" strategy. The solution is constructed through a
sequence of steps, each expanding a partially
constructed solution obtained so far. At each step the
choice must be locally optimal – this is the central point
of this technique.

Minimal spanning tree


Shortest distance in graphs
Algorithm Design Paradigms
Divide-and-Conquer:With the divide-and-conquer
method the size of the problem instance is reduced by a
factor (e.g. half the input size).
while with the decrease-and-conquer method the size is
reduced by a constant.

Binary search in a sorted array (recursion)


Mergesort algorithm, Quicksort algorithm (recursion)
Algorithm Design Paradigms
Decrease-and-conquer:

Insertion sort
Binary Tree traversals: inorder, preorder and postorder
(recursion)
Computing the length of the longest path in a binary tree
(recursion)
Computing Fibonacci numbers (recursion)
Reversing a queue (recursion)
Warshall’s algorithm (recursion)
Algorithm Design Paradigms
Dynamic Programming:Dynamic Programming is a
Bottom-Up Technique in which the smallest sub-
instances are explicitly solved first and the results of
these used to construct solutions to progressively larger
sub-instances.

In contrast, Divide-and-Conquer is a Top-Down


Technique which logically progresses from the initial
instance down to the smallest sub-instance via
intermediate sub-instances.

Fibonacci numbers computed by iteration.


Warshall’s algorithm implementation by iteration
Algorithm Design Paradigms
Transform-and-Conquer:These methods work as two-
stage procedures. First, the problem is modified to be
more amenable to solution. In the second stage the
problem is solved.

Types of problem modifications


1. Problem simplification e.g. presorting
Example: consider the problem of finding the two
closest numbers in an array of numbers.
Brute force solution: O(n2)
Algorithm Design Paradigms
Transform and conquer solution: O(nlogn)
Presort the array – O(nlogn)
Scan the array comparing the differences - O(n)

2. Change in the representation


Example: AVL trees guarantee O(nlogn) search time

3. Problem reduction
Example: least common multiple
lcm(m,n) = (m*n)/ gcd(m,n)
Algorithm Design Paradigms
Backtracking:The method is used for state-space
search problems. State-space search problems are
problems, where the problem representation consists of:
-initial state
-goal state(s)
-a set of intermediate states
-a set of operators that transform one state into
another. Each operator has preconditions and
postconditions.
-a cost function – evaluates the cost of the operations
(optional)
-a utility function – evaluates how close is a given state
to the goal state (optional)
Algorithm Design Paradigms

Example: The following problems can be solved using


state-space search techniques:
Algorithm Design Paradigms
Problem 1:
A farmer has to move a goat, a cabbage and a wolf from
one side of a river to the other side using a small boat.
The boat can carry only the farmer and one more object
(either the goat, or the cabbage, or the wolf). If the farmer
leaves the goat with the wolf alone, the wolf would kill the
goat. If the goat is alone with the cabbage, it will eat the
cabbage. How can the farmer move all his property
safely to the other side of the river?"
Algorithm Design Paradigms

Problem 2:
You are given two jugs, a 4-gallon one and a 3-gallon
one. Neither has any measuring markers on it. There is a
tap that can be used to fill the jugs with water. How can
you get exactly 2 gallons of water into the 4-gallon jug?
Algorithm Design Paradigms

Backtracking uses depth-first search


Algorithm Design Paradigms

Branch and bound:Branch and bound is used when we


can evaluate each node using the cost and utility
functions. At each step we choose the best node to
proceed further. Branch-and bound algorithms are
implemented using a priority queue. The state-space tree
is built in a breadth-first manner.
Algorithm Design Paradigms

Example: the 8-puzzle problem. The cost function is the


number of moves. The utility function evaluates how
close is a given state of the puzzle to the goal state, e.g.
counting how many tiles are not in place.
Depth First Search
Breadth First Search

You might also like