0% found this document useful (0 votes)
2 views

Algorithm Types

Uploaded by

l227437
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Algorithm Types

Uploaded by

l227437
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 28

Data Structures and Algorithms

CS2002

Algorithms
Algorithm Types
• Algorithm types we will consider include:
– Simple recursive algorithms
– Backtracking algorithms
– Divide and conquer algorithms
– Dynamic programming algorithms
– Greedy algorithms
– Branch and bound algorithms
– Brute force algorithms
– Randomized algorithms
Divide and Conquer
• A divide and conquer algorithm consists of
two parts:
– Divide the problem into smaller subproblems
of the same type, and solve these subproblems
recursively
– Combine the solutions to the subproblems into
a solution to the original problem
• Traditionally, an algorithm is only called
divide and conquer if it contains two or
more recursive calls
Divide & Conquer Example: Quick Sort
(ref 7.3 Data Structures in C++)
 The quick sort scheme developed by C. A. R.
Hoare has the best average behavior among all
the sorting methods we shall be studying
 Given (R0, R1, …, Rn-1) and Ki denote a pivot key
 If Ki is placed in position s(i),
then Kj  Ks(i) for j < s(i), Kj  Ks(i) for j > s(i).
 After a positioning has been made, the original
file is partitioned into two subfiles, {R0, …, Rs(i)-1},
Rs(i), {Rs(i)+1, …, Rs(n-1)}, and they will be sorted
independently
Quick Sort
• Quick Sort Concept
– select a pivot key
– interchange the elements to their correct
positions according to the pivot
– the original file is partitioned into two subfiles
and they will be sorted independently
Quick Sort Program
Left =0, right=9

Left =0, right=4

Left =0, right=1

Left =3, right=4

Left =6, right=9

Left =6, right=7

Left =9, right=9


 Analysis for Quick Sort
– Assume that each time a record is positioned, the list is
divided into roughly the same size of two parts.
– Position a list with n element needs O(n)
– T(n) is the time taken to sort n elements
– T(n)<=cn+2T(n/2) for some c
<=cn+2(cn/2+2T(n/4))
...
<=cnlog2n+nT(1)=(nlogn)
• Time complexity
– Average case and best case: (nlogn)
– Worst case: O(n2)
– Best internal sorting method considering the average
case
• Unstable
• Lemma 7.1:
– Let Tavg(n) be the expected time for quicksort to
sort a file with n records. Then there exists a
constant k such that Tavg(n)  knlogen for n  2

• Stack space complexity:


– Average case and best case: O(logn)
– Worst case: O(n)
Homework

1. Show how Quick sort would work for the


following sequence of numbers
19 7 32 2 68 13 58 17 46 17
Heap Sort
• The challenges of merge sort
– The merge sort requires additional storage
proportional to the number of records in the file
being sorted.
– By using the O(1) space merge algorithm, the
space requirements can be reduced to O(1), but
significantly slower than the original one.
Heap Sort
• adjust
– adjust the binary tree to establish the heap
– This function takes a binary tree T, whose left and right
subtrees satisfy the heap property but whose root may not,
and adjusts T so that the entire binary tree satisfies the heap
property
Heap Sort
To sort a list in ascending order, we first create a max heap (in a bottom
up manner) by using the adjust function repeatedly. Next we make n-1
passes over the list exchanging the first record in the heap with the last.
Since the first record always contains the highest key, this record is now
in its sorted position. We then decrement the heap size and readjust the
heap
Heap Sort Example:
Input list : 26, 5, 77, 1, 61, 11, 59, 15, 48, 19
Heap Sort Example contd..
Heap Sort Analysis
– O(n log n)
– In place
– Unstable
Merge Sort
(ref 7.5 Data Structures in C++)

• Before looking at the merge sort algorithm to sort


n records, let us see how one may merge two
sorted lists to get a single sorted list.
• Merging
– The first one, Program 7.7, uses O(n) additional space.
– It merges the sorted lists
(list[i], … , list[m]) and (list[m+1], …, list[n]),
into a single sorted list, (sorted[i], … , sorted[n]).
public void merge(int[] initList, int[] mergedList, int left, int m, int n)
{
int i1 = left, iResult =left, i2 = m+1;
for( ; i1<=m && i2<=n; iResult++)
{ if(initList[i1]<= initList[i2])
{
mergedList[iResult] = initList[i1];
i1++;
}
else
{
mergedList[iResult] = initList[i2];
i2++;
}
}
if(i1>m)
{
for(int t=i2; t<=n; t++)
mergedList[iResult +t -i2]=initList[t];
}
else
{
for(int t=i1; t<=m; t++)
mergedList[iResult +t -i1]=initList[t];
}

}
Iterative Merge Sort
1. We assume that the input sequence has n
sorted lists, each of length 1.
2. We merge these lists pairwise to obtain n/2
lists of size 2.
3. We then merge the n/2 lists pairwise, and so
on, until a single list remains.
4. It works bottom-up
void MergeSort(Element *list, const int
n)
{
Element *tempList = new Element[n];
for (int l =1; l<n; l*=2)
{
MergePass(list, tempList, n, l );
l = l *2;
MergePass(tempList, list, n, l ) void MergePass(Element *initlist, Element *resultlist,
const int n,
}
const int l)
delete tempList;
{
}
for (int i =1; i<=n-2*l+1; i=i+2*l)
{
merge(initList, resultList, i, i+l-1;i+2*l-1);
}

if((i+l -1)< n) merge(initList, resultList, i, i+l-1;n);


else
for(int t=i; t<=n; t++)
resultList[t] = initList[t];
}
initList

resultList

initList

resultList

initList
• Analysis
– Total number of passes is the ceiling of log 2n
– merge two sorted list in linear time: O(n)
– The total computing time is O(n log n) in all
cases.
– Stable ( depending on underlying merge)
– Not-in-place
Recursive Merge Sort (arrays)
(works top down)
similar to the one given in Program 7.11 for linked lists

void rmergesort(Element *a, int left, int right)


{
if(right <= left) return;
int mid = (right +left)/2;
rmergesort(a, left, mid);
rmergesort(a, mid+1, right);
merge( a, left, mid, right);

}
Homework
• Show how the basic mergesort that we saw
in class would work on the following array
of numbers
15 8 7 2 3 6 9 10 1 11 12 0 4 5 13 14
Merge (linked lists)
similar to the one given in Program 7.12
Node ListMerge(Node a, Node b) // first nodes of the 2 sorted linked blists
{
Node dummy= new Node();
Node head = dummy, c = head;

for(; a!=null && b!=null;)


{ if (a.item < =b.item)
{
c.next = a; c=a; a=a.next;
}
else
{
c.next = b; c=b; b=b.next;
}
}// end of for loop
if ( a == null) c.next =b;
else c.next = a;

head= head.next; delete dummy; return head;


}
MergeSort (linked lists)
similar to the one given in Program 7.11 for linked lists

Node ListMergesort(Node c)
{
if ( c== null || c.next == null) return c;

Node a=c, b=c.next;

// divide the list in half (different mechanism in book)


while ((b!=null) && (b.next !=null))
{
c= c.next ; b =(b.next).next;
}
b=c.next ; c.next = null;

return ListMerge(ListMergesort(a), ListMergesort(b));


}
Greedy algorithms
• An optimization problem is one in which you want to find, not just a
solution, but the best solution
• A “greedy algorithm” sometimes works well for optimization
problems
• A greedy algorithm works in phases: At each phase:
– You take the best you can get right now, without regard for future
consequences
– You hope that by choosing a local optimum at each step, you will
end up at a global optimum
Example: Counting money
• Suppose you want to count out a certain amount of money,
using the fewest possible bills and coins
• A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin that does
not overshoot
– Example: To make $6.39, you can choose:
• a $5 bill
• a $1 bill, to make $6
• a 25¢ coin, to make $6.25
• A 10¢ coin, to make $6.35
• four 1¢ coins, to make $6.39
• For US money, the greedy algorithm always gives the
optimum solution

You might also like