0% found this document useful (0 votes)
3 views7 pages

Binary Search

The document discusses three important algorithms: binary search, quick sort, and merge. It provides an overview of their implementation, complexity analysis, and the underlying principles of each algorithm. The binary search algorithm is efficient with a complexity of O(log n), quick sort averages O(n log n), and the merge algorithm operates in O(n + m) for merging two sorted arrays.

Uploaded by

osama7abx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views7 pages

Binary Search

The document discusses three important algorithms: binary search, quick sort, and merge. It provides an overview of their implementation, complexity analysis, and the underlying principles of each algorithm. The binary search algorithm is efficient with a complexity of O(log n), quick sort averages O(n log n), and the merge algorithm operates in O(n + m) for merging two sorted arrays.

Uploaded by

osama7abx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

The Binary Search and Quick Sort algorithms

In the following section, we give a brief look at 3 important and

efficient algorithms and their complexity analysis.

Example: The binary search algorithm

Write an algorithm to find the index of a given key in an array a, the

array is sorted in ascending order, and find the complexity of your algorithm.

input parameters
a: an array to search in (sorted in ascending order)
: an integer index to the first element of the array
r: an integer index to the last element of the array
key: the element to search for

output through return


index: the location of the key in the array, or -1 if not found

Idea behind the binary search algorithm

We make use of the fact that the array is sorted by comparing the key

with the element in the middle of the array: if they are equal we are done, else

if the key < this element then, we search in the left half of the array, else we

search in the right half of the array. Binary search is one of the well-known

divide and conquer algorithms.

... ...

l m r

i
int binarySearch ( a [ ], , r, key )
if l <= r
m = ( l + r ) / 2;
if ( key = am ) return m;
else if (key < am) // search the left half of the array
return binarySearch (a, l, m – 1, key);
else // key > am: search the right half of the array
return binarySearch (a, m + 1, r, key);
else
return –1; // key not found

Time complexity

Let ( ) be the number of comparisons done to search an array of n

elements in the worst case.

To simplify the math, we assume that .

It can be easily shown that ( ) is governed by the following recurrence

relation:

( ) ( ), ( )

Its’ solution is given by: ( )

This shows that the binary search algorithm is much more efficient than the

sequential search algorithm whose complexity is O(n).

Example: The quick sort algorithm

Write an algorithm to sort an array a in ascending order, and find the

complexity of your algorithm.

input parameters
a: an array to be sorted
ii
: an integer index to the first element of the array
r: an integer index to the last element of the array

output parameter
a: the array after sorting

Idea behind the quick sort algorithm

Quick sort applies divide and conquer by dividing the array into 2 sub

arrays (hopefully of equal size) and then applies the same idea to the left and

right sub arrays over and over until we have arrays of size 1, at which point

we are finished.

...

To sort the array, we partition (divide) it so that small elements

(compared to a pivotal element x) are moved to the left and large ones are

moved to the right as shown next:

...≤x x ...≥x

l i r

That is:
, ,
, , and
.

The element x is called the pivot. Now, to sort the array a, we proceed as

follows:

iii
void quick_sort ( a [ ], l, r )
if l < r
partition ( a, l, r, i ); // i is the index of the pivot
quick_sort ( a, l, i-1 ); // sort the left sub array
(quick_sort ( a, i+1, r); // sort the right sub array

The partition algorithm, shown next, is the part which is doing the

actual work of comparing and moving the array elements. Notice that the

integer parameter i is a call by reference (output parameter).

void partition ( a [ ], , r, &i )


i = l; j = r; x = ai ; // x is the pivot. Location ai is now free
while i < j
while ( (aj >= x) and ( j > i) ) j--; // looking for a small
// element from the right
if ( j > i )
ai = aj; // aj is now a free location
i++;
while ((ai <= x) and (i < j) ) i++; // looking for a large
// element from the left
if ( i < j )
aj = ai; // ai is now a free location
j--;
// now i=j, and ai is free: put the pivot x into its’ final location
ai = x;

This algorithm will do n-1 comparisons to partition an array of n elements.

iv
Time complexity of the quick sort algorithm

Let ( ) be the number of comparisons done to sort an array of n

elements in the best case. The best case happens if we are so lucky that the

partition algorithm will divide the array into 2 equal sub arrays.

To simplify the math, we assume that . It can be easily shown

that ( ) is governed by the following recurrence relation:

( ) ( ), ( )

Its’ solution is given by: ( ) It can also be shown that the average

complexity is about: This shows that on the average, the quick

sort algorithm is much more efficient than many other sorting algorithms such

as the selection sort algorithm whose complexity is O(n2).

Example: The merge algorithm

Write an algorithm to merge 2 sorted arrays a and b to form a new

(sorted) array c, and find the complexity of your algorithm.

input parameters
a, b: 2 arrays, each of which is sorted in increasing order
na, nb: the number of elements in a and b

output parameters
c: an array, the result of the merge in increasing order
nc: the number of elements in c

Idea of the merge algorithm

In the general step, we compare ai with bj and copy the smaller element

into ck. If 2 elements are equal we copy one and only one of them to ck.

v
...
i ...

...
j ...

... ...

k ...

void merge ( a [ ], b [ ], c [ ], na, nb, & nc)


i = 1; j = 1; k = 1; // i, j, k are running indices through a, b, c
while (i <= na and j <=nb)
if (a[i] < b[j])
c[k] = a[i];
i ++;
k ++;
else if (b[j] < a[i])
c[k] = b[j];
j ++;
k ++;
else // equal elements, we save one and only one of them
c[k] = a[i];
k++;
i ++;
j++;
// end while
// copy the remaining elements from a or b, if any, into c
for i = i to na
c[k] = a[i];
k++;
for j = j to nb
c[k] = b[j];
k++;
nc = k--;
// end merge
vi
Complexity of the merge algorithm

We find the number of comparisons done between the elements during

the merge operation in the worst case. Each time we go through the while

loop one element from a or b is copied into c depending on which is smaller.

If 2 elements are equal one of them is copied into c. The worst case happens

when there are no equal elements and the ends of the two arrays are reached at

the same time inside the while loop. Hence the total number of comparisons =

na+nb-1 = O(na+nb).

vii

You might also like