0% found this document useful (0 votes)
134 views6 pages

Algorithms, Design and Analysis: Types of Formulas For Basic Operation Count

The document discusses asymptotic analysis and big-O notation. It defines asymptotic growth rates like O(n), Ω(n), and Θ(n) and provides examples of determining the growth rate of functions. Brute force algorithms are introduced as straightforward approaches based directly on the problem statement, with examples like computing factorials or performing sequential search. The strengths of brute force include wide applicability, simplicity, and reasonability for some problems, but it may have poor time efficiency.

Uploaded by

Dudley Williams
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
134 views6 pages

Algorithms, Design and Analysis: Types of Formulas For Basic Operation Count

The document discusses asymptotic analysis and big-O notation. It defines asymptotic growth rates like O(n), Ω(n), and Θ(n) and provides examples of determining the growth rate of functions. Brute force algorithms are introduced as straightforward approaches based directly on the problem statement, with examples like computing factorials or performing sequential search. The strengths of brute force include wide applicability, simplicity, and reasonability for some problems, but it may have poor time efficiency.

Uploaded by

Dudley Williams
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Types of formulas for basic operation count

• Exact formula
e.g., C(n) = n(n-1)/2

Algorithms, Design and Analysis • Formula indicating order of growth with


specific multiplicative constant
e.g., C(n) ˜ 0.5 n2
Big-Oh analysis,
Brute Force, Divide and conquer • Formula indicating order of growth with
intro unknown multiplicative constant
e.g., C(n) ˜ cn2

v1.2 1 v1.2 2

Order of growth Table 2.1


• Most important: Order of growth within a constant
multiple as n? 8

• Example:
– How much faster will algorithm run on computer that is
twice as fast?

– How much longer does it take to solve problem of


double input size?

• See table 2.1

v1.2 3 v1.2 4

Asymptotic growth rate Big-oh


• A way of comparing functions that ignores constant
factors and small input sizes

• O(g(n)): class of functions f(n) that grow no faster


than g(n)

• T (g(n)): class of functions f(n) that grow at same


rate as g(n)

• O(g(n)): class of functions f(n) that grow at least as


fast as g(n)
v1.2 5 v1.2 6

1
Big-omega Big-theta

v1.2 7 v1.2 8

Establishing rate of growth: Method 1 – using


limits L’Hôpital’s rule
0 order of growth of T( n) < order of growth of g( n) If
• limn? 8 f(n) = limn? 8 g(n) = 8
c>0 order of growth of T( n) = order of growth of g( n)
limn? 8 T(n)/g(n)
= • and the derivatives f´, g´ exist,
8 order of growth of T( n) > order of growth of g( n)

Examples: Then
lim f(n) lim f ´(
´(nn)
• 10n vs. 2 n2 =
n? 8 g (n) n? 8g ´(
´(nn)

• n(n+1)/2 vs. n2

• log b n vs. log c n • Example: log


lognn vs. n

v1.2 9 v1.2 10

Establishing rate of growth: Method 2 – using


definition Basic Asymptotic Efficiency classes
• f(n) is O(g(n)) if order of growth of f(n) = 1 constant
order of growth of g(n) (within constant
multiple) log n logarithmic
• There exist positive constant c and non- n linear
negative integer n0 such that n log n n log n
n2 quadratic
f(n) = c g(n) for every n = n0 n3 cubic
Examples: 2n exponential
• 10n is O(2n2)
n! factorial

• 5n+20 is O(10n)
v1.2 11 v1.2 12

2
More Big-Oh Examples Big-Oh Rules

7n-2
7n-2 is O(n)
need c > 0 and n0 ≥ 1 such that 7n-2 ≤ c•n for n ≥ n0 • If is f(n) a polynomial of degree d, then f(n) is
this is true for c = 7 and n0 = 1 O(nd ), i.e.,
n 3n3 + 20n2 + 5 1. Drop lower- order terms
3n3 + 20n2 + 5 is O(n3 ) 2. Drop constant factors
need c > 0 and n0 ≥ 1 such that 3n3 + 20n2 + 5 ≤ c•n3 for n ≥ n0
this is true for c = 4 and n0 = 21
• Use the smallest possible class of functions
– Say “2n is O(n)” instead of “2n is O(n2)”
n 3 log n + log log n
3 log n + log log n is O(log n)
• Use the simplest expression of the class
need c > 0 and n0 ≥ 1 such that 3 log n + log log n ≤ c•log n for n ≥ n0 – Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)”
this is true for c = 4 and n0 = 2
v1.2 13 v1.2 14

Intuition for Asymptotic Brute Force


Notation
A straightforward approach usually based on
problem statement and definitions
Big-Oh Examples:
– f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)
big-Omega
1. Computing an (a > 0, n a nonnegative
– f(n) is Ω(g(n)) if f(n) is asymptotically greater than or equal to g(n) integer)
big-Theta
– f(n) is Θ(g(n)) if f(n) is asymptotically equal to g(n)
little-oh
2. Computing n!
– f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)
little-omega 3. Selection sort
– f(n) is ω(g(n)) if is asymptotically strictly greater than g(n)

4. Sequential search
v1.2 15 v1.2 16

More brute force algorithm examples: Brute force strengths and weaknesses
• Closest pair • Strengths:
– wide applicability
– Problem : find closest among n points in k-
– simplicity
dimensional space
– yields reasonable algorithms for some important problems
– Algorithm : Compute distance between each pair of
• searching
points • string matching
– Efficiency : • matrix multiplication
– yields standard algorithms for simple computational tasks
• sum/product of n numbers
• Convex hull • finding max/min in a list
– Problem : find smallest convex polygon enclosing n • Weaknesses:
points on the plane – rarely yields efficient algorithms
– Algorithm : For each pair of points p1 and p2 – some brute force algorithms unacceptably slow
determine whether all other points lie to the same – not as constructive/creative as some other design
side of the straight line through p1 and p2 techniques
v1.2 17 v1.2 18
– Efficiency :

3
Divide and Conquer Divide-and-conquer technique
The most well known algorithm design a problem of size n
strategy:
1. Divide instance of problem into two or subproblem 1 subproblem 2
more smaller instances of size n/2 of size n/2

a solution to a solution to
2. Solve smaller instances recursively subproblem 1 subproblem 2

3. Obtain solution to original (larger)


instance by combining these solutions a solution to
the original problem

v1.2 19 v1.2 20

Divide and Conquer Examples General Divide and Conquer recurrence:

T(n) = aT(n/b) + f (n) where f (n) ? T(nk)


• Sorting: mergesort and quicksort

• Tree traversals 1. a < b k T(n) ? T(nk)


2. a = b k T(n) ? T(nk lg n )
• Binary search
3. a > b k T(n) ? T(nlog a) b

• Matrix multiplication-Strassen’s algorithm


Note: the same results hold with O instead of
• Convex hull-QuickHull algorithm T.
v1.2 21 v1.2 22

Mergesort Mergesort Example


Algorithm: 7 2 1 6 4
• Split array A[1..n] in two and make copies of each
half
in arrays B[1.. n/2 ] and C[1.. n/2 ]
• Sort arrays B and C
• Merge sorted arrays B and C into array A as follows:
– Repeat the following until no elements remain in one of the
arrays:
• compare the first elements in the remaining unprocessed
portions of the arrays
• copy the smaller of the two into A, while incrementing the index
indicating the unprocessed portion of that array
– Once all elements in one of the arrays are processed, copy
the remaining unprocessed elements from the other array
into A.
v1.2 23 v1.2 24

4
Efficiency of mergesort Quicksort
• All cases have same efficiency: T ( n log n) • Select a pivot (partitioning element)
• Rearrange the list so that all the elements in the
positions before the pivot are smaller than or equal to
• Number of comparisons is close to theoretical
the pivot and those after the pivot are larger than the
minimum for comparison-based sorting:
pivot (See algorithm Partition in section 4.2)
– log n ! ˜ n lg n - 1.44 n • Exchange the pivot with the last element in the first
(i.e., = sublist) – the pivot is now in its final position
• Space requirement: T ( n ) (NOT in-place) • Sort the two sublists

p
• Can be implemented without recursion
(bottom-up)
A[i]=p A[i]>p
v1.2 25 v1.2 26

The partition algorithm Quicksort Example


15 22 13 27 12 10 20 25

v1.2 27 v1.2 28

Efficiency of quicksort QuickHull Algorithm


• Best case: split in the middle — T ( n log n) Inspired by Quicksort compute Convex Hull:
• Worst case: sorted array! — T ( n2) • Assume points are sorted by x-coordinate values
• Average case: random arrays — T ( n log n) • Identify extreme points P1 and P2 (part of hull)
• Compute upper hull:
• Improvements: – find point Pmax that is farthest away from line P1P2
– better pivot selection: median of three partitioning avoids
– compute the hull of the points to the left of line P1Pmax
worst case in sorted files
– switch to insertion sort on small subfiles – compute the hull of the points to the left of line PmaxP2
– elimination of recursion • Compute lower hull in a similar manner
these combine to 20-25% improvement P max
P2
• Considered the method of choice for internal sorting
for large files (n = 10000)
P1
v1.2 29 v1.2 30

5
Efficiency of QuickHull algorithm
• Finding point farthest away from line P1P2 can be
done in linear time
• This gives same efficiency as quicksort :
– Worst case: T ( n2)
– Average case: T ( n log n)

• If points are not initially sorted by x-coordinate value,


this can be accomplished in T ( n log n) — no
increase in asymptotic efficiency class
• Other algorithms for convex hull:
– Graham’s scan
– DCHull
also in T ( n log n)

v1.2 31

You might also like