Algorithms Rosen
Algorithms Rosen
Orders of Growth
1
Analysis of Algorithms
• An algorithm is a finite set of precise instructions
for performing a computation or for solving a
problem.
• What is the goal of analysis of algorithms?
– To compare algorithms mainly in terms of running time
but also in terms of other factors (e.g., memory
requirements, programmer's effort etc.)
• What do we mean by running time analysis?
– Determine how running time increases as the size of
the problem increases.
2
Example: Searching
• Problem of searching an ordered list.
– Given a list L of n elements that are sorted into
a definite order (e.g., numeric, alphabetical),
– And given a particular element x,
– Determine whether x appears in the list, and if
so, return its index (position) in the list.
3
Search alg. #1: Linear Search
procedure linear search
(x: integer, a1, a2, …, an: distinct integers)
i := 1
while (i n x ai)
i := i + 1
if i n then location := i
else location := 0
return location {index or 0 if not found}
4
Search alg. #2: Binary Search
• Basic idea: On each step, look at the middle
element of the remaining list to eliminate
half of it, and quickly zero in on the desired
element.
5
Search alg. #2: Binary Search
procedure binary search
(x:integer, a1, a2, …, an: distinct integers)
i := 1 {left endpoint of search interval}
j := n {right endpoint of search interval}
while i<j begin {while interval has >1 item}
m := (i+j)/2 {midpoint}
if x>am then i := m+1 else j := m
end
if x = ai then location := i else location := 0
return location
6
Is Binary Search more efficient?
• Number of iterations:
– For a list of n elements, Binary Search can
execute at most log2 n times!!
– Linear Search, on the other hand, can execute up
to n times !!
8
How do we analyze algorithms?
9
Example (# of statements)
Algorithm 1 Algorithm 2
11
Computing running time
• Associate a "cost" with each statement and find the
"total cost“ by finding the total number of times
each statement is executed.
• Express running time in terms of the size of the
problem.
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
12
Computing running time (cont.)
Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N x N
13
Comparing Functions Using
Rate of Growth
• Consider the example of buying elephants and
goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
• The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4
i.e., n4 + 100n2 + 10n + 50 and n4 have the same
rate of growth
14
Rate of Growth ≡Asymptotic Analysis
15
Example
• Suppose you are designing a web site to process
user data (e.g., financial records).
• Suppose program A takes fA(n)=30n+8
microseconds to process any n records, while
program B takes fB(n)=n2+1 microseconds to
process the n records.
• Which program would you choose, knowing
you’ll want to support millions of users?
16
Visualizing Orders of Growth
• On a graph, as
you go to the
Value of function
fA(n)=30n+8
right, a faster
growing
function
fB(n)=n2+1
eventually
becomes
larger... Increasing n
17
Questions
• Find the best big-O notation to describe the
complexity of following algorithms:
– A binary search algorithm.
– A linear search algorithm.
– An algorithm that finds the maximum element
in a finite sequence.
18