0% found this document useful (0 votes)
83 views29 pages

DSweek3 Algo

The document discusses algorithms including their definition, importance, and analysis. It describes how algorithms are used in applications like genome sequencing and search engines. It also covers algorithm design goals like correctness and efficiency, and analyzing asymptotic runtime using techniques like counting operations and Big O notation.

Uploaded by

Prei Cyy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views29 pages

DSweek3 Algo

The document discusses algorithms including their definition, importance, and analysis. It describes how algorithms are used in applications like genome sequencing and search engines. It also covers algorithm design goals like correctness and efficiency, and analyzing asymptotic runtime using techniques like counting operations and Big O notation.

Uploaded by

Prei Cyy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 29

Algorithms

Outline
• Algorithms
• Importance of Algorithms
• Algorithms as a Technology
• Asymptotic Growth
Algorithms
• An algorithm is a computational procedure that
– Finiteness
– Definiteness
– may take some value (or set of values) as input and
– produces some value (or set of values) as output.
– Effectiveness
Strategy
• Use the techniques of design, analysis and
experimentation
• Design: create algorithms
• Analysis: examine algorithms and problems
mathematically
– “Algorithm A1 is more efficient than Algorithm A2"
– “Problem P1 is not solvable; problem P2 is solvable"
– “Problem P3 is solvable but intractable" (i.e., requires
too much time/resources to be of practical value)
• Experimentation: implement systems and study
the resulting behavior
Describing Algorithms
• Format
– Title
– Input
– Output
– Body
• Write in pseudocode or actual code
• Use of indentation
• Keywords: If-then-else, while, return
• Special symbols (next page)
Describing Algorithms
• Special symbols:
– “” (assignment) vs “=“ (checking for equality)
– “[ ]” (array indexing, normally starting at 1)
– “” or “//” (comments)
– “” or “{ }” (empty set)
– “” (there exists; for some) , “” (for all)
– “ ” (floor function), “ ” (ceiling function), “ ” (absolute value, or set cardinality)
– Other math symbols:  , ,  , , , 
Describing Algorithms
ALGORITHM MaxArrayElement
INPUT: Array A of n integers
OUTPUT: Integer k such that A[i]=k i  {1, 2,…,n}
and A[j]  k j  {1, 2,…,n}

max  A[1]
For i  2 to n
If A[i] > max then
max  A[i]
Return max
Try This!
• Write an algorithm that returns the third
maximum element of an array

ALGORITHM: ThirdMax
INPUT: Array A of n distinct integers
OUTPUT: // How do we state this?
// code here…
Importance of Algorithms
• The Human Genome Project
involves sophisticated
algorithms, enabling it to
• identify all the
(approx.100,000) genes in
human DNA
• determine the sequences of
(approx. 3 billion) chemical
base pairs that make up the
human DNA
• store all of the derived
information in databases
• develop tools for data analysis
Importance of Algorithms

• Internet enables people all around the world to quickly


access and retrieve (even large amount) information
• Finding good routes from the source host to the destination
host uses a clever algorithm
• Search engines that quickly find pages about a particular
information, and rank the retrieved results, make use of well-
studied algorithms
Importance of Algorithms
• Market Forecasting
• Weather Prediction
• Logistics Set-up
• Fraud Detection
• Encryption
• Cancer Detection
• Games
• A lot of real-world applications!
Algorithms as a Technology
• Computer-related technology is normally
associated with hardware
– Processor speed
– Memory size
• Suppose computers are extremely fast and with
extremely large memory. Would a study of
algorithms still be important? YES!
– Show an algorithm is correct
– Make an algorithm more efficient
Algorithms as a Technology
• Making an algorithm more efficient is important:
– Illustration 1: Suppose a complicated algorithm for n
objects requires n2 operations
• If each op requires 1 sec, how long will the algorithm run
when n = 1000?
– n = 1000  n2 = 1 000 000
– Run-time = 1 000 000 ops * (1 sec / op) = 1 000 000 sec
– Approx. 11 ½ days!
• If a (twice-as-) fast processor is developed so that each
operation requires 0.5 second only, how long will the
algorithm run?
• A “faster” algorithm requiring only 3n operations will finish in
how much time?
– n = 1000  3n = 3000
– Run-time = 3000 ops * (1 sec / op) = 3000 sec or 50 min only!
Algorithms as a Technology
• Making an algorithm more efficient is important:
– Illustration 2: Suppose you have 30 days (e.g., last
month of thesis) to run your n2 algorithm. What is the
largest n that you can study / solve?
• 30 days = 2 592 000 sec
• Find n such that n2 ops * (1 sec / op)  2 592 000 sec
• n2  2 592 000
• n = 1609
– What if you use a twice-as-fast processor? Will it
double?
• n2  2 592 000 / 0.5  n = 2276
– What if you, instead, develop a 3n algorithm?
• 3n  2 592 000  n = 864 000 (Cool!)
Algorithm Design Goals
• Proving Correctness
– Counter-example (if incorrect)
– Proof by Contradiction
– Proof by Induction
• Efficiency
– Best-case, Worst-case, Average-case
– O(f (n)), Ω(f (n)), ϴ(f (n))
– o(f (n)), ω(f (n))
Algorithm Efficiency
• For a correct algorithm, the efficiency is a major design
goal.
– Space utilization
– Time efficiency
– Execution time is the amount of time spent in executing
instructions of a given algorithm.
• Analyzing an algorithm has come to mean predicting the
resources, especially computational time, that the
algorithm requires.
– Benchmarking
– Counting primitive operations
– Analyzing growth relative to input size
Benchmarking
• Actually run algorithms and get elapsed
(running) time.
• Pitfalls:
– Requires implementation of algorithm
• Presence of bugs
• Inefficient implementation
– Requires exactly the same machine set-up
– Interruption from software features
– Results are valid only for the test cases used
Counting Primitive Operations
• Define which operations are primitive, then
count the number of primitive ops the algorithm
will perform.
• Pitfalls:
– Selecting primitive operations
• assignment, comparison, addition, multiplication, etc
• A[i] is 1 op or 2 ops?
• Dependent on instruction set of the machine
– Actual execution path
• Worst case
• Best case
• Average case
Analyzing Growth
• Observe how running time (or number of
operations) increase relative to the input size.

http://www.artima.com/cppsource/images/first-impl.png http://www.artima.com/cppsource/images/second-impl.png
Example: Bubble Sort
• Bubble Sort requires (n2 – n) / 2 “steps”
• Suppose the actual number of instructions on different CPUs are
– Pentium III CPU: 62 * (n2 – n) / 2
– Pentium IV CPU: 56 * (n2 – n) / 2
– Motorola CPU: 84 * (n2 – n) / 2
• Some Observations
– As n increases, the values of other terms become insignificant
• Pentium III CPU: 31n2
• Pentium IV CPU: 28n2
• Motorola CPU: 42n2
– As processors change, the coefficients change but not the exponent of n
• We say Bubble Sort runs in O(n2)-time
• How about if we measure speed?
Big Oh Notation
• Asymptotic Efficiency
• f (n) is O(g(n))
– if there exists constants c > 0 and n0  1
– such that f (n)  c  g(n)
– for all n  n0.

http://www.dgp.toronto.edu/people/JamesStewart/378notes/04formalO/d_domterm.gif
Big Oh Notation
• Prove that:
– 5n + 3 is O(n)
– 3n2 + 2n + 35 is O(n2)
– 4 + 6 log n is O(log n)
• Goal: find a pair (c; n0) such that f (n)  c  g(n)
for all n  n0.
Big Oh Notation
• Some rules of thumb can applied to determine
the running time of some algorithms
– Straightforward (Pseudo) Code
– Loops
– If-Else Statements
– Recursive Functions
Big Oh Notation
• O(n):
For i  1 to n
sum  sum + i
• O(n2):
For i  1 to n
For j  i to n
sum  sum + j
• O(1):
return n * (n + 1) / 2
• O(log n):
While n > 1
count  count + 1
nn/2
Asymptotic Notation
• Big-Oh Notation
– O(g(n)) is actually a class of functions.
– O(g(n)) = { f(n) |  c, n0 such that 0  f(n)  cg(n) n  n0}
• O(n2) = {1.3n2 + 35, 4007n2 +11.74,…}
– By its definition, BigOh describes a function that is an
“upperbound”
• 3n + 5 is O(n2)
• 3n + 5 is O(n2010)
• Big-Omega Notation: (a “lowerbound” notation)
– Ω(g(n)) = { f(n) |  c, n0 such that 0  cg(n)  f(n) n  n0}
• 5n4 + 99 is Ω(n4)
• 5n4 + 99 is Ω(n)
• 5n4 + 99 is NOT Ω(n5)
Asymptotic Notation
• Big-Theta Notation
– f(n) is ϴ(g(n)) if and only if f(n) is both O(g(n)) and
Ω(g(n)).
– 3n + 5 is ϴ(n)
– 3n + 5 is NOT ϴ(n2)
– 3n + 5 is NOT ϴ(1)
Asymptotic Notation
• Small Oh (upperbound that is not tight)
– o(g(n)) = { f(n) |  c > 0 ,  n0 such that 0  f(n) <
cg(n) n  n0}
– 3n + 5 is o(n2)
– 3n + 5 is NOT o(n)
• Small Omega
– f(n) is ω (g(n)) if and only if g(n) is o(f(n))
– 3n + 5 is NOT ω(n2)
– 3n + 5 is NOT ω(n)
– 3n + 5 IS ω(1)
• How about Small Theta???
Final Remarks
• Big Oh is most often used among the 5 asymptotic
notations
– Easier to prove than Big Theta
– Info on upperbound of required resources is often more
important than a lowerbound
• Common BigOh classes:
– O(1)  O(log n)  O(n0.5)  O(n)  O(n log n)  O(n2)
– O(nc)  O(2n)  O(n!)  O(nn)
• Although BigOh describes an upperbound, it is ok to say
– This algorithm has a worst-case run-time of O(nc).
– This algorithm has a best-case run-time of O(nc).
– This algorithm has an average run-time of O(nc).
References
• Cormen, Thomas, et al Introduction to
Algorithms (2nd edition), The MIT Press, 2001.
• Zelle, John M., Python Programming: An
Introduction to Computer Science, Franklin,
Beedle & Associates, 2004.

You might also like