Lecture 1
Lecture 1
Lecture No. 1
1
Algorithms
• Seen many algorithms
Graph Searching
Sorting Graph Algorithms
- Breadth First Search
- Insertion Sort - Shortest Path
- Depth First Search
- Bubble Sort - Minimum Spanning Tree
- Tree Traversal
- Merge Sort
- Quick Sort
- Heap Sort
- Radix Sort
- Counting Sort
Searching
- Linear Search
- Binary Search
And Many More
Key Questions
Given a problem:
1st Question
Does Solution/Algorithm exist?
Do we know any such problem?
2nd Question
If solution exists, is there alternate better solution?
3rd Question
What is the least time required to solve the problem?
- lower bound results
4th Question
Does there exist algorithm solving the problem taking the least
time?
Key Questions
5th Question
Is the known solution polynomial time?
What about primality?
6th Question
If the known solution is not polynomial time, does/will
there exist a polynomial time solution?
7th Question
Can we prove that no polynomial time solution will
ever exist?
8th Question
If we don’t know a polynomial time solution and
answer to 7th Question is no, then what?
Sorting Problem
• Input:
• Output:
10
Asymptotic Analysis
The main idea of asymptotic analysis is to have a
measure of efficiency of algorithms that doesn’t
depend on machine specific constants
and doesn’t require algorithms to be
implemented .
Asymptotic notations are mathematical tools to
represent time complexity of algorithms
Asymptotic Notation
Asymptotic Notations allows us to analyze an
algorithm's running time by identifying its
behaviour as the input size for the algorithm
increases. This is also known as an algorithm's
growth rate.
We are interested in the upper bound on
running time of the algorithm, or a Worst Case
of an algorithm.
Asymptotic Notation for this is Big-O
Big O
Formal mathematical definition of Big O.
Let T(n) and f(n) be two positive functions.
We write T(n) = O(f(n)), and say that T(n) has order of
f(n) or T(n) is of order f(n),
if there are positive constants M and n₀ such that
T(n) ≤ M·f(n) for all n ≥ n₀.
Example
Suppose T(n) = 6n2 + 8n + 10
Then we know for n ≥ 1
6n2 ≤ 6n2
8n ≤ 8n2
& 10 ≤ 10n2
Therefore T(n) ≤ 24n2 for all n ≥ 1
& T(n) = O(n2)
In this case M = 24 & n₀ = 1
Note: M & n₀ are not unique
Can we find different M & n₀ for the above example?
M = 10 & n₀ = 4
We would say that the running time of this algorithm grows as n2
Asymptotic Notation
By dropping the less significant terms and the
constant coefficients,
we can focus on the important part of an
algorithm's running time—its rate of growth
—without getting mired in details that
complicate our understanding.
Sloppy Notation
The notation T(n) ∊ O(f(n)) can be used even
when f(n) grows much faster than T(n).
For example,
If T(n) = 6n2 + 8n + 10
We may write T(n) = O(n3).
This is indeed true, but not very useful.
Example
Comparing Algorithms
Comparison between algorithms can be done easily.
Let's take an example to understand this:
If we have two algorithms with the following expressions representing
the time required by them for execution,
Expression 1: 20n2 + 3n - 4
Expression 2: n3 + 100n - 2
Then as per asymptotic notations, we should just worry about how the
function will grow as the value of n (input) will grow
And that will entirely depend on n2 for Expression 1
and on n3 for Expression 2.
Hence, we can clearly say that the algorithm for which running time is
represented by the Expression 2, will grow faster than the other one,
simply by analysing the highest power coefficient
Good & Bad Solutions
When analysing algorithms we will often come across the
following time complexities.
Complexity
Good News
O(1)
O(logn)
O(n)
O(nlogn)
________________________
O(nk), k ≥ 2
Bad News
O(kn), k ≥ 2
O(n!)
Other Asymptotic Notations
Big Omega Ω
used to give a lower bound.
Important to know how many computations are
necessary to solve the problem
Equivalently any solution to the problem has to
perform that may computations
Generally not easy to prove. For future
Big Theta θ
used to indicate that a function is bounded both from
above and below.
How to Determine Complexities
Complexity
How to determine the running time of a piece of code?
•The answer is that it depends on what kinds of statements are used.
Sequence of statements
statement 1;
statement 2;
...
statement k;
The total time is found by adding the times for all statements:
total time = time(statement 1) + time(statement 2) + ... +
time(statement k)
If each statement is "simple" (only involves basic operations) then the
time for each statement is constant and the total time is also constant:
O(1)
How to Determine Complexities
Complexity
If-Then-Else
if (cond)
then block 1 (sequence of statements)
else
block 2 (sequence of statements)
end if;
Here, either block 1 will execute, or block 2 will execute.
Therefore, the worst-case time is the slower of the two
possibilities: max(time(block 1), time(block 2))
If block 1 takes O(1) and block 2 takes O(N), the if-then-else
statement would be O(N).
How to Determine Complexities
Complexity
Loops
for I in 1 .. N loop
sequence of statements
end loop;
• The loop executes N times, so the sequence of
statements also executes N times.
• If we assume the statements are O(1),
• then total time for the for loop is N * O(1), which is
O(N) overall
How to Determine Complexities
Complexity
Nested loops
for I in 1 .. N loop
for J in 1 .. M loop
sequence of statements
end loop;
end loop;
• The outer loop executes N times.
• Every time the outer loop executes, the inner loop executes
M times.
• As a result, the statements in the inner loop execute a total
of N * M times.
Strong Belief