0% found this document useful (0 votes)
66 views43 pages

What Is Algorithm?: - A Clearly Specified To Be Followed To Solve A Problem

1. An algorithm is a clearly specified set of instructions to solve a problem by taking inputs and producing outputs. Algorithms use data structures to organize data. 2. It is important to analyze algorithms to determine their efficiency, as a program may be inefficient if run on large data sets. The resources considered include time, memory, bandwidth, and disk usage. 3. Big-O notation is used to describe how fast a function grows relative to the size of its input. It provides an upper bound to the growth rate of an algorithm's running time. Common time complexities include constant O(1), linear O(N), and quadratic O(N^2) time.

Uploaded by

Mahmoud Albatie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views43 pages

What Is Algorithm?: - A Clearly Specified To Be Followed To Solve A Problem

1. An algorithm is a clearly specified set of instructions to solve a problem by taking inputs and producing outputs. Algorithms use data structures to organize data. 2. It is important to analyze algorithms to determine their efficiency, as a program may be inefficient if run on large data sets. The resources considered include time, memory, bandwidth, and disk usage. 3. Big-O notation is used to describe how fast a function grows relative to the size of its input. It provides an upper bound to the growth rate of an algorithm's running time. Common time complexities include constant O(1), linear O(N), and quadratic O(N^2) time.

Uploaded by

Mahmoud Albatie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 43

Introduction

• What is Algorithm?
– a clearly specified set of simple instructions to be followed to solve a
problem
• Takes a set of values, as input and
• produces a value, or set of values, as output
– May be specified
• In English
• As a computer program
• As a pseudo-code
• Data structures
– Methods of organizing data
• Program = algorithms + data structures

Lecture 1 Algorithms Analysis 1


Introduction

• Why need algorithm analysis ?


– writing a working program is not good enough
– The program may be inefficient!
– If the program is run on a large data set, then the
running time becomes an issue

Lecture 1 Algorithms Analysis 2


Example: Selection Problem
• Given a list of N numbers, determine the kth
largest, where k  N.
• Algorithm 1:
(1) Read N numbers into an array
(2) Sort the array in decreasing order by some simple
algorithm
(3) Return the element in position k
Let N=10 ,k=5
A[]={2,4,6,7,46,8,1,3,9,5}
Sort A=46 9 8 7 6 5 4 3 2 1
A[4]=6
Lecture 1 Algorithms Analysis 3
Example: Selection Problem…
• Algorithm 2:
(1) Read the first k elements into an array and sort
them in decreasing order
(2) Each remaining element is read one by one
• If smaller than the kth element, then it is ignored
• Otherwise, it is placed in its correct spot in the array,
bumping one element out of the array.
(3) The element in the kth position is returned as
the answer.
Let N=10 ,k=5
A[]={2,4,6,7,46,8,1,3,9,5}
Sort first 5=46 7 6 4 2 take 8= 5=46 8 7 6 4
Take 9=46 9 8 7 6
A[4]=6

Lecture 1 Algorithms Analysis 4


Example: Selection Problem…

• Which algorithm is better when


– N =100 and k = 100?
– N =100 and k = 1?
• What happens when N = 1,000,000 and k =
500,000?
• There exist better algorithms

Lecture 1 Algorithms Analysis 5


Algorithm Analysis
• We only analyze correct algorithms
• An algorithm is correct
– If, for every input instance, it halts with the correct output
• Incorrect algorithms
– Might not halt at all on some input instances
– Might halt with other than the desired answer
• Analyzing an algorithm
– Predicting the resources that the algorithm requires
– Resources include
• Memory
• Communication bandwidth
• Computational time (usually most important)

Lecture 1 Algorithms Analysis 6


Algorithm Analysis…
• Factors affecting the running time
– computer
– compiler
– algorithm used
– input to the algorithm
• The content of the input affects the running time
• typically, the input size (number of items in the input) is the main
consideration
– E.g. sorting problem  the number of items to be sorted
– E.g. multiply two matrices together  the total number of elements
in the two matrices
• Machine model assumed
– Instructions are executed one after another, with no concurrent
operations  Not parallel computers

Lecture 1 Algorithms Analysis 7


Example
N
• Calculate
 i 3

i 1

1
1
2 2N+2
3 4N
4 1

• Lines 1 and 4 count for one unit each


• Line 3: executed N times, each time four units
• Line 2: (1 for initialization, N+1 for all the tests, N for all the
increments) total 2N + 2
• total cost: 6N + 4  O(N)
Lecture 1 Algorithms Analysis 8
Worst- / average- / best-case
• Worst-case running time of an algorithm
– The longest running time for any input of size n
– An upper bound on the running time for any input
 guarantee that the algorithm will never take longer
– Example: Sort a set of numbers in increasing order; and the data is in
decreasing order
– The worst case can occur fairly often
• E.g. in searching a database for a particular piece of information
• Best-case running time
– sort a set of numbers in increasing order; and the data is already in
increasing order
• Average-case running time
– May be difficult to define what “average” means

Lecture 1 Algorithms Analysis 9


Growth Rate

• The idea is to establish a relative order among functions for large n


•  c , n0 > 0 such that f(N)  c g(N) when N  n0
• f(N) grows no faster than g(N) for “large” N

Lecture 1 Algorithms Analysis 10


Big O notation

• Big O notation (with a capital letter O, not a


zero), also called Landau's symbol, is a
symbolism used in complexity theory,
computer science, and mathematics to
describe the asymptotic behavior of functions.
• Basically, it tells you how fast a function grows
or declines.

Lecture 1 Algorithms Analysis 11


Big O notation

Lecture 1 Algorithms Analysis 12


• Log N= a N= 2𝑎
• Log 16= 4 16= 24
• Log 32= 5 32= 25

Lecture 1 Algorithms Analysis 13


Big O notation

How efficient is an algorithm or piece of code?


Efficiency covers lots of resources, including:
• CPU (time) usage
• memory usage
• disk usage
• network usage
All are important but we will mostly talk about
time complexity (CPU usage).
Lecture 1 Algorithms Analysis 14
Big O notation

• Be careful to differentiate between:


1. Performance: how much time/memory/disk/... is
actually used when a program is run. This depends on the
machine, compiler, etc. as well as the code.
2. Complexity: how do the resource requirements of a
program or algorithm scale, i.e., what happens as the size
of the problem being solved gets larger?
• Complexity affects performance but not the other way
around.

Lecture 1 Algorithms Analysis 15


Big O notation
The time required by a function/procedure is proportional
to the number of "basic operations" that it performs. Here
are some examples of basic operations:
• one arithmetic operation (e.g., +, *).
• one assignment (e.g. x := 0)
• one test (e.g., x = 0)
• one read (of a primitive type: integer, float, character,
boolean)
• one write (of a primitive type: integer, float, character,
boolean)
Lecture 1 Algorithms Analysis 16
We express complexity using big-O notation.
For a problem of size N:
• a constant-time algorithm is "order 1": O(1)
• a linear-time algorithm is "order N": O(N)
• a quadratic-time algorithm is "order N
squared": O(𝑁 2 )

Lecture 1 Algorithms Analysis 17


Big-Oh Notation

• f(N) = O(g(N))
• There are positive constants c and n0 such that
f(N)  c g(N) when N  n0

• The growth rate of f(N) is less than or equal to


the growth rate of g(N)
• g(N) is an upper bound on f(N)

Lecture 1 Algorithms Analysis 18


Example

• f(N) = 3 𝑁 2 + 5.
• We can show that f(N) is O(N*2) by choosing
• F(n) <= c g(n)
• 3𝑁 2 +5 <= c 𝑁 2 c=4 𝑛0 =0 5<=0 false
• c = 4 and 𝑛0 = 3. c=4 𝑛0 =1 8<=4 false
c=4 𝑛0 =2 17 <= 16
c=4 𝑛0 =3 32<=36
n=10 305<=400
Lecture 1 Algorithms Analysis 19
• f(N) is not O(N), because whatever constant c
and value n0 you choose, There is always
• a value of N > 𝑛0 such that
(3𝑁 2 +5 ) > c (N)

Lecture 1 Algorithms Analysis 20


Big-Oh: example

• Let f(N) = 2N2. Then


– f(N) = O(N4) 𝑛0 = 2 or 𝑛0 =0, c=2
– f(N) = O(N3) 𝑛0 =2 or 𝑛0 =0, c=2
– f(N) = O(N2) (best answer, asymptotically tight)
C=2

• O(N2): reads “order N-squared” or “Big-Oh N-squared”

Lecture 1 Algorithms Analysis 21


Big Oh: more examples
• N2 / 2 – 3N = O(N2)
• 1 + 4N = O(N) c=5, n0=1
• 7N2 + 10N + 3 = O(N2) = O(N3)
• log10 N = log2 N / log2 10 = O(log2 N) = O(log N)
• sin N = O(1); 10 = O(1), 1010 = O(1)


N
i 1
i  N  N  O( N 2 )

 i 1
N
i 2
 N  N 2
 O ( N 3
)
• log N + N = O(N)
• logk N = O(N) for any constant k
• N = O(2N), but 2N is not O(N)
• 210N is not O(2N)
Lecture 1 Algorithms Analysis 22
Some rules

When considering the growth rate of a function using Big-Oh


• Ignore the lower order terms and the coefficients of the
highest-order term
• No need to specify the base of logarithm
– Changing the base from one constant to another changes the value of
the logarithm by only a constant factor

• If f1(N) = O(f(N) and f2(N) = O(g(N)), then


– f1(N) + f2(N) = max(O(f(N)), O(g(N))),
– f1(N) * f2(N) = O(f(N) * g(N))
Lecture 1 Algorithms Analysis 23
Big-Omega

•  c , n0 > 0 such that f(N)  c g(N) when N  n0


• f(N) grows no slower than g(N) for “large” N
Lecture 1 Algorithms Analysis 24
Big-Omega

• f(N) = (g(N))
• There are positive constants c and n0 such that
f(N)  c g(N) when N  n0

• The growth rate of f(N) is greater than or


equal to the growth rate of g(N).

Lecture 1 Algorithms Analysis 25


Big-Omega: examples

• Let f(N) = 2N2. Then


– f(N) = (N)
𝑛0 =1 2>=1
𝑛0 =2 8>=2
– f(N) = (N2) (best answer)

Lecture 1 Algorithms Analysis 26


f(N) = (g(N))

• the growth rate of f(N) is the same as the growth rate of g(N)

Lecture 1 Algorithms Analysis 27


Big-Theta

• f(N) = (g(N)) if
f(N) = O(g(N)) and f(N) = (g(N))
• The growth rate of f(N) equals the growth rate of
g(N)
• Example: Let f(N)=N2 , g(N)=2N2
– Since f(N) = O(g(N)) and f(N) = (g(N)),
thus f(N) = (g(N)).
• Big-Theta means the bound is the tightest
possible.
Lecture 1 Algorithms Analysis 28
Some rules

• If f(N) is a polynomial of degree k, then


f(N) = (Nk).

• For logarithmic functions,


f(logm N) = (log N).

Lecture 1 Algorithms Analysis 29


Typical Growth Rates

Lecture 1 Algorithms Analysis 30


Growth Rate

• The idea is to establish a relative order among functions for large n


•  c , n0 > 0 such that f(N)  c g(N) when N  n0
• f(N) grows no faster than g(N) for “large” N

Lecture 1 Algorithms Analysis 31


Determine Complexity

• In general, how can you determine the


running time of a piece of code?
• The answer is that it depends on what kinds of
statements are used.

Lecture 1 Algorithms Analysis 32


Determine Complexity
• Sequence of statements
statement 1;
statement 2;
...
statement k;
The total time is found by adding the times for all
statements:
Total time = time(statement 1) + time(statement 2) + ...
+ time(statement k)
Lecture 1 Algorithms Analysis 33
Determine Complexity

• If each statement is "simple" (only involves


basic operations) then the time for each
statement is constant and the total time is
also constant: O(1).

Lecture 1 Algorithms Analysis 34


Determine Complexity
If-Then-Else
if (cond) then
block 1 (sequence of statements)
else
block 2 (sequence of statements)
end if;
Here, either block 1 will execute, or block 2 will execute.
Therefore, the worst-case time is the slower of the two
possibilities: max(time(block 1), time(block 2))
If block 1 takes O(1) and block 2 takes O(N), the if-then-else
statement would be O(N).
Lecture 1 Algorithms Analysis 35
Determine Complexity

Loops
for I in 1 .. N loop
sequence of statements
end loop;
The loop executes N times, so the sequence of
statements also executes N times. If we assume
the statements are O(1), the total time for the
for loop is N * O(1), which is O(N) overall.
Lecture 1 Algorithms Analysis 36
Determine Complexity

Nested loops
for I in 1 .. N loop
for J in 1 .. M loop
sequence of statements
end loop;
end loop;

Lecture 1 Algorithms Analysis 37


Determine Complexity

• The outer loop executes N times. Every time the


outer loop executes, the inner loop executes M
times. As a result, the statements in the inner loop
execute a total of N * M times. Thus, the complexity
is O(N * M).
• In a common special case where the stopping
condition of the inner loop is J<N instead of J<M the
inner loop also executes N times), the total
complexity for the two loops is O(N2 ).

Lecture 1 Algorithms Analysis 38


Examples

True or False any Why?


𝒏𝟐 = O(𝒏𝟑 )
𝒏𝟑 = O(𝒏𝟐 )
n/3 = O(n)
𝟑𝒏 = O(𝟐𝒏 )

Lecture 1 Algorithms Analysis 39


Calculate the number of operations and find
then O(g(n)):
1) Algorithm printArray(A, n)
i← 0 "1 assignment"
while i < n do " n + 1 comparisons"
cout << A[i] << endl " n outputs"
i ++ " n increments”
So: 1 + (n+1) + n + n = 3n + 2 operations >> o(n)
Lecture 1 Algorithms Analysis 40
Adding matrices C=A+B , dim=n*n
For i←1...n do (n+1)
For j←1...n do (n(n+1))
C[i , j] ← A[i,j]+B[i,j] 𝒏𝟐
2 𝒏𝟐 +2n+1

Lecture 1 Algorithms Analysis 41


For this program, assume n=100 this code took
0.6 sec. how long will it take to run for n=200.
Speed = #Steps/Time
steps = 2 𝒏𝟐 +2n+1

Speed=(2 ∗𝟏𝟎𝟎𝟐 +2∗100+1)/0.6=33668 steps/sec


Time for n=200
= (2 ∗𝟐𝟎𝟎𝟐 +2∗200+1)/speed(33668)=2.4 sec
Lecture 1 Algorithms Analysis 42
Speed = #Steps/Time

Time= #Steps/ speed

#Steps = Time* speed

Lecture 1 Algorithms Analysis 43

You might also like