AoA Basics

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

1. What is an Algorithm?

An algorithm is a sequence of unambiguous instructions for solving a problem, i.e., for


obtaining a required output for any legitimate input in a finite amount of time

2. What are Sequential Algorithms?


The central assumption of the RAM model is that instructions are executed one after
another, one operation at a time. Accordingly, algorithms designed to be executed on such machines
are called Sequential algorithms.

3. What is Algorithm Design Technique?


An algorithm design technique is a general approach to solving problems algorithmically
that is applicable to a variety of problems from different areas of computing.

4. Define Pseudo code.


A Pseudo code is a mixture of a natural language and programming language like constructs.
A pseudo code is usually more precise than a natural language, and its usage often yields more
succinct algorithm descriptions.

5. Define Flowchart.
A method of expressing an algorithm by a collection of connected geometric shapes
containing descriptions of the algorithm’s steps.

6. Explain Algorithm’s Correctness


To prove that the algorithm yields a required result for every legitimate input in a finite
amount of time.
Example: Correctness of Euclid’s algorithm for computing the greatest common divisor
stems from correctness of the equality gcd (m, n) = gcd (n, m mod n).

7. What is Efficiency of algorithm?


Efficiency of an algorithm can be precisely defined and investigated with mathematical
rigor. There are two kinds of algorithm efficiency
1) Time Efficiency – Indicates how fast the algorithm runs
2) Space Efficiency – Indicates how much extra memory the algorithm needs.

8. What is generality of an algorithm?


It is a desirable characteristic of an algorithm. Generality of the problem the algorithm
solves is sometimes easier to design an algorithm for a problem posed in more general terms.

9. What is algorithm’s Optimality?


Optimality is about the complexity of the problem that algorithm solves. What is the
minimum amount of effort any algorithm will need to exert to solve the problem in question is
called algorithm’s Optimality.

10. What do you mean by ″Worst case-Efficiency” of an algorithm?


The ″Worst case-Efficiency” of an algorithm is its efficiency for the worst-case input of size
n, which is an input (or inputs) of size n for which the algorithm runs the longest among all possible
inputs of that size.
Ex: if you want to sort a list of numbers in ascending order when the numbers are given in
descending order. In this running time will be the longest.

11. What do you mean by ″Best case-Efficiency” of an algorithm?


The ″Best case-Efficiency” of an algorithm is its efficiency for the Best-case input of size n,
which is an input(or inputs) of size n for which the algorithm runs the fastest among all possible
inputs of that size.
Ex: if you want to sort a list of numbers in ascending order when the numbers are given in
ascending order. In this running time will be the smallest.

12. Define the ″Average-case efficiency” of an algorithm?


The ″Average-case efficiency” of an algorithm is its efficiency for the input of size n, for
which the algorithm runs between the best case and the worst case among all possible inputs of that
size.

13. How to measure the algorithm’s efficiency?


It is logical to investigate the algorithm’s efficiency as a function of some parameter n
indicating the algorithm’s input size.
Example: It will be the size of the list for problems of sorting, searching, finding the list’s smallest
element, and most other problems dealing with lists.

14. What is called the basic operation of an algorithm?


The most important operation of the algorithm is the operation contributing the most to the
total running time is called basic operation of an algorithm.

15. How to measure an algorithm’s running time?


Let Cop be the time of execution of an algorithm’s basic iteration on a particular computer
and let C (n) be the number of times this operation needs to be executed for this algorithm. Then
we can estimate the running time T(n) of a program implementing this algorithm on that computer
by the formula
T(n) ≈ Cop C(n)
16. Define order of growth.
The efficiency analysis framework concentrates on the order of growth of an algorithm’s
basic operation count as the principal indicator of the algorithm’s efficiency. To compare and rank
such orders of growth we use three notations
1) O (Big oh) notation
2) Ω (Big Omega) notation &
3) Θ (Big Theta) notation

17. Define Big oh notation May/June 2006, April/May 2008


A function t(n) is said to be in O(g(n)) denoted t(n) ε O (g(n)), if t(n) is bounded above by
some constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some
non negative integer n0 such that
T (n) < c g (n) for n > n0

18. Prove that 100n+5 ∈ O (n2)?


Clearly 100n+5 ≤ 100n+n (for all n ≥ 5) = 101n≤101n2
By choosing n0=5 and c=101 we find that 100n+5∈O (n2).

19. Define Ω notation


A function t(n) is said to be in Ω (g(n)), denoted t(n) ∈ Ω (g(n)), if t(n) is bounded below by
some positive constant multiple of g(n) for all large n, i.e., if there exist some positive constant c
and some non negative integer n0 such that
T (n) < c g (n) for n > n0
20. Prove that n3∈Ω(n2)?
Clearly n3 ≥ n2 for all n ≥ 0. i.e., we can select c=1 and n0=0.

21. Define Θ - notation


A function t(n) is said to be in Θ(g(n)), denoted t(n) ∈ Θ (g(n)), if t(n) is bounded both
above and below by some positive constant multiples of g(n) for all large n, i.e., if there exist some
positive constant c1 and c2 and some non negative integer n0 such that
c2 g (n) < t (n) < c1 g(n) for n > n0

22. Prove that( ½)n(n-1) ∈ Θ(n2)


1/2n(n-1)=(1/2)n2-1/2n ≤ 1/2 n2 for all n≥0.(we have proved upper inequality) now
1/2n(n-1)=(1/2)n2-1/2n≥(1/2)n2-1/2n*1/2n(for all n≥2)=1/4 n2 hence we can select c2=1/4,c1=1/2
and n0=2.

23. What is the use of Asymptotic Notations?


The notations O, Ω and Θ and are used to indicate and compare the asymptotic orders of
growth of functions expressing algorithm efficiencies.

You might also like