0% found this document useful (0 votes)
10 views3 pages

CSC 305 Lec1 - Analysis of Algorithms

The document discusses the analysis of algorithms, focusing on their time and space complexity as measures of efficiency. It explains the importance of asymptotic analysis, including worst-case, best-case, and average-case scenarios, and introduces concepts like rate of growth and correctness proofs for algorithms. Additionally, it highlights the use of Big O notation for estimating algorithm performance and the need for choosing appropriate algorithms for specific problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views3 pages

CSC 305 Lec1 - Analysis of Algorithms

The document discusses the analysis of algorithms, focusing on their time and space complexity as measures of efficiency. It explains the importance of asymptotic analysis, including worst-case, best-case, and average-case scenarios, and introduces concepts like rate of growth and correctness proofs for algorithms. Additionally, it highlights the use of Big O notation for estimating algorithm performance and the need for choosing appropriate algorithms for specific problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 3

Analysis of Algorithms

Introduction

In theoretical analysis of algorithms, it is common to estimate their complexity in the asymptotic


sense, i.e., to estimate the complexity function for arbitrarily large input. The term "analysis of
algorithms" was coined by Donald Knuth.

Algorithm analysis is an important part of computational complexity theory, which provides


theoretical estimation for the required resources of an algorithm to solve a specific computational
problem. Most algorithms are designed to work with inputs of arbitrary length. Analysis of
algorithms is the determination of the amount of time and space resources required to execute it.

Algorithm complexity, is a measure of the efficiency of an algorithm in terms of the resources it


consumes as the input size grows. It provides insights into how the algorithm's performance
scales with larger input data.
There are tools to measure how fast a program runs. These tools (programs) are called profilers,
which measure running time in milliseconds and can help us optimise our code by spotting
bottlenecks — it is NOT relevant to algorithm complexity. Complexity analysis is a tool that
allows us to explain how an algorithm behaves as the input grows larger. If our algorithm takes 1
second to run for an input of size 1000, how will it behave if we double the input size? Will it
run just as fast, half as fast, or four times slower? This allows us to predict how our algorithm
will behave when the input data becomes larger.
Algorithm complexity analyses an algorithm at the idea level (idea of how something is
computed) ignoring details such as the programming language used, the hardware the algorithm
runs on, or the instruction set of the given CPU.
The complexity analysis of algorithms can be studied in two aspects:
• Time Complexity: The processing time of the algorithm.
• Space Complexity: Memory space required to run the algorithm.

Usually, the efficiency or running time of an algorithm is stated as a function relating the input
length to the number of steps, known as time complexity, or volume of memory, known
as space complexity. An algorithm is said to be efficient when this function's values are small,
or grow slowly compared to a growth in the size of the input. Different inputs of the same size
may cause the algorithm to have different behavior;

 Worst-case − The maximum number of steps taken on any instance of size a.


 Best-case − The minimum number of steps taken on any instance of size a.
 Average case − An average number of steps taken on any instance of size a.
 Amortized − A sequence of operations applied to the input of size a averaged over time.
Usually, this involves determining a function that relates the size of an algorithm's input to the
number of steps it takes (its time complexity) or the number of storage locations it uses (its space
complexity so best, worst and average case descriptions might all be of practical interest. When
not otherwise specified, the function describing the performance of an algorithm is usually
an upper bound, determined from the worst case inputs to the algorithm.

In theoretical analysis of algorithms it is common to estimate their complexity in the asymptotic


sense, i.e., to estimate the complexity function for arbitrarily large input. Big O notation, Big-
omega notation and Big-theta notation are used to this end. For instance, binary search is said to
run in a number of steps proportional to the logarithm of the size n of the sorted list being
searched, or in O(log n), colloquially "in logarithmic time". Usually asymptotic estimates are
used because different implementations of the same algorithm may differ in efficiency. However
the efficiencies of any two "reasonable" implementations of a given algorithm are related by a
constant multiplicative factor called a hidden constant.

Exact (not asymptotic) measures of efficiency can sometimes be computed but they usually
require certain assumptions concerning the particular implementation of the algorithm,
called model of computation.

The Need for Analysis


The need for analysis of algorithms and how to choose a better algorithm for a particular
problem as one computational problem can be solved by different algorithms. By considering an
algorithm for a specific problem, we can begin to develop pattern recognition so that similar
types of problems can be solved by the help of this algorithm.

Algorithms are often quite different from one another, though the objective of these algorithms
are the same. For example, we know that a set of numbers can be sorted using different
algorithms. Number of comparisons performed by one algorithm may vary with others for the
same input. Hence, time complexity of those algorithms may differ. At the same time, we need to
calculate the memory space required by each algorithm.

Rate of Growth
Rate of growth is defined as the rate at which the running time of the algorithm is increased
when the input size is increased.

The growth rate could be categorized into two types: linear and exponential. If the algorithm is
increased in a linear way with an increasing in input size, it is linear growth rate. And if the
running time of the algorithm is increased exponentially with the increase in input size, it
is exponential growth rate.

Proving Correctness of an Algorithm


Once an algorithm is designed to solve a problem, it becomes very important that the algorithm
always returns the desired output for every input given. So, there is a need to prove the
correctness of an algorithm designed. This can be done using various methods −
Proof by Counterexample: If the counterexample works for the algorithm, then the correctness
is proved. Otherwise, another algorithm that solves this counterexample must be designed.
Proof by Induction: Using mathematical induction, we can prove an algorithm is correct for all
the inputs by proving it is correct for a base case input, say 1, and assume it is correct for another
input k, and then prove it is true for k+1.
Proof by Loop Invariant: Find a loop invariant k, prove that the base case holds true for the
loop invariant in the algorithm. Then apply mathematical induction to prove the rest of algorithm
true.

When carrying out the analysis, we will concentrate on the computation instructions and not
things such as networking tasks or user input and output. We will assume that the following
computation operations are executed as one instruction each:
• Assigning a value to a variable
• Looking up the value of a particular element in an array
• Comparing two values
• Incrementing a value
• Basic arithmetic operations such as addition, multiplication, etc.

You might also like