0% found this document useful (0 votes)
25 views

Algorithm Complexity

An algorithm is a process for solving a problem that takes input and produces output. The complexity of an algorithm refers to how many resources like time or memory it requires. Time complexity measures how long an algorithm takes based on input size, while space complexity measures memory usage. Algorithm designers aim to minimize these complexities. An algorithm's performance can be analyzed in terms of best-case, worst-case, and average-case scenarios. There is often a tradeoff between time and space complexity - using more memory can speed up an algorithm's runtime.

Uploaded by

Yuvraj Singh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Algorithm Complexity

An algorithm is a process for solving a problem that takes input and produces output. The complexity of an algorithm refers to how many resources like time or memory it requires. Time complexity measures how long an algorithm takes based on input size, while space complexity measures memory usage. Algorithm designers aim to minimize these complexities. An algorithm's performance can be analyzed in terms of best-case, worst-case, and average-case scenarios. There is often a tradeoff between time and space complexity - using more memory can speed up an algorithm's runtime.

Uploaded by

Yuvraj Singh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Algorithm Complexity

An algorithm is a well-defined sequential computational technique that accepts a value or a


collection of values as input and produces the output(s) needed to solve a problem.

Complexity in algorithms refers to the amount of resources (such as time or memory)


required to solve a problem or perform a task. The most common measure of complexity is
time complexity, which refers to the amount of time an algorithm takes to produce a result
as a function of the size of the input. Memory complexity refers to the amount of memory
used by an algorithm. Algorithm designers strive to develop algorithms with the lowest
possible time and memory complexities, since this makes them more efficient and scalable.

The complexity of an algorithm is a function describing the efficiency of the algorithm in


terms of the amount of data the algorithm must process.

Usually there are natural units for the domain and range of this function.

An algorithm is analyzed using Time Complexity and Space Complexity. Writing an efficient
algorithm help to consume the minimum amount of time for processing the logic. For
algorithm A, it is judged on the basis of two parameters for an input of size n :

i. Time Complexity: Time taken by the algorithm to solve the problem. It is measured by
calculating the iteration of loops, number of comparisons etc.
• Time complexity is a function describing the amount of time an algorithm takes in terms
of the amount of input to the algorithm.
• “Time” can mean the number of memory accesses performed, the number of
comparisons between integers, the number of times some inner loop is executed, or some
other natural unit related to the amount of real time the algorithm will take.
ii. Space Complexity: Space taken by the algorithm to solve the problem. It includes space
used by necessary input variables and any extra space (excluding the space taken by inputs)
that is used by the algorithm. For example, if we use a hash table (a kind of data structure),
we need an array to store values so
• this is an extra space occupied, hence will count towards the space complexity of the
algorithm. This extra space is known as Auxiliary Space.
• Space complexity is a function describing the amount of memory(space)an algorithm
takes in terms of the amount of input to the algorithm.
• Space complexity is sometimes ignored because the space used is minimal and/ or
obvious, but sometimes it becomes an issue as time.
Cases in complexities:
There are three commonly studied cases of complexity in algorithms:

i. Best case complexity: The best-case scenario for an algorithm is the scenario in which the
algorithm performs the minimum amount of work (e.g. takes the shortest amount of time,
uses the least amount of memory, etc.).
ii. Worst case complexity: The worst-case scenario for an algorithm is the scenario in which
the algorithm performs the maximum amount of work (e.g. takes the longest amount of time,
uses the most amount of memory, etc.).
iii. Average-Case Analysis: By considering all possible inputs and their likelihood of
occurrence, the average-case analysis offers a more realistic perspective on an
algorithm’s performance.

In analysing the complexity of an algorithm, it is often more informative to study the worst-
case scenario, as this gives a guaranteed upper bound on the performance of the algorithm.
Best-case scenario analysis is sometimes performed, but is generally less important as it
provides a lower bound that is often trivial to achieve.

Time Space Tradeoff

A tradeoff is a situation where one thing increases and another thing decreases. It is a way to
solve a problem in:
• Either in less time and by using more space, or
• In very little space by spending a long amount of time.
The best Algorithm is that which helps to solve a problem that requires less space in memory
and also takes less time to generate the output. But in general, it is not always possible to
achieve both of these conditions at the same time. The most common condition is
an algorithm using a lookup table. This means that the answers to some questions for every
possible value can be written down. One way of solving this problem is to write down the
entire lookup table, which will let you find answers very quickly but will use a lot of space.
Another way is to calculate the answers without writing down anything, which uses very little
space, but might take a long time. Therefore, the more time-efficient algorithms you have,
that would be less space-efficient.
Types of Space-Time Trade-off
• Compressed or Uncompressed data
• Re Rendering or Stored images
• Smaller code or loop unrolling
• Lookup tables or Recalculation

i. Compressed or Uncompressed data: A space-time trade-off can be applied to the


problem of data storage. If data stored is uncompressed, it takes more space but less time.
But if the data is stored compressed, it takes less space but more time to run the
decompression algorithm. There are many instances where it is possible to directly work with
compressed data. In that case of compressed bitmap indices, where it is faster to work with
compression than without compression.

ii. Re-Rendering or Stored images: In this case, storing only the source and rendering it as
an image would take more space but less time i.e., storing an image in the cache is faster than
re-rendering but requires more space in memory.

iii. Smaller code or Loop Unrolling: Smaller code occupies less space in memory but it
requires high computation time that is required for jumping back to the beginning of the loop
at the end of each iteration. Loop unrolling can optimize execution speed at the cost of
increased binary size. It occupies more space in memory but requires less computation time.

iv. Lookup tables or Recalculation: In a lookup table, an implementation can include the
entire table which reduces computing time but increases the amount of memory needed. It
can recalculate i.e., compute table entries as needed, increasing computing time but reducing
memory requirements.

You might also like