Algorithm Complexity
Algorithm Complexity
Usually there are natural units for the domain and range of this function.
An algorithm is analyzed using Time Complexity and Space Complexity. Writing an efficient
algorithm help to consume the minimum amount of time for processing the logic. For
algorithm A, it is judged on the basis of two parameters for an input of size n :
i. Time Complexity: Time taken by the algorithm to solve the problem. It is measured by
calculating the iteration of loops, number of comparisons etc.
• Time complexity is a function describing the amount of time an algorithm takes in terms
of the amount of input to the algorithm.
• “Time” can mean the number of memory accesses performed, the number of
comparisons between integers, the number of times some inner loop is executed, or some
other natural unit related to the amount of real time the algorithm will take.
ii. Space Complexity: Space taken by the algorithm to solve the problem. It includes space
used by necessary input variables and any extra space (excluding the space taken by inputs)
that is used by the algorithm. For example, if we use a hash table (a kind of data structure),
we need an array to store values so
• this is an extra space occupied, hence will count towards the space complexity of the
algorithm. This extra space is known as Auxiliary Space.
• Space complexity is a function describing the amount of memory(space)an algorithm
takes in terms of the amount of input to the algorithm.
• Space complexity is sometimes ignored because the space used is minimal and/ or
obvious, but sometimes it becomes an issue as time.
Cases in complexities:
There are three commonly studied cases of complexity in algorithms:
i. Best case complexity: The best-case scenario for an algorithm is the scenario in which the
algorithm performs the minimum amount of work (e.g. takes the shortest amount of time,
uses the least amount of memory, etc.).
ii. Worst case complexity: The worst-case scenario for an algorithm is the scenario in which
the algorithm performs the maximum amount of work (e.g. takes the longest amount of time,
uses the most amount of memory, etc.).
iii. Average-Case Analysis: By considering all possible inputs and their likelihood of
occurrence, the average-case analysis offers a more realistic perspective on an
algorithm’s performance.
In analysing the complexity of an algorithm, it is often more informative to study the worst-
case scenario, as this gives a guaranteed upper bound on the performance of the algorithm.
Best-case scenario analysis is sometimes performed, but is generally less important as it
provides a lower bound that is often trivial to achieve.
A tradeoff is a situation where one thing increases and another thing decreases. It is a way to
solve a problem in:
• Either in less time and by using more space, or
• In very little space by spending a long amount of time.
The best Algorithm is that which helps to solve a problem that requires less space in memory
and also takes less time to generate the output. But in general, it is not always possible to
achieve both of these conditions at the same time. The most common condition is
an algorithm using a lookup table. This means that the answers to some questions for every
possible value can be written down. One way of solving this problem is to write down the
entire lookup table, which will let you find answers very quickly but will use a lot of space.
Another way is to calculate the answers without writing down anything, which uses very little
space, but might take a long time. Therefore, the more time-efficient algorithms you have,
that would be less space-efficient.
Types of Space-Time Trade-off
• Compressed or Uncompressed data
• Re Rendering or Stored images
• Smaller code or loop unrolling
• Lookup tables or Recalculation
ii. Re-Rendering or Stored images: In this case, storing only the source and rendering it as
an image would take more space but less time i.e., storing an image in the cache is faster than
re-rendering but requires more space in memory.
iii. Smaller code or Loop Unrolling: Smaller code occupies less space in memory but it
requires high computation time that is required for jumping back to the beginning of the loop
at the end of each iteration. Loop unrolling can optimize execution speed at the cost of
increased binary size. It occupies more space in memory but requires less computation time.
iv. Lookup tables or Recalculation: In a lookup table, an implementation can include the
entire table which reduces computing time but increases the amount of memory needed. It
can recalculate i.e., compute table entries as needed, increasing computing time but reducing
memory requirements.