1-Introduction-to-Complexity-in-Algorithm
1-Introduction-to-Complexity-in-Algorithm
An algorithm is like a recipe for solving a problem. It has clear steps that tell the computer
exactly what to do, in the right order, to get the desired result.
Example of an Algorithm:
1. Start with the first number in the list and call it the current largest.
2. Look at the next number in the list:
o If it is bigger than the current largest, replace the current largest with this new
number.
o If it’s not bigger, move to the next number.
3. Continue until you have checked all the numbers.
4. The final "current largest" number is the largest number in the list.
Algorithm Complexity measures how much time and memory an algorithm uses, depending on
the size of the input. It helps to know whether an algorithm will work efficiently as the problem
grows.
Time Complexity: Measures how the running time increases as the input size increases.
Space Complexity: Measures how much extra memory an algorithm uses as the input
size increases.
Example:
If you have 5 numbers and an algorithm takes 5 steps to process them, the time complexity is
O(5). If you have 1000 numbers, and it takes 1000 steps, the time complexity is O(1000).
Understanding algorithm complexity helps to choose the best algorithm for a problem. Here are
the most common types of complexities:
O(1): Constant Time. The time it takes doesn’t depend on the input size.
o Example: Accessing an element from an array.
O(n): Linear Time. The time it takes grows directly with the input size.
o Example: A linear search through a list to find a number.
O(n2): Quadratic Time. The time grows as the square of the input size. Often happens in
algorithms that involve nested loops.
o Example: Bubble Sort, where you compare each pair of numbers.
O(log n): Logarithmic Time. The time grows slowly as the input size increases. This
happens when the algorithm divides the problem in half with each step.
o Example: Binary Search, where you keep cutting the list in half to find a
number.
O(n log n): Linearithmic Time. This is a combination of linear and logarithmic time,
commonly seen in efficient sorting algorithms.
o Example: Merge Sort and Quick Sort.
O(2n): Exponential Time. The time grows very fast as the input size increases, making it
impractical for large inputs.
o Example: Calculating Fibonacci numbers using simple recursion.
This tells us how much extra memory is needed by an algorithm to complete its task.
O(1): Constant Space. The memory used does not depend on the input size.
o Example: Sorting an array in-place.
O(n): Linear Space. The memory grows directly with the input size.
o Example: Storing the input in an array.
Different algorithms are designed to solve problems in different ways. Here are some types of
algorithms:
These algorithms divide a big problem into smaller problems, solve the smaller problems, and
then combine the solutions.
Example: Merge Sort and Quick Sort. These are used to sort large lists efficiently.
o Merge Sort splits the list into smaller lists, sorts each, and merges them back
together.
1.4.2. Dynamic Programming
This approach solves problems by breaking them into overlapping subproblems and solving each
one just once. Results are stored to avoid repeating work.
Greedy algorithms make the best choice at each step to try to find the best solution.
Example: Dijkstra’s Algorithm for finding the shortest path in a graph. It chooses the
closest node first, assuming that it leads to the shortest path.
1.4.4. Backtracking
Backtracking is used to find all possible solutions to a problem, and it keeps trying different
options until it finds the correct one.
Example: Solving a Sudoku puzzle. The algorithm tries to fill the grid, and if it reaches
a dead end, it backtracks and tries a different approach.
These algorithms try all possible solutions to find the best one. It’s simple but inefficient for
large problems.
Example: Trying all possible password combinations to break into a locked account.
Data Sorting and Searching: Algorithms like QuickSort and Binary Search are used
to find or sort data quickly.
Machine Learning: Algorithms like Gradient Descent help computers learn from data.
Web and App Development: Algorithms handle everything from loading a webpage to
recommending products on an e-commerce site.
Networking: Algorithms like Dijkstra's Algorithm help in routing data across
networks.
1.6. Advantages of Algorithms
Efficiency: A well-designed algorithm can make a program run faster and use fewer
resources.
Scalability: Good algorithms work well with both small and large datasets.
Optimization: Algorithms that are designed efficiently can save time and memory,
especially for large inputs.
Predictability: Knowing the complexity of an algorithm helps predict its performance on
different problem sizes.
Overhead: Writing complex algorithms can take time and might add unnecessary
complexity to a simple task.
Not Always Accurate: The theoretical time complexity doesn’t always match real-world
performance due to factors like hardware limitations and programming language.
Space Complexity: Some algorithms, even if they are fast, might use a lot of memory,
which can be problematic in resource-constrained environments.
Examples of Algorithms and Complexity
Algorithm: Suppose you are looking for a specific book in a library with shelves
arranged alphabetically.
o Brute Force (Linear Search): Go through every book on every shelf until you
find the book you need.
Time Complexity: O(n)O(n)O(n), where nnn is the total number of books.
o Efficient Search (Binary Search): If the books on each shelf are sorted, start in
the middle, check if the book is before or after the middle, and repeat until you
find the book.
Time Complexity: O(logn)O(\log n)O(logn).
Algorithm: You have a list of items to buy, and you want to organize it for quick
shopping.
o Brute Force: Go through your list randomly and search for the items one by one
in the store.
Time Complexity: O(n2)O(n^2)O(n2), since you might go back and forth
many times.
o Optimized Sorting (Merge Sort): Arrange your shopping list based on the layout
of the store (e.g., all fruits together, then vegetables, etc.).
Time Complexity: O(nlogn)O(n \log n)O(nlogn).
4. Scheduling Appointments
Algorithm: You have several appointments to attend in a day, and you want to minimize
travel time.
o Brute Force: Try every possible order of appointments to see which one gives the
shortest travel time.
Time Complexity: O(n!)O(n!)O(n!), where nnn is the number of
appointments.
o Optimized Greedy Algorithm: Start with the closest appointment first, then go
to the next closest, and so on.
Time Complexity: O(n2)O(n^2)O(n2).
5. Solving a Puzzle
7. Cooking a Meal
9. Managing Finances
Algorithm: You want to distribute your monthly budget among rent, food, utilities, and
savings.
o Dynamic Programming: Allocate amounts to each category based on the most
efficient use of your money (e.g., paying off high-interest loans first).
Example: The Knapsack Problem, where you maximize value with
limited resources.
Time Complexity: O(n⋅W)O(n \cdot W)O(n⋅W), where nnn is the number
of categories and WWW is the total budget.