DAA Defination
DAA Defination
The hiring problem is a classic algorithmic problem where you need to hire the best possible candidate from
a stream of applicants. The challenge is that you interview applicants one by one, and after each interview,
you must decide whether to hire the current applicant or interview the next. If you hire, you incur a cost. If
you interview, you also incur a cost. The goal is to minimize the total cost of hiring and interviewing while
ensuring you end up with the best possible candidate (or a high-quality candidate).
o Optimal Substructure: An optimal solution to the problem contains optimal solutions to subproblems.
o Overlapping Subproblems:1 The same subproblems are solved2 repeatedly by a recursive algorithm.
Dynamic programming solves each subproblem only once and stores its solution to avoid recomputation.
3. Differentiate Greedy strategy and Dynamic programming.
| Decision | Makes locally optimal choices at each step. | Explores all possible solutions to subproblems. |
| Optimality | Does not always guarantee a globally optimal solution. | Guarantees a globally optimal
solution. |
| Problem Type| Best for problems where local optimum leads to global optimum. | Best for problems with
optimal substructure and overlapping subproblems. |
| Example | Kruskal's, Prim's, Huffman coding | Fibonacci sequence, shortest path, knapsack problem |
The principle of optimality states that an optimal solution to a problem contains optimal solutions to its
subproblems.3 This means that if you have an optimal way to solve a larger problem, then the way you've
solved any part of that larger problem must also be optimal for that specific part, considered independently.
Common orders of growth describe how the running time or space requirements of an algorithm increase
with the input size (n).
o O(1) - Constant: The time/space remains constant regardless of n. (e.g., accessing an array element by
index).
o O(log n) - Logarithmic: The time/space grows very slowly with n. (e.g., binary search).
o O(n) - Linear: The time/space grows directly proportional to n. (e.g., linear search).
o O(n log n) - Linearithmic/Log-linear: Common in efficient sorting algorithms. (e.g., Merge Sort, Heap
Sort).
o O(n^2) - Quadratic: The time/space grows proportional to the square of n. (e.g., Bubble Sort, insertion sort
in worst case).
o O(n^k) - Polynomial: The time/space grows proportionally to a polynomial of n.
o O(2^n) - Exponential: The time/space grows very rapidly with n. Often indicates brute-force solutions.
(e.g., Tower of Hanoi).
o O(n!) - Factorial: Extremely rapid growth, usually in highly inefficient algorithms. (e.g., brute-force for
Traveling Salesperson Problem).
7. Arrange the given notations in the increasing order of their values.
The increasing order of their values (from fastest growing to slowest growing, or smallest complexity to
largest complexity) is:
Logn<n<nlogn<n2<n3<2n<n!
1. Amortized Analysis: Amortized analysis is a method for analyzing the running time or space complexity of
an algorithm over a sequence of operations. Instead of looking at the cost of a single operation, which might
be very high in certain cases, amortized analysis considers the total cost of a sequence of operations and then
averages it over the number of operations. This approach often provides a more realistic and tighter bound
on the performance of certain data structures or algorithms where occasional expensive operations are "paid
for" by a large number of cheap operations.
2. Complexity of Algorithm: The complexity of an algorithm refers to the resources (primarily time and
space) required for the algorithm to execute as a function of the input size. It quantifies how the algorithm's
performance scales with increasing input.
o Time Complexity: Measures the number of operations an algorithm performs.
o Space Complexity: Measures the amount of memory an algorithm uses. Both are typically expressed using
Big O notation to describe their asymptotic behavior.
3. Hiring Problem: The Hiring Problem (also known as the Secretary Problem or the Optimal Stopping
Problem) is a classic problem in probability and online algorithms. It describes a scenario where you need to
hire the best candidate from a sequence of candidates, who are presented one at a time. You must make an
immediate decision to either hire or reject a candidate, and once rejected, they cannot be recalled. The goal
is to maximize the probability of hiring the single best candidate. A common strategy involves observing a
certain number of initial candidates without hiring any, and then hiring the first candidate encountered
thereafter who is better than all previously seen candidates.
4. Time Efficiency: Time efficiency, often referred to as time complexity, measures the amount of
computational time an algorithm takes to complete its task as a function of the input size. It quantifies how
the running time grows with the input size, typically expressed using Big O notation (e.g., O(n), O(nlogn),
O(n2)). Lower time complexity generally indicates a more efficient algorithm. (Note: This definition was
also provided in your previous request; I'm including it again for completeness based on your current list.)
5. Strassen’s Algorithm: Strassen's algorithm is a divide-and-conquer algorithm for matrix multiplication. It
was developed by Volker Strassen in 1969 and1 provides a more efficient approach than the naive matrix
multiplication algorithm for large matrices. While the naive algorithm has a time complexity of O(n3) for
multiplying two n×n matrices, Strassen's algorithm achieves a complexity of approximately O(nlog27)
which is roughly O(n2.807). It does this by cleverly reducing the number of recursive multiplications from 8
to 7.
6. Backtracking Method: Backtracking is a general algorithmic technique for solving problems that
incrementally build a solution. It explores all possible solutions to a problem by systematically trying to
extend a partial solution. If, at any point, the partial solution cannot lead to a valid complete solution (i.e., it
violates constraints), the algorithm "backtracks" (undoes its last choice) and tries a different option. This
method is often used for problems like finding paths in a maze, solving Sudoku, or finding all permutations
of a set.
7. Dynamic Programming: Dynamic programming is an algorithmic technique for solving complex problems
by breaking them down into simpler,2 overlapping subproblems. The key idea is to solve each subproblem
only once and store its solution (memoization or tabulation) so that it can be reused later if the same
subproblem arises. This avoids redundant computations and can significantly improve efficiency, especially
for problems with optimal substructure (an optimal solution to the problem contains optimal solutions to its
subproblems) and overlapping subproblems. Examples include the Fibonacci sequence, shortest path
problems, and the knapsack problem.
Here are the definitions for the terms you provided:
i. Space Complexity: Space complexity measures the amount of memory space an algorithm requires to run
to completion as a function of the input size. It includes both the space used by the input itself and the
auxiliary space used by the algorithm for variables, data structures, and the call stack. Like time efficiency,
it's often expressed using Big O notation (e.g., O(n), O(logn), O(1)).
ii. Feasible Solution: In the context of optimization or decision problems, a feasible solution is a solution
that satisfies all the given constraints or conditions of the problem. It might not necessarily be the best or
optimal solution, but it is a valid one that adheres to all the rules and requirements.
iii. Directed Acyclic Graph (DAG): A Directed Acyclic Graph (DAG) is a directed graph that contains no
directed cycles. This means that for any vertex v, there is no path that starts and ends at v by traversing
along the directed edges. DAGs are commonly used to represent tasks with dependencies (e.g., project
scheduling), hierarchical structures, or causal relationships.
iv. Minimum Spanning Tree (MST): In an undirected, connected, and weighted graph, a Minimum
Spanning Tree (MST) is a subgraph that is a tree (contains no cycles), connects all the vertices of the
original graph, and has the minimum possible total sum of edge weights among all such spanning trees.
Algorithms like Prim's algorithm and Kruskal's algorithm are commonly used to find an MST.
v. Activity Selection Problem: The Activity Selection Problem is a classic optimization problem where the
goal is to select the maximum number of non-overlapping activities from a given set of activities, each with
a start time and a finish time. All activities share a common resource (e.g., a classroom, a machine), and
only one activity can use the resource at a time. The greedy approach of sorting activities by their finish
times and then picking the earliest finishing compatible activity is a common and effective way to solve this
problem to find an optimal solution.
vii. Optimal Solution: An optimal solution (or optimum solution) to a problem is the best possible solution
among all feasible solutions. In the context of optimization problems, "best" typically means maximizing a
desirable quantity (e.g., profit) or minimizing an undesirable quantity (e.g., cost, time, error). Finding an
optimal solution often involves exploring various possibilities and selecting the one that meets the defined
optimization criteria.