Daa Q&a

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

PANDIAN SARASWATHI YADAV ENGINEERING COLLEGE- 630562

DEPARTMENT OF COMPUTER SCIENCE AND TECHNOLOGY


AD 3351 DESIGN AND ANALYSIS OF ALGORITHM

1. State the principle of Optimality:


The principle of optimality is the basic principle of dynamic programming, which
was developed by Richard Bellman: that an optimal path has the property that whatever
the initial conditions and control variables (choices) over some initial period, the control
(or decision variables) chosen over the remaining period must be optimal for the
remaining problem, with the state resulting from the early decisions taken to be the initial
condition.
2. Define Traveling Salesman Problem:
The traveling salesman problem (TSP) is an algorithmic problem tasked with finding
the shortest route between a set of points and locations that must be visited. In the problem
statement, the points are the cities a salesperson might visit. The salesman‘s goal is to keep
both the travel costs and the distance traveled as low as possible.
3. State the 0/1 Knapsack problem:
The 0-1 knapsack problem in Computer Science refers to a scenario where a set of
objects with specific values and weights need to be placed in a knapsack with limited
capacity to maximize profit. Each object can either be selected or not selected, making it a
combinatorial optimization problem.

4. Define Optimal Binary Search Tree:


An optimal binary search tree (OBST) is a binary search tree that minimizes the expected
search time for a given sequence of accesses or access probabilities. It's also known as a
weight-balanced binary tree.
5. What is Greedy Algorithm:
A greedy algorithm, as the name suggests, always makes the choice that seems to be the
best at that moment. This means that it makes a locally-optimal choice in the hope that this
choice will lead to a globally-optimal solution.
6. Differentiate between subset paradigm & ordering paradigm
Subset paradigm
In this version, the greedy method generates a subset of available choices to solve a
problem. The selection procedure is based on an optimization measure, such as an
objective function. This version can result in suboptimal solutions. Examples of problems
that use the subset paradigm include the knapsack problem and job sequencing with
deadlines.
Ordering paradigm
In this version, the greedy method makes decisions by considering inputs in a specific
order. Each decision is made using an optimization criterion that can be computed using
decisions already made.
7. What is the drawback of Greedy Algorithms?
The biggest drawback involved with making use of greedy algorithms is that it is very
possible that the local optimal solution might not always be the global optimal solution.
8. How to calculate the efficiency of Dijkstra’s Algorithm?
Analyzing the time complexity of Dijkstra's Algorithm is crucial for understanding its
efficiency. In the worst-case scenario, where all vertices and edges are explored, the time
complexity is O(|V|^2), where |V| represents the number of vertices in the graph.
9. State the principle of duality:
The principle of duality in Boolean algebra states that if we have true Boolean postulates or
equations then the dual of this statement equation is also true. A dual of a boolean
statement is obtained by replacing the statement's symbols with their counterparts.
10. State planner coloring graph problem
Graph coloring is the assignment of colors to vertices of a graph such that no two
adjacent vertices share the same color. The minimum number of colors required to color a
graph is called its chromatic number.
11. Write Algorithm for warshall and Floyds algorithm:

WARSHALL ALGORITHM

12. Warshall(A[1...n, 1...n]) // A is the adjacency matrix


13. R(0) ← A
14. for k ← 1 to n do
15. for i ← 1 to n do
16. for j ← to n do
17. R(k)[i, j] ← R(k-1)[i, j] or (R(k-1)[i, k] and R(k-1)[k, j])
18. return R(n)

FLOYDS ALGORITHM
let dist be a |V| × |V| array of minimum distances initialized to ∞ (infinity)
for each edge (u, v) do
dist[u][v] ← w(u, v) // The weight of the edge (u, v)
for each vertex v do
dist[v][v] ← 0
for k from 1 to |V|
for i from 1 to |V|
for j from 1 to |V|
if dist[i][j] > dist[i][k] + dist[k][j]
dist[i][j] ← dist[i][k] + dist[k][j]
end if
11 b)Slove the following using Floyds Algorithm:
12. Explain the Dijkstra shortest path algorithm and its efficiency.
### Dijkstra's Shortest Path Algorithm

Dijkstra's algorithm is a greedy algorithm used to find the shortest path from a starting node to
all other nodes in a weighted graph. It works on graphs with non-negative edge weights and can
be applied to both directed and undirected graphs.

### Steps of the Algorithm:

1. **Initialization:**
- Mark the distance to the source node as 0, and set the distance to all other nodes as infinity.
- Mark all nodes as unvisited. The source node is considered as visited initially.
- Create a priority queue (or a min-heap) to efficiently fetch the next node with the smallest
tentative distance.

2. **Processing:**
- While there are unvisited nodes:
- Select the unvisited node with the smallest tentative distance from the priority queue.
- Update the distances to its neighboring nodes. For each neighbor \(v\) of the current node \
(u\), if the path through \(u\) provides a shorter path to \(v\) than previously known, update the
distance to \(v\).
- After updating, mark the current node as visited.
- Repeat this process until all nodes have been visited.

3. **Termination:**
- Once all nodes have been processed, the algorithm terminates. At this point, the shortest
distance from the source node to each of the other nodes is known.

### Example:

Consider the following graph:

```
A
/\
1 2
/ \
B---3---C
\ /
4 5
\/
D
```

- Start with node **A** and set its distance to 0. Set distances for all other nodes to infinity.
- Process node **A**, update the distances to nodes **B** and **C**.
- Continue the process until all nodes are visited, updating the shortest path estimates as you go.

### Efficiency Analysis:

- **Time Complexity:**
- Using a **naive array-based implementation** (with linear search for the smallest unvisited
node), the time complexity of Dijkstra’s algorithm is **O(V²)**, where \(V\) is the number of
vertices (nodes) in the graph.
- If you use a **min-heap** (priority queue) for efficiently finding the smallest unvisited node,
the time complexity improves to **O((V + E) log V)**, where \(E\) is the number of edges.
- This is because:
- Extracting the minimum node takes \(O(\log V)\).
- Each edge update operation takes \(O(\log V)\), and there are \(E\) edges.

- **Space Complexity:**
The space complexity is **O(V + E)**, as you need to store the graph structure (vertices and
edges) and maintain data structures like the distance array and the priority queue.

### Summary of Dijkstra's Algorithm:

- **Purpose:** Finds the shortest path from a source node to all other nodes in a weighted graph.
- **Assumptions:** The graph must have non-negative edge weights.
- **Time Complexity:**
- With simple arrays: \(O(V^2)\)
- With priority queues (min-heaps): \(O((V + E) \log V)\)
- **Space Complexity:** \(O(V + E)\)

This makes Dijkstra's algorithm efficient for graphs with fewer edges (sparse graphs) when
implemented with a priority queue. However, for dense graphs or when dealing with negative
weights, other algorithms (like Bellman-Ford or A*) may be preferred.

You might also like