Exercises in Artificial Intelligence
Exercises in Artificial Intelligence
These exercises
cover various AI topics, including search algorithms, machine learning, and logic.
---
**Problem:**
Consider the following graph where each node represents a city, and the edges represent roads
with associated costs. The heuristic values (h(n)) represent the estimated cost from each city to
the goal city (G).
```
Cities: A, B, C, D, G
Edges and Costs:
- A -> B (cost = 4)
- A -> C (cost = 2)
- B -> D (cost = 5)
- C -> D (cost = 8)
- D -> G (cost = 3)
Use the **A* search algorithm** to find the shortest path from city **A** to city **G**. Show the
steps and the final path.
---
**Solution:**
1. **Initialization:**
- Start at node **A**.
- **f(n) = g(n) + h(n)**, where:
- **g(n)** is the cost from the start node to the current node.
- **h(n)** is the heuristic cost from the current node to the goal.
---
**Problem:**
You are given the following dataset:
| X (Feature) | Y (Target) |
|-------------|------------|
|1 |2 |
|2 |4 |
|3 |6 |
|4 |8 |
Use **linear regression** to predict the value of **Y** when **X = 5**. Assume the linear model
is of the form:
\[
Y = \theta_0 + \theta_1 X
\]
Use **gradient descent** to find the values of \(\theta_0\) and \(\theta_1\). Initialize \(\theta_0 =
0\) and \(\theta_1 = 0\), and use a learning rate (\(\alpha\)) of **0.1**.
---
**Solution:**
1. **Initialize Parameters:**
- \(\theta_0 = 0\), \(\theta_1 = 0\)
- Learning rate (\(\alpha\)) = 0.1
3. **Iteration 1:**
- Compute predictions:
- For \(X = 1\): \(h(1) = 0 + 0 \times 1 = 0\)
- For \(X = 2\): \(h(2) = 0 + 0 \times 2 = 0\)
- For \(X = 3\): \(h(3) = 0 + 0 \times 3 = 0\)
- For \(X = 4\): \(h(4) = 0 + 0 \times 4 = 0\)
- Compute gradients:
- \(\frac{\partial}{\partial \theta_0} = \frac{1}{4} [(0-2) + (0-4) + (0-6) + (0-8)] = -5\)
- \(\frac{\partial}{\partial \theta_1} = \frac{1}{4} [(0-2) \times 1 + (0-4) \times 2 + (0-6) \times 3
+ (0-8) \times 4] = -15\)
- Update parameters:
- \(\theta_0 := 0 - 0.1 \times (-5) = 0.5\)
- \(\theta_1 := 0 - 0.1 \times (-15) = 1.5\)
4. **Iteration 2:**
- Compute predictions:
- For \(X = 1\): \(h(1) = 0.5 + 1.5 \times 1 = 2\)
- For \(X = 2\): \(h(2) = 0.5 + 1.5 \times 2 = 3.5\)
- For \(X = 3\): \(h(3) = 0.5 + 1.5 \times 3 = 5\)
- For \(X = 4\): \(h(4) = 0.5 + 1.5 \times 4 = 6.5\)
- Compute gradients:
- \(\frac{\partial}{\partial \theta_0} = \frac{1}{4} [(2-2) + (3.5-4) + (5-6) + (6.5-8)] = -0.75\)
- \(\frac{\partial}{\partial \theta_1} = \frac{1}{4} [(2-2) \times 1 + (3.5-4) \times 2 + (5-6) \times
3 + (6.5-8) \times 4] = -2.25\)
- Update parameters:
- \(\theta_0 := 0.5 - 0.1 \times (-0.75) = 0.575\)
- \(\theta_1 := 1.5 - 0.1 \times (-2.25) = 1.725\)
5. **Final Model:**
- After several iterations, the parameters converge to:
- \(\theta_0 \approx 0\), \(\theta_1 \approx 2\)
- The linear model is:
\[
Y = 2X
\]
- For \(X = 5\), the predicted value is:
\[
Y = 2 \times 5 = 10
\]
---
**Problem:**
Given the following logical statements:
1. If it rains, then the ground will be wet. (\(R \rightarrow W\))
2. The ground is wet. (\(W\))
Can we conclude that it rained? Use logical reasoning to justify your answer.
---
**Solution:**
1. **Logical Statements:**
- \(R \rightarrow W\): If it rains, then the ground will be wet.
- \(W\): The ground is wet.
2. **Logical Reasoning:**
- The statement \(R \rightarrow W\) means that rain is a sufficient condition for the ground to
be wet, but it is not the only possible cause. The ground could be wet for other reasons (e.g.,
someone spilled water).
- Therefore, knowing that the ground is wet (\(W\)) does not necessarily imply that it rained
(\(R\)).
3. **Conclusion:**
- We **cannot** conclude that it rained based solely on the given statements.
---
These exercises cover fundamental AI concepts and provide step-by-step solutions to help
reinforce understanding.