2 Marks Questions
2 Marks Questions
2 Marks Questions
5. **State the KKT conditions and explain their significance in convex optimization:**
- **KKT Conditions:**
1. Stationarity: \(\nabla f(x^*) + \sum_{i=1}^{m} \lambda_i \nabla g_i(x^*) + \sum_{j=1}^{p}
\mu_j \nabla h_j(x^*) = 0\)
2. Primal Feasibility: \(g_i(x^*) \leq 0, \quad i = 1, \ldots, m\)
3. Dual Feasibility: \(\lambda_i \geq 0, \quad i = 1, \ldots, m\)
4. Complementary Slackness: \(\lambda_i g_i(x^*) = 0, \quad i = 1, \ldots, m\)
- **Significance:** KKT conditions are necessary conditions for optimality in convex
optimization. They provide a set of relationships that must hold at the optimal solution.
Satisfying these conditions helps in characterizing and identifying optimal solutions in convex
optimization problems.
2. **Subgradient:**
- **Definition:** A subgradient of a convex function \(f\) at a point \(x\) is any vector \(g\) that
satisfies \(f(y) \geq f(x) + g^\top (y - x)\) for all \(y\).
- **Usefulness:** Subgradients are useful in cases where the function is not differentiable at
every point, such as in convex optimization problems involving non-smooth functions. They
extend the concept of gradients to handle non-differentiable points.
These topics provide insights into different aspects of optimization algorithms, from basic
iterative methods like gradient descent to more advanced techniques like Nesterov's
Accelerated Gradient Method and the use of subgradients and proximal operators.
3. **Monotone Operators:**
- **Definition:** A monotone operator is a mapping between two real Hilbert spaces that
satisfies the property that for any two points \(x\) and \(y\), the inner product of the difference \(x
- y\) and the difference between the operator applied to \(x\) and \(y\) is nonnegative.
- **Relevance in Optimization:** Monotone operators play a crucial role in variational
inequalities and optimization problems, especially when dealing with non-smooth or non-convex
functions. They provide a generalization of the concept of gradients for non-differentiable
functions.
4. **Douglas–Rachford Splitting Technique:**
- **Explanation:** The Douglas–Rachford splitting technique is an algorithm used to solve
convex optimization problems that can be formulated as the sum of two functions, each having
a simple proximal operator. It is particularly useful for problems involving separable or composite
structures.
- **Solution Process:**
1. Decompose the objective into two functions.
2. Apply the proximal operator associated with each function.
3. Combine the results with a reflection step.
4. Iterate until convergence.
- **Applications:** It is commonly used in signal processing, image reconstruction, and
machine learning problems where the objective function can be expressed as the sum of two
simpler functions.