0% found this document useful (0 votes)
2 views10 pages

Pass/fail Recommend On This Form.: Do Not Write Your Name!

The document provides instructions for a qualifying exam in Optimization and Numerical Linear Algebra (ONLA) for Fall 2024. It includes guidelines for completing the exam, such as writing identification numbers, not including names, and using blank pages for additional space. The exam consists of various mathematical problems related to matrix theory, numerical methods, and optimization, with specific tasks and proofs required for each question.

Uploaded by

3089688246
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views10 pages

Pass/fail Recommend On This Form.: Do Not Write Your Name!

The document provides instructions for a qualifying exam in Optimization and Numerical Linear Algebra (ONLA) for Fall 2024. It includes guidelines for completing the exam, such as writing identification numbers, not including names, and using blank pages for additional space. The exam consists of various mathematical problems related to matrix theory, numerical methods, and optimization, with specific tasks and proofs required for each question.

Uploaded by

3089688246
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

ONLA

INSTRUCTIONS FOR QUALIFYING EXAMS

Start each problem on a new sheet of paper.


Write your university identification number at the top of each sheet of paper.

DO NOT WRITE YOUR NAME!


Complete this sheet and staple to your answers. Read the directions of the exam very
carefully.

STUDENT ID NUMBER ___________________________________________________

DATE: ________________________________________________________________

EXAMINEES: DO NOT WRITE BELOW THIS LINE


****************************************************************************************************
1. __________________ 6. __________________

2. __________________ 7. __________________

3. __________________ 8. __________________

4. __________________ 9. __________________

5. __________________

Pass/fail recommend on this form.

Total score: ____________________


Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


DO NOT FORGET TO WRITE YOUR SID NO. ON YOUR EXAM. PLEASE USE BLANK
PAGES AT END FOR ADDITIONAL SPACE.

1. (10 points) Let A be a unitary matrix.


a) Prove that the condition number of A is equal to 1.
b) Prove that A is orthogonally diagonalizable.
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


2. (10 points) Let A be a real square matrix that has eigendecomposition A = V ΛV −1 (here Λ is the diagonal
matrix of eigenvalues and V is the nonsingular eigenvector matrix). Suppose that a perturbation A + δA has
eigenvalue µ. Prove that there exists some eigenvalue λ of A such that |λ − µ| ≤ κ(V )kδAk, where κ(V ) denotes
the condition number of V and k · k the spectral norm.
Hint: You may wish to first prove that if µ is not an eigenvalue of A, then −1 is an eigenvalue of (Λ −
µI)−1 V −1 δAV .
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


3. (10 points) Assume the fundamental axiom of floating point arithmetic is in place.
a) Prove or disprove that backwards substitution is backward stable for a 2 × 2 upper triangular system.
b) Prove or disprove that the addition of 1 is backward stable (i.e. the algorithm defined by f˜(x) =fl(x) ⊕ 1
where ⊕ denotes floating point addition and fl(x) denotes the floating point representation of x).
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


4. (10 points) Consider Ax = b with A ∈ R2×2 solved with Gauss-Seidel and Jacobi iterations.
a) Derive the spectral radius for both methods.
b) Prove that Gauss-Seidel converges if and only if Jacobi converges.
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


5. (10 points) Consider Ax = b:     
2 −1 0 x1 4
−1 2 −1 x2  = 0
0 −1 2 x3 0
and the Conjugate Gradient (CG) algorithm

r0 = b − Ax0 , p0 = r0 ,
for i = 0, 1, 2, . . .
αi = (riT ri )/(pTi Api )
xi+1 = xi + αi pi
ri+1 = ri − αi Api
T
βi = (ri+1 ri+1 )/(riT ri )
pi+1 = ri+1 + βi pi

a) With kth Krylov subspace Kk = span{r0 , Ar0 , . . . , Ak−1 r0 }, determine the vectors defining the Krylov
spaces for k ≤ 3, taking initial approximation x0 = 0.
b) Solve Ax = b with CG using zero initial guess x0 = 0.
c) Verify that r0 , . . . , rk−1 form an orthogonal basis for Kk for k = 1, 2, 3.
d) Verify that p0 , . . . , pk−1 form an A-orthogonal basis for Kk for k = 1, 2, 3.
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


6. (10 points) Recall that the Lanczos iteration tridiagonalizes a hermitian A by building towards
 
α1 β1
 β1 α2 β2 
 
 .. 
Tn =  β2 α3 . .

 . . . .

 . . β  n−1
βn−1 αn

The Lanczos algorithm is given by

β0 = 0, q0 = 0, b = arbitrary, q1 = b/kbk
for n = 1, 2, 3, . . .
v = Aqn
an = qnT v
v = v − βn−1 qn−1 − αn qn
βn = kvk
qn+1 = v/βn

Assuming exact arithmetic, prove that during Lanczos iterations qj+1 is orthogonal to q1 , q2 , . . . , qj .
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


7. (10 points) Consider the problem

extremize x1 x2 + x21 subject to x21 − 2 ≤ x2 ≤ −x21 + 2.

(a) Write down the KKT conditions for this problem and find all points that satisfy them.
(b) Determine whether or not the points in part (a) satisfy the second order necessary conditions for being local
maximizers or minimizers.
(c) Determine whether or not the points that satisfy the necessary conditions in part (b) also satisfy the second
order sufficient conditions for being local maximizers or minimizers.
Hint: Draw a rough sketch of the objective function and the constraints. Maximization of a function f can be
treated as minimization of −f .
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


8. (10 points) Let X ⊆ Rn be a convex set and let f, g1 , . . . , gm be convex functions over X. Assume there exists
x̂ ∈ X such that
g1 (x̂) < 0, . . . , gm (x̂) < 0.
Let c ∈ R. Show that the following are equivalent (nonlinear Farkas lemma).

(a) The following implication holds:

x ∈ X, g1 (x) ≤ 0, . . . , gm (x) ≤ 0 ⇒ f (x) ≥ c.

(b) There exist λ1 , . . . , λm ≥ 0 such that


( m
)
X
min f (x) + λi gi (x) ≥ c.
x∈X
i=1

Hint: One direction is easy. For the other direction you may use the following result on the separation of two
convex sets: Let C1 , C2 ⊆ Rn be two nonempty convex sets with C1 ∩ C2 = ∅. Then there exists a ∈ Rn , a 6= 0
with aT x ≤ aT y for any x ∈ C1 , y ∈ C2 .
Qualifying Exam, Fall 2024

Optimization / Numerical Linear Algebra (ONLA)


9. (10 points) Let f ∈ CL1,1 (Rn ) and assume that ∇2 f (x) ≥ 0 (positive semi-definite) for all x ∈ Rn . Suppose that
the optimal value of the problem minx∈Rn f (x) is f ∗ . Let {xk }k≥0 be the sequence generated by the gradient
descent method with constant stepsize L1 . Show that if {xk }k≥0 is bounded, then f (xk ) → f ∗ as k → ∞.
Hint: CL1,1 (Rn ) denotes the set of continuously differentiable functions on Rn whose gradient satisfies k∇f (x) −
∇f (y)k2 ≤ Lkx − yk2 for all x, y ∈ Rn .

You might also like