0% found this document useful (0 votes)
804 views3 pages

Hessian Matrix

This document discusses the second order conditions for constrained optimization problems in intermediate microeconomics. It defines the Hessian matrix and bordered Hessian matrix, which are used to determine if a solution maximizes the objective function. For an unconstrained problem, the Hessian must be negative semidefinite, meaning its leading principal minors alternate in sign. For a constrained problem, the bordered Hessian must be negative semidefinite. This ensures any solutions found from the first order conditions are actually maxima. An example is provided to demonstrate checking the bordered Hessian for a problem with two choice variables and one constraint.

Uploaded by

Desvia Safitri
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
804 views3 pages

Hessian Matrix

This document discusses the second order conditions for constrained optimization problems in intermediate microeconomics. It defines the Hessian matrix and bordered Hessian matrix, which are used to determine if a solution maximizes the objective function. For an unconstrained problem, the Hessian must be negative semidefinite, meaning its leading principal minors alternate in sign. For a constrained problem, the bordered Hessian must be negative semidefinite. This ensures any solutions found from the first order conditions are actually maxima. An example is provided to demonstrate checking the bordered Hessian for a problem with two choice variables and one constraint.

Uploaded by

Desvia Safitri
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

WENDNER: INTERMEDIATE MICROECONOMICS Second Order Conditions

1. Notation and Basic Concepts n = number of choice variables, m = number of constraints, f (x1 , ..., xn ) = objective function, g i (x1 , ..., xn ) = i-th (non)linear constraint. Hessian Matrix of f (x1 , ..., xn ): f11 f21 H= fn1 f1n f2n fnn

f12 f22 fn2

(1)

Leading principal minors of the Hessian Matrix of f (x1 , ..., xn ): |H1 | = |f11 | = f11 , f11 f21 fk1 f12 f22 fk2 |H2 | = f1k f2k fkk f11 f12 f21 f22 , , ,

|Hk | =

|Hn | = |H| ,

where |Hi | denotes the ith leading principal minor. Bordered Hessian Matrix of the Lagrange-function L: L = f (x1 , ..., xn ) + 1 g 1 (x1 , ..., xn ) + 2 g 2 (x1 , ..., xn ) + + m g m (x1 , ..., xn ): L1 1 L1 m L1 x1 L1 xn Lm 1 Lm m Lm x1 Lm xn Hb = Lx1 1 Lx1 m Lx1 x1 Lx1 xn Lxn 1 Lxn m Lxn x1 Lxn xn 1 1 gn 0 0 g1 m m 0 0 gn g1 . (2) = 1 m m i i m g1 g1 f11 + f1n + i=1 i g1n i=1 i g11 m m i i m 1 gn gn fn1 + i=1 i gn1 fnn + i=1 i gnn Ronald Wendner Notes V-1 v1.1

If the constraints are linear as bordered Hessian becomes: 0 0 Hb = 1 g 1 1 gn

gij = 0, i = 1, ..., n, j = 1, ..., n the


1 0 g1 m 0 g1 m g1 f11 m gn fn1

1 gn m gn f1n fnn

(3)

Notice that quadrants 1 to 3 represent the border, and quadrant 4 equals the (un-bordered) Hessian matrix. Leading principal minors of the bordered Hessian. As the second quadrant of bordered Hessian matrix (=rst m rows and columns) is a nullmatrix, the rst leading principal minor of interest concerns the square matrix of the rst m + 2 rows and columns! I.e., the rst leading principal minor we are looking at is: |Hbm+2 |, and NOT |Hbm+1 |.
1 0 0 g1 m 0 0 g1 1 m g1 g1 f11 1 m g2 g2 f21 1 g2 m g2 f12 f22 1 g2 m g2 f12 f22 f32 1 g3 m g3 f13 f23 f33

|Hbm+2 | =

|Hbm+3 | =

0 0 1 g1 1 g2 1 g3

1 0 g1 m 0 g1 m g1 f11 m g2 f21 m g3 f31

|Hbm+n | = |Hb| .

2. Concavity and Negative (Semi)Deniteness of H Suppose, m = 0. Concavity of f (x1 , ..., xn ) is a sucient (second order) condition for the rst order conditions to actually represent a maximum for unconstrained maximization problems. Concavity of f (x1 , ..., xn ) implies (and is implied by) negative semi-deniteness of the Hessian matrix. The Hessian matrix is negative semi-denite if (i) the sign of the rst leading principal minor is non-positive, i.e., f11 0, and (ii) the signs of the further leading principal minors alternate. I.e., sign |Hi | = sign (1)i or |Hi | = 0. If all inequalities hold strictly, the Hessian matrix is (strictly) negative denite, f (x1 , ..., xn ) is strictly concave, and the maximizer (if it exists) is unique.

Ronald Wendner

Notes V-2

v1.1

3. Negative (Semi)Deniteness of Hb Suppose, m > 0. Negative (semi)deniteness of Hb is a sucient (second order) condition for the rst order conditions to actually represent a maximum for constrained maximization problems. The bordered Hessian matrix is negative semi-denite if (i) the sign of the rst leading principal minor |Hbm+2 | equals the sign (1)m+1 (or equals zero), and (ii) the signs of the further leading principal minors alternate. I.e., sign |Hbm+i | = sign (1)m+i1 (or equal zero). If all inequalities hold strictly, the bordered Hessian matrix is (strictly) negative denite, and the maximizer (if it exists) is unique. 4. Example: n=2, m=1, linear constraint The bordered Hessian matrix is: 1 1 0 g1 g2 1 Hb = g1 f11 f12 , 1 g2 f21 f22

(4)

and the only leading principal minor we are interest in is |Hb1+2 | = |Hb|. I.e., 1 1 0 g1 g2 1 |Hb| = g1 f11 f12 (5) 1 g2 f21 f22 If we are looking for a (unique) maximum, we need to verify that Hb is negative denite, i.e., sign|Hb| = sign(1)1+1 , which is positive: |Hb| > 0. Notice: For this linear constraint considering that f1 /(g1 ) = f2 /(g2 ) we exactly get our condition for quasiconcavity (with n = 2, m = 1). Good luck for your Final!

Ronald Wendner

Notes V-3

v1.1

You might also like