0% found this document useful (0 votes)
38 views57 pages

ch02 (2) Unlocked

This document provides an overview of constrained maximization problems and the Lagrangian multiplier method. It discusses: 1) How to set up constrained maximization problems using Lagrangians with multipliers to solve for optimal values that maximize an objective function subject to constraints. 2) The economic interpretation of Lagrangian multipliers as the marginal cost or benefit of relaxing/tightening a constraint. 3) How to apply the method to problems with inequality constraints using Kuhn-Tucker conditions. 4) The relationship between first-order and second-order conditions and how they ensure an identified solution is a maximum rather than minimum.

Uploaded by

pibaso6725
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views57 pages

ch02 (2) Unlocked

This document provides an overview of constrained maximization problems and the Lagrangian multiplier method. It discusses: 1) How to set up constrained maximization problems using Lagrangians with multipliers to solve for optimal values that maximize an objective function subject to constraints. 2) The economic interpretation of Lagrangian multipliers as the marginal cost or benefit of relaxing/tightening a constraint. 3) How to apply the method to problems with inequality constraints using Kuhn-Tucker conditions. 4) The relationship between first-order and second-order conditions and how they ensure an identified solution is a maximum rather than minimum.

Uploaded by

pibaso6725
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Microeconomic Theory

by Nicholson & Snyder

1
Chapter 2 (2/2)
Mathematics for Microeconomics
2 Constrained Maximization

In many cases, decision making


(optimization) is done with some
constraints
Consumers’ choices are limited by the
amount of their budget
One method used to solve
constrained maximization problems is
the Lagrangian multiplier method
3 Lagrangian Multiplier Method

Suppose that we wish to find the


values of , that maximize

subject to a constraint

The Lagrangian multiplier method


starts with setting up the expression

is called a Lagrangian multiplier


Lagrangian Multiplier Method
4

First-Order Conditions

From these equations, we obtain


solutions
5 Lagrangian Multiplier Method

The solution will have two properties:


will obey the constraint
will make the value of (and
therefore ) as large as possible

The Lagrangian multiplier ( ) has an


important economic interpretation
6 Lagrangian Multiplier Method

The first-order conditions imply that

the numerators ( ) measure the


additional benefit of one more unit
of (marginal benefit of )
the denominators ( ) reflect the
additional burden on the constraint
of using more (marginal cost of )
7 Lagrangian Multiplier Method
At the optimum, the ratio of the
marginal benefit to the marginal cost
of should be the same for every

is the common cost-benefit ratio for


all

has another important interpretation


(later)
8 Constrained Maximization
【Exercise】
Suppose that a farmer had a certain
length of fence ( ) and wished to
enclose the largest possible
rectangular area
let and be the lengths of the
sides
Problem: choose and to maximize
the area ( ) subject to the
constraint that the perimeter is fixed at
Constrained Maximization
9

Setting up the Lagrange’s equation:

The first-order conditions for a


maximum are
10 Constrained Maximization

From the first and second equations,


we obtain , and therefore
(the field should be square)
Thus,
Substituting into the third
equation,

, and
Interpretation of
11 the Lagrangian multiplier
Another interpretation of the
Lagrangian multiplier
indicates additional loss (gain)
of the area ( ) when the length of
fence ( ) decreases (increases) by
an extra meter
Because and therefore
, one unit decrease of
fence reduces the area by
Interpretation of
12 the Lagrangian multiplier

Thus, the Lagrangian multiplier


provides information about the
marginal cost (benefit) of the
constraint
If the constraint was tightened
slightly, we lose amount (= cost of
the constraint)
In a technical term, it is called
“shadow price” of the constraint
13 Duality

Consider the previous problem from


the opposite point of view
Choose and to minimize the length
of fence required to surround the field
(dual problem)

Minimize
subject to
14 Duality
Setting up the Lagrangian:

First-order conditions:
15 Duality

Solving, we get
/
/

Thus, we obtained the same solution


as the previous problem
This property is called the duality
16 Inequality Constraints

In some economic problems, the


constraints need not hold exactly
Suppose we seek to maximize

subject to
17 Inequality Constraints
One way to solve this problem is to
introduce a new variable ( ) that
convert the inequalities into equalities
,
The square mark (2 in the superscript) is
to ensure their values are non-negative
( )
18 Inequality Constraints

Any solution that obeys the equality


constraint will also obey the inequality
constraint
We can set up the Lagrange’s equation

There will be 4 first-order conditions,


because there are 4 variables
Inequality Constraints
19
20 Inequality Constraints

According to the third condition,


either or
If , and the constraint

If ( and ),
21 Inequality Constraints
These results are called Kuhn-Tucker
conditions
Kuhn-Tucker conditions are usually
expressed as

Because

Thus, solutions to problems involving


inequality constraints will differ from
those involving equality constraints
22 Re-interpretation of
the Lagrangian multiplier
Remember that is the marginal
cost (benefit) when tightening
(relaxing) the constraint
Therefore, implies that there is
no benefit to the objective function
(= the constraint is not binding)
Graphically …
If the constraint is here
23 ∗
Relaxing the constraint
∗∗
does not affect the
maximum ( ∗ ), and
∗∗∗
&

If the constraint is on the left of ∗



Relaxing the constraint
∗∗∗
does affect the maximum
( ∗∗  ∗∗∗ ) , and
∗∗
&

SOCs and Its Meaning:
24 - Unconstrained Problems
- Functions of One Variable
Let
A necessary (but not a sufficient)
condition for a maximum is that

To ensure that the point is a maximum


(NOT a minimum), the second order
condition must be satisfied
SOCs and Its Meaning:
25 - Unconstrained Problems
- Functions of One Variable
Let
Using the Taylor approximation at any
point

higher−order terms

Assuming that the higher-order terms


are zero, we have
SOCs and Its Meaning:
26 - Unconstrained Problems
- Functions of One Variable
Thus, when the second order
condition holds, we have

This indicates that the function is


always below (or on) any tangent to it

This function has a


concave shape to
the origin
SOCs and Its Meaning:
27 - Unconstrained Problems
- Functions of One Variable

The functions that have a concave


shape to the origin are called
concave functions
Therefore, functions that obey the
second order condition are concave
functions
SOCs and Its Meaning:
28
- Unconstrained Problems
- Functions of Two Variables
Suppose that
First order conditions for a maximum
are

These conditions ensure


SOCs and Its Meaning:
29
- Unconstrained Problems
- Functions of Two Variables

To ensure that the point is a maximum


(NOT a minimum), and must be
diminishing for movements in any
directions at the critical point
at the critical point ( second
order conditions)
SOCs and Its Meaning:
30
- Unconstrained Problems
- Functions of Two Variables

To see the SOCs mathematically, let’s


consider the total differential of

The differential of that function is

By Young’s theorem, and


SOCs and Its Meaning:
31
- Unconstrained Problems
- Functions of Two Variables

To satisfy , and must be


negative:
When

Since

An identical argument can be


made for by setting
SOCs and Its Meaning:
32 - Unconstrained Problems
- Functions of Two Variables
Then, let’s consider the general case
where and can be zero or
non-zero values
To satisfy , we must have:

Proof:
SOCs and Its Meaning:
33 - Unconstrained Problems
- Functions of Two Variables
Proof (cont.):
SOCs and Its Meaning:
34 - Unconstrained Problems
- Functions of Two Variables
Proof (cont.):

Therefore, if ,
for any and
Thus,
SOCs and Its Meaning:
35 - Constrained Problems
- Functions of Two Variables
Suppose we want to choose and
to maximize

subject to the linear constraint

We can set up the Lagrangian:


SOCs and Its Meaning:
36 - Constrained Problems
- Functions of Two Variables
The first-order conditions are

To ensure we have a maximum, the


second order condition must be
satisfied:
SOCs and Its Meaning:
- Constrained Problems
37 - Functions of Two Variables
In addition, in the case of constrained
maximization problems, and must
change by satisfying the constraint
To illustrate the relationship between
and , we use the total differential
of the constraint

This suggests the allowable relative


changes in and
SOCs and Its Meaning:
38 - Constrained Problems
- Functions of Two Variables
Because the FOC imply that , we
get

Substituting this equality into the SOC,


we get
SOCs and Its Meaning:
39 - Constrained Problems
- Functions of Two Variables
Combining terms and rearranging, we
get

Therefore, it must be true that


SOCs and Its Meaning:
40 - Constrained Problems
- Functions of Two Variables
This condition characterizes a set of
functions termed quasi-concave
functions

Thus, a constrained maximization


problem can be solved by FOCs if
The second order condition (
) holds
The objective function is a quasi-concave
function
Key points
41

Unconstrained problems: maximizing

FOCs: ( )
SOCs: and
 (strictly) concave function
Constrained problems: maximizing

FOCs: ( )&
SOCs:
 (strictly) quasi-concave function
42 SOCs and Its Meaning:
- Functions of K Variables

In the case for functions of ( )


variables, the second order conditions
are derived by using the matrix
algebra
In particular, the Hessian matrix for
unconstrained problems and the
bordered Hessian matrix for
constrained problems are used
43
SOCs and Its Meaning:
- Functions of K Variable

Since it needs advanced


mathematics and is somewhat difficult,
we skip the many-variables case
If you are interested in it, see E2.1 (pp.
83-85) of the textbook
As in the case for two-variable
functions, k-variable functions that
obey the SCOs are concave (or quasi-
concave) functions
Concave and Quasi-concave
44 Functions of Two Variables

What is the difference between concave


and quasi-concave functions?
The following figure is an example of a
concave (and quasi-concave) function
Concave and Quasi-concave
45 Functions of Two Variables
You will find a solution
(maximum) regardless
of constraints
Without constraints,
the maximum can be
solved by FOCs (the
tangency rule)  Looking from the
concave straight above
With a constraint, the
solution depends on
the constraint but
FOCs can be applied
 quasi-concave
46 Concave and Quasi-concave
Functions of Two Variables

The figure below is an example of a


convex (non concave) and quasi-
concave function
Concave and Quasi-concave
47 Functions of Two Variables

This function has an


interior solution
(maximum) if there is a
constraint
Without constraints,
the maximum cannot Looking from the
be solved by FOCs  straight above
not concave
With a constraint,
FOCs can be applied
 quasi-concave
Concave and Quasi-concave
48 Functions of Two Variables
Important points
All concave functions are quasi-
concave functions
But not all quasi-concave functions
are concave functions
The maximum value of a concave
function can be found by FOC
When a constraint exists, the
maximum of a convex function can
be found by FOC if the function is a
quasi-concave function
Homogeneous Functions
49

A function is said to be
homogeneous of degree if

when , a -fold increase in all


of its arguments increases the value
of the function by -fold
When , a -fold increase in all
of its arguments leaves the value of
the function unchanged
50
Homogeneous Functions

If a function is homogeneous of
degree , the partial derivatives of the
function will be homogeneous of
degree
, ,⋯, # , ,⋯, # $
$ $ $

%
* , ,⋯, #
RHS ) ) + · -% . , . , ⋯ , .
$

/
% %
51 Integration (Anti-derivative)

Integration is the inverse of


differentiation
let be the integral of
then is the derivative of

We denote an integral as
52 Integration (Anti-derivative)

Example: if , then

is an arbitrary constant of
integration
How to calculate anti-derivative
There is no explicit formula/rule
By guesswork, using differentiation
to check your answer
53 Definite Integrals

We can also use integration to sum up


the area under a function over some
defined interval
0

01

This is called a definite integral


54 Definite Integrals
0

01
55 Differentiating
a Definite Integral

A definite integral has a constant


value
its derivative is zero
1
56 Differentiating
a Definite Integral

Changing the upper bound of


integration will change the value of a
definite integral

1
57 Differentiating
a Definite Integral
Suppose we want to integrate
with respect to ( 1 )
How will this be affected by
changes in ?
1

You might also like