Mathematical Knowledge Functions of A Single Variable
Mathematical Knowledge Functions of A Single Variable
df
f 0 (x) = .
dx
d2 f
f 00 (x) = .
dx2
It gives the rate at which the slope of f changes. It is thus related to the
curvature of the function f .
• Rules of differentiation
— For constants, α:
d
α = 0,
dx
— For sums:
d
[f (x) ± g(x)] = f 0 (x) ± g 0 (x),
dx
— Power rule:
d
(αxn ) = nαxn−1 ,
dx
1
— Product rule:
d
[f (x)g(x)] = f 0 (x)g(x) + f (x)g 0 (x),
dx
— Quotient rule:
— Chain rule:
d
[f (g(x))] = f 0 (g(x))g 0 (x)
dx
1. y = 5x−2
2. f (x) = 2x2 + 3x + 4
3. g(x) = 3x − 1
4. (f (x)g(x))0 ?
5. ( fg(x)
(x) 0
)?
6. f (g(x))0 ?
1. f is concave.
2. f 00 (x) ≤ 0 ∀x.
• Moreover,.
2
4. If f 00 (x) < 0 ∀x, then f is strictly concave.
1. f (x) = 3x2 − 2x + 5
2. g(x) = −5x3 + 9x + 3
3. h(x) = 3x − 3
∂f
for i = 1, 2
∂xi
∂f ∂f
dy = dx1 + dx2
∂x1 ∂x2
3 Optimization
• Function y = f (x) is differentiable.
• The function achieves a local maximum (respectively global maxi-
mum) at x∗ , if f (x∗ ) ≥ f (x) for all x in some neighborhood of x∗ (re-
spectively for all x).
• The function achieves a local minimum (respectively global minimum)
at xe, if f (e e (respectively for
x) ≥ f (x) for all x in some neighborhood of x
all x).
3
• Necessary conditions for local interior optima in a single-variable case
f (x) is twice continuously differentiable. It reaches a local interior
e ⇒ f 0 (e
1. maximum at x x) = 0 (F OC)
00
⇒ f (e x) ≥ 0 (SOC)
4 Constrained optimization
• Consider the following problem
(
M axf (x1 , x2 )
x1 ,x2
subject to g(x1 , x2 ) = 0
— x∗1 is defined by
4
— and x∗2 by
x∗2 = ge(x∗1 )
M inwL + rK
K,L
subject to Q = Lα K β
1 β
— The constraint Q = Lα K β becomes L = Q α K − α .
— Unconstrained minimization
1 β
M inwQ α K − α + rK
K
— FOC:
dC 1 β − β −1
= −wQ α K α +r =0
dK α
— SOC
d2 C β 1 β β
= ( + 1)wQ α K − α −2 ≥ 0
dK 2 α α
w 1β α
K∗ = [ Q α ] α+β
r α
1 β
— Substitute K ∗ in L = Q α K − α gives
r 1α β
L∗ = [ Q β ] α+β
w β