Manecon Module 3 Notes
Manecon Module 3 Notes
Linear Demand, Elasticity, and Revenue Perfectly Elastic and Inelastic Demand
When -∞ < E < -1, then MR > 0. - Cross-price elasticity is important for firms selling
When E = -1, then MR = 0 multiple products.
When -1 < E < 0, then MR < 0. o Price changes for one product impact
demand for other products.
- Assessing the overall change in revenue from a
Demand and Marginal Revenue price change for one good when a firm sells two
goods is:
∆𝑅 = [𝑅𝑥 (1 + 𝐸𝑄𝑥 𝑑 ,𝑃𝑥 ) + 𝑅𝑦 𝐸𝑄𝑥 𝑑 ,𝑃𝑦 ] 𝑥 %∆𝑃𝑥
E.g.,
- If 𝐸𝑄𝑥 𝑑 𝑃𝑦 > 0, then X and Y are substitutes Elasticities for Linear Demand Functions
- If 𝐸𝑄𝑥 𝑑 𝑃𝑦 < 0, then X and Y are complements. - From a linear demand function, we can easily
compute various elasticities.
E.g.,
- Given a linear demand function:
- Suppose it is estimated that the cross-price elasticity of 𝑄𝑥 𝑑 = 𝑎0 + 𝑎𝑥 𝑃𝑥 + 𝑎𝑦 𝑃𝑦 + 𝑎𝑚 𝑀 + 𝑎𝐻 𝑃𝐻
demand between clothing and food is -0.18. If the price of
𝑃𝑥
food is projected to increase by 10%, by how much will o Own price elasticity: 𝑎𝑥 𝑄𝑥 𝑑
demand for clothing change? 𝑃𝑦
o Cross price elasticity: 𝑎𝑦 𝑄𝑥 𝑑
%∆𝑄𝐶𝑙𝑜𝑡ℎ𝑖𝑛𝑔𝑑 𝑀
−0.18 = ⇒ %∆𝑄𝐶𝑙𝑜𝑡ℎ𝑖𝑛𝑔𝑑 = −1.8 o Income elasticity: 𝑎𝑀
𝑄𝑥 𝑑
10
𝑄𝑥 𝑑 = 100 − 3𝑃𝑥 + 4𝑃𝑦 − 0.01𝑀 + 2𝑃𝐴𝑥 - How does one obtain information on the demand
function?
Suppose good X sells at $25 a pair, good Y sells at $35, o Published studies
the company utilizes 50 units of advertising, and average o Hire consultant
consumer income is $20,000. Calculate the own price, o Statistical technique called regression
cross-price and income elasticities of demand. analysis using data on quantity, price,
income and other important variables.
𝑄𝑥 𝑑 = 100 − 3($25) + 4($35) − 0.01($20,000)
+ 2(50) = 65 𝑢𝑛𝑖𝑡𝑠
25
- Own price elasticity: −3 ( ) = −1.15
65
35
- Cross-price elasticity: 4 (65) = 2.15
20,000
- Income elasticity: −0.01 ( ) = −3.08
65
E.g,
- True (or population) regression model
An analyst for a major apparel company estimates that the
𝑌 = 𝑎 + 𝑏𝑋 + 𝑒
demand for its raincoats is given by
%∆𝑄𝑥 𝑑 %∆𝑄𝑥 𝑑
𝐸𝑄𝑥 𝑑 ,𝑅 = 𝛽𝑅 = 3. 𝑆𝑜, 𝐸𝑄𝑥 𝑑 ,𝑅 = ⇒3= - Least squares regression line
%∆𝑅 10
𝑌 = 𝑎̂ + 𝑏̂𝑋
A 10% increase in rainfall will lead to a 30% increase in the
demand of raincoats. 𝑎̂ least squares estimate of the unknown
parameter 𝑎.
𝑏̂ least squares estimate of the unknown
parameter 𝑏.
- The parameter estimates 𝑎̂ and 𝑏̂, represent the
Regression Analysis values of 𝑎 and 𝑏 that result in the smallest sum
MANECON – MODULE 3: QUANTITATIVE DEMAND ANALYSIS
Professor: Prof. Cecilia Flores
Transcribed by: Tyrone Villena
of squared errors between a line and the actual - When t stat is large we are confident that it is not
data. zero thus the standard error is small relative to the
absolute value of the parameter estimate
- 𝑡𝑎̂ = |6.69| > 2 , the intercept is statistically
different from zero.
- 𝑡𝑏̂ = |−4.89| < 2 , the intercept is statistically
different from zero.
- P values are a much more precise measure of
statistical significance
- 0.0012 = only 12 in 10000 chance that we’ll get
an estimate at least as big as -2.6 in absolute
value if the true coefficient is actually zero
- 0.05 = estimated coefficient is statistically
significant at the 5% level
Evaluating Statistical Significance
- Standard error
Evaluating the Overall Fit of the Regression Line
o Measure of how much each estimated
estimate varies in regressions based on - R-Square
the same true demand model using o Also called the coefficient of
different data. determination
o Fraction of the total variation in the
- 95 Percent Confidence interval rule of thumb dependent variable that is explained by
o 𝑎̂ ± 2𝜎𝑎̂ the regression.
o 𝑏̂ ± 2𝜎𝑏̂
𝐸𝑥𝑝𝑙𝑎𝑖𝑛𝑒𝑑 𝑉𝑎𝑟𝑖𝑎𝑡𝑖𝑜𝑛 𝑆𝑆𝑅𝑒𝑔𝑟𝑒𝑠𝑠𝑖𝑜𝑛
𝑅2 = =
- T-statistics rule of thumb 𝑇𝑜𝑡𝑎𝑙 𝑉𝑎𝑟𝑖𝑎𝑡𝑖𝑜𝑛 𝑆𝑆𝑇𝑜𝑡𝑎𝑙
o When |𝑡| > 2 , we are 95 percent o Ranges between 0 and 1.
confident the true parameter is in the ▪ Values closer to 1 indicate “better” fit
regression is not zero.
- Adjusted R-Square
o A version of the R-Square that penalize
researchers for having few degrees of
freedom.
̅𝑅̅̅2̅ = 1 − (1 − 𝑅 2 ) 𝑛 − 1
𝑛−𝑘
- The F-Statistic
o A measure of the total variation explained
by the regression relative to the total
unexplained variation.
▪ The greater the F-statistic, the better
the overall regression fit.
▪ Equivalently, the P-value is another
measure of the F-statistic.
• Lower P-values are associated
with better overall regression fit.