0% found this document useful (0 votes)
20 views84 pages

Handbook of Applied Econometrics and Statistical Inference 1st Edition Viktor K. Jirsa

Uploaded by

asmuyuori
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views84 pages

Handbook of Applied Econometrics and Statistical Inference 1st Edition Viktor K. Jirsa

Uploaded by

asmuyuori
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 84

Full download ebook at ebookname.

com

Handbook of Applied Econometrics and Statistical


Inference 1st Edition Viktor K. Jirsa

https://ebookname.com/product/handbook-of-applied-
econometrics-and-statistical-inference-1st-edition-viktor-k-
jirsa/

OR CLICK BUTTON

DOWLOAD NOW

Download more ebook from https://ebookname.com


More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Introduction to the mathematical and statistical


foundations of econometrics Bierens

https://ebookname.com/product/introduction-to-the-mathematical-
and-statistical-foundations-of-econometrics-bierens/

Applied Econometrics 4th Edition Dimitrios Asteriou

https://ebookname.com/product/applied-econometrics-4th-edition-
dimitrios-asteriou/

Statistical Inference 2e 2nd Edition George Casella

https://ebookname.com/product/statistical-inference-2e-2nd-
edition-george-casella/

Optimal statistical inference in financial engineering


1st Edition Masanobu Taniguchi

https://ebookname.com/product/optimal-statistical-inference-in-
financial-engineering-1st-edition-masanobu-taniguchi/
An Introduction to probability and statistical
inference 1st Edition George G. Roussas

https://ebookname.com/product/an-introduction-to-probability-and-
statistical-inference-1st-edition-george-g-roussas/

In All Likelihood Statistical Modelling and Inference


Using Likelihood 1st Edition Yudi Pawitan

https://ebookname.com/product/in-all-likelihood-statistical-
modelling-and-inference-using-likelihood-1st-edition-yudi-
pawitan/

Handbook of Computational Econometrics 1st Edition


David A. Belsley

https://ebookname.com/product/handbook-of-computational-
econometrics-1st-edition-david-a-belsley/

Nonparametric Statistical Inference 4th ed., rev. and


expanded Edition Jean Dickinson Gibbons

https://ebookname.com/product/nonparametric-statistical-
inference-4th-ed-rev-and-expanded-edition-jean-dickinson-gibbons/

Statistical Hypothesis Testing in Context


Reproducibility Inference and Science Michael P. Fay

https://ebookname.com/product/statistical-hypothesis-testing-in-
context-reproducibility-inference-and-science-michael-p-fay/
HANDBOOK OF
APPLIED ECONOMETRICS
AND STATISTICAL INFERENCE
STATISTICS: Textbooks and Monographs

D. B. Owen, FoundingEditor, 1972-1991

1. The Generalized Jackknife Statistic,H. L. Gray and W.R. Schucany


2. Multivariate Analysis, Anant M. Kshirsagar
3. Statistics and Society, Walter T. Federer
4. MultivariateAnalysis:ASelectedandAbstractedBibliography, 1957-1972, Kocher-
lakota Subrahmaniam and Kathleen Subrahmaniam
5. Design of Experiments:ARealisticApproach,Vi@ L. Anderson and Robert A.
McLean
6. Statistical and Mathematical Aspects of Pollution Problems, JohnW. fratt
7. IntroductiontoProbabilityandStatistics(in two parts), Part I: Probability:Part II:
Statistics, Narayan C. Gin'
8. Statistical Theoryof the Analysis of Experimental Designs,J. Ogawa
9. Statistical Techniques in Simulation(in two parts), Jack P. C. Kleijnen
IO. Data Quality Control and Editing, Joseph1. Naus
11. Cost of Living Index Numbers: Practice, Precision, and Theory,Kali S. Banerjee
12. Weighing Designs: For
Chemistry,
Medicine, Economics, OperationsResearch,
Statistics, Kali S. Banerjee
13. The Search for Oil: Some Statistical Methods and Techniques, edited by D. B. Owen
14. Sample Size Choice: Charts for Experiments with Linear Models, Robert E. Odeh and
Martin Fox
15. Statistical MethodsforEngineersandScientists,RobertM.Bethea,Benjamin S.
Duran, and Thomas L. Boullion
16. Statistical Quality Control Methods,lrving W. Bun
17. On the History of Statistics and Probability,edited by D. B. Owen
18. Econometrics. Peter Schmidt
19. Sufficient Statistics: Selected Contributions, VasantS. Huzurbazar (editedby Anant M.
Kshirsagar)
20. Handbook of Statistical Distributions, Jagdish K.fatel, C. H. Kapadia, and D. B. Owen
21. Case Studies in Sample Design, A. C.Rosander
22. PocketBookofStatisticalTables,compiled by R. E. Odeh, D. B. Owen,Z.W.
Bimbaum, and L. Fisher
23. The Information in Contingency Tables,D. V. Gokhale and Solomon Kullback
24. Statistical Analysis of Reliability and Life-Testing Models: Theory and Methods, LeeJ.
Bain
25. Elementary Statistical Quality Control, lrving W. Burr
26. An Introduction to Probability and Statistics Using BASIC, Richard A. Groeneveld
27. Basic Applied Statistics, B. L. Raktoe and J. J. Hubert
28. A Primer in Probability, Kathleen Subrahmaniam
29. Random Processes: A First Look, R. Syski
30. Regression Methods: A Tool for Data Analysis, Rudolf J. Freundand Paul D. Minfon
31. Randomization Tests, Eugene S. Edgington
32. Tables for Normal Tolerance Limits, Sampling Plans and Screening, Robert E. Odeh
and D. 8. Owen
33. Statistical Computing, WilliamJ. Kennedy, Jr., and James E. Gentle
34. Regression Analysis and Its Application: A Data-Oriented Approach, Richard F. Gunst
and Robert L. Mason
35. Scientific Strategies to Save Your Life, 1. D. J. Bross
36. Statistics in the Pharmaceutical Industry, edited by C. Ralph Buncher and Jia-Yeong
Tsay
37. Sampling from a Finite Population, J. Hajek
38. Statistical Modeling Techniques,S. S. Shapiro and A. J. Gross
39. Statistical Theory and Inference in Research, T. A. Bancroff and C.-P Han
40. Handbook of the Normal Distribution, Jagdish K. Patel and CampbellB. Read
41. Recent Advancesin Regression Methods, Hrishikesh D. Vinod and Aman Ullah
42. Acceptance Sampling in Quality Control, EdwardG. Schilling
43. The Randomized Clinical Trial and Therapeutic Decisions, edited by Niels Tygstmp,
John M Lachin, and Erik Juhl
44. Regression Analysis of Survival Data in Cancer Chemotherapy, Walter H. Carte6 Jr.,
Galen L. Wampler, and DonaldM. Stablein
45. A Course in Linear Models, Anant M. Kshirsagar
46. Clinical Trials: Issues and Approaches, edited by Stanley H. Shapiro and Thomas H.
Louis
47. Statistical Analysis of DNA Sequence Data, editedby B. S. Weir
48. Nonlinear Regression Modeling: A Unified Practical Approach, DavidA. Ratkowsky
49. Attribute Sampling Plans, Tablesof Tests and Confidence Limits for Proportions, Rob-
ert E. Odeh and D. B. Owen
50. ExperimentalDesign,StatisticalModels,andGeneticStatistics, edited by Klaus
Hinkelmann
51. Statistical Methods for Cancer Studies, editedby Richard G. Comell
52. Practical Statistical Sampling for Auditors, Arthur J. wilbum
53. Statistical Methods for Cancer Studies, edited by Edward J. Wegman and James G.
Smith
54. Self-organizing Methods in Modeling: GMDH Type Algorithms, edited by Stanley J.
Fadow
55. Applied Factorial and Fractional Designs, Robert A. McLean and V gi liL. Anderson
56. Design of Experiments: Ranking and Selection, editedby Thomas J. Sanfner and Ajit
C.Tamhane
57. StatisticalMethodsforEngineersandScientists:SecondEdition,RevisedandEx-
panded, Robert M. Befhea, BenjaminS. Duran, and Thomas L. Boullion
58. Ensemble Modeling: Inference from Small-Scale Properties to Large-Scale Systems,
Alan E. Gelfand and CraytonC. Walker
59. ComputerModelingforBusinessandIndustry.BruceL.Bowennan and Richard T.
OConnell
60. Bayesian Analysis of Linear Models, Lyle D. Broemeling
61. Methodological Issues for Health Care Surveys, Brenda Cox and Steven Cohen
62. Applied Regression Analysis and Experimental Design, RichardJ. Brook and Gregory
C. Arnold
63. Statpal: A Statistical Package for Microcomputers-PC-DOS Version for the IBM PC
and Compatibles, BruceJ. Chalmerand David G. Whitmore
64. Statpal: A Statistical Package for Microcomputers-Apple Version for the It, ll+, and
Ile, DavidG. Whitmore and Bruce J. Chalmer
65. NonparametricStatisticalInference:SecondEdition,RevisedandExpanded,Jean
Dickinson Gibbons
66. Design and Analysis of Experiments,Roger G. fetersen
67. StatisticalMethodsforPharmaceuticalResearchPlanning,Sten W. Begman and
John C. Gittins
68. Goodness-of-Fit Techniques, edited by Ralph B. D'Agostino and MichaelA. Stephens
69. Statistical Methods in Discrimination Litigation, edited by D.H. Kaye and Mike1Aickin
70. Truncated and Censored Samples from Normal Populations, Helmut Schneider
71. Robust Inference,M. L. Tiku,W. Y. Tan, and N. Balakrishnan
72. Statistical Image Processing and Graphics, edited by EdwardJ. Wegman and Douglas
J. DePriest
73. Assignment Methods in Combinatortal Data Analysis, Lawrence J. Hubert
74. Econometrics and Structural Change, Lyle D. Broemeling and Hiroki Tsuwmi
75. Multivariate Interpretation of Clinical Laboratory Data, Adelin Albert and Eugene K.
Hams
76. Statistical Tools for Simulation Practitioners, JackP. C. Kleijnen
77. Randomization Tests: Second Edition, Eugene S. Edgington
78. A Folio of Distributions: A Collection of Theoretical Quantile-Quantile Plots. EdwardB.
Fowlkes
79. Applied Categorical Data Analysis, DanielH. Freeman, Jr.
80. Seemingly Unrelated Regression Equations Models: Estimation and Inference, Viren-
dra K. Snvastava and David E. A. Giles
81. Response Surfaces: Designs and Analyses, Andre1. Khun’ and John A. Come//
82. Nonlinear Parameter Estimation: An Integrated System in BASIC, John C. Nash and
Mary Walker-Smith
83. Cancer Modeling, edited by James R. Thompson and Bany W. Brown
8 4 . Mixture Models: Inference and Applications to Clustering. Geoffrey J. McLachlan and
Kaye E. Basford
85. Randomized Response: Theory and Techniques, Anjit Chaudhun’ and Rahul Muketjee
86. Biopharmaceutical Statistics for Drug Development, edited by Karl E. Peace
87. Parts per Million Values for Estimating Quality Levels, Robert E. Odeh and D. 8. Owen
88. Lognormal Distributions: Theory and Applications, edited by Edwin L. Crow and Kunio
Shimizu
89. Properties of Estimators for the Gamma Distribution,K. 0. Bowman and L. R. Shenton
90. Spline Smoothing and Nonparametric Regression, Randall L. Eubank
91. Linear Least Squares Computations, R. W. Farebrother
92. Exploring Statistics, Damaraju Raghavarao
93. Applied Time Series Analysis for Business and Economic Forecasting, Sufi M. Nazem
94. Bayesian Analysis of Time Series and Dynamic Models, editedby James C. Spa//
95. TheInverseGaussianDistribution:Theory.Methodology,andApplications, Raj S.
Chhikara and J. Leroy Folks
96. Parameter Estimation in Reliability and Life Span Models,A. Clifford Cohen and Betty
Jones Whitten
97. Pooled Cross-Sectional and Time Series Data Analysis, Teny E. Dielman
98. Random Processes: A First Look, Second Edition, Revised and Expanded, R. Syski
99. Generalized Poisson Distributions: Properties and Applications, P. C. Consul
100. Nonlinear L,-Norm Estimation, Rene Gonin and ArthurH. Money
101. Model Discrimination for Nonlinear Regression Models, DaleS. Borowiak
102. Applied Regression Analysis in Econometrics. HowardE. Doran
103. Continued Fractions in Statistical Applications, K. 0. Bowman and L. R. Shenton
104. Statistical Methodologyin the Pharmaceutical Sciences.DonaldA. Beny
105. Experimental Design in Biotechnology,Peny D. Haaland
106. Statistical Issues in Drug Research and Development, edited by Karl E. Peace
107. Handbook of Nonlinear Regression Models, DavidA. Rathowsky
108. Robust Regression: Analysis and Applications, edited by Kenneth D. Lawrence and
Jeffrey L. Arthur
109. Statistical Design and Analysis of Industrial Experiments, edited by Subir Ghosh
1 IO. U-Statistics: Theory and Practice, A. J. Lee
111. A Primer in Probability:SecondEdition,RevisedandExpanded,KathleenSubrah-
rnaniam
112. Data Quality Control: Theory and Pragmatics, editedby Gunar E. Liepins and V. R. R.
Uppulun’
113. Engineering Quality by Design: Interpreting the Taguchi Approach, ThomasB. Barker
114. Survivorship Analysis for Clinical Studies, Eugene K. Hams and Adelin Albert
115. Statistical Analysis of Reliability and Life-Testing Models: Second Edition, Lee J. Bain
and Max Engelhardt
116. Stochastic Models of Carcinogenesis, Wai-Yuan Tan
117. Statistics and Society: Data Collection and Interpretation, Second Edition, Revised and
Expanded, Walter T. Federer
118. Handbook of Sequential Analysis,B. K. Ghosh and P. K. Sen
119. Truncated and Censored Samples: Theory and Applications, A. Clifford Cohen
120. Survey Sampling Principles,E. K. Foreman
121. Applied Engineering Statistics, RobertM. Bethea and R. RussellRhinehart
122. SampleSizeChoice:ChartsforExperiments with LinearModels:SecondEdition,
Robert E. Odeh and Martin Fox
123. Handbook of the Logistic Distribution, edited
by N. Balaknshnan
124. Fundamentals of Biostatistical Inference, ChapT. Le
125. Correspondence Analysis Handbook,J.9. Benzecri
126. Quadratic Forms in Random Variables: Theory and Applications, A. M. Mathai and
Serge 8. Provost
127. Confidence Intervals on Variance Components, Richard K. Burdick and Franklin A.
Graybill
edited by Karl E. Peace
128. Biopharmaceutical Sequential Statistical Applications,
129. Item Response Theory: Parameter Estimation Techniques, Frank 8. Baker
130. Survey Sampling: Theory and Methods, Ar@t Chaudhun and Horst Stenger
131. Nonparametric Statistical Inference: Third Edition, Revised and Expanded, Jean Dick-
inson Gibbons and Subhabrata Chakraborti
132. BivariateDiscreteDistribution,SubrahmaniamKochedakota and KathleenKocher-
lakota
133. Design and Analysis of Bioavailability and Bioequivalence Studies, Shein-Chung Chow
and Jen-peiLiu
134. MultipleComparisons,Selection,andAppllcationsinBiometry, edited by Fred M.

135. Cross-OverExperiments:Design,Analysis,andApplication, David A.Ratkowsky,


Marc A. Evans,and J. Richard Alldredge
136. Introduction to ProbabilityandStatistics:SecondEdition,RevisedandExpanded,
Narayan C. Gin
137. Applied Analysisof Variance in Behavioral Science, edited by Lynne K. Edwards
138. Drug Safety Assessmentin Clinical Trials, edited by Gene S. Gilbert
139. Design of Experiments: A No-Name Approach, Thomas J. Lorenzen and Virgil L. An-
derson
140. Statistics in thePharmaceuticalIndustry:SecondEdition,RevisedandExpanded,
edited by C. Ralph Buncherand Jia-Yeong Tsay
141. Advanced Linear Models: Theory and Applications, Song-Gui Wang and Shein-Chung
Chow
142. Multistage Selection and Ranking Procedures: Second-Order Asymptotics, Nitis Uuk-
hopadhyay and Tumulesh K. S. Solanky
143. Statistical Design and Analysis in Pharmaceutical Science: Validation, Process Con-
trols, and Stability, Shein-Chung Chow and Jen-pi Liu
144. Statistical Methods for Engineers and Scientists: Third Edition, Revised and Expanded,
Robert M. Bethea, BenjaminS. Duran, and Thomas L. Boullion
145. Growth Curves, AnantM. Kshirsagarand Wlliam Boyce Smith
146. Statistical Bases of Reference Values in Laboratory Medicine, Eugene K. Hams and
James C. Boyd
147. Randomization Tests: Third Edition, Revised and Expanded, Eugene S. Edgington
148. PracticalSamplingTechniques:SecondEdition,RevisedandExpanded,Ranjan K.
Som
149. Multivariate Statistical Analysis, Narayan C. Giri
150. Handbook of the Normal Distribution: Second Edition, Revised and Expanded, Jagdish
K. Pate1and Campbell 6. Read
151. Bayesian Biostatistics,edited by DonaldA. Beny and Dalene K. Stangl
152. Response Surfaces: Designs and Analyses, Second Edition, Revised and Expanded,
Andt-6 I. Khuri and John A. Come11
153. Statistics of Quality,edited by Subir Ghost?, William R. Schucany, and William 8. Smith
154. Linear and Nonlinear Models for the Analysis of Repeated Measurements, Edward F.
Vonesh and Vernon M. Chinchilli
155. Handbook of Applied Economic Statistics, Aman Ullah and David E. A. Giles
156. Improving Efficiency by Shrinkage: The James-Stein and Ridge Regression Estima-
tors, MarvinH. J. Gruber
157. Nonparametric Regression and Spline Smoothing: Second Edition, Randall L. Eu-
bank
1513. Asyrnptotics, Nonparametrics, and Time Series, editedby Subir Ghosh
159. Multivariate Analysis, Design of Experiments, and Survey Sampling,edited by Subir
Ghosh
160. Statistical Process Monitoring and Control,edited by Sung H. Park and G. Geoffrey
Vining
161. Statistics for the 21st Century: Methodologies for Applications of the Future, edited
by C. R. Rao and GBbor J. Szdkely
162. Probability and Statistical Inference,Nitis Mukhopadhyay
163. Handbook of Stochastic Analysis and Applications,edited by 0.Kannan and V. Lak-
shmikantham
164. Testing for Normality, Henry C. Jhode, Jr.
165. Handbook of Applied Econometrics and Statistical Inference, edited by Aman Ullah,
Alan J. K. Wan, and Anoop Chaturvedi

Additional Volumes in Preparation

Visualizing Statistical Models and Concepts,


R. W. Farebrutherand Michael Schyns
HANDBOOK OF
APPLIED ECONOMETRICS
AND STATISTICAL INFERENCE

EDITED BY

AMANULLAH
University of California, Riverside
Riverside, California
ALANT. K. WAN
City Universityof Hong Kong
Kowloon, Hong Kong
ANOOPCHATURVEDI
University of Allahabad
Allahabad, India

m
M A R C E L

MARCELDEKKER,
INC. -
N E WYORK BASEL
D E K K E R
ISBN: 0-8247-0652-8
This book is printed on acid-free paper

Headquarters
Marcel Dekker, Inc.
270 Madison Avenue, New York, NY 10016
tel: 2 12-696-9000; fax: 2 12-685-4540

Eastern Hemisphere Distribution


Marcel Dekker AG
Hutgasse 4, Postfach 812, CH-4001 Basel, Switzerland
tel: 41-61-261-8482; fax: 41-61-261-8896

World Wide Web


http://www.dekker.com

The publisher offers discounts on this book when ordered in bulk quantities.
For more information, write to Special Sales/Professional Marketing at the
headquarters address above.

Copyright 0 2002 by Marcel Dekker, Inc. All Rights Reserved.

Neither this book nor any part may be reproduced or transmitted in any
form or by any means, electronic or mechanical, including photocopying,
microfilming, and recording, or by any information storage and retrieval
system. without permission in writing from the publisher

Current printing (last digit):


I 0 9 8 7 6 5 4 3 2 1

PRINTED IN THE UNITED STATES OF AMERICA


To the memory of Viren K. Srivastava
Prolific researcher,
Stimulating teacher,
Dear friend
This Page Intentionally Left Blank
Preface

This Hundbook contains thirty-one chapters by distinguished econometri-


cians and statisticians from many countries. It is dedicated to the memory of
Professor Viren K Srivastava, a profound and innovative contributor in the
fields of econometrics and statistical inference. Viren Srivastava was most
recently aProfessor andChairman in theDepartmentof Statistics at
LucknowUniversity,India. He had taught at Banaras Hindu University
and had been a visiting professor or scholar at various universities, including
Western Ontario,Concordia,Monash.AustralianNational, New South
Wales, Canterbury, and Munich. During his distinguished career, he pub-
lished morethan 150 researchpapers in variousareas of statistics and
econometrics (a selected list is provided). His most influential contributions
are in finite sample theory of structural models and improved methods of
estimation in linear models. These contributions have provided a new direc-
tion not only in econometrics and statistics but also in other areas of applied
sciences. Moreover, his work on seemingly unrelated regression models,
particularly his book SeeminglyUnrelatedRegressionEquationsModels:
EstimationandInference, coauthored with DavidGiles(Marcel Dekker,
V
Vi Preface

Inc., 1987), has laid the foundation of much subsequent work in this area.
Several topics included in this volume are directly or indirectly influenced by
his work.
In recent yearstherehave been manymajor developmentsassociated
withtheinterface between appliedeconometrics and statistical inference.
This is true especially for censored models, panel data models. time series
econometrics, Bayesian inference, anddistributiontheory.Thecommon
ground at the interface between statistics and econometrics is of consider-
able importance for researchers, practitioners, and students of both subjects,
and it is also of direct interest to those working in other areas of applied
sciences. The crucial importance of this interface has been reflected in sev-
eral ways. For example, this was part of the motivation for the establish-
ment of the journal Econometric TI?~ory (Cambridge University Press); the
Hctmibook of Storistics series (North-Holland). especially Vol. 11; the North-
Hollandpublication Hcrrldbook of Econonretrics, Vol. I-IV, where the
emphasis is oneconometricmethodology;andtherecent Hnrdbook sf
Applied Ecollornic Stntisrics (Marcel Dekker, Inc.), which contains contribu-
tionsfrom
applied
economists and
econometricians. However, there
remains a considerable range of material and recent research results that
are of direct interest to both of the groups under discussion here, but are
scattered throughout the separate literatures.
1 This Hcrrzdbook aims to disseminate significant research results
econo-
in
metrics and statistics.It is aconsolidated and comprehensive reference
source for researchers and students whose work takes them to the interface
between these two disciplines. This may lead to more collaborative research
4
between members of the two disciplines. The major recent developments in
both the applied econometrics and statistical inference techniques that have
been covered are of direct interest to researchers, practitioneres, and grad-
uate students, not only in econometrics and statistics but in other applied
fields such as medicine, engineering, sociology, and psychology. The book
incorporatesreasonablycomprehensiveandup-to-date reviews of recent
developments in various key areas of applied econometrics and statistical
inference, and it also contains chapters that set the scene for future research
in these areas. The emphasis has been on research contributions with acces-
sibility to practitioners and graduate students.
The thirty-one chapters contained in this Hmdbook have been divided
into seven majorparts, viz.. StatisticalInference andSample Design,
NonparametricEstimationand Testing,HypothesisTesting,Pretestand
Biased Estimation,Time Series Analysis, Estimationand Inference in
EconometricModels,and Applied Econometrics. Part I consists of five
chaptersdealing with issues relatedtoparametric inference procedures
andsample design. In Chapter 1, Barry Arnold,EnriqueCastillo,and
Preface vii

Josk Maria Sarabia give a thorough overview of theavailableresults on


Bayesian inference usingconditionally specified priors.Some guidelines
are given forchoosingthe appropriate values forthepriors’hyperpara-
meters, and the results are elaborated with the aid of a numerical example.
Helge Toutenburg, Andreas Fieger, and Burkhard Schaffrin, in Chapter 2.
consider minimax estimation of regression coefficients in a linear regression
model and obtain a confidence ellipsoid based on the minimax estimator.
Chapter 3, by Pawel PordzikandGotz Trenkler. derives necessary and
sufficient conditionsforthe best linearunbiasedestimator of thelinear
parametric function of a general linear model, and characterizes the sub-
space of linear parametric functions which can be estimated with full effi-
ciency. In Chapter 4, Ahmad Parsian and Syed Kirmani extend the concepts
of unbiased estimation, invariant estimation,Bayes and minimax estimation
fortheestimationproblemundertheasymmetric LINEX loss function.
These concepts are applied in the estimation of some specific probability
models. Subir Ghosh, in Chapter 5 , gives an overview of a wide array of
issues relating to the design and implementation of sample surveys over
time, and utilizes a particular survey application as an illustration of the
ideas.
The four chapters of Part I1 are concerned with nonparametric estima-
tion and testing methodologies. Ibrahim Ahmad in Chapter 6 looks at the
problem of estimatingthedensity,distribution, and regression functions
nonparametrically when one gets onlyrandomized responses. Several
asymptotic properties, including weak, strong, uniform, mean square, inte-
grated mean square. and absolute error consistencies as well as asymptotic
normality, are considered in each estimation case. Multinomialchoice mod-
els are thetheme of Chapter 7, in which Jeff Racineproposesa new
approach to the estimation of these modelsthatavoidsthe specification
of aknown index function, which can be problematic in certaincases.
Radhey Singh and Xuewen Lu in Chapter 8 consider a censored nonpara-
metric additive regression model, which admits continuous and categorical
variables in an additive manner. The concepts of marginal integration and
local linear fits are extended to nonparametric regression analysis with cen-
soring to estimate the low dimensional components in an additive model. In
Chapter 9, Mezbahur Rahman and Aman Ullah consider a combined para-
metric and nonparametric regression model, which improves both the (pure)
parametric and nonparametric approaches in the sense that the combined
procedure is less biased than the parametric approach while simultaneously
reducing the magnitude of the variance that results from the non-parametric
approach. Small sample performance of the estimators is examined via a
Monte Carlo experiment.
viii Preface

In Part 111, the problems related to hypothesis testing are addressed in


three chapters. Ani1 Bera and Aurobindo Ghosh in Chapter 10 give a com-
prehensive survey of the developments in the theory of Neyman’s smooth
test with an emphasis on its merits, and put the case for the inclusion of this
test in mainstream econometrics. Chapter 11byBill Farebrother outlines
several methods for evaluating probabilities associated with the distribution
of a quadratic form in normal variables and illustrates the proposed tech-
nique in obtaining the critical values of the lower and upper bounds of the
Durbin-Watson statistics. It is well known that the Wald test for autocor-
relation does not always have the most desirable properties in finite samples
owing to such problems as thelack of invariance to equivalent formulations
of the nullhypothesis, local biasedness, and powernonmonotonicity. In
Chapter 12, to overcome these problems, Max King and Kim-Leng Goh
considerthe use of bootstrap methods to find more appropriate critical
values and modifications to the asymptotic covariance matrix of the esti-
mates used inthe test statistic.In Chapter 13, JanMagnus studiesthe
sensitivityproperties of a“t-type”statistic based on anormalrandom
variable with zero mean and nonscalar covariance matrix. A simple expres-
sion for the even moments of this t-type random variable is given, as are the
conditions for the moments to exist.
Part IV presentsacollection of papersrelevanttopretest and biased
estimation. In Chapter 14, David Giles considers pretest and Bayes estima-
tion of the normal location parameter with the loss structure given by a
“reflected normal” penalty function, which has the particular merit of being
bounded. In Chapter 15, Akio Namba and Kazuhiro Ohtani considera
linear regression model with multivariate t errors and derive the finite sam-
ple moments and predictive mean squared error of a pretest double k-class
estimator of the regression coefficients. Shalabh, in Chapter 16. considers a
linear regression model with trended explanatory variable using three dif-
ferent formulations for the trend, viz., linear. quadratic, and exponential,
and studies large sample properties of the least squares and Stein-rule esti-
mators. Emphasizing a model involving orthogonality of explanatory vari-
ables and the noise component, Ron Mittelhammer and George Judge in
Chapter 17 demonstrate a semiparametric empirical likelihood data based
information theoretic (ELDBIT) estimator that has finite sample properties
superior to those of thetraditionalcompetingestimators.TheELDBIT
estimatorexhibitsrobustness with respect to ill-conditioning implied by
highly correlated covariatesandsample
outcomes fromnonnormal.
thicker-tailed
sampling processes. Some possible extensions of the
ELDBIT formulations have also been outlined.
Time series analysis forms the subject matter of Part V. Judith Giles in
Chapter 18 proposestestsfortwo-stepnoncausalitytests in atrivariate
Preface ix

VARmodel when theinformation set containsvariables thatarenot


directly involved in the test. An issue that often arises in the approximation
of an ARMA process by a pure AR process is the lack of appraisal of the
quality of the approximation. John Galbraith and VictoriaZinde-Walsh
addressthis issue in Chapter 19, emphasizingtheHilbertdistance as a
measure of the approximation’s accuracy. Chapter 20 by Anoop
Chaturvedi, Alan Wan, and Guohua Zou adds to the sparse literature on
Bayesian inference on dynamic regression models, with allowance for the
possible existence of nonnormal errors through theGram-Charlier distribu-
tion. Robust in-sample volatility analysis is the substance of the contribu-
tion of Chapter 21, in which Xavier Yew, Michael McAleer, and Shiqing
Ling examine the sensitivity of the estimated parameters of the GARCH
and asymmetric GARCH models through recursive estimation to determine
the optimal window size. In Chapter 22, Koichi Maekawa and Hiroyuki
Hisamatsu consider a nonstationarySUR system and investigate the asymp-
totic distributions of OLS and the restricted and unrestricted SUR estima-
tors. A cointegration test based on the SUR residuals is also proposed.
Part VI comprises five chapters focusing on estimation and inference of
econometric models. In Chapter 23, Gordon Fisher and Marcel-Christian
Voia considertheestimation of stochastic coefficients regression (SCR)
models with missing observations. Among other things, the authors present
a new geometric proof of an extended Gauss-Markov theorem. In estimat-
inghazardfunctions,thenegativeexponential regression model is com-
monly used, but previous results on estimators for this model have been
mostly asymptotic. Along the lines of their other ongoing research in this
area, John Knight and Stephen Satchell, in Chapter 24, derive some exact
properties for the log-linear least squares and maximum likelihood estima-
tors for a negative exponential model with a constant and a dummy vari-
able. Minimum variance unbiased estimators are also developed. In Chapter
25, Murray Smith examines various aspects of double-hurdle models, which
are used frequently in demand analysis. Smith presents a thorough review of
the current state of the art on this subject, and advocates the use of the
copula method as the preferred technique for constructing these models.
Rick Vinod in Chapter 26 discusses how the popular techniques of general-
ized linear models and generalized estimating equations in biometrics can be
utilized in econometrics in theestimation of panel data models.Indeed,
Vinod’s paper spells out thecrucialimportance of theinterface between
econometrics andotherareas of statistics.Thissectionconcludes with
Chapter 27 in which William Griffiths,Chris Skeels, andDuangkamon
Chotikapanich take up the important issue of sample size requirement in
the estimation of SUR models. One broad conclusion that can be drawn
X Preface

fromthispaper is that the usually stated sample size requirementsoften


understate the actual requirement.
The last part includes four chapters focusing on applied econometrics.
The panel data model is the substance of Chapter 28, in which Aman Ullah
and Kusum Mundra study the so-called immigrantshome-link effect on
U.S. producer trade flows via a semiparametric estimator which the authors
introduce. Human development is an important issue faced by many devel-
opingcountries.Having been at theforefront of this line of research,
Aunurudh Nagar, in Chapter 29, along with Sudip Basu, considers estima-
tion of human development indices and investigates the factors in determin-
ing human development. comprehensive
A survey of the recent
developments of structural auction models is presented in Chapter 30, in
which Samita Sareen emphasizes the usefulness of Bayesian methods in the
estimation and testing of these models. Market switching models are often
used in business cycle research. In Chapter 31, Baldev Raj provides a thor-
ough review of the theoretical knowledge on this subject. Raj’s extensive
survey includes analysis of the Markov-switching approach and generaliza-
tions to a multivariate setup with some empirical results being presented.
Needless to say, in preparing this Handbook, we owe a great debt to the
authors of the chapters for their marvelous cooperation. Thanks are also
due to the authors, who were notonlydevoted to theirtask of writing
exceedingly high qualitypapersbuthadalso been willing to sacrifice
muchtime and energy to review otherchapters of thevolume.In this
respect, we wouldlike tothankJohnGalbraith,David Giles,George
Judge,Max King, JohnKnight, ShiqingLing,KoichiMaekawa, Jan
Magnus,RonMittelhammer,KazuhiroOhtani, Jeff Racine,Radhey
Singh,Chris Skeels, MurraySmith, Rick Vinod,VictoriaZinde-Walsh,
and Guohua Zou. Also, Chris Carter (Hong Kong University of Science
and Technology), Hikaru Hasegawa (Hokkaido University), Wai-Keong Li
(CityUniversity of HongKong),andNilanjanaRoy (University of
Victoria)have refereed several papers in thevolume.Acknowledgedalso
is thefinancial supportfor visiting appointmentsforAmanUllahand
Anoop Chaturvedi at the City University of Hong Kong during the summer
of 1999 when the idea of bringing together the topics of this Handbook was
first conceived. We also wish to thank Russell Dekker and Jennifer Paizzi of
Marcel Dekker, Inc., for their assistance and patiencewith us in the process
of preparing this Handbook, and Carolina Juarez and Alec Chan for secre-
tarial and clerical support.

Aman Ullah
Alan T. K. Wan
A110opCllatrrrvedi

I
J
Contents

...
Prefcice 111
Contributors is
Selected Plrbliccrtiorn of V.K. Srivastova xix

Part 1 StatisticalInferenceandSample Design


1. Bayesian Inference Using Conditionally Specified Priors 1
Borq- C. A r ~ o l c lEnrique
. Castillo, and JosP Maria Sarabia
2 . ApproximateConfidenceRegions for Minimax-Linear
Estimators 27
Helge Toutenburg, A . Fieger. ond Burkkard Schaffrin
3. On Efficiently Estimable Parametric Functionals in the General
Linear Model with Nuisance Parameters 45
Pobvcd R . Pordzik rrnd Got: Trenkler
4. EstimationunderLINEX Loss Function 53
Ahnml Parsian and S.N . U .A . Kirrmni
xi
xii Contents

5. DesignofSampleSurveysacrossTime 77
Subir Ghosh

Part2NonparametricEstimationandTesting
6. Kernel Estimation in a Continuous Randomized Response
Model 97
Ibrahim A . Ahmad
Index-Free, Density-Based Multinomial Choice 115
Jeffrey S . Racine
Censored Additive Regression Models 143
R . S. Singh and Xuewen Lu
Improved Combined Parametric and Nonparametric
Regressions: Estimation and Hypothesis Testing 159
Mezbahur Ruhman and Aman Ullalz

Part 3 HypothesisTesting
I 10. Neyman’s Smooth Test and Its Applications in Econometrics 177
Ani1 K . Bera and Aurobindo Ghosh
11. Computing the Distribution of a Quadratic Form in Normal
Variables 23 1
R . W. Farebrother
12. Improvements to the Wald Test 25 1
Mu.xwell L . King and Kim-Leng Goh
13. On the Sensitivity of the t-Statistic 277
Jan R . Magnus

Part 4 Pretest and Biased Estimation


14. Preliminary-Test and Bayes Estimation of a Location
Parameter under “Reflected Normal” Loss 287
David E. A . Giles
15. MSE Performance of the Double k-Class Estimator of Each
Individual Regression Coefficient under Multivariate t-Errors 305
Akio Namba und Kazuhiiro Ohtani
Contents xiii

16. Effects of a Trended Regressor on the Efficiency Properties of


the Least-Squares and Stein-Rule Estimation of Regression
Coefficients 327
Shalabh
17. Endogeneity and Biased EstimationunderSquaredError Loss 347
Ron C. Mittelhanlrner and George G. Judge

Part 5 Time Series Analysis


18. Testing for Two-step Granger Noncausality in Trivariate VAR
Models 37 1
Juditll A. Giles
19. Measurement of the Quality of Autoregressive Approximation,
with
Applications
Econometric 40 1
John W . Galbrait11 and Victoria Zinde- Walsh
20. Bayesian Inference of a Dynamic Linear Model with Edgeworth
Series Disturbances 423
Anoop Chaturvedi, Alan T. K. Wan, and Guohua Zou
21. Determining an Optimal Window Size for Modeling Volatility 443
Xavier Chee Hoong Yew, Michael McAleei-, and Shiqing Ling
33 .
" SUR Models with Integrated Regressors 469
Koichi Maekawa and Hiroyuki Hisamcrtsu

Part 6 Estimation and Inference in Econometric Models


23. Estimating Systems of Stochastic Coefficients Regressions
When Some of the Observations Are Missing 49 1
Gordon Fisher and Mnrcel-Christian Voia
24. Efficiency Considerations in theNegativeExponentialFailure
Time Model 513
John L. Knight and Stephen E. Satchel1
25. On Specifying Double-HurdleModels 535
M u r r q D. Smitlz
26. Econometric Applications of GeneralizedEstimatingEquations
for Panel Data and Extensions to Inference 553
H. D. Vinod
xiv Contents

27. Sample Size Requirements for


Estimation in SUR Models 575
William E. Griffitlts, Christopher L. Skeels, and Duangkamon
1
Chotikapanicl~

Part 7 AppliedEconometrics
28. SemiparametricPanelDataEstimation:AnApplicationto
Immigrants'Homelink Effect on U.S. ProducerTrade Flows 591
Arnan Ullcrk and Kustrnl Mundra
29. Weighting Socioeconomic Indicators of Human Development:
ApproachVariable Latent A 609
A . L. Nugar and Sudip Rnnjan Basu
30. A Survey of Recent Work on Identification, Estimation, and
643Models
Auction
Structural
of Testing
Sanlita Scween
31. Asymmetry of BusinessCycles: The Markov-Switching
Approach 68 7
Baldev Rc~j

Index 71I

I
Contributors

Ibrahim A. Ahmad Department of Statistics, University of Central Florida,


Orlando, Florida
Barry C. Arnold Department of Statistics,
University
of
California,
Riverside, Riverside, California
Sudip RanjanBasu National Institute of Public Finance and Policy, New
Delhi, India
Ani1 K. Bera Department of Economics, University of Illinois at Urbana-
Champaign, Champaign, Illinois
Enrique Castillo Department of Applied Mathematics and Computational
Sciences, Universities of CantabriaandCastilla-LaMancha,Santander,
Spain
AnoopChaturvedi Department ofStatistics,University of Allahabad,
Allahabad, India

xv
xvi Contributors

DuangkamanChotikapanich SchoolofEconomicsandFinance,Curtin
University of Technology, Perth, Australia
R. W. Farebrother DepartmentofEconomicStudies,Faculty ofSocial
Studies and Law, Victoria University of Manchester, Manchester, England
A. Fieger ServiceBarometer AG, Munich, Germany
Gordon
Fisher Department Economics,
of Concordia
University,
Montreal, Quebec, Canada
John W. Galbraith Department of
Economics,
McGill
University,
Montreal, Quebec, Canada
AurobindoGhosh DepartmentofEconomics,Universityof Illinois at
Urbana-Champaign, Champaign, Illinois
Subir Ghosh Department of Statistics, University of California, Riverside,
Riverside, California
, David E. A. Giles Department Economics,
of University
Victoria,
of
Victoria, British Columbia, Canada
Judith A. Giles Department of
Economics, University of Victoria,
Victoria, British Columbia, Canada
!
Kim-Leng Goh Departmentof AppliedStatistics,FacultyofEconomics
and Administration, University of Malaya, Kuala Lumpur, Malaysia
William E. Griffiths Department of Economics, University of Melbourne,
Melbourne, Australia
Hiroyuki Hisamatsu Faculty of Economics,
Kagawa
University,
Takamatsu, Japan
George G . Judge Department ofAgriculturalEconomics,Universityof
California, Berkeley, Berkeley, California
Maxwell L. King Faculty of Business and Economics, Monash University,
Clayton, Victoria, Australia
S.N.U.A. Kirmani DepartmentofMathematics, University of Northern
Iowa, Cedar Falls, Iowa
John L. Knight Department of Economics, University of Western Ontario,
London, Ontario, Canada
ShiqingLing Department of Mathematics,HongKong University of
Science and Technology, Clear Water Bay, Hong Kong, China

!
Contributors xvii

Xuewen Lu Food Research Program, Agriculture and Agri-Food Canada,


Guelph, Ontario, Canada
Koichi Maekawa Department
of
Economics,
Hiroshima University,
Hiroshima, Japan
Jan R. Magnus Center for
Economic
Research
(CentER),
Tilburg,
University, Tilburg, The Netherlands
Michael. McAleer Department ofEconomics,UniversityofWestern
Australia, Nedlands, Perth, Western Australia, Australia
Ron C. Mittelhammer Department Statistics
of and
Agricultural
Economics, Washington State University, Pullman, Washington
KusumMundra Department ofEconomics,SanDiegoStateUniversity,
San Diego, California
A. L. Nagar National Institute of Public Finance and Policy, New Delhi,
India
Akio Namba Faculty of Economics, Kobe University, Kobe, Japan
Kazuhiro Ohtani Faculty of Economics, Kobe University, Kobe, Japan
Ahmad Parsian SchoolofMathematical Sciences, IsfahanUniversityof
Technology, Isfahan, Iran
Pawel R.Pordzik Department of Mathematical and Statistical Methods,
Agricultural University of Poznan, Poznan, Poland
Jeffrey S. Racine Department of Economics, University of South Florida,
Tampa, Florida
MezbahurRahman Department of Mathematics and Statistics. Minnesota
State University. Mankato, Minnesota
Baldev Raj School of Business and Economics, Wilfrid Laurier University,
Waterloo, Ontario, Canada
Jose Maria Sarabia Department ofEconomics,Universityof Cantabria,
Santander, Spain
Samita Sareen FinancialMarketsDivision,Bank of Canada,Ontario,
Canada
Stephen E. Satchel1 Faculty of
Economics
and Politics, Cambridge
University, Cambridge, England
xviii Contributors

Burkhard Schaffrin Department of Civil andEnvironmentalEngineering


and Geodetic Science, Ohio State University, Columbus, Ohio
Shalabh Department of Statistics, Panjab University, Chandigarh, India
R. S . Singh Department of MathematicsandStatistics,University of
Guelph, Guelph, Ontario, Canada
Christopher L. Skeels Department of
Statistics
and
Econometrics,
Australian National University, Canberra, Australia
Murray D. Smith Department ofEconometricsandBusinessStatistics,
University of Sydney, Sydney, Australia
Helge Toutenburg Department Statistics,
of Ludwig-Maximilians-
Universitat Munchen, Munich, Germany
Gotz Trenkler Department of
Statistics,
University
of
Dortmund,
Dortmund, Germany
Aman Ullah Department Economics,of University
California,
of
Riverside, Riverside, California
H. D. Vinod Department of Economics, Fordham University, Bronx, New
York
Marcel-Christian Voia Department ofEconomics,ConcordiaUniversity,
Montreal, Quebec, Canada
I Alan T. K. Wan Departmentof
Management Sciences, City
University
of
Hong Kong, Hong Kong S.A.R., China
XavierChee Hoong Yew Department ofEconomics,TrinityCollege,
University of Oxford, Oxford, England
Victoria Zinde-Walsh Department of Economics,
McGill
University,
Montreal, Quebec, Canada
Guohua Zou Institute ofSystem Science, ChineseAcademyof Sciences,
Beijing, China
Selected Publications of Professor Viren
K. Srivastava

Books:
1. Seemingly Unrelated Regression Equation Models: Estimation altd
Inference (with D. E. A. Giles), Marcel-Dekker, New York, 1987.
2. The Econometrics of Disequilibrium Models, (with B. Rao), Greenwood
Press, New York, 1990.

Research papers:
1.On
the
Estimation of Generalized
LinearProbability
Model
InvolvingDiscreteRandomVariables (with A. R.Roy), Annals of
the Institute of Statistical Mathematics, 20, 1968, 457467.
2. The Efficiency of Estimating Seemingly
UnrelatedRegression
Equations, Annals of the Institute of StatisticalMathematics, 22,
1970, 493.
3. Three-Stage
Least-Squares
and
Generalized
Double
k-Class
Estimators: A MathematicalRelationship, InternationalEconomic
Review, Vol. 12, 1971, 312-316.
xix
xx Publications of Srivastava

4. DisturbanceVarianceEstimation in SimultaneousEquations by k-
Class Method, Annals of the Institute of Statistical Mathematics, 23,
197 1,437-449.
5. DisturbanceVarianceEstimation in SimultaneousEquationswhen
DisturbancesAre
Small, Journal of the American Statistical
Association, 67, 1972, 164-168.
6. The Bias of Generalized Double k-Class Estimators, (with A.R. Roy),
Annals of the Institute of Statistical Mathematics, 24, 1972, 495-508.
7. The Efficiency of an ImprovedMethodofEstimatingSeemingly
Unrelated Regression Equations, Journal of Econometrics, 1,1973,
341-50.
8. Two-Stage and Three-Stage Least Squares Estimation of Dispersion
Matrix of Disturbances in Simultaneous Equations, (with R. Tiwari),
Annals of the Institute of Statistical Mathematics, 28, 1976, 41 1-428.
9. Evaluation of Expectationof Product of Stochastic Matrices, (withR.
Tiwari), Scandinavinn Journal of Statistics. 3, 1976. 135-1 38.
10. OptimalityofLeastSquares in SeeminglyUnrelatedRegression
Equation Models (with T. D. Dwivedi), Journal of Econornetrics, 7,
1978, 391-395.
11. LargeSampleApproximations in SeeminglyUnrelatedRegression
Equations (with S. Upadhyay), Annals of the Institute of Statisticnl
Mathematics, 30, 1978,89-96.
12. Efficiency of Two Stage and Three Stage Least Square Estimators,
Econometrics, 46, 1978, 1495-1498.
13. Estimation ofSeeminglyUnrelatedRegression Equations:A Brief
Survey (with T. D. Dwivedi), Journal of Econometrics, 8, 1979, 15-32.
14. GeneralizedTwoStageLeastSquaresEstimatorsforStructural
Equation withBothFixed andRandom Coefficients (with B. Raj
and A. Ullah), International Economic Review, 21, 1980, 61-65.
15. EstimationofLinearSingle-EquationandSimultaneousEquation
Model
under
Stochastic
Linear
Constraints:
Annoted
An
Bibliography, International Statistical Review, 48, 1980, 79-82.
16. Finite Sample Properties of Ridge Estimators (with T. D.Dwivedi and
R. L. Hall), Technometrics, 22, 1980,205-212.
17. The Efficiency of Estimating a Random Coefficient Model, (with B.
Raj and S. Upadhyaya), Journal Oj’Econometrics, 12, 1980, 285-299.
18. A
Numerical Comparison of
Exact,
Large-Sample and
Small-
DisturbanceApproximations ofPropertiesofk-ClassEstimators,
(with T.D. Dwivedi, M. Belinski andR.Tiwari), Inrernational
Economic Review, 21, 1980, 249-252.
Publications of Srivastava xxi

19. Dominance of Double k-Class Estimators in Simultaneous Equations


(with B. S. Agnihotri and T. D. Dwivedi), Annals of the Institute of
Statistical Mathematics, 32, 1980, 387-392.
20. Estimation of the Inverse of Mean (with S. Bhatnagar), Jourtzal of
Statistical Planning and Ittferences, 5 , 198 1, 329-334.
21. A Note on Moments of k-Class Estimators for Negative k,(with A. K.
Srivastava), Journal of Econometrics. 21, 1983, 257-260.
22. EstimationLinear
of Regression Model
with
Autocorrelated .
Disturbances(withA.Ullah,L.MageeandA.K.Srivastava),
Journal of Time Series Analysis, 4, 1983, 127-135.
23. Propertiesof
Shrinkage Estimators in LinearRegression
when
DisturbancesArenotNormal,(withA.UllahandR.Chandra),
Journal of Econometrics, 21, 1983, 389-402.
24. Some Properties of the Distribution of anOperational Ridge
Estimator (with A. Chaturvedi), Metrika, 30, 1983, 227-237.
25. OntheMomentsofOrdinaryLeastSquaresandInstrumental
Variables Estimators in a General Structural Equation, (with G. H.
Hillier and T. W. Kinal), Econometrics, 52, 1984, 185-202.
26. ExactFiniteSamplePropertiesofaPre-TestEstimator in Ridge
Regression (with D. E. A. Giles), Australian Journal of Statistics, 26,
1984,323-336.
27. Estimation of a Random Coefficient Model under Linear Stochastic
Constraints, (with B. Raj and K. Kumar). Annals of the Institute of
Statistical Mathematics, 36, 1984, 395401.
28. ExactFiniteSamplePropertiesofDoublek-ClassEstimators in
SimultaneousEquations
(with T. D. Dwivedi), Journal of
Econometrics, 23,1984,263-283.
29. TheSamplingDistributionofShrinkageEstimatorsandTheir F-
Ratios in theRegressionModels(with A.Ullahand R. A. L.
Carter), Journal of Econometrics, 25, 1984. 109-122.
30. Properties of Mixed Regression Estimator When DisturbancesAre not
Necessarily Normal (with R. Chandra), Journal of Statistical Planning
and Inference, 1 1, 1985, 15-21.
31. Small-DisturbanceAsymptotic
Theory
Linear-Calibration
for
Estimators, (with N. Singh), Teclznometrics, 31, 1989, 373-378.
32. Unbiased Estimation of the MSE Matrix of Stein Rule Estimators,
Confidence Ellipsoids and Hypothesis Testing (with R. A. L. Carter,
M. S. Srivastava and A. Ullah), Econometric Theory, 6, 1990, 63-74.
33. AnUnbiasedEstimatoroftheCovarianceMatrix of the Mixed
Regression Estimator, (with D. E. A. Giles), Journal of the American
Statistical Association, 86, 1991, 441444.
xxii Publications of Srivastava

34. Moments of the Ratio of Quadratic Forms in Non-Normal Variables


with
Econometric
Examples(with
AmanUllah), Journal of
Econornetrics, 62, 1994, 129-1 4 1.
I
.! 35.
Efficiency Properties
of
Feasible
Generalized
Least
Squares
Estimators
in SURE ModelsunderNon-NormalDisturbances (withKoichi
Maekawa), Journal of Econornetrics, 66, 1995, 99-121.
36. Large Sample Asymptotic Properties of the Double k-Class Estimators
in Linear
Regression Models
(with H. D. Vinod). Econometric
Reviews, 14, 1995, 75-100.
37. The Coefficient of Determination and Its Adjusted Version in Linear
RegressionModels(with Ani1 K. SrivastavaandAmanUllah),
Econometric Reviews, 14, 1995, 229-240.
38. MomentsoftheFuctionofNon-Normal VectorwithApplication
EconometricEstimatorsandTestStatistics(withA.Ullahand N.
Roy), Econornetric Reviews, 14, 1995, 459-471.
39. TheSecondOrder Bias andMeanSquaredError ofNon-linear
Estimators (with P. Rilstone and A. Ullah), Journal of Econometrics,
75, 1996, 369-395.
HANDBOOK OF
APPLIED ECONOMETRICS
AND STATISTICAL INFERENCE
This Page Intentionally Left Blank
1
Bayesian Inference Using Conditionally
Specified Priors
BARRY C. ARNOLD UniversityofCalifornia, Riverside, Riverside,
California

ENRIQUE CASTILLO Universities of Cantabria and Castilla-La


Mancha, Santander. Spain

JOSE MARiA SARABIA University of Cantabria, Santander, Spain

1. INTRODUCTION
Suppose we are given a sample of size I I from a normal distribution with
known variance and unknown mean p, and that, on the basis of sample
values . x l , x ? , . . . ,x,,, we wish to make inference about p. The Bayesian
solution of this problem involves specification of an appropriate representa-
tion of our priorbeliefs about p (summarized in a prior density forw ) which
will be adjusted by conditioning to obtain a relevant posterior density forp
(the conditional density of p, given X = xJ. Proper informative priors are
most easily justified butimproper(nonintegrable)andnoninformative
(locally uniform) priors are often acceptable in the analysis and may be
necessary when the informed scientist insists on some degree of ignorance
about unknown parameters. With a one-dimensional parameter,life for the
Bayesian analyst is relatively straightforward. The worst that can happenis
that the analyst will need to use numerical integration techniques to normal-
ize and to quantify measures of central tendency of the resulting posterior
density.
1
2 Arnold et al.

Moving to higher diniensional parameter spaces immediately compli-


cates matters. The “curse of dimensionality” begins to manifest some of
its implications even in the bivariate case. The use of conditionally
specified priors, as advocated in this chapter, will not in any sense
eliminate the “curse” but it will ameliorate some difficulties in, say,
two and three dimensions and is even practical for some higher dimen-
sional problems.
Conjugate priors, that is to say, priors which combine analytically
with the likelihood to give recognizable and analytical tractable poster-
ior densities, have been and continue to be attractive. More properly,
we should perhaps speak of priors with convenient posteriors, for their
desirability hinges mostly on the form of the posterior and there is no
need to insist on the prior and posterior density being of the same form
(the usual definition of conjugacy). It turns out that the conditionally
specified priors that are discussed in this chapter are indeed conjugate
priors in the classical sense. Not only do they have convenient poster-
iors but also the posterior densities will be of the same form as the
priors. They will prove to be more flexible than the usually recom-
mended conjugate priors for multidimensional parameters, yet still man-
ageable in the sense that simulation of the posteriors is easy to program
and implement.
Let us return for a moment to our original problem involving a sample
of size IZ from a normal distribution. This time, however, we will assume
that both the mean and the variance are unknown: a classic setting for
statistical inference. Already, in this setting, the standard conjugate prior
analysis begins to appear confining. There is a generally accepted conju-
gate prior for ( p , t) (the mean p and the precision (reciprocal of the
variance) t). It will be discussed in Section 3.1, where it will be contrasted
with a more flexible conditionally conjugate prior. A similar situation
exists for samples from a Pareto distribution with unknown inequality
and scale parameters. Here too a conditionally conjugate prior will be
compared to the usual conjugate priors. Again, increased flexibility at little
cost in complexity of analysis will be encountered. In order to discuss these
issues, a brief review of conditional specification will be useful. It will be
provided in Section 2. In Section 3 conditionally specified priors will be
introduced; the normal and Pareto distributions provide representative
examples here. Section 4 illustrates application of the conditionally speci-
fied prior technique to a number of classical inference problems. In Section
5 we will address the problem of assessing hyperparameters for condition-
ally specified priors. The closing section (Section 6) touches on the possi-
bility of obtaining even more flexibility by considering mixtures of
conditionally specified priors.
Bayesian Inference Using Conditionally Specified Priors 3

2. CONDITIONALLY SPECIFIED DISTRIBUTIONS


In efforts to describe a two-dimensional density function, it is undeniably
easier to visualize conditional densities than it is to visualize marginal den-
sities. Consequently, it may be argued that joint densities might best be
determined by postulating appropriate behavior for conditional densities.
For example, following Bhattacharyya [l], we might specify that the joint
density of a random vector (X,Y ) have every conditional density of X given
Y = y of the normal form (with mean and variance that might depend on y )
and, in addition, have every conditional density of Y given X = x of the
normal form (with mean and variance which might depend on x). In other
words, we seek all bivariate densities with bell-shaped cross sections (where
cross sections are taken parallel to the x and y axes). The class of such
densities may be represented as

where A = (u!!I):j=ois a 3 x 3 matrix of parameters. Actually, noois a norm-


ing constant, a function of the other ads chosen to make the density inte-
grate to 1. The class (1) of densities with normal conditionals includes, of
course, classical bivariate normal densities. But it includes other densities;
some of which are bimodal and some trimodal!
This normal conditional example is the prototype of conditionally speci-
fied models. The more general paradigm is as follows.
Consider an ll-parameter family of densities on R with respect to p l , a
measure on R (often a Lebesgue or counting measure), denoted by (fi(x; Q) :
8 E O } where 0 5 R'I. Consider a possible different &-parameter family of
-
densities on Iw with respect to p2 denoted by cf2(j;z): t E 7')where
T R'?. We are interested in all possible bivariate distributions which
have all conditionals of X given Y in the familyf, and all conditionals of
Y given X in the fainilyh. Thus we demand that

and

Here S ( X ) and S ( Y ) denote, respectively, the sets of possible values of X


and Y .
In order that (2) and (3) should hold there must exist marginal densities
fx andfy for X and Y such that
4 Arnold et al.

To identify the possible bivariate densities with conditionals in the two


prescribed families, we will need to solve the functional equation (4). This is
not always possible. For some choices offl andf2 no solution exists except
for the trivial solution with independent marginals.
One important class of examples in which the functional equation (4) is
readily solvable are those in which the parametric families of densitiesfl and
f2 are both exponential families. In this case the class of all densities with
conditionals in given exponential families is readily determined and is itself
an exponential family of densities. First. recall the definition of an exponen-
tial family.

Definition 1 (Exponential family) A I I el-parmtleter f h l i l y of demities


v;(s; Q ) : Q E @}, with respect to p l on S ( X ) ,sf tlzeforw

is called an exponential f h d l j of distributions.


Here 0 is the naturul parameter space and the ql,(x)s are nssunzed to be
liilearly independent. Frequently, p l is a Lebesglre meust(re or counting weu-
sure and often S ( X ) is some subset of Euclidean space oj'jinite dimension.

Note that Definition 1 contines to be meaningful if we underline x in


equation (5) to emphasize the fact that x can be multidimensional.
In addition to the exponential family (5), wewilllet (f2(v; x) : r E T ]
denote another 12-parameter exponential family of densities with respect
to p2 on S( Y ) , of the form

where T is the natural parameter space and, as is customarily done, the


qr,(y)s are assumed to be linearly independent.
The general class of the bivariate distributions with conditionals in the
two exponential families ( 5 ) and (6) is provided by the following theorem
due to Arnold and Strauss [2].

Theorem 1 (Conditionals in exponential families) Let f ( . ~11) . be a bivcwiare


density whose conditional densities satisLv

m y ) = f , (-x; eCv)) (7)


Bayesian Inference Using Conditionally Specified Priors 5

where ql(,(s) = q1oCv) = 1 ortd M is a matris of parameters of appropriate


+ +
climensions (i.e., ( e , I ) x (C, 1)) subject to the requirement that

For convenience we can partition the ntntrix hl as follows.

- Note that the cctse of irzdependence is included; it corresponds to tlze choice


M E 0.

Note that densities of the form (9) form an exponential family with (el +
I)([? + 1) - 1 parameters (since moois a normalizing constant determined by
the other m,,s).
Bhattacharyya's [I] normal conditionals density canbe represented in the
form (9) by suitable identification of the q0s. A second example, which will
arise again in our Bayesian analysis of normal data,involves normal-gamma
conditionals (see Castillo andGalambos [3]). Thus we, inthis case. are
interested in all bivariate distributions with X ( Y = y having a normal dis-
tribution for each*I , and with YIX = .x having a gamma distribution for each
s. These densities will be given by (9) with the following choices for the rs
and qs.
6 Arnold et al.

The joint density is then given by


t

Certainconstraintsmust be placed onthe mas, theelementsof M,


appearing in (12) and more generally in (9) to ensure integrability of the
resulting density, Le., to identify the natural parameter space. However,in a
Bayesian context, where improperpriorsareoftenacceptable,nocon-
straints need be placed on the mas. If the joint density isgivenby (12),
then the specific forms of the conditional distributions are as follows:

where

and

A typical density with normal-gamma conditionals is shown in Figure 1.


It should be remarked that multiple modes can occur (asin the distribution
with normal conditionals).
Another example that willbe useful in a Bayesian context is one with
conditionals that are beta densities (see Arnold and Strauss [ 2 ] ) .This corre-
sponds to the following choices for rs and qs in (9):
Bayesian inference Using Conditionally Specified Priors

0.
0. \\I
0.5

0. -0.5

- lt
~~

0 1 2 3 4 5 6

Figure 1. Example of a normal-gamma conditionals distribution with nzo2


= - 1 , 11201 = I . /l?lo = 1.5, 11112 = 2, 1171I = I , 11?20 = 3, 17122 = /t?21 0, show-
ing (left) the probability density function and (right) the contour plot.

This yields a joint density of the form

f(.u. y) = [.u(l - .Y)~(I - JV)]" exp{1111 I l o g s log], + m I 2 log x log(] - y)


+ m l log( 1 - x) logy + in2?log( 1 - s)log( 1 - y )
+ log .Y + 1n20 log( 1 - x)
11210

+ logy + log( 1 Y)
11201 11702 -

+ Ii200}z(o< X,)' < 1) (18)

In this case the constraints on the ?nos to ensure integrability of (17) are
quite simple:

Thereare somenon-exponential family cases in which thefunctional


equation (4)can be solved. For example, it is possible to identify all joint
densities with zero median Cauchy conditionals (see Arnold et al. [4]). They
are of the form

,f(.Y, J') 0: (11100 + i1?10.Y2 + 1??01J)2+ HZII.X-])-) 7 1 -1


(20)
Extensions to higher dimensions are often possible (see Arnold et al. [5]).
Assume that X is a X--dimensional random vector with coordinates (Xl,
X?, . . ., X k ) . For each coordinate random variable X,of X we define the
vector to be the (k - I)-dimensional vector obtained from X by deleting
X;. We use the same convention for real vectors, Le., s[,)is obtained from s
8 Arnold et al.

by deleting s,.We concentrate on conditional specifications of the form “ X i


j given &).”
i Consider k parametric families densities
of on 08 defined by
V;(.v; e,,,) : e(,, E 0,). i = 1. 2 . . . . , k (21)
where e,,is of dimension L , and where the ith density is understood as being
with respect to the measure p l . We are interested in k-dimensional densities
that have all their conditionals in the families (21). Consequently, we require
that for certain functions e,,,
we have, for i = 1, 2 , . . . k, .
L \ ~ , ~ z 8 , ( -=f;(.yi;
y;k~~~) Q(l)(~(l))) (22)
If these equations are to hold, then
there must exist marginal densities for
the &,s such that
f-.(,,(x(,,)fl(-yl; !9&(,,))
. . . =f.,,,(s(k)Ifk(xk;8 ( k ) ( S ( k , ) )
Sometimes the array of functional equations (23) will be solvable. Here,
J
= f 3 , , ( - Y ( 2 ) l f 2 ( . ~ 2 ; 8(2,(S(2,,>
(23)

as in two dimensions, an assumption that the families of densities (21) are


exponential families will allow for straightforward solution.
Suppose that the k families of densities f I , f ? , . . . ,fx- in (21) are el. e?,
. . . , Lk parameter exponential families of the form

(here 8, denotes thejth coordinateof e(,)


and, by convention, qlo(t)= 1, Vi).
We wish to identify all joint distributions forX such that (22) holds with the
.As defined as in (24) (i.e., with conditionals in the prescribed exponential
families).
By taking logarithms in (23) and differencing with respect to x I ,x 2 , . . . ,
xk we may conclude that the joint density must be of the following form:

The dimension of the parameter spaceofthisexponential family is


[-(t, + l)] - 1 since ~~zoo,,,o is a function of the other m‘s, chosen to ensure
that the densityintegratesto 1. Determination of the natural parameter
space may be very difficult and we may well be enthusiastic aboutaccepting
non-integrable versions of (25) to avoid the necessity of identifying which ~ n s
correspond to proper densities [4].
Bayesian Inference Using Conditionally Specified Priors 9

3. CONDITIONALLY SPECIFIED PRIORS


Suppose that we have data, X,whose distribution is governed by the family
e)
of densities Cf(2; : E 0 )where 0 is k-dimensional (k > 1). Typically, the
informative prior used in this analysis is a member of a convenient family of
priors (often chosen to be a conjugate family).

e
Definition 2 (Conjugate family) A furnil]*3 of priors for is .wid to he a
conjugatefuntily if any member of 3,when contbined with the likelihood the
of
datu, leads to a posterior density whicl1 is clguill u member of 3.

One approach to constructing a family of conjugate priors is to consider


e
the possible posterior densities for corresponding to all possible samples of
all possible sizes from the given distribution beginning with a locally uni-
form prior on 0 . More often than not, a parametric extensionof this class is
usually considered. For example, suppose that X I , . . . X,,are independent
identically distributed random variables with possiblevalues 1,2, 3. Suppose
that
P ( X , = 1) = el. P ( X , = 2) = e?, P(X, = 3) = I - el - e2
Beginning with a locally uniform prior for (e,,el) and considering all
possible samples of all possible sizes leads to a family of Dirichlet ( a I , a ? ,
a 3 )posteriors where a l ,a?,a3 are positive integers. This could be used as a
conjugate prior family but it is more natural to use the augmented family of
Dirichlet densities with aI.a 2 ,cy3 E Rf.
If we applythis approachto i.i.d.normal random variableswith
unknown mean p and unknown precision (the reciprocal of the variance)
t, we are led to a conjugate prior family of the following form (see, e.g.,
deGroot [6]).

where a > 0.0 < 0, c E [w, d < 0. Densities such as (26) have a gamma
marginal density for t and a conditional distribution for p, given t that is
normal with precision depending on t.
It is notclear why we should force our prior to accommodate to the
particularkind of dependenceexhibited by (26). Inthiscase, if t were
known, it would be natural to use a normal conjugate prior for p. If p
were known,theappropriateconjugatepriorfor t would be agamma
density. Citing ease of assessment arguments, Arnold and Press [7] advo-
cated use of independent normal and gamma priors for ,LL and t (when both
are unknown). They thus proposed prior densities of the form
10 Arnold et al.

f ( ~r ,
) 0: exp(n
log r + br + CF + d p 2 ) (27)
where CI > 0, b < 0, c E R, d < 0. 1
The family (27) is not a conjugate family and in addition it, like (26).
involves a specific assumption about the (lack of) dependence between prior
beliefs about p and r.
It will be noted that densities (26)and (27), both have normal and gamma
conditionals.Indeedbothcanbeembedded in the full class ofnormal-
gamma conditionals densities introduced earlier (see equation ( 1 2)). This
family is a conjugate prior family. It is richer than (26) and (27) and it can
be argued that it provides us with desirable additional flexibility at little cost
since the resulting posterior densities (onwhich our inferential decisions will
be made) will continue to have normal and gamma conditionals and thus are
not difficult to deal with. This is a prototypical example of what we call a
conditionally specified prior. Some practical examples, together with their
corresponding hyperparameter assessments, are given in Arnold et al. [8].
Ifthe possible densitiesfor e
X are given by (f(5: E @]where
0 c R k ,k > 1, then specification of a joint prior for e involves describing
ak-dimensionaldensity. We argued in Section 2 that densities are most
easily visualized in terms of conditional densities. In order to ascertain an
appropriate prior density for e it would then seem appropriate to question
the informed scientific expert regarding prior beliefs about 8 , given specific
values of the other 8;s. Then, we would ask about prior beliefs about Q2
1
given specific values of e(,,
(the other Ois),etc. One clear advantage of this
approach is that we are only asking about univariate distributions, which
are much easier to visualize than multivariate distributions.
1 Often we can still takeadvantage of
conjugacy concepts in our effort
to
pin down prior beliefs using a conditional approach. Suppose that for each
e,
coordinate 8; of if the other 8,s (i.e. e,,))were known, a convenient con-
jugate prior family, sayf,(O,lp). p, E A , , is available. In this notation the g,s
are "hyperparameters" of the conjugate prior families. If this is the case, we
e.
propose to use, as a conjugate prior family for all densities which have the
property that, for each i, the conditional density of O,, given e,, belongs to
the familyf,. It is not difficult to verify that this is a conjugate prior family
so that the posterior densities will also have conditionals in the prescribed
families.

3.1 ExponentialFamilies
If, in the above scenarios, each of the prior familiesf, (the prior for Oi,given
e(,,)is an C,-parameter exponential family, then,
- from Theorem 1, the result-
ing conditionally conjugate prior family will itself be an exponential family.
Bayesian
Inference Using Conditionally Specified Priors 11

x-
It will have a large number of hyperparameters (namely n(t,+ 1) - 1),
i= 1
providing flexibility for matching informed or vague prior beliefs about @.
Formally, if for each i a natural conjugate prior for 0, (assuming e(,) is
known) is an [,-parameter exponential family of the form

then a convenient family of priors for the full parameter vector @ will be of
the form

where, for notational convenience, we have introduced the constant func-


tions Tio(6,)= 1. i = 1.2. . . . . k.
This family of densities includes all densities for @ with conditionals (fore,
given e,,, for each i) in the given exponential families (28). The proof is based
on a simple extension of Theorem 1 to the rz-dimensional case.
Because each1; is a conjugate prior for 0, (given e,,) it follows that.f(@),
given by (29), is a conjugate priorfamily and that all posterior densities have
the same structure as the prior. In other words, a priori and a posteriori. 0,
given e(,) will have adensityoftheform (28) foreach i. Theyprovide
particularlyattractiveexamples of conditionallyconjugate(conditionally
specified) priors.
As we shall see, it is usually the case that the posterior hyperparameters
are related to the prior hyperparameters in asimpleway.Simulation of
realizations from the posterior distribution corresponding to a conditionally
conjugateprior willbe quitestraightforwardusing rejection or Markov
Chain Monte Carlo (MCMC) simulation methods (see Tanner [9]) and, in
particular.the Gibbssampleralgorithm since thesimulation will only
involve one-dimensionalconditionaldistributions which are themselves
exponential families. Alternatively, it is often possible to use some kind of
rejection algorithm to simulate realizations from a density of the form (29).
Note that the family (29) includes the “natural” conjugate prior family
(obtained by considering all possible posteriors corresponding to all possible
samples,beginning with a locally uniformprior). Inaddition, (29) will
include priors with independent marginals for the 8,s, with the density for
8, selected from the exponential family (28), for each i. Both classes of these
more commonly encountered priors can be recognized as subclasses of (29).
12 Arnold et al.

. 4 obtained by setting some of the hyperparameters(the n ? , , J 2 . . . .jks) equal


to
zero.
Consideragainthecase in which ourdata consistof n independent
identically distributed random variables each having a normal distribution
with mean p and precision t (the reciprocal of the variance). The corre-
sponding likelihood has the form

If t is known, the conjugate priorfamily for p is the normal family.If p is


known, the conjugate prior family for t is the gamma family. We are then
led to consider, as our conditionally conjugate family of prior densities for
( p , t).the set of densities with normal-gamma conditionals given above in
(12). We will rewrite this in an equivalent but more convenientform as
follows:
+
,f(p. t )ccexp[mIOp nr20p9 + m12plog t + n?22p9logs]
x exp[nrO,t + m O 2 log t + inl l p t + n ~ ~ ~ p ’ t ] (31)
For such a density we have:
1. The conditional density of p given t is normal with mean

1
I
and precision
+
l/var(plt) = - 2 ( n ~ ~m~lt + / w 2 log r) (33)
2. The conditional density of t given p is gamma with shape parameter
a(p) and intensity parameter A(@). i.e.,
f(t[p) t 4 4 - ~ ~ - w ~
(34)
with mean and variance

var(slp) =
1+ m o 2 + m12p+ nt22p’ (36)
(!?to, + ml1p + nz21p?)2

In order to have a proper density, certain constraints must be placed on


the m,,s in (3 1) (seefor example Castilloand Galambos [ 101). However, if we
Bayesian Inference Using Conditionally Specified Priors 13

are willing to accept improper priors we can allow each of them to range
over [w.
In order to easily characterize the posterior density which will arise when
a prior of the form (31) is combined with the likelihood (30), it is convenient
to rewrite the likelihood as follows:

A prior of the form (31) combined with the likelihood (37) will yield a
posterior density again in the family (31) with prior and posterior hyper-
parameters related as shown in Table 1. From Table 1 we may observe that
four of the hyperparameters are unaffected by the data. They are the four
hyperparameters appearing in the first factor in (31). Their influence on the
prior is eventually “swamped” by the data but, by adopting the condition-
ally conjugate prior family, we do not force them arbitrarily to be zero as
would be done if we were to use the “natural” conjugate prior (26).
Reiterating. the choice 11100 = t w o = i n l 2 = m 2 2 = 0 yields the natural
conjugateprior.The choice l n I 1= i n l 2 = = n12? = 0 yields priorswith
independent normal and gamma marginals. Thus both of the commonly
proposed prior families are subsumed by (31) and in all cases we end up
with a posterior with normal-gamma conditionals.

3.2 Non-exponential Families


Exponential families of priors play a very prominent role in Bayesian sta-
tistical analysis. However, there are interesting cases which fall outside the

Table 1. Adjustments in the hyperparameters in


the prior family (31), combined with likelihood (37)
Parameter
Prior value Posterior value
14 Arnold et al.

exponential family framework. We will presentone such examplein this


section.Each one mustbedealtwith on acase-by-case basis because
there will be no analogous theorem in non-exponential family cases that is
aparalleltoTheorem I (which allowed us to clearly identify thejoint
densities with specified conditional densities).
Our example involves classical Pareto data. The data take the form of a
sample of size 17 from a classical Pareto distribution with inequality para-
meter a and precision parameter (the reciprocal of the scale parameter) r.
The likelihood is then

.fx(s;
a, r ) = n11

i= 1
ra(r.xi)-(a+’)z(r.xl> I )

which can be conveniently rewritten in the form

(39)
If r were known, the natural conjugate prior family for a would be the
gamma family. If a were known, the natural conjugate prior family for r
would be the Pareto family. We are then led to consider the conditionally
conjugate prior family which will include the joint densities for ( a ,r ) with
gamma and Pareto conditionals. It is not difficult to verify that this is a six
(hyper) parameter family of priors of the form
+
logr r r l z l log a log r]
.f(a, r ) o( exp[mol
+ +
x exp[mIoa n720 loga /??lIalogr]Z(rc > 1) (40)
It willbe obvious that this is not an exponential family of priors. The
support depends on one of the hyperparameters. In (40). the hyperpara-
meters in the first factor are those which are unchanged in the posterior.
The hyperparameters in the second factor are the ones that are affected by
the data.If a density is of the form (40) is used as a prior in conjunction with
the likelihood (39), i t is evident that the resulting posterior densityis again in
the family (40). The prior and posterior hyperparameters are related in the
manner shown in Table 2.
The density (40). having gamma and Pareto conditionals, is readily simu-
lated using a Gibbs sampler approach. The family (40) includes thetwo
most frequently suggested families of joint priors for (a. r), namely:
Bayesian Inference Using Conditionally Specified Priors 15

Table 2. Adjustments in the parameters in the


prior (40) when combined with the likelihood (39)
Parameter
Prior Posterior
value
value

1 The “clcrssicnl” corljugute priorfamily. This was introduced byLwin


[ l I]. It correspondedto the case in which mol and inZl were both
arbitrarily set equal to 0.
2 The indeperdent gumma and Paretopriors. These were suggested by
Arnold and Press [7] and correspond to the choice m l l = nzzl = 0.

4. SOME CLASSICALPROBLEMS
4.1 The Behrens-FisherProblem
In this setting we wish to compare the means of two or more normal popu-
lations with unknown and possibly different precisions. Thus our data con-
sists of k independent samples from normal populations with

X, - N(,u,,r,), i = 1 . 2 , . . . , k ; j = 1 , 2 , . . . , izi (41)

Our interest is in the values of the pis. The ris (the precisions) are here
classic examples of (particularly pernicious) nuisance parameters. Our like-
lihood will involve 2k parameters. If all the parameters save pJ are known,
then a natural conjugate prior for pJ would be a normal density. If all the
parameters save rJ are known, then a natural conjugate prior for rj will be a
gamma density. Consequently, the general conditionally conjugate prior for
( p , z) will be one in which the conditional density of each p J ,given the other
2k - 1 parameters, is normal and the conditional density of rj, given the
other 2k - 1 parameters, is of the gamma type. The resulting family of joint
priors in then given by
16 Arnold et al.

where
YIO(c(,) =1 f

Y,I(PLi)=P,.
q ; & 4 = P?
q:,”(ri)= 1,
s:s1(r1,)=-rl,,
q;.2(t,.)= log t,,
.

There are thus32k - 1 hyperparameters. Many of these will be unaffected


by the data. The traditional Bayesian prior is a conjugate prior in which
only the 4k hyperparameters that areaffected by the data aregiven non-zero
values.An easily assessed joint prior in the family (42) would be one in
which the 11s are taken to be independent of the t s and independent normal
priors are used for each p., and independent gamnla priors are used for the
tJs. This kind of prior will also have 4k of the hyperparameters in (42) not
equal to zero.
Perhaps the most commonly used joint prior in this setting is one in
which the pjs are assumed to have independent locally uniform densities
and, independent of the p J s ,the tls (or their logarithms) are assumed to have
independent locally uniform priors. All three of these types of prior will lead
to posterior densities in the family (42). We can then use a Gibbs sampler
algorithm to generate posterior realizations of ( p , I).The approximate pos-
terior distribution of C;=,(p, -I;)’ can be perused to identify evidence for
differences among the p,s. A specific example of this program is described in
Section 5.1 below.

4.2 2 x 2 Contingency Tables


In comparing two medical treatments, we may submit I?, subject to treat-
ment i. i = 1,2 and observe the number of successes (survival to the end of

-
the observation period) for each treatment. Thus our data consists
independent random variables ( X , . X , ) where X , bi~zomicrl(n,,p,).
of two
Bayesian Inference Using Conditionally Specified Priors 17

The odds ratio

is of interest in this setting. A natural conjugate prior forp l (if p., is known)
is a beta distribution. Analogously, a beta prior is natural for p z (if p I is
known). The corresponding conditionally conjugate prior for ( p , , p 2 ) will
have beta conditionals (cf. equation (18)) and is given by

. f ( P I . P ? ) = b I ( 1 --PI)P.,(l -P2)1-I
+
x exp[tnII logpl logp2 t n 1 2logpl log(1 - p z )
+ m . , l log(l - PI) logp: + 11122 log(1 - PI) log( 1 - p z )

+ ti110 logy1+ li720 IOg(1 - P I )

+ +
I H O ] logp., +
n10., log( 1 - p2) moo]
x Z(0 <PI < 1)1(0 < p2 < 1) (44)
When such a prior is combined with the likelihood corresponding to the
two independent binomial X,s, the posterior density is again in the family
(44). Only some of the hyperparameters are affected by the data.
The usual prior in this situation involves independent beta priors for pI
and p z . Priors of this form are of course included as special cases in (44) but
it is quitereasonableto expect non-independentprior beliefs about the
efficacy of the two treatment regimes. Observe that simulated realizations
from a posterior of the form (44) are readily generated using a Gibbs sam-
pler algorithm (with beta conditionals). This permits ready simulation of the
posterior distribution of the parametric function of interest (the odds ratio
(43)).

4.3 Regression
The conditionally specified prior approach can alsobe used in other classical
situations. We will describe an example involving simple linear regression.
but analogous ideas can be developed in more complex settings.
Assume that we have I I independent random variables XI .X2. . . . , X,,
whose marginal distributions follow a linear regression model. Thus
X, - ~ ( +apt,. 2 ) , i = 1 , . 7 . . . . , I T (45)
where the t,s are knownquantitiesandtheparameters CY,p, and 0' are
unknown. Here CY E 0 8 , b E R and 0
' E R'. Often (Y and j3 are the parameters
of interest whereas a2is a nuisance parameter. As we have done in previous
sections, we reparameterize in terms of precision t ( = 1 / 0 2 ) If
. and r were
18 Arnold et al.

known, we would use a normal prior forCY.If CY and r were known, a normal
I
prior for @ would be used. If CY and B were known, a routinely used prior for
I
r would be a gamma distribution. Our conditionalspecification route would
then lead to a joint prior with normal, normal, and gamma conditionals.
Thus we would use
I

The twofactors on theright-handside of (46) involve, respectively,


hyperparameters which arenotandare affected by the data. The seven
hyperparameters that are affected by the data are changed in value as dis-
played in Table 3.
The classical Bayesian approach would set all hyperparameters in the first
factor of (46) equalto zero.This7-parameterconjugatepriormight be
adequate but it clearly lacks flexibility when compared with the full prior

I Table 3. Adjustments in theparameters in the


prior family (46), combined with the likelihood
corresponding to the model (45)
Parameter
Prior value Posterior
value
I ’ I . . ,
Bayesian Inference Using Conditionally Specified Priors 19

family (46). It is not easy to justify the dependence structure that is implicitly
assumed when using such a restrictive prior.
Another possibility would involve independentpriorsfor a, p, and r.
Whether we pick a prior in the full family (46) or from some subfamily,
we will still be able to use a simple Gibbs sampler approach to simulating
realizations from the posterior density with its normal, normal, and gamma
conditionals.

5. HYPERPARAMETER ASSESSMENT STRATEGIES


Ifwe have agreed to use a conditionally specified prior such as (29). (40),
etc., we willbe faced with the problem of selecting suitable values of the
hyperparameters to match as closely as possible the prior beliefs of our
informedexpertwho is supplyingtheaprioriinformation. It must be
emphasized that use of conditionally specified priors (as is the case with
use of other convenient flexible families of priors) does not imply that we
believe that our expert’s beliefs will precisely match some distribution in the
conditionally specified family. What we hopeto be thecase is thatthe
conditionally specified family will be flexible enough to contain a member
which will approximate the informed expert’s belief quite adequately.
In order to select a conditionally specified prior to represent our expert’s
prior beliefs about a multidimensional parameter Q, it will be necessary to
elicit quite a bit of information.
The information providedby a human expert will typically be inconsistent.
That is to say, there probably will not exist c r y distribution which matches
exactly all of the many priorpieces of probabilistic information provided by
the expert. Whatwe will try todo is to find a conditionally specified prior that
is. in some sense. least at variance with the given information.
There will be some arbitrariness in how we measure such discrepancies
but we take comfort in the fact that eventually the data will outweigh any
unwarranted prior assumptions.
Our knowledge of the one-dimensional conditional distributions involved
in our conditionally specified prior will usually allow us to computea variety
of conditional moments and percentiles explicitly as functions of the hyper-
parameters in the conditionally specified prior. Armed with this informa-
tion, we will then elicit from our informed expert the subjective evaluations
of the true prior values of a selection of conditional moments and percen-
tiles. We will usually askformore values of conditionalmomentsand
percentiles than there are hyperparameters in the model. We reiterate that
we don’t expect to find a choice of hyperparameters that will match the
expert’s elicited momentsand percentiles exactly (theyprobably won’t
Another random document with
no related content on Scribd:
And dimly she realised one of the great laws of the human soul: that
when the emotional soul receives a wounding shock, which does not
kill the body, the soul seems to recover as the body recovers. But
this is only appearance. It is really only the mechanism of the re-
assumed habit. Slowly, slowly the wound to the soul begins to make
itself felt, like a bruise, which only slowly deepens its terrible ache,
till it fills all the psyche. And when we think we have recovered and
forgotten, it is then that the terrible after-effects have to be
encountered at their worst.
So it was with Clifford. Once he was "well," once he was back at
Wragby, and writing his stories, and feeling sure of life, in spite of
all, he seemed to forget, and to have recovered all his equanimity.
But now, as the years went by, slowly, slowly, Connie felt the bruise
of fear and horror coming up, and spreading in him. For a time it
had been so deep as to be numb, as it were non-existent. Now
slowly it began to assert itself in a spread of fear, almost paralysis.
Mentally he still was alert. But the paralysis, the bruise of the too
great shock, was gradually spreading in his affective self.
And as it spread in him, Connie felt it spread in her. An inward
dread, an emptiness, an indifference to everything gradually spread
in her soul. When Clifford was roused, he could still talk brilliantly,
and as it were, command the future: as when, in the wood, he
talked about her having a child, and giving an heir to Wragby. But
the day after, all the brilliant words seemed like dead leaves,
crumpling up and turning to powder, meaning really nothing, blown
away on any gust of wind. They were not the leafy words of an
effective life, young with energy and belonging to the tree. They
were the hosts of fallen leaves of a life that is ineffectual.
So it seemed to her everywhere. The colliers at Tevershall were
talking again of a strike, and it seemed to Connie there again it was
not a manifestation of energy, it was the bruise of the war that had
been in abeyance, slowly rising to the surface and creating the great
ache of unrest, and stupor of discontent. The bruise was deep, deep,
deep ... the bruise of the false inhuman war. It would take many
years for the living blood of the generations to dissolve the vast
black clot of bruised blood, deep inside their souls and bodies. And it
would need a new hope.
Poor Connie! As the years drew on it was the fear of nothingness in
her life that affected her. Clifford's mental life and hers gradually
began to feel like nothingness. Their marriage, their integrated life
based on a habit of intimacy, that he talked about: there were days
when it all became utterly blank and nothing. It was words, just so
many words. The only reality was nothingness, and over it a
hypocrisy of words.
There was Clifford's success: the bitch-goddess! It was true he was
almost famous, and his books brought him in a thousand pounds.
His photograph appeared everywhere. There was a bust of him in
one of the galleries, and a portrait of him in two galleries. He
seemed the most modern of modern voices. With his uncanny lame
instinct for publicity, he had become in four or five years one of the
best known of the young "intellectuals." Where the intellect came in,
Connie did not quite see. Clifford was really clever at that slightly
humorous analysis of people and motives which leaves everything in
bits at the end. But it was rather like puppies tearing the sofa
cushions to bits; except that it was not young and playful, but
curiously old, and rather obstinately conceited. It was weird and it
was nothing. This was the feeling that echoed and re-echoed at the
bottom of Connie's soul: it was all nothing, a wonderful display of
nothingness. At the same time a display. A display! a display! a
display!
Michaelis had seized upon Clifford as the central figure for a play;
already he had sketched in the plot, and written the first act. For
Michaelis was even better than Clifford at making a display of
nothingness. It was the last bit of passion left in these men: the
passion for making a display. Sexually they were passionless, even
dead. And now it was not money that Michaelis was after. Clifford
had never been primarily out for money, though he made it where
he could, for money is the seal and stamp of success. And success
was what they wanted. They wanted, both of them, to make a real
display ... a man's own very display of himself, that should capture
for a time the vast populace.
It was strange ... the prostitution to the bitch-goddess. To Connie,
since she was really outside of it, and since she had grown numb to
the thrill of it, it was again nothingness. Even the prostitution to the
bitch-goddess was nothingness, though the men prostituted
themselves innumerable times. Nothingness even that.
Michaelis wrote to Clifford about the play. Of course she knew about
it long ago. And Clifford was again thrilled. He was going to be
displayed again this time, somebody was going to display him, and
to advantage. He invited Michaelis down to Wragby with Act I.
Michaelis came: in summer, in a pale-coloured suit and white suède
gloves, with mauve orchids for Connie, very lovely, and Act I was a
great success. Even Connie was thrilled ... thrilled to what bit of
marrow she had left. And Michaelis, thrilled by his power to thrill,
was really wonderful ... and quite beautiful, in Connie's eyes. She
saw in him that ancient motionlessness of a race that can't be
disillusioned any more, an extreme, perhaps, of impurity that is
pure. On the far side of his supreme prostitution to the bitch-
goddess he seemed pure, pure as an African ivory mask that dreams
impurity into purity, in its ivory curves and planes.
His moment of sheer thrill with the two Chatterleys, when he simply
carried Connie and Clifford away, was one of the supreme moments
of Michaelis' life. He had succeeded: he had carried them away. Even
Clifford was temporarily in love with him ... if that is the way one can
put it.
So next morning Mick was more uneasy than ever: restless,
devoured, with his hands restless in his trousers pockets. Connie had
not visited him in the night ... and he had not known where to find
her. Coquetry!... at his moment of triumph.
He went up to her sitting-room in the morning. She knew he would
come. And his restlessness was evident. He asked her about his play
... did she think it good? He had to hear it praised: that affected him
with the last thin thrill of passion beyond any sexual orgasm. And
she praised it rapturously. Yet all the while, at the bottom of her
soul, she knew it was nothing.
"Look here!" he said suddenly at last. "Why don't you and I make a
clean thing of it? Why don't we marry?"
"But I am married," she said amazed, and yet feeling nothing.
"Oh that!... he'll divorce you all right.... Why don't you and I marry?
I want to marry. I know it would be the best thing for me ... marry
and lead a regular life. I lead the deuce of a life, simply tearing
myself to pieces. Look here, you and I, we're made for one another
... hand and glove. Why don't we marry? Do you see any reason why
we shouldn't?"
Connie looked at him amazed: and yet she felt nothing. These men,
they were all alike, they left everything out. They just went off from
the top of their heads as if they were squibs, and expected you to be
carried heavenwards along with their own thin sticks.
"But I am married already," she said. "I can't leave Clifford, you
know."
"Why not? but why not?" he cried. "He'll hardly know you've gone,
after six months. He doesn't know that anybody exists, except
himself. Why the man has no use for you at all, as far as I can see;
he's entirely wrapped up in himself."
Connie felt there was truth in this. But she also felt that Mick was
hardly making a display of selflessness.
"Aren't all men wrapped up in themselves?" she asked.
"Oh, more or less, I allow. A man's got to be, to get through. But
that's not the point. The point is, what sort of a time can a man give
a woman? Can he give her a damn good time, or can't he? If he
can't he's no right to the woman...." He paused and gazed at her
with his full, hazel eyes, almost hypnotic. "Now I consider," he
added, "I can give a woman the darndest good time she can ask for.
I think I can guarantee myself."
"And what sort of a good time?" asked Connie, gazing on him still
with a sort of amazement, that looked like thrill; and underneath
feeling nothing at all.
"Every sort of a good time, damn it, every sort! Dress, jewels up to a
point, any night-club you like, know anybody you want to know, live
the pace ... travel and be somebody wherever you go.... Darn it,
every sort of good time."
He spoke it almost in a brilliancy of triumph, and Connie looked at
him as if dazzled, and really feeling nothing at all. Hardly even the
surface of her mind was tickled at the glowing prospects he offered
her. Hardly even her most outside self responded, that at any other
time would have been thrilled. She just got no feeling from it all, she
couldn't "go off." She just sat and stared and looked dazzled, and
felt nothing, only somewhere she smelt the extraordinarily
unpleasant smell of the bitch-goddess.
Mick sat on tenterhooks, leaning forward in his chair, glaring at her
almost hysterically: and whether he was more anxious out of vanity
for her to say Yes! or whether he was more panic-stricken for fear
she should say Yes!—who can tell?
"I should have to think about it," she said. "I couldn't say now. It
may seem to you Clifford doesn't count, but he does. When you
think how disabled he is...."
"Oh damn it all! if a fellow's going to trade on his disabilities, I might
begin to say how lonely I am, and always have been, and all the rest
of the my-eye-Betty-Martin sob-stuff! Damn it all, if a fellow's got
nothing but disabilities to recommend him...."
He turned aside, working his hands furiously in his trousers pockets.
That evening he said to her:
"You're coming round to my room tonight, aren't you? I don't darned
know where your room is."
"All right!" she said.
He was a more excited lover that night, with his strange, small boy's
frail nakedness. Connie found it impossible to come to her crisis
before he had really finished his. And he roused a certain craving
passion in her, with his little boy's nakedness and softness; she had
to go on after he had finished, in the wild tumult and heaving of her
loins, while he heroically kept himself up, and present in her, with all
his will and self-offering, till she brought about her own crisis, with
weird little cries.
When at last he drew away from her, he said, in a bitter, almost
sneering little voice:
"You couldn't go off at the same time as a man, could you? You'd
have to bring yourself off! You'd have to run the show!"
This little speech, at the moment, was one of the shocks of her life.
Because that passive sort of giving himself was so obviously his only
real mode of intercourse.
"What do you mean?" she said.
"You know what I mean. You keep on for hours after I've gone off ...
and I have to hang on with my teeth till you bring yourself off by
your own exertions."
She was stunned by this unexpected piece of brutality, at the
moment when she was glowing with a sort of pleasure beyond
words, and a sort of love for him. Because after all, like so many
modern men, he was finished almost before he had begun. And that
forced the woman to be active.
"But you want me to go on, to get my own satisfaction?" she said.
He laughed grimly: "I want it!" he said. "That's good! I want to hang
on with my teeth clenched, while you go for me!"
"But don't you?" she insisted.
He avoided the question. "All the darned women are like that," he
said. "Either they don't go off at all, as if they were dead in there ...
or else they wait till a chap's really done, and then they start in to
bring themselves off, and a chap's got to hang on. I never had a
woman yet who went off just at the same moment as I did."
Connie only half heard this piece of novel, masculine information.
She was only stunned by his feeling against her ... his
incomprehensible brutality. She felt so innocent.
"But you want me to have my satisfaction too, don't you?" she
repeated.
"Oh, all right! I'm quite willing. But I'm darned if hanging on waiting
for a woman to go off is much of a game for a man...."
This speech was one of the crucial blows of Connie's life. It killed
something in her. She had not been so very keen on Michaelis; till he
started it, she did not want him. It was as if she never positively
wanted him. But once he had started her, it seemed only natural for
her to come to her own crisis with him. Almost she had loved him for
it ... almost that night she loved him, and wanted to marry him.
Perhaps instinctively he knew it, and that was why he had to bring
down the whole show with a smash; the house of cards. Her whole
sexual feeling for him, or for any man, collapsed that night. Her life
fell apart from his as completely as if he had never existed.
And she went through the days drearily. There was nothing now but
this empty treadmill of what Clifford called the integrated life, the
long living together of two people, who are in the habit of being in
the same house with one another.
Nothingness! To accept the great nothingness of life seemed to be
the one end of living. All the many busy and important little things
that make up the grand sum-total of nothingness!

CHAPTER VI
"Why don't men and women really like one another nowadays?"
Connie asked Tommy Dukes, who was more or less her oracle.
"Oh, but they do! I don't think since the human species was
invented, there has ever been a time when men and women have
liked one another as much as they do today. Genuine liking! Take
myself ... I really like women better than men; they are braver, one
can be more frank with them."
Connie pondered this.
"Ah, yes, but you never have anything to do with them!" she said.
"I? What am I doing but talking perfectly sincerely to a woman at
this moment?"
"Yes, talking...."
"And what more could I do if you were a man, than talk perfectly
sincerely to you?"
"Nothing perhaps. But a woman...."
"A woman wants you to like her and talk to her, and at the same
time love her and desire her; and it seems to me the two things are
mutually exclusive."
"But they shouldn't be!"
"No doubt water ought not to be so wet as it is; it overdoes it in
wetness. But there it is! I like women and talk to them, and
therefore I don't love them and desire them. The two things don't
happen at the same time in me."
"I think they ought to."
"All right. The fact that things ought to be something else than what
they are, is not my department."
Connie considered this. "It isn't true," she said. "Men can love
women and talk to them. I don't see how they can love them
without talking, and being friendly and intimate. How can they?"
"Well," he said, "I don't know. What's the use of my generalising? I
only know my own case. I like women, but I don't desire them. I like
talking to them; but talking to them, though it makes me intimate in
one direction, sets me poles apart from them as far as kissing is
concerned. So there you are! But don't take me as a general
example, probably I'm just a special case: one of the men who like
women, but don't love women, and even hate them if they force me
into a pretence of love, or an entangled appearance."
"But doesn't it make you sad?"
"Why should it? Not a bit! I look at Charlie May, and the rest of the
men who have affairs.... No, I don't envy them a bit! If fate sent me
a woman I wanted, well and good. Since I don't know any woman I
want, and never see one ... why, I presume I'm cold, and I really like
some women very much."
"Do you like me?"
"Very much! And you see there's no question of kissing between us,
is there?"
"None at all!" said Connie. "But oughtn't there to be?"
"Why, in God's name? I like Clifford, but what would you say if I
went and kissed him?"
"But isn't there a difference?"
"Where does it lie, as far as we're concerned? We're all intelligent
human beings, and the male and female business is in abeyance.
Just in abeyance. How would you like me to start acting up like a
continental male at this moment, and parading the sex thing?"
"I should hate it."
"Well then! I tell you, if I'm really a male thing at all, I never run
across the female of my species. And I don't miss her, I just like
women. Who's going to force me into loving, or pretending to love
them, working up the sex game?"
"No, I'm not. But isn't something wrong?"
"You may feel it, I don't."
"Yes, I feel something is wrong between men and women. A woman
has no glamour for a man any more."
"Has a man for a woman?"
She pondered the other side of the question.
"Not much," she said truthfully.
"Then let's leave it all alone, and just be decent and simple, like
proper human beings with one another. Be damned to the artificial
sex-compulsion! I refuse it!"
Connie knew he was right, really. Yet it left her feeling so forlorn, so
forlorn and stray. Like a chip on a dreary pond, she felt. What was
the point, of her or anything?
It was her youth which rebelled. These men seemed so old and cold.
Everything seemed old and cold. And Michaelis let one down so; he
was no good. The men didn't want one; they just didn't really want a
woman, even Michaelis didn't.
And the bounders who pretended they did, and started working the
sex game, they were worse than ever.
It was just dismal, and one had to put up with it. It was quite true,
men had no real glamour for a woman: if you could fool yourself into
thinking they had, even as she had fooled herself over Michaelis,
that was the best you could do. Meanwhile you just lived on and
there was nothing to it. She understood perfectly well why people
had cocktail parties, and jazzed, and Charlestoned till they were
ready to drop. You had to take it out some way or other, your youth,
or it ate you up. But what a ghastly thing, this youth! you felt as old
as Methuselah, and yet the thing fizzed somehow, and didn't let you
be comfortable. A mean sort of life! And no prospect! She almost
wished she had gone off with Mick, and made her life one long
cocktail party, and jazz evening. Anyhow that was better than just
mooning yourself into the grave.
On one of her bad days she went out alone to walk in the wood,
ponderously, heeding nothing, not even noticing where she was. The
report of a gun not far off startled and angered her.
Then, as she went, she heard voices, and recoiled. People! She
didn't want people. But her quick ear caught another sound, and she
roused; it was a child sobbing. At once she attended; someone was
ill-treating a child. She strode swinging down the wet drive, her
sullen resentment uppermost. She felt just prepared to make a
scene.
Turning the corner, she saw two figures in the drive beyond her: the
keeper, and a little girl in a purple coat and moleskin cap, crying.
"Ah, shut it up, tha false little bitch!" came the man's angry voice,
and the child sobbed louder.
Constance strode nearer, with blazing eyes. The man turned and
looked at her, saluting coolly, but he was pale with anger.
"What's the matter? Why is she crying?" demanded Constance,
peremptory but a little breathless.
A faint smile like a sneer came on the man's face. "Nay, yo' mun ax
'er," he replied callously, in broad vernacular.
Connie felt as if he had hit her in the face, and she changed colour.
Then she gathered her defiance, and looked at him, her dark-blue
eyes blazing rather vaguely.
"I asked you," she panted.
He gave a queer little bow, lifting his hat. "You did, your Ladyship,"
he said; then, with a return to the vernacular: "but I canna tell yer."
And he became a soldier, inscrutable, only pale with annoyance.
Connie turned to the child, a ruddy, black-haired thing of nine or ten.
"What is it, dear? Tell me why you're crying!" she said, with the
conventionalised sweetness suitable. More violent sobs, self-
conscious. Still more sweetness on Connie's part.
"There, there, don't you cry! Tell me what they've done to you!" ...
and intense tenderness of tone. At the same time she felt in the
pocket of her knitted jacket, and luckily found a sixpence.
"Don't you cry then!" she said, bending in front of the child. "See
what I've got for you!"
Sobs, snuffles, a fist taken from a blubbered face, and a black
shrewd eye cast for a second on the sixpence. Then more sobs, but
subduing. "There, tell me what's the matter, tell me!" said Connie,
putting the coin into the child's chubby hand, which closed over it.
"It's the ... it's the ... pussy!"
Shudders of subsiding sobs.
"What pussy, dear?"
After a silence the shy fist, clenching on sixpence, pointed into the
bramble brake.
"There!"
Connie looked, and there, sure enough, was a big black cat,
stretched out grimly, with a bit of blood on it.
"Oh!" she said in repulsion.
"A poacher, your Ladyship," said the man satirically.
She glanced at him angrily. "No wonder the child cried," she said, "if
you shot it when she was there. No wonder she cried!"
He looked into Connie's eyes, laconic, contemptuous, not hiding his
feelings. And again Connie flushed; she felt she had been making a
scene, the man did not respect her.
"What is your name?" she said playfully to the child. "Won't you tell
me your name?"
Sniffs; then very affectedly in a piping voice; "Connie Mellors!"
"Connie Mellors! Well, that's a nice name! And did you come out
with your Daddy, and he shot a pussy? But it was a bad pussy!"
The child looked at her, with bold, dark eyes of scrutiny, sizing her
up, and her condolence.
"I wanted to stop with my Gran," said the little girl.
"Did you? But where is your Gran?"
The child lifted an arm, pointing down the drive. "At th' cottidge."
"At the cottage! And would you like to go back to her?"
Sudden, shuddering quivers of reminiscent sobs. "Yes!"
"Come then, shall I take you? Shall I take you to your Gran? Then
your Daddy can do what he has to do." She turned to the man. "It is
your little girl, isn't it?"
He saluted, and made a slight movement of the head in affirmation.
"I suppose I can take her to the cottage?" asked Connie.
"If your Ladyship wishes."
Again he looked into her eyes, with that calm, searching detached
glance. A man very much alone, and on his own.
"Would you like to come with me to the cottage, to your Gran,
dear?"
The child peeped up again. "Yes!" she simpered.
Connie disliked her; the spoilt, false little female. Nevertheless she
wiped her face, and took her hand. The keeper saluted in silence.
"Good morning!" said Connie.
It was nearly a mile to the cottage, and Connie senior was well
bored by Connie junior by the time the gamekeeper's picturesque
little home was in sight. The child was already as full to the brim
with tricks as a little monkey, and so self-assured.
At the cottage the door stood open, and there was a rattling heard
inside. Connie lingered, the child slipped her hand, and ran indoors.
"Gran! Gran!"
"Why, are yer back a'ready!"
The grandmother had been blackleading the stove, it was Saturday
morning. She came to the door in her sacking apron, a blacklead-
brush in her hand, and a black smudge on her nose. She was a little,
rather dry woman.
"Why, whatever?" she said, hastily wiping her arm across her face as
she saw Connie standing outside.
"Good morning!" said Connie. "She was crying, so I just brought her
home."
The grandmother looked round swiftly at the child:
"Why, wheer was yer Dad?"
The little girl clung to her grandmother's skirts and simpered.
"He was there," said Connie, "but he'd shot a poaching cat, and the
child was upset."
"Oh, you'd no right t'ave bothered, Lady Chatterley, I'm sure! I'm
sure it was very good of you, but you shouldn't 'ave bothered. Why,
did ever you see!"—and the old woman turned to the child: "Fancy
Lady Chatterley takin' all that trouble over yer! Why, she shouldn't
'ave bothered!"
"It was no bother, just a walk," said Connie smiling.
"Why, I'm sure t'was very kind of you, I must say! So she was
crying! I knew there'd be something afore they got far. She's
frightened of 'im, that's wheer it is. Seems 'e's almost a stranger to
'er, fair a stranger, and I don't think they're two as'd hit it off very
easy. He's got funny ways."
Connie didn't know what to say.
"Look, Gran!" simpered the child.
The old woman looked down at the sixpence in the little girl's hand.
"An' sixpence an' all! Oh, your Ladyship, you shouldn't, you
shouldn't. Why, isn't Lady Chatterley good to yer! My word, you're a
lucky girl this morning!"
She pronounced the name, as all the people did: Chat'ley.—"Isn't
Lady Chat'ley good to you!"—Connie couldn't help looking at the old
woman's nose, and the latter again vaguely wiped her face with the
back of her wrist, but missed the smudge.
Connie was moving away.... "Well, thank you ever so much, Lady
Chat'ley, I'm sure. Say thank you to Lady Chat'ley!"—this last to the
child.
"Thank you," piped the child.
"There's a dear!" laughed Connie, and she moved away, saying
"Good morning," heartily relieved to get away from the contact.
Curious, she thought, that that thin, proud man should have that
little, sharp woman for a mother!
And the old woman, as soon as Connie was gone, rushed to the bit
of mirror in the scullery, and looked at her face. Seeing it, she
stamped her foot with impatience. "Of course she had to catch me in
my coarse apron, and a dirty face! Nice idea she'd get of me!"
Connie went slowly home to Wragby. "Home!" ... it was a warm
word to use for that great, weary warren. But then it was a word
that had had its day. It was somehow cancelled. All the great words,
it seemed to Connie, were cancelled for her generation: love, joy,
happiness, home, mother, father, husband, all these great, dynamic
words were half dead now, and dying from day to day. Home was a
place you lived in, love was a thing you didn't fool yourself about,
joy was a word you applied to a good Charleston, happiness was a
term of hypocrisy used to bluff other people, a father was an
individual who enjoyed his own existence, a husband was a man you
lived with and kept going in spirits. As for sex, the last of the great
words, it was just a cocktail term for an excitement that bucked you
up for a while, then left you more raggy than ever. Frayed! It was as
if the very material you were made of was cheap stuff, and was
fraying out to nothing.
All that really remained was a stubborn stoicism: and in that there
was a certain pleasure. In the very experience of the nothingness of
life, phase after phase, étape after étape, there was a certain grisly
satisfaction. So that's that! Always this was the last utterance: home,
love, marriage, Michaelis: So that's that!—And when one died, the
last words to life would be: So that's that!—
Money? Perhaps one couldn't say the same there. Money one always
wanted. Money, success, the bitch-goddess, as Tommy Dukes
persisted in calling it, after Henry James, that was a permanent
necessity. You couldn't spend your last sou, and say finally: So that's
that!—No, if you lived even another ten minutes, you wanted a few
more sous for something or other. Just to keep the business
mechanically going, you needed money. You had to have it. Money
you have to have. You needn't really have anything else. So that's
that!—
Since, of course, it's not your own fault you are alive. Once you are
alive, money is a necessity, and the only absolute necessity. All the
rest you can get along without, at a pinch. But not money.
Emphatically, that's that!—
She thought of Michaelis, and the money she might have had with
him; and even that she didn't want. She preferred the lesser amount
which she helped Clifford to make by his writing. That she actually
helped to make.—"Clifford and I together, we make twelve hundred
a year out of writing;" so she put it to herself. Make money! Make it!
Out of nowhere! Wring it out of the thin air! The last feat to be
humanly proud of! The rest all-my-eye-Betty-Martin.
So she plodded home to Clifford, to join forces with him again, to
make another story out of nothingness: and a story meant money.
Clifford seemed to care very much whether his stories were
considered first class literature or not. Strictly, she didn't care.
Nothing in it! said her father. Twelve hundred pounds last year! was
the retort simple and final.
If you were young, you just set your teeth, and bit on and held on,
till the money began to flow from the invisible; it was a question of
power. It was a question of will; a subtle, subtle, powerful
emanation of will out of yourself brought back to you the mysterious
nothingness of money: a word on a bit of paper. It was a sort of
magic, certainly it was triumph. The bitch-goddess! Well, if one had
to prostitute oneself, let it be to a bitch-goddess! One could always
despise her even while one prostituted oneself to her, which was
good.
Clifford, of course, had still many childish taboos and fetishes. He
wanted to be thought "really good," which was all cock-a-hoopy
nonsense. What was really good was what actually caught on. It was
no good being really good and getting left with it. It seemed as if
most of the "really good" men just missed the bus. After all you only
lived one life, and if you missed the bus, you were just left on the
pavement, along with the rest of the failures.
Connie was contemplating a winter in London with Clifford, next
winter. He and she had caught the bus all right, so they might as
well ride on top for a bit, and show it.
The worst of it was, Clifford tended to become vague, absent, and to
fall into fits of vacant depression. It was the wound to his psyche
coming out. But it made Connie want to scream. Oh God, if the
mechanism of the consciousness itself was going to go wrong, then
what was one to do? Hang it all, one did one's bit! Was one to be let
down absolutely?
Sometimes she wept bitterly, but even as she wept she was saying
to herself: Silly fool, wetting hankies! As if that would get you
anywhere!
Since Michaelis, she had made up her mind she wanted nothing.
That seemed the simplest solution of the otherwise insoluble. She
wanted nothing more than what she'd got; only she wanted to get
ahead with what she'd got: Clifford, the stories, Wragby, the Lady-
Chatterley business, money, and fame, such as it was ... she wanted
to go ahead with it all. Love, sex, all that sort of stuff, just water-
ices! Lick it up and forget it. If you don't hang on to it in your mind,
it's nothing. Sex especially ... nothing! Make up your mind to it, and
you've solved the problem. Sex and a cocktail: they both lasted
about as long, had the same effect, and amounted to about the
same thing.
But a child, a baby! that was still one of the sensations. She would
venture very gingerly on that experiment. There was the man to
consider, and it was curious, there wasn't a man in the world whose
children you wanted. Mick's children! Repulsive thought! As lief have
a child to a rabbit! Tommy Dukes?... he was very nice, but somehow
you couldn't associate him with a baby, another generation. He
ended in himself. And out of all the rest of Clifford's pretty wide
acquaintance, there was not a man who did not rouse her contempt,
when she thought of having a child by him. There were several who
would have been quite possible as lovers, even Mick. But to let them
breed a child on you! Ugh! Humiliation and abomination.
So that was that!
Nevertheless, Connie had the child at the back of her mind. Wait!
wait! She would sift the generations of men through her sieve, and
see if she couldn't find one who would do.—"Go ye into the streets
and byways of Jerusalem, and see if ye can find a man." It had been
impossible to find a man in the Jerusalem of the prophet, though
there were thousands of male humans. But a man! C'est une autre
chose!
She had an idea that he would have to be a foreigner: not an
Englishman, still less an Irishman. A real foreigner.
But wait! wait! Next winter she would get Clifford to London; the
following winter she would get him abroad to the South of France,
Italy. Wait! She was in no hurry about the child. That was her own
private affair, and the one point on which, in her own queer, female
way, she was serious to the bottom of her soul. She was not going
to risk any chance comer, not she! One might take a lover almost at
any moment, but a man who should beget a child on one ... wait!
wait! it's a very different matter.—"Go ye into the streets and byways
of Jerusalem...." It was not a question of love; it was a question of a
man. Why, one might even rather hate him, personally. Yet if he was
the man, what would one's personal hate matter? This business
concerned another part of oneself.
It had rained as usual, and the paths were too sodden for Clifford's
chair, but Connie would go out. She went out alone every day now,
mostly in the wood, where she was really alone. She saw nobody
there.
This day, however, Clifford wanted to send a message to the keeper,
and as the boy was laid up with influenza,—somebody always
seemed to have influenza at Wragby,—Connie said she would call at
the cottage.
The air was soft and dead, as if all the world were slowly dying. Grey
and clammy and silent, even from the shuffling of the collieries, for
the pits were working short time, and today they were stopped
altogether. The end of all things!
In the wood all was utterly inert and motionless, only great drops fell
from the bare boughs, with a hollow little crash. For the rest, among
the old trees was depth within depth of grey, hopeless, inertia,
silence, nothingness.
Connie walked dimly on. From the old wood came an ancient
melancholy, somehow soothing to her, better than the harsh
insentience of the outer world. She liked the inwardness of the
remnant of forest, the unspeaking reticence of the old trees. They
seemed a very power of silence, and yet a vital presence. They, too,
were waiting: obstinately, stoically waiting, and giving off a potency
of silence. Perhaps they were only waiting for the end; to be cut
down, cleared away, the end of the forest, for them the end of all
things. But perhaps their strong and aristocratic silence, the silence
of strong trees, meant something else.
As she came out of the wood on the north side, the keeper's
cottage, a rather dark, brown stone cottage, with gables and a
handsome chimney, looked uninhabited, it was so silent and alone.
But a thread of smoke rose from the chimney, and the little railed-in
garden in the front of the house was dug and kept very tidy. The
door was shut.
Now she was here she felt a little shy of the man, with his curious
far-seeing eyes. She did not like bringing him orders, and felt like
going away again. She knocked softly, no one came. She knocked
again, but still not loudly. There was no answer. She peeped through
the window, and saw the dark little room, with its almost sinister
privacy, not wanting to be invaded.
She stood and listened, and it seemed to her she heard sounds from
the back of the cottage. Having failed to make herself heard, her
mettle was roused, she would not be defeated.
So she went round the side of the house. At the back of the cottage
the land rose steeply, so the backyard was sunken, and enclosed by
a low stone wall. She turned the corner of the house and stopped.
In the little yard two paces beyond her, the man was washing
himself, utterly unaware. He was naked to the hips, his velveteen
breeches slipping down over his slender loins. And his white slim
back was curved over a big bowl of soapy water, in which he ducked
his head, shaking his head with a queer, quick little motion, lifting his
slender white arms, and pressing the soapy water from his ears,
quick, subtle as a weasel playing with water, and utterly alone.
Connie backed away round the corner of the house, and hurried
away to the wood. In spite of herself, she had had a shock. After all,
merely a man washing himself; common-place enough, Heaven
knows!
Yet in some curious way it was a visionary experience: it had hit her
in the middle of the body. She saw the clumsy breeches slipping
down over the pure, delicate, white loins, the bones showing a little,
and the sense of aloneness, of a creature purely alone, overwhelmed
her. Perfect, white, solitary nudity of a creature that lives alone, and
inwardly alone. And beyond that, a certain beauty of a pure
creature. Not the stuff of beauty, not even the body of beauty, but a
lambency, the warm, white flame of a single life, revealing itself in
contours that one might touch: a body!
Connie had received the shock of vision in her womb, and she knew
it; it lay inside her. But with her mind she was inclined to ridicule. A
man washing himself in a backyard! No doubt with evil-smelling
yellow soap!—She was rather annoyed; why should she be made to
stumble on these vulgar privacies?
So she walked away from herself, but after a while she sat down on
a stump. She was too confused to think. But in the coil of her
confusion, she was determined to deliver her message to the fellow.
She would not be balked. She must give him time to dress himself,
but not time to go out. He was probably preparing to go out
somewhere.
So she sauntered slowly back, listening. As she came near, the
cottage looked just the same. A dog barked, and she knocked at the
door, her heart beating in spite of herself.
She heard the man coming lightly downstairs. He opened the door
quickly, and startled her. He looked uneasy himself, but instantly a
laugh came on his face.
"Lady Chatterley!" he said. "Will you come in?"
His manner was so perfectly easy and good, she stepped over the
threshold into the rather dreary little room.
"I only called with a message from Sir Clifford," she said in her soft,
rather breathless voice.
The man was looking at her with those blue, all-seeing eyes of his,
which made her turn her face aside a little. He thought her comely,
almost beautiful, in her shyness, and he took command of the
situation himself at once.
"Would you care to sit down?" he asked, presuming she would not.
The door stood open.
"No thanks! Sir Clifford wondered if you would ..." and she delivered
her message, looking unconsciously into his eyes again. And now his
eyes looked warm and kind, particularly to a woman, wonderfully
warm, and kind, and at ease.
"Very good, your Ladyship. I will see to it at once."
Taking an order, his whole self had changed, glazed over with a sort
of hardness and distance. Connie hesitated, she ought to go. But
she looked round the clean, tidy, rather dreary little sitting-room with
something like dismay.
"Do you live here quite alone?" she asked.
"Quite alone, your Ladyship."
"But your mother...?"
"She lives in her own cottage in the village."
"With the child?" asked Connie.
"With the child!"
And his plain, rather worn face took on an indefinable look of
derision. It was a face that changed all the time, baffling.
"No," he said, seeing Connie stand at a loss, "my mother comes and
cleans up for me on Saturdays; I do the rest myself."
Again Connie looked at him. His eyes were smiling again, a little
mockingly, but warm and blue, and somehow kind. She wondered at
him. He was in trousers and flannel shirt and a grey tie, his hair soft
and damp, his face rather pale and worn-looking. When the eyes
ceased to laugh they looked as if they had suffered a great deal, still
without losing their warmth. But a pallor of isolation came over him,
she was not really there for him.
She wanted to say so many things, and she said nothing. Only she
looked up at him again, and remarked:
"I hope I didn't disturb you?"
The faint smile of mockery narrowed his eyes.
"Only combing my hair, if you don't mind. I'm sorry I hadn't a coat
on, but then I had no idea who was knocking. Nobody knocks here,
and the unexpected sounds ominous."
He went in front of her down the garden path to hold the gate. In
his shirt, without the clumsy velveteen coat, she saw again how
slender he was, thin, stooping a little. Yet, as she passed him, there
was something young and bright in his fair hair, and his quick eyes.
He would be a man about thirty-seven or eight.
She plodded on into the wood, knowing he was looking after her; he
upset her so much, in spite of herself.
And he, as he went indoors, was thinking: "She's nice, she's real!
she's nicer than she knows."
She wondered very much about him; he seemed so unlike a
gamekeeper, so unlike a working-man anyhow; although he had
something in common with the local people. But also something very
uncommon.
"The gamekeeper, Mellors, is a curious kind of person," she said to
Clifford; "he might almost be a gentleman."
"Might he?" said Clifford. "I hadn't noticed."
"But isn't there something special about him?" Connie insisted.
"I think he's quite a nice fellow, but I know very little about him. He
only came out of the army last year, less than a year ago. From
India, I rather think. He may have picked up certain tricks out there,
perhaps he was an officer's servant, and improved on his position.
Some of the men were like that. But it does them no good, they
have to fall back into their old place when they get home again."
Connie gazed at Clifford contemplatively. She saw in him the peculiar
tight rebuff against anyone of the lower classes who might be really
climbing up, which she knew was characteristic of his breed.
"But don't you think there is something special about him?" she
asked.
"Frankly, no! Nothing I had noticed."
He looked at her curiously, uneasily, half-suspiciously. And she felt he
wasn't telling her the real truth; he wasn't telling himself the real
truth, that was it. He disliked any suggestion of a really exceptional
human being. People must be more or less at his level, or below it.
Connie felt again the tightness, niggardliness of the men of her
generation. They were so tight, so scared of life!

CHAPTER VII

When Connie went up to her bedroom she did what she had not
done for a long time: took off all her clothes, and looked at herself
naked in the huge mirror. She did not know what she was looking
for, or at, very definitely, yet she moved the lamp till it shone full on
her.
And she thought, as she had thought so often ... what a frail, easily
hurt, rather pathetic thing a human body is, naked; somehow a little
unfinished, incomplete!
She had been supposed to have rather a good figure, but now she
was out of fashion: a little too female, not enough like an adolescent
boy. She was not very tall, a bit Scottish and short; but she had a
certain fluent, down-slipping grace that might have been beauty. Her
skin was faintly tawny, her limbs had a certain stillness, her body
should have had a full, down-slipping richness; but it lacked
something.
Instead of ripening its firm, down-running curves, her body was
flattening and going a little harsh. It was as if it had not had enough
sun and warmth; it was a little greyish and sapless.
Disappointed of its real womanhood, it had not succeeded in
becoming boyish, and unsubstantial, and transparent; instead it had
gone opaque.
Her breasts were rather small, and dropping pear-shaped. But they
were unripe, a little bitter, without meaning hanging there. And her
belly had lost the fresh, round gleam it had had when she was
young, in the days of her German boy, who really loved her
physically. Then it was young and expectant, with a real look of its
own. Now it was going slack, and a little flat, thinner, but with a
slack thinness. Her thighs, too, that used to look so quick and
glimpsey in their female roundness, somehow they too were going
flat, slack, meaningless.
Her body was going meaningless, going dull and opaque, so much
insignificant substance. It made her feel immensely depressed and
hopeless. What hope was there? She was old, old at twenty-seven,
with no gleam and sparkle in the flesh. Old through neglect and
denial, yes denial. Fashionable women kept their bodies bright like
delicate porcelain, by external attention. There was nothing inside
the porcelain; but she was not even as bright as that. The mental
life! Suddenly she hated it with a rushing fury, the swindle!
She looked in the other mirror's reflection at her back, her waist, her
loins. She was getting thinner, but to her it was not becoming. The
crumple of her waist at the back, as she bent back to look, was a
little weary; and it used to be so gay-looking. And the longish slope
of her haunches and her buttocks had lost its gleam and its sense of
richness. Gone! Only the German boy had loved it, and he was ten
years dead, very nearly. How time went by! Ten years dead, and she
was only twenty-seven. That healthy boy with his fresh, clumsy
sensuality that she had then been so scornful of! Where would she
find it now? It was gone out of men. They had their pathetic, two-
second spasms like Michaelis; but no healthy human sensuality, that
warms the blood and freshens the whole being.
Still she thought the most beautiful part of her was the long-sloping
fall of the haunches from the socket of the back, and the
slumberous, round stillness of the buttocks. Like hillocks of sand the
Arabs say, soft and downward-slipping with a long slope. Here the
life still lingered hoping. But here too she was thinner, and going
unripe, astringent.
But the front of her body made her miserable. It was already
beginning to slacken, with a slack sort of thinness, almost withered,
going old before it had ever really lived. She thought of the child she
might somehow bear. Was she fit, anyhow?
She slipped into her nightdress, and went to bed, where she sobbed
bitterly. And in her bitterness burned a cold indignation against
Clifford, and his writings and his talk: against all the men of his sort
who defrauded a woman even of her own body.
Unjust! Unjust! The sense of deep physical injustice burned to her
very soul.
But in the morning, all the same, she was up at seven, and going
downstairs to Clifford. She had to help him in all the intimate things,
for he had no man, and refused a woman-servant. The
housekeeper's husband, who had known him as a boy, helped him,
and did any heavy lifting; but Connie did the personal things, and
she did them willingly. It was a demand on her, but she had wanted
to do what she could.
So she hardly ever went away from Wragby, and never for more
than a day or two; when Mrs. Betts, the housekeeper, attended to
Clifford. He, as was inevitable in the course of time, took all the
service for granted. It was natural he should.
And yet, deep inside herself, a sense of injustice, of being
defrauded, began to burn in Connie. The physical sense of injustice
is a dangerous feeling, once it is awakened. It must have outlet, or it
eats away the one in whom it is aroused. Poor Clifford, he was not
to blame. His was the greater misfortune. It was all part of the
general catastrophe.
And yet was he not in a way to blame? This lack of warmth, this lack
of the simple, warm, physical contact, was he not to blame for that?
He was never really warm, nor even kind, only thoughtful,
considerate, in a well-bred, cold sort of way! But never warm as a
man can be warm to a woman, as even Connie's father could be
warm to her, with the warmth of a man who did himself well, and
intended to, but who still could comfort a woman with a bit of his
masculine glow.
But Clifford was not like that. His whole race was not like that. They
were all inwardly hard and separate, and warmth to them was just
bad taste. You have to get on without it, and hold your own; which
was all very well if you were of the same class and race. Then you
could keep yourself cold and be very estimable, and hold your own,
and enjoy the satisfaction of holding it. But if you were of another
class and another race it wouldn't do; there was no fun merely
holding your own, and feeling you belonged to the ruling class. What
was the point, when even the smartest aristocrats had really nothing
positive of their own to hold, and their rule was really a farce, not
rule at all? What was the point? It was all cold nonsense.
A sense of rebellion smouldered in Connie. What was the good of it
all? What was the good of her sacrifice, her devoting her life to
Clifford? What was she serving, after all? A cold spirit of vanity, that
had no warm human contacts, and that was as corrupt as any low-
born Jew, in craving for prostitution to the bitch-goddess, Success.
Even Clifford's cool and contactless assurance that he belonged to
the ruling class didn't prevent his tongue lolling out of his mouth, as
he panted after the bitch-goddess. After all, Michaelis was really
more dignified in the matter, and far, far more successful. Really, if
you looked closely at Clifford, he was a buffoon, and a buffoon is
more humiliating than a bounder.
As between the two men, Michaelis really had far more use for her
than Clifford had. He had even more need of her. Any good nurse
can attend to crippled legs! And as for the heroic effort, Michaelis
was a heroic rat, and Clifford was very much of a poodle showing
off.
There were people staying in the house, among them Clifford's Aunt
Eva, Lady Bennerley. She was a thin woman of sixty, with a red
nose, a widow, and still something of a "grande dame." She
belonged to one of the best families, and had the character to carry
it off. Connie liked her, she was so perfectly simple and frank, as far
as she intended to be frank, and superficially kind. Inside herself she
was a past-mistress in holding her own, and holding other people a
little lower. She was not at all a snob: far too sure of herself. She
was perfect at the social sport of coolly holding her own, and making
other people defer to her.
She was kind to Connie, and tried to worm into her woman's soul
with the sharp gimlet of her well-born observations.
"You're quite wonderful, in my opinion," she said to Connie. "You've
done wonders for Clifford. I never saw any budding genius myself,
and there he is all the rage."—Aunt Eva was quite complacently
proud of Clifford's success. Another feather in the family cap! She
didn't care a straw about his books, but why should she?
"Oh, I don't think it's my doing," said Connie.
"It must be! Can't be anybody else's. And it seems to me you don't
get enough out of it."
"How?"
"Look at the way you are shut up here. I said to Clifford: If that child
rebels one day you'll have yourself to thank!"
"But Clifford never denies me anything," said Connie.
"Look here, my dear child,"—and Lady Bennerley laid her thin hand
on Connie's arm. "A woman has to live her life, or live to repent not
having lived it. Believe me!" And she took another sip of brandy,
which maybe was her form of repentance.
"But I do live my life, don't I?"
"Not in my idea! Clifford should bring you to London, and let you go
about. His sort of friends are all right for him, but what are they for
you? If I were you I should think it wasn't good enough. You'll let
your youth slip by, and you'll spend your old age, and your middle
age too, repenting it."
Her ladyship lapsed into contemplative silence, soothed by the
brandy.
But Connie was not keen on going to London, and being steered into
the smart world by Lady Bennerley. She didn't feel really smart, it
wasn't interesting. And she did feel the peculiar, withering coldness
under it all; like the soil of Labrador, which has gay little flowers on
its surface, and a foot down is frozen.
Tommy Dukes was at Wragby, and another man, Harry Winterslow,
and Jack Strangeways with his wife Olive. The talk was much more
desultory than when only the cronies were there, and everybody
was a bit bored, for the weather was bad, and there was only
billiards, and the pianola to dance to.
Olive was reading a book about the future, when babies would be
bred in bottles, and women would be "immunised."
"Jolly good thing too!" she said. "Then a woman can live her own
life." Strangeways wanted children, and she didn't.
"How'd you like to be immunised?" Winterslow asked her, with an
ugly smile.
"I hope I am; naturally," she said. "Anyhow the future's going to
have more sense, and a woman needn't be dragged down by her
functions."
"Perhaps she'll float off into space altogether," said Dukes.
"I do think sufficient civilization ought to eliminate a lot of the
physical disabilities," said Clifford. "All the love-business for example,
it might just as well go. I suppose it would if we could breed babies
in bottles."
"No!" cried Olive. "That might leave all the more room for fun."
"I suppose," said Lady Bennerley, contemplatively, "if the love-
business went, something else would take its place. Morphia
perhaps. A little morphine in all the air. It would be wonderfully
refreshing for everybody."
"The government releasing ether into the air on Saturdays, for a
cheerful weekend!" said Jack. "Sounds all right, but where should we
be by Wednesday?"
"So long as you can forget your body you are happy," said Lady
Bennerley. "And the moment you begin to be aware of your body,
you are wretched. So, if civilization is any good, it has to help us to
forget our bodies, and then time passes happily without our knowing
it."
"Help us to get rid of our bodies altogether," said Winterslow. "It's
quite time man began to improve on his own nature, especially the
physical side of it."
"Imagine if we floated like tobacco smoke," said Connie.
"It won't happen," said Dukes. "Our old show will come flop; our
civilization is going to fall. It's going down the bottomless pit, down
the chasm. And believe me, the only bridge across the chasm will be
the phallus!"
"Oh do! do be impossible, General!" cried Olive.
"I believe our civilization is going to collapse," said Aunt Eva.
"And what will come after it?" asked Clifford.
"I haven't the faintest idea, but something, I suppose," said the
elderly lady.
"Connie says people like wisps of smoke, and Olive says immunised
women, and babies in bottles, and Dukes says the phallus is the
bridge to what comes next. I wonder what it will really be?" said
Clifford.
"Oh, don't bother! let's get on with today," said Olive. "Only hurry up
with the breeding bottle, and let us poor women off."
"There might even be real men, in the next phase," said Tommy.
"Real, intelligent, wholesome men, and wholesome nice women!
Wouldn't that be a change, an enormous change from us? We're not
men, and the women aren't women. We're only cerebrating make-

You might also like