Statistical Inference Based On Estimating Functions in Exact and Misspecified Models
Janicki, Ryan Louis
Kagan, Abram M
MetadataShow full item record
Estimating functions, introduced by Godambe, are a useful tool for constructing estimators. The classical maximum likelihood estimator and the method of moments estimator are special cases of estimators generated as the solution to certain estimating equations. The main advantage of this method is that it does not require knowledge of the full model, but rather of some functionals, such as a number of moments. We define an estimating function <bold>Ψ</bold> to be a Fisher estimating function if it satisfies E<sub><bold>θ</bold></sub>(<bold>Ψ</bold><bold>Ψ</bold><super>T</super) = -E<sub><bold>θ</bold></sub>(d<bold>Ψ</bold>/d<bold>θ</bold>). The motivation for considering this class of estimating functions is that a Fisher estimating function behaves much like the Fisher score, and the estimators generated as solutions to these estimating equations behave much like maximum likelihood estimators. The estimating functions in this class share some of the same optimality properties as the Fisher score function and they have applications for estimation in submodels, elimination of nuisance parameters, and combinations of independent samples. We give some applications of estimating functions to estimation of a location parameter in the presence of a nuisance scale parameter. We also consider the behavior of estimators generated as solutions to estimating equations under model misspecication when the misspecication is small and can be parameterized. A problem related to model misspecication is attempting to distinguish between a finite number of competing parametric families. We construct an estimator that is consistent and efficient, regardless of which family contains the true distribution.