« Back to Results

Inference and Identification Issues in Econometrics

Paper Session

Sunday, Jan. 7, 2018 8:00 AM - 10:00 AM

Marriott Philadelphia Downtown, Meeting Room 410
Hosted By: Econometric Society
  • Chair: Frank Kleibergen, University of Amsterdam

Asymptotic Efficiency in Estimation with Moment Restrictions Including Latent Variables

Eric Michel Renault
Brown University


Following Schennach (2014), we consider in this paper a general method to convert a model defined by moment conditions that involve both observed and unobserved variables into equivalent moment conditions that involve only observable variables. However, by contrast with Schennach (2014), we set the focus on the particular case where the unknown parameters are point identified. Then, using the least favorable exponentially tilted probability distribution of observable variables given latent ones is shown to be the right way to characterize a semi-parametric efficiency bound for consistent asymptotically normal estimators of these parameters. This is the reason why a suitably devised version of Schennach’s ELVIS (Entropic Latent Variable Integration via Simulation) estimator gives us a way to reach the efficiency bound. Even more importantly, the user does not need to know whether the parameters are point identified or not, because Schennach’s results for partial identification are at stake, or because efficient point estimation is achieved. This ability to efficiently take advantage of a true, albeit possibly unknown, point identification property is illustrated through the example of an error in variables model that is studied both analytically and via Monte Carlo experiments.

Estimation and Inference with a (Nearly) Singular Jacobian

Sukjin Han
University of Texas-Austin
Adam McCloskey
Brown University


This paper develops extremum estimation and inference results for nonlinear models with very general forms of potential identification failure when the source of this identification failure is known. We examine models that may have a general deficient rank Jacobian in certain parts of the parameter space. When identification fails in one of these models, it becomes under-identified and the identification status of individual parameters is not generally straightforward to characterize. We provide a systematic reparameterization procedure that leads to a reparameterized model with straightforward identification status. Using this reparameterization, we determine the asymptotic behavior of standard extremum estimators and Wald statistics under a comprehensive class of parameter sequences characterizing the strength of identification of the model parameters, ranging from non-identification to strong identification. Using the asymptotic results, we propose hypothesis testing methods that make use of a standard Wald statistic and data-dependent critical values, leading to tests with correct asymptotic size regardless of identification strength and good power properties. Importantly, this allows one to directly conduct uniform inference on low-dimensional functions of the model parameters, including one-dimensional subvectors. The paper illustrates these results in three examples: a sample selection model, a triangular threshold crossing model and a collective model for household expenditures.

Mis-classified, Binary, Endogenous Regressors: Identification and Inference

Francis DiTraglia
University of Pennsylvania
Camilo Garcia-Jimeno
University of Pennsylvania


This paper studies identification and inference for the effect of a mis-classified,
binary, endogenous regressor when a discrete-valued instrumental variable is available.
We begin by showing that the only existing point identification result for this model
is incorrect. We go on to derive the sharp identified set under mean independence
assumptions for the instrument and measurement error, and find that these fail to point
identify the effect of interest. This motivates us to consider alternative and slightly
stronger assumptions: we show that adding second and third moment independence
assumptions suffices to identify the model. We then turn our attention to inference.
We show that both our model, and related models from the literature that assume
regressor exogeneity, suffer from weak identification when the effect of interest is small. To address this difficulty, we exploit the inequality restrictions that emerge from our derivation of the sharp identified set under mean independence only. These restrictions remain informative irrespective of the strength of identification. Combining these with the moment equalities that emerge from our identification result, we propose a robust inference procedure using tools from the moment inequality literature. Our method
performs well in simulations.

A More Powerful Subvector Anderson Rubin Test in Linear Instrumental Variable Regression

Patrik Guggenberger
Pennsylvania State University
Frank Kleibergen
University of Amsterdam
Sophocles Mavroeidis
University of Oxford


We study subvector inference in the linear instrumental variables model assuming homoskedasticity but allowing for weak instruments. The subvector Anderson and Rubin (1949) test that uses chi square critical values with degrees of freedom reduced by the number of parameters not under test, proposed by Guggenberger et al (2012), controls size but is generally conservative. We propose a conditional subvector Anderson and Rubin test that uses data-dependent critical values that adapt to the strength of identification of the parameters not under test. This test has correct size and strictly higher power than the subvector Anderson and Rubin test by Guggenberger et al (2012). We provide tables with conditional critical values so that the new test is quick and easy to use.
JEL Classifications
  • C12 - Hypothesis Testing: General
  • C26 - Instrumental Variables (IV) Estimation