SOLUTIONS MANUAL for Introduction to Econometrics, Global Edition 4th Edition James H. Stock; Mark Watson - (GET DOWNLOAD LINK FOR MULTIPLE FILES + EXCEL)
Applied Financial Econometrics
Chapter 6 Linear Regression with Multiple Regressors
6.8 Control Variables and Conditional Mean Independence
A control variable is not the object of interest in the study; rather, it is a
regressor included to hold constant factors that, if neglected, could lead
the estimated causal effect of interest to suffer from omitted variable bias.
The reason for including control variables in multiple regression is to make
the variables of interest no longer correlated with the error term, once the
control variables are held constant. This idea is made precise by replacing
the first assumption of the multiple regression model with an assumption
called conditional mean independence. Conditional mean
independence requires that the conditional expectation of ui given the
variable of interest, and the control variables does not depend on (is
independent of) the variable of interest, although it can depend on control
variables. The idea of conditional
mean independence is that once you control for the W’s the X’s can be
treated as if they were randomly assigned, in the sense that the
conditional mean of the error term no longer depends on X. controlling for
W makes the X’s uncorrelated with the error term, so that OLS can
estimate the causal effects on Y of a change in each of the X’s. The
control variables, however, remain correlated with the error term, so the
coefficients on the control variables are subject to omitted variables bias
and do not have a causal interpretation. Meaning that if the conditional
mean independence holds, then the OLS estimators of the coefficients on
the X’s are unbiased estimators of the causal effects on the X’s, but the
OLS estimators of the coefficients on the W’s are in general biased. This
bias does not pose a problem as we are interested in the coefficients on
the X’s, not on the W’s. Under the additional
assumptions 2-4, the OLS estimators are consistent and have a normal
distribution in large samples.
Chapter 12 Instrumental Variables Regression
When doing a regression several problems, including omitted variables,
errors in variables, and simultaneous causality, that make the error term
correlated with the regressor can arise. Omitted variable bias can be
addressed directly by including the omitted variable in a multiple
regression, but this is only feasible if you have data on the omitted
variable. And sometimes, such as when causality runs both form X to Y
and from Y to X so that there is simultaneous causality bias, multiple
, regression simply cannot eliminate the bias. Instrumental Variables
(IV) Regression is a general way to obtain a consistent estimator of the
unknown causal coefficients when the regressor, X, is correlated with the
error term, u. to understand how IV regressions works, think of the
variation in X as having two parts: one part that, for whatever reason, is
correlated with u (this is the part that causes the problems) and a second
part that is uncorrelated with u. If you had information that allowed you to
isolate the second part, you could focus on those variations in X that are
uncorrelated with u and disregard the variations in X that bias the OLS
estimates. This is, in fact, what IV regression does. The information about
the movements in X that are uncorrelated with u is gleaned from one or
more additional variables, called instrumental variables or
instruments.
12.1 The IV estimator with a Single Regressor and a Single
Instrument
If X and u are correlated, the OLS estimator is inconsistent; that is, it may
not be close to the true value of the causal coefficient even when the
sample is very large.
The two conditions for a valid instrument:
1. Instrument relevance: corr(Zi, Xi) 0
2. Instrument exogeneity: corr(Zi, ui) = 0
An instrument that is relevant and exogenous can capture
movements in Xi that are exogenous.
If the instrument Z satisfies the conditions of instrument relevance
and exogeneity, the coefficient 1 can be estimated using an IV
estimator called two stage least squares (TSLS).
The first stage decomposes X into two components: a problematic
component that may be correlated with the regression error and another,
problem-free component that is uncorrelated with the error. The first stage
begins with a population regression linking X and Z.
X i =π 0+ π 1 Z i + v i
The component π 0 + π 1 Z iis the part of Xi uncorrelated with the error term.
The other component vi is the problematic component of Xi that is
correlated with the error term.
Although the two stages of TSLS make the estimator seem complicated,
when there is a single X and a single instrument Z, as we assume in this
section, there is a simple formula for the TSLS estimator. Let szy be the
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller BiggieDaggoe. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $6.33. You're not tied to anything after your purchase.