Time-Series Misspecification / independet variable is measured with error

Referring to CFAI Vol 1, p. 388:

If we considered a univariate regression (only one independent variable) and the independent variable is measured with error, the estimated coefficient will be biased toward 0. Could anybody explain why the estimated beta will be biased toward zero?

If the independent variable is measured with error, the set of values for the independent variable will tend to be more spread out (i.e., have a larger standard deviation) than the values measured without error. If you have the same y values, but your x values are more spread out, the regression line will be flatter; i.e., the absolute value of the slope will be smaller (closer to zero).

This makes a lot of sense. wink

Thanks S2000!

Sometimes I do that. It’s usually accidental.

You’re quite welcome.