If we considered a univariate regression (only one independent variable) and the independent variable is measured with error, the estimated coefficient will be biased toward 0. Could anybody explain why the estimated beta will be biased toward zero?
If the independent variable is measured with error, the set of values for the independent variable will tend to be more spread out (i.e., have a larger standard deviation) than the values measured without error. If you have the same y values, but your x values are more spread out, the regression line will be flatter; i.e., the absolute value of the slope will be smaller (closer to zero).