standard error for different variables

I just finished Reading 9 and I am having trouble wrapping my head around the fact that you can have different standard errors for different variables within the same regression. For instance, you can have a standard error for the intercept coefficient that is different from the standard error for the slope coefficient. The formula given to calculate the standard error involves both variables of the regression. Can someone please set me straight before I pull out all of my hair? Thanks!

If you’re referring to the specific calculation, I wouldn’t worry about understanding it to any real level of detail for the CFA exams. If you want to understand it better for your own purposes, do a search for the derivation of the variances or standard errors of those coefficients. Walking through the derivation will let you see why these specific standard errors are different. However, the most I think you need for the CFA exam is to understand that each estimate has it’s own standard error (which likely has a different calculation).

If you were just asking why each estimated coefficient can have it’s own standard error, then the answer is very straight forward. Each parameter is estimated, and these estimates have uncertainty surrounding them. They each have a standard error to quantify the uncertainty/precision of the estimate. In other words, different parameter estimated = different standard error for that estimate.

Thanks, ticker. I think i have got it now. I was getting near the end of a particularly long day of studying and my brain had quit processing information. I see now that the standard error of the estimate reflects the uncertainty of the regression model as a whole and then there is the uncertainty of the estimate of a particular parameter within the model. For some reason my brain was trying to tell me these should be the same thing.

Glad to help!