Coefficients for Alpha and Beta for Hypothesis Testing

Ok, example 7 from Elan (pretty similar to the CFAI example). I don’t understand why suddenly there are coefficients and standard errors for alpha and beta. They just give them to us in the question…how did they get calculated? Where are they from and what do they mean?

I don’t understand why we go from having the one coefficient with the SSE and the SEE to two. What am I missing here?

Could you be more specific, or could you provide the CFAI page for the example?

The calculations are fairly complex and well beyond the scope of the CFA curriculum.

Suffice it to say that if you have a software package that does linear regression it’ll give you all of those numbers, but it’s up to you to interpret them.

Ok well…I guess where i’m getting confused is what is the difference between the SEE and the standard error for the intercept and slope? I thought that the SEE would be the standard error for the slope, but clearly this is not true.

Are we always going to be given full regression outputs?

The SEE is the standard error of the estimate-- think of it like an estimated standard deviation of the error term in the regression.

The standard error of a coefficient is an estimate for the standard deviation of the sampling distribution pertaining to that estimator. In other words, the standard error of the slope estimator, for example, gives us an idea of how precise our estimate of the slope is in our regression (larger s.e. of coefficient, more uncertainty).

Also, these standard errors incorporate the SEE into their calculation. For example, the standard error of the slope in simple linear regression is actually: SEE / ((SSTx)^0.5) where SEE is the standard error of the estimate, and SSTx is the sum of squares total (SUM [x-XBAR]^2) for the independent variable x. The standard error of the intercept is a little different, but you get the idea.

Long story short, SEE gives us insight into the accuracy of our model predictions, and the standard error for an estimated coefficient gives us information on how precisely we have estimated the coefficient.

Hopefully, this helps clear things up a little bit.

It’s clearing up a little. Let me see how close I am.

The SEE is the error term that shows how much the standard deviation of a data point from the line of best fit. This means that if the correlation coefficient is 1, then the error term must be 0 because all data points lie on the line? (fingers crossed).

The standard error of the coefficients for slope and intercept are a representation of how accurate these coefficients are then? So if I had a very large SEE then that would mean the data points are often far away from the line of best fit, but maybe I could still have a low standard error for the coefficient…so we’re confident that the line has a certain slope but the data still has a large variance around this line. But then again you said SEE is used in the calculation of the standard error of coefficients, so if I had a large SEE then I should also have a large standard error?

I don’t recall ever seeing that standard error formula in this reading. Am I missing it or is standard error of the coefficients always provided in the question?

Edit: And just when I think I have it all figured out they throw another error term in there.

The variance of the prediction error…what is this? Is it a combination of the uncertainty in the error term as well as the uncertainty in the parameters? Do I have to memorize that big formula or is there a way to reconstruct it conceptually?

Yes. Similarly if ρ = −1.

Yes: the standard errors for the regression coefficients tell us, in a manner of speaking, how much confidence we have that the coefficient numbers are correct.

As I wrote above, it’s beyond the scope of the CFA curriculum: you’re not missing it; it’s not there.

The standard errors of the coefficients will be given to you. μη φοβου.

Yes: it’s a bit of both.

I don’t know of a way to reconstruct it, but I haven’t really tried.

As for having to memorize it: only if it appears on the exam in June.

Thank you so much guys! I greatly appreciate your lengthy answers. You’ve saved this reading for me, lol.

This reading is difficult because I do my best when I can understand the concepts and reconstruct any formulas from there. But this reading has so many frickin acronyms (SEE, SSE, MSE, RSS, etc etc) that it’s difficult for me to understand what each one means and more importantly how they’re different from each other. I think I’ve got it now though.

S2000 I know you’re with Elan now. I’m doing an Elan question for this reading where the slope coefficient falls within the 99% confidence interval, and the answer is that the analyst is most likely to conclude that the coefficient is equal to 1. Would I be right to say that really the answer should actually be that the analyst can not rule out that the coefficient is 1 at a 1% significance level?

You’re correct: that’s the right way to interpret it.