SEE

Could anybody confirm the following?

SEE = [SSE/(n-2)]^0.5 = MSE^0.5 = [sum from i to n of (y_i - Y_i hat)^2/(n-2)]^0.5

I would add that MSE^.5 = RMSE

So, SEE = RMSE?

Someone correct me if i’m wrong but I think the difference here is the amount of variables in the regressions. To complicate your formula further:

SSE/n-k-1 = MSE^.5 = RMSE = SEE

if you have a single linear regression then SSE/n - 2 (where k =1), is applicable. Why they even introduce SEE as SSE/n-2 is counter productive to me

Yes indeed, it certainly is.

So:

SEE = [SSE/(n-k-1)]^0.5 = MSE^0.5 = [sum from i to n of (y_i - Y_i hat)^2/(n-k-1)]^0.5 = RMSE

Now explain what SEE/RMSE and SSE mean :slight_smile:

SSE is the sum of squared errors / residuals, which is: sum from i to n of (y_i - Y_i hat)^2 - interpretation: can’t say frown

SEE is the standard error of estimate and it measure the accuracy of the regression - lower SEE, better; it would also say, it is kind of a dispersion from the observations around the fitted regression line?!

this is how i interrupted it:

SSE is the variance of the errors

SEE is the standard deviation of the errors

Notable difference here being SSE is the sum of the squared difference between the actual Y values and the predicted Y values

Not sure if that’s fully correct but it’s the only way I make sense of it in my mind

O.K. - thanks Galli