Just started studying and already confused - not a good sign Just a quick Q: What is the difference between R squared, SSR and SST?
R^2 = [SSR/SST] SSR = Sum of Squares Regression = Explained Sum of Squares SST = Total Sum of Squares = sum ((yi - ybar)^2)
Here is a good description: http://en.wikipedia.org/wiki/Coefficient_of_determination Quick summary: R squared is a measure of goodness of fit. It is defined as the portion of variance of the dependent variable that is explained by the best fit (linear function of dependent variable). R squared = 1-SSR/SST, where SSR = residual sum of squares, SST = total sum of squares.
I used SSR as Sum of squares regression - which was wrong. Maratikus is right… it is sum of squares residual…
Should have been more clear, I am not looking for mathematical forumulae I am want to understand conceptually what the differences are. So R squared in the variation in the dependant explained by the independant. But that is the same definition for SSR?
R^2 - the percent of variation in the dependent var explained by the indpendents vars SSR - ?? RSS - the total amount (rather than percent) of variation in the dependent var explained by the independent vars SSE - the total amount of variation in the dependent var unexplained by the independent vars SST - The total amount of variation in the dependent var (explained and unexplained) Therefore it should make sense that R^2 = RSS / SST
SSR and RSS are the same thing I think (need to check notes), other than that, that’s exatly what I was looking for marty3. THANKS a lot.
cpk123 Wrote: ------------------------------------------------------- > I used SSR as Sum of squares regression - which > was wrong. Maratikus is right… it is sum of > squares residual… cpk, both of us are correct. If SSR is the sum of squares regression, then we get your formula R squared = SSR/SST (which is correct) if SSR is teh sum of squares residuals, then R squared = 1-SSR/SST (which is also correct).
I agree with marty3. Also just a note, I don’t recall (at least not in Schwesser) seeing the acronym SSR, unless your referring to the Sum of Squared Error/Residual, which uses the acronym SSE and not SSR. RSS is the sum of square regressions.
Those are the acronyms they are refering to… SSR/RSS - Sum of Square Regression SSE - Sum of squared Error/Residual TSS - Total Sum of Squares RSS+SSE=TSS
R^2 is RSS/SST…but do we want a high or low R^2? RSS = explained variation SSE = unexplained variation SST = total variation I’m pretty sure we want a high R^2, is that correct?
A high R^2 indicates that the independent variable would explain a larger portion of the dependent variable than the error term does. or rather that: b1*(X) would comprise a larger portion of Y than e would in the equation Y=b1*(X) + e So, yes you would want a high R^2 when creating a regression equation.
ebwana mambovipi ? niaje? mtu wa wapi wewe? Kenya? TZ? au?
I think I’m having a headache again! haha.