What is the difference between R^2 and regression sum of squares?

They both measure the variation in the dependent variable explained by the independent variable. Yet are calculated differently. I feel like this should be easy but I don’t get it.

Regression sum of square alone (RSS), when calculated, doesn’t explain anything. It is merely a number. You must use this number RSS to divide by SST to make some sense out of it. The result would be a % of SST< which represents the independent variable’s explanatory power. I hope this helps.

The RSS is just the absolute amount of explained variation, the R squared is the (RSS/SST), i.e. the absolute amount of variation as a proportion of total variation.

It’s just like saying Net income is an absolute figure, whereas ROE is Net Income as a proportion of Equity.

So just want to confirm, I can get to R^2 two ways, by squaring the correlation coefficient or looking at an ANOVA table that thas RSS and SST?

Both valid methods.

A third; (SST-SSE)/SST (where SST-SSE = RSS)

However, you can only square the correlation coefficient when you have a single independent variable, while the other method always works.