I am having issues with the calculation of stdv from variance when it is in percentage value vs actual. So, the definition is that stdv = sqrt(var), however, this is very different when we are looking at percentage values (>1) and actual values (<1).
For example, var = 0.09 (9%). When we are calculating from actual values, we get 0.3 (30%) and from percentage values, we get 3%. This is very confusing for me as we are getting very much different values.
Also, I have found that both ways are being used in prepping material and this does not make any sense! I think it is better to use actual values, but would really appreciate your opinion and if possible, an explanation of why both ways are used?
This is one reason that you should never write 9 when you mean 9%.
Finance people – lazy, sloppy finance people – do this frequently, and it’s infuriating, because they create problems of this sort unnecessarily.
So what do you think - should I calculate always from actual values, meaning 0.09?
I come from a mathematical background and I know this is the right way, but it is frustrating to see examples where they do with percentage values and I got the wrong result due to that. This only adds unnecessary confusion as I don’t want to think about this on exam day.