standard deviation vs. mean absolute deviation

Conceptually, what does standard deviation tell us as a measure of dispersion for a given data set vs. mean absolute deviation? Aren’t they both valuable measures of dispersion? As far as I can see, standard deviation is just a more complicated way of getting to the same conclusion. What am I missing?

Very good question. I guess its like arithmetic mean versus geometric mean. MAD is average versus standard deviation is square root of sum of squared deviations. Probably more accurate deviation measure than simple average. To prove this, you need ton of random samples and get the average accurate measures of both MAD and Standard deviations and conclude one of them (most probably standard deviation) is more accurate deviation measure. Hope this helps, if some one knows better include your ideas. Regards, Sreeni

I guess conceptually you are right, however stdev is more often referenced because it is much simpler to manipulate mathematically than MAD, since it does not operate on absolute values, which are difficult to differentiate.

i think its like more weighted mean and simple mean … which one is better … ??? mad is simple gives equal weighting to all deviations … and Stdev gives more weighting to larger deviations… thats why its better … do couple of example with large deviations and it will be more clear … i am lazy to do any example right now…