The standard error formula is = std/sqrt (n)
but for autocorrelation the standard error formula is = 1/sqrt(n)
why are we not considering the standard deviation while calculating the standard error of auto-correlation?
Thanks
The standard error formula is = std/sqrt (n)
but for autocorrelation the standard error formula is = 1/sqrt(n)
why are we not considering the standard deviation while calculating the standard error of auto-correlation?
Thanks
The standard deviation for the autocorrelation is 1. Think of the observations and the time aspect as normally distributed (e.g. months, quarters, years, etc.)
Does it matter if the autocorrelation is one lag or two lags? i.e. does the 1 become a 2 incase of a two lag autocorrelation?