CFAI says “a series that trends upward or downward over time often has a unit root.”
This statement is not intuitive to me. When I see a time series that trends upwards or downwards, my first instinct would be that the errors are autocorrelated, the series is covariance nonstationary, and it would need to be tested for serially correlated errors. I would not have the instinct to think that the series has a unit root.
Why should we automatically think a series likely has a unit root if it trends upwards or downwards?
If a series is covariance nonstationary, does it mean that it must have a unit root? Is it possible for a series to be covariance nonstationary while not having a unit root?
If a series has a unit root, is it the case that it must be a random walk? Can a series have a unit root and not be a random walk?
It certainly _ can _ have a unit root, but it could also have b1 = 1.1. I see no reason that the former should be more likely than the latter.
Why would having a trend induce you to think of nonzero serial correlation? You certainly cannot have all of your errors positive (or all negative), so I don’t see how positive serial correlation would have anything to do with a trend
No reason that I can imagine.
It’s a possibility, to be sure, but not remotely the only (or even the most likely) possibility, as far as I can tell.
I’m confusing serial correlation with unit root. They seem fairly similar to me: both indicate nonstationarity.
When you observe a time series with errors that trend upward/downward rather than errors that fluctuate constantly around the mean, what would be your first thought?