W3cubDocs

/Statsmodels

Autoregressive Moving Average (ARMA): Sunspots data

In [1]:
%matplotlib inline

from __future__ import print_function
import numpy as np
from scipy import stats
import pandas as pd
import matplotlib.pyplot as plt

import statsmodels.api as sm
In [2]:
from statsmodels.graphics.api import qqplot

Sunpots Data

In [3]:
print(sm.datasets.sunspots.NOTE)
::

    Number of Observations - 309 (Annual 1700 - 2008)
    Number of Variables - 1
    Variable name definitions::

        SUNACTIVITY - Number of sunspots for each year

    The data file contains a 'YEAR' variable that is not returned by load.

In [4]:
dta = sm.datasets.sunspots.load_pandas().data
In [5]:
dta.index = pd.Index(sm.tsa.datetools.dates_from_range('1700', '2008'))
del dta["YEAR"]
In [6]:
dta.plot(figsize=(12,8));
In [7]:
fig = plt.figure(figsize=(12,8))
ax1 = fig.add_subplot(211)
fig = sm.graphics.tsa.plot_acf(dta.values.squeeze(), lags=40, ax=ax1)
ax2 = fig.add_subplot(212)
fig = sm.graphics.tsa.plot_pacf(dta, lags=40, ax=ax2)
In [8]:
arma_mod20 = sm.tsa.ARMA(dta, (2,0)).fit(disp=False)
print(arma_mod20.params)
const                49.659542
ar.L1.SUNACTIVITY     1.390656
ar.L2.SUNACTIVITY    -0.688571
dtype: float64
In [9]:
arma_mod30 = sm.tsa.ARMA(dta, (3,0)).fit(disp=False)
In [10]:
print(arma_mod20.aic, arma_mod20.bic, arma_mod20.hqic)
2622.636338065809 2637.56970317 2628.60672591
In [11]:
print(arma_mod30.params)
const                49.749936
ar.L1.SUNACTIVITY     1.300810
ar.L2.SUNACTIVITY    -0.508093
ar.L3.SUNACTIVITY    -0.129650
dtype: float64
In [12]:
print(arma_mod30.aic, arma_mod30.bic, arma_mod30.hqic)
2619.4036286964474 2638.07033508 2626.8666135
  • Does our model obey the theory?
In [13]:
sm.stats.durbin_watson(arma_mod30.resid.values)
Out[13]:
1.9564807635787604
In [14]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
ax = arma_mod30.resid.plot(ax=ax);
In [15]:
resid = arma_mod30.resid
In [16]:
stats.normaltest(resid)
Out[16]:
NormaltestResult(statistic=49.845019661107585, pvalue=1.5006917858823576e-11)
In [17]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
fig = qqplot(resid, line='q', ax=ax, fit=True)
In [18]:
fig = plt.figure(figsize=(12,8))
ax1 = fig.add_subplot(211)
fig = sm.graphics.tsa.plot_acf(resid.values.squeeze(), lags=40, ax=ax1)
ax2 = fig.add_subplot(212)
fig = sm.graphics.tsa.plot_pacf(resid, lags=40, ax=ax2)
In [19]:
r,q,p = sm.tsa.acf(resid.values.squeeze(), qstat=True)
data = np.c_[range(1,41), r[1:], q, p]
table = pd.DataFrame(data, columns=['lag', "AC", "Q", "Prob(>Q)"])
print(table.set_index('lag'))
            AC          Q  Prob(>Q)
lag                                
1.0   0.009179   0.026287  0.871202
2.0   0.041793   0.573041  0.750872
3.0  -0.001335   0.573600  0.902448
4.0   0.136089   6.408927  0.170620
5.0   0.092468   9.111841  0.104685
...        ...        ...       ...
36.0 -0.119329  91.248909  0.000001
37.0 -0.036665  91.723876  0.000002
38.0 -0.046193  92.480525  0.000002
39.0 -0.017768  92.592893  0.000003
40.0 -0.006220  92.606716  0.000005

[40 rows x 3 columns]
  • This indicates a lack of fit.
  • In-sample dynamic prediction. How good does our model do?
In [20]:
predict_sunspots = arma_mod30.predict('1990', '2012', dynamic=True)
print(predict_sunspots)
1990-12-31    167.047417
1991-12-31    140.993002
1992-12-31     94.859112
1993-12-31     46.860896
1994-12-31     11.242577
                 ...    
2008-12-31     41.963810
2009-12-31     46.869285
2010-12-31     51.423261
2011-12-31     54.399720
2012-12-31     55.321692
Freq: A-DEC, dtype: float64
In [21]:
fig, ax = plt.subplots(figsize=(12, 8))
ax = dta.ix['1950':].plot(ax=ax)
fig = arma_mod30.plot_predict('1990', '2012', dynamic=True, ax=ax, plot_insample=False)
In [22]:
def mean_forecast_err(y, yhat):
    return y.sub(yhat).mean()
In [23]:
mean_forecast_err(dta.SUNACTIVITY, predict_sunspots)
Out[23]:
5.6369602158434047

Exercise: Can you obtain a better fit for the Sunspots model? (Hint: sm.tsa.AR has a method select_order)

Simulated ARMA(4,1): Model Identification is Difficult

In [24]:
from statsmodels.tsa.arima_process import arma_generate_sample, ArmaProcess
In [25]:
np.random.seed(1234)
# include zero-th lag
arparams = np.array([1, .75, -.65, -.55, .9])
maparams = np.array([1, .65])

Let's make sure this model is estimable.

In [26]:
arma_t = ArmaProcess(arparams, maparams)
In [27]:
arma_t.isinvertible
Out[27]:
True
In [28]:
arma_t.isstationary
Out[28]:
False
  • What does this mean?
In [29]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
ax.plot(arma_t.generate_sample(nsample=50));
In [30]:
arparams = np.array([1, .35, -.15, .55, .1])
maparams = np.array([1, .65])
arma_t = ArmaProcess(arparams, maparams)
arma_t.isstationary
Out[30]:
True
In [31]:
arma_rvs = arma_t.generate_sample(nsample=500, burnin=250, scale=2.5)
In [32]:
fig = plt.figure(figsize=(12,8))
ax1 = fig.add_subplot(211)
fig = sm.graphics.tsa.plot_acf(arma_rvs, lags=40, ax=ax1)
ax2 = fig.add_subplot(212)
fig = sm.graphics.tsa.plot_pacf(arma_rvs, lags=40, ax=ax2)
  • For mixed ARMA processes the Autocorrelation function is a mixture of exponentials and damped sine waves after (q-p) lags.
  • The partial autocorrelation function is a mixture of exponentials and dampened sine waves after (p-q) lags.
In [33]:
arma11 = sm.tsa.ARMA(arma_rvs, (1,1)).fit(disp=False)
resid = arma11.resid
r,q,p = sm.tsa.acf(resid, qstat=True)
data = np.c_[range(1,41), r[1:], q, p]
table = pd.DataFrame(data, columns=['lag', "AC", "Q", "Prob(>Q)"])
print(table.set_index('lag'))
            AC           Q      Prob(>Q)
lag                                     
1.0   0.254921   32.687669  1.082216e-08
2.0  -0.172416   47.670733  4.450737e-11
3.0  -0.420945  137.159383  1.548473e-29
4.0  -0.046875  138.271291  6.617736e-29
5.0   0.103240  143.675896  2.958739e-29
...        ...         ...           ...
36.0  0.142724  231.734107  1.923091e-30
37.0  0.095768  236.706149  5.937808e-31
38.0 -0.084744  240.607793  2.890898e-31
39.0 -0.150126  252.878971  3.963021e-33
40.0 -0.083767  256.707729  1.996181e-33

[40 rows x 3 columns]
In [34]:
arma41 = sm.tsa.ARMA(arma_rvs, (4,1)).fit(disp=False)
resid = arma41.resid
r,q,p = sm.tsa.acf(resid, qstat=True)
data = np.c_[range(1,41), r[1:], q, p]
table = pd.DataFrame(data, columns=['lag', "AC", "Q", "Prob(>Q)"])
print(table.set_index('lag'))
            AC          Q  Prob(>Q)
lag                                
1.0  -0.007889   0.031302  0.859569
2.0   0.004132   0.039906  0.980245
3.0   0.018103   0.205415  0.976710
4.0  -0.006760   0.228538  0.993948
5.0   0.018120   0.395024  0.995466
...        ...        ...       ...
36.0  0.041271  21.358847  0.974774
37.0  0.078704  24.716879  0.938948
38.0 -0.029729  25.197056  0.944895
39.0 -0.078397  28.543388  0.891179
40.0 -0.014466  28.657578  0.909268

[40 rows x 3 columns]

Exercise: How good of in-sample prediction can you do for another series, say, CPI

In [35]:
macrodta = sm.datasets.macrodata.load_pandas().data
macrodta.index = pd.Index(sm.tsa.datetools.dates_from_range('1959Q1', '2009Q3'))
cpi = macrodta["cpi"]

Hint:

In [36]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
ax = cpi.plot(ax=ax);
ax.legend();

P-value of the unit-root test, resoundly rejects the null of no unit-root.

In [37]:
print(sm.tsa.adfuller(cpi)[1])
0.990432818834

© 2009–2012 Statsmodels Developers
© 2006–2008 Scipy Developers
© 2006 Jonathan E. Taylor
Licensed under the 3-clause BSD License.
http://www.statsmodels.org/stable/examples/notebooks/generated/tsa_arma_0.html