Python中statsmodels中的ADF测试 [英] ADF test in statsmodels in Python
本文介绍了Python中statsmodels中的ADF测试的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在尝试在Python的statsmodels
中运行增强的Dickey-Fuller测试,但是我似乎缺少一些东西.
I am trying to run a Augmented Dickey-Fuller test in statsmodels
in Python, but I seem to be missing something.
这是我正在尝试的代码:
This is the code that I am trying:
import numpy as np
import statsmodels.tsa.stattools as ts
x = np.array([1,2,3,4,3,4,2,3])
result = ts.adfuller(x)
我收到以下错误:
Traceback (most recent call last):
File "C:\Users\Akavall\Desktop\Python\Stats_models\stats_models_test.py", line 12, in <module>
result = ts.adfuller(x)
File "C:\Python27\lib\site-packages\statsmodels-0.4.1-py2.7-win32.egg\statsmodels\tsa\stattools.py", line 201, in adfuller
xdall = lagmat(xdiff[:,None], maxlag, trim='both', original='in')
File "C:\Python27\lib\site-packages\statsmodels-0.4.1-py2.7-win32.egg\statsmodels\tsa\tsatools.py", line 305, in lagmat
raise ValueError("maxlag should be < nobs")
ValueError: maxlag should be < nobs
我的Numpy版本:1.6.1 我的统计模型版本:0.4.1 我正在使用Windows.
My Numpy Version: 1.6.1 My statsmodels Version: 0.4.1 I am using windows.
I am looking at the documentation here but can't figure what I am doing wrong. What am I missing?
预先感谢.
推荐答案
我知道了.默认情况下,maxlag
设置为None
,而应将其设置为整数.像这样的作品:
I figured it out. By default maxlag
is set to None
, while it should be set to integer. Something like this works:
import numpy as np
import statsmodels.tsa.stattools as ts
x = np.array([1,2,3,4,3,4,2,3])
result = ts.adfuller(x, 1) # maxlag is now set to 1
输出:
>>> result
(-2.6825663173365015, 0.077103947319183241, 0, 7, {'5%': -3.4775828571428571, '1%': -4.9386902332361515, '10%': -2.8438679591836733}, 15.971188911270618)
这篇关于Python中statsmodels中的ADF测试的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文