plot.window(...)中的错误:需要有限的"xlim"值 [英] Error in plot.window(...) : need finite 'xlim' values

查看:293
本文介绍了plot.window(...)中的错误:需要有限的"xlim"值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

该错误该怎么办? 我的代码是:

What should i do for this error? My code is :

library(e1071)
library(hydroGOF)
donnees <- read.csv("F:/new work with shahab/Code-SVR/SVR/MainData.csv")
summary(donnees)

#partitioning into training and testing set
donnees.train <- donnees[donnees$subset=="train",2:ncol(donnees)]
donnees.test <- donnees[donnees$subset=="test",2:ncol(donnees)]

#use the mean of the dependent variable as a predictor
def.pred <- mean(donnees.train$y)

#error sum of squares of the default model on the test set
def.rss <- sum((donnees.test$y-def.pred)^2)
print(def.rss)
plot(donnees.train)
#*****************
#linear regression
#*****************
#Linear Models
reg <- lm(y ~., data = donnees.train)
print(summary(reg))
#error sum of squares of the model on the test set
reg.pred <- predict(reg,newdata = donnees.test)
reg.rss <- sum((donnees.test$y-reg.pred)^2)
print(reg.rss)

#pseudo-r-squared
print(1.0-reg.rss/def.rss)


#**********************************
#rbf epsilon-svr with cost = 1.0
#**********************************
epsilon.svr <- svm(y ~.,data = donnees.train, scale = T, type = "eps-regression",
                   kernel = "radial", cost = 1.0, epsilon=0.1,tolerance=0.001, shrinking=T,
                   fitted=T)
print(epsilon.svr)
#prédiction
esvr.pred <- predict(epsilon.svr,newdata = donnees.test)
esvr.rss <- sum((donnees.test$y-esvr.pred)^2)
#pseudo-R2
print(1.0-esvr.rss/def.rss)
esvr.rmse=rmse(donnees.test$y,esvr.pred)
print(esvr.rmse)

#****************************************************
#detect the "best" cost parameter for rbf epsilon-svr
#****************************************************
costs <- seq(from=0.05,to=3.0,by=0.005)
pseudor2 <- double(length(costs))
for (c in 1:length(costs)){
  epsilon.svr <- svm(y ~.,data = donnees.train, scale = T, type = "eps-regression",
                     kernel = "radial", cost = costs[c], epsilon=0.1,tolerance=0.001, shrinking=T,
                     fitted=T)
  #prédiction
  esvr.pred <- predict(epsilon.svr,newdata = donnees.test)
  esvr.rss <- sum((donnees.test$y-esvr.pred)^2)
  pseudor2[c] <- 1.0-esvr.rss/def.rss
}

#graphical representation
plot(costs,pseudor2,type="l")
#show the max. of pseudo-r2 and the corresponding cost parameter
print(max(pseudor2))
k <- which.max(pseudor2)
print(costs[k])

我在excel工作表中的主要数据是:

And my maindata in excel worksheet is :

    subset  x1  x2  y       
train   18  1088    9.77        
train   0   831 5.96        
train   0   785 5.36        
train   0   762 5.08        
train   0   749 4.92        
train   0.5 731 4.69        
train   0   727 4.64        
train   2   743 4.84        
train   5   818 5.83        
train   12  942 7.49        
train   13  973 7.98        
train   89.5    1292    12.94       
train   46.5    1086    9.61        
train   5.5 877 6.59        
train   1   826 5.89        
train   0.5 780 5.3     
train   3.5 756 5       
train   4   764 5.1     
train   28.5    851 6.26        
train   10  866 6.45        
train   20.5    839 6.09        
train   7   759 5.03        
train   0.5 722 4.57        
train   0   708 4.4     
train   0   694 4.22        
train   0   689 4.16        
train   0   679 4.03        
train   11  769 5.2     
train   0.5 697 4.26        
train   10.5    702 4.33        
train   1.5 692 4.2     
train   3   743 4.86        
train   16  958 7.98        
train   14  835 6.05        
train   0   713 4.46        
train   0.5 671 3.94        
train   0   659 3.79        
train   0   646 3.63        
train   0.5 636 3.52        
train   0   627 3.43        
train   0   629 3.44        
train   1   682 4.1     
train   8.5 735 4.81        
train   1   729 4.67        
train   0   649 3.66        
train   56  774 5.29        
train   1.5 663 3.84        
train   5.5 787 5.49        
train   50  839 6.14        
train   6.5 699 4.29        
train   1.5 756 5.03        
train   11.5    669 3.91        
train   5   684 4.1     
train   0   653 3.71        
train   0.5 669 3.94        
train   0   638 3.53        
train   0.5 647 3.65        
train   12.5    715 4.56        
train   7.5 921 7.37        
train   50  1149    10.95       
train   10.5    772 5.21        
train   23.5    1205    11.93       
train   23.5    1171    11.01       
train   8.5 927 7.26        
train   0.5 1009    8.45        
train   4   1019    8.62        
train   0   968 7.88        
train   2   862 6.38        
train   22  1349    14.15       
train   16.5    1029    8.74        
train   8.5 846 6.15        
train   0.5 853 6.26        
train   9.5 819 5.81        
train   19.5    775 5.24        
train   23  746 4.88        
train   46.5    723 4.58        
train   1   733 4.72        
train   26.5    731 4.69        
train   34.5    814 5.81        
train   2   743 4.84        
train   0   715 4.49        
train   4   680 4.05        
train   8   816 5.85        
train   20  823 5.91        
train   0.5 824 5.93        
train   2.5 746 4.88        
train   0   817 5.87        
train   0   732 4.7     
train   6   682 4.07        
train   0   685 4.12        
train   1   719 4.56        
train   10.5    701 4.31        
train   23.5    1002    8.74        
train   23.5    947 7.71        
train   8.5 808 5.66        
train   0.5 835 6.06        
train   4   811 5.71        
train   0   709 4.42        
train   2   696 4.25        
train   22  913 7.21        
train   16.5    860 6.42        
train   8.5 902 7.15        
train   0.5 781 5.32        
train   9.5 862 6.45        
train   19.5    833 6.02        
train   23  803 5.63        
train   46.5    903 7.06        
train   1   822 5.86        
train   26.5    1040    9.19        
train   34.5    939 7.55        
train   2   793 5.48        
train   0   730 4.68        
train   4   719 4.53        
train   8   706 4.38        
train   20  829 5.99        
train   0.5 724 4.6     
train   2.5 697 4.26        
train   0   669 3.91        
train   0   657 3.76        
train   6   724 4.66        
train   0   657 3.76        
train   1   676 4.02        
train   23.5    968 8.24        
train   0   696 4.25        
train   12  727 4.73        
train   0.5 651 3.69        
train   3.5 685 4.12        
train   0.5 668 3.9     
train   0   626 3.4     
train   0   619 3.32        
train   1   697 4.34        
train   0.5 624 3.37        
train   13.5    683 4.14        
train   0   651 3.68        
train   0   621 3.33        
train   0   612 3.24        
train   3   668 3.91        
train   0   626 3.39        
train   0.5 614 3.27        
train   0   614 3.26        
train   2.5 630 3.45        
train   0.5 617 3.3     
train   0   616 3.3     
train   8   684 4.14        
train   0.5 612 3.24        
train   0   598 3.09        
train   0   588 2.99        
train   0   590 3       
train   6   648 3.71        
train   0   598 3.1     
train   2   614 3.29        
train   33  804 5.9     
train   0   619 3.32        
train   0   588 2.98        
train   0   577 2.87        
train   0   571 2.81        
train   0.5 572 2.82        
train   4.5 607 3.2     
train   0   579 2.89        
train   0   562 2.72        
train   0   565 2.74        
train   0   554 2.63        
train   0   543 2.51        
train   0   536 2.44        
train   0   531 2.39        
train   0   532 2.4     
train   0.5 529 2.36        
train   0   527 2.35        
train   0   528 2.36        
train   0   523 2.31        
train   0   521 2.29        
train   0   523 2.31        
train   0.5 541 2.49        
train   0   522 2.3     
train   0.5 533 2.42        
train   2   529 2.37        
train   10  638 3.65        
train   0.5 544 2.52        
train   5   627 3.52        
train   0   535 2.43        
train   0   516 2.24        
train   0   520 2.27        
train   32  841 6.55        
train   11.5    838 6.29        
train   0   595 3.06        
train   0.5 592 3.03        
train   0   558 2.67        
train   0   540 2.48        
train   0   534 2.42        
train   2   539 2.46        
train   13  623 3.42        
train   0   553 2.62        
train   0   561 2.71        
train   0   546 2.55        
train   0   512 2.2     
train   2   518 2.26        
train   32  702 4.46        
train   27  731 4.76        
train   1   604 3.15        
train   0   584 2.94        
train   0   548 2.57        
train   0   519 2.26        
train   29.5    735 4.91        
train   0   564 2.74        
train   12  606 3.23        
train   0   542 2.51        
train   0   516 2.24        
train   0   508 2.15        
train   0   500 2.07        
train   0   495 2.03        
train   0   496 2.04        
train   0   492 1.99        
train   0   496 2.04        
train   0   490 1.98        
train   0   494 2.02        
train   0   490 1.99        
train   3   548 2.62        
train   17  546 2.61        
train   9.5 737 4.95        
train   1.5 584 2.96        
train   0   521 2.27        
train   0.5 526 2.34        
train   0   539 2.48        
train   24.5    699 4.45        
train   41  740 4.97        
train   3   569 2.8     
train   1   525 2.32        
train   0   511 2.18        
train   0   498 2.05        
train   2   597 3.22        
train   0.5 520 2.27        
train   66  909 7.77        
train   23  716 4.54        
train   0.5 564 2.74        
train   4.5 582 2.94        
train   0   577 2.88        
train   0   527 2.34        
train   0   512 2.19        
train   0   503 2.09        
train   8.5 561 2.73        
train   0   533 2.4     
train   24.5    640 3.77        
train   0   515 2.21        
train   0   496 2.03        
train   0   485 1.93        
train   0   480 1.88        
train   0   476 1.85        
train   0   480 1.88        
train   24  689 4.34        
train   0   568 2.79        
train   0   506 2.12        
train   8.5 680 4.19        
train   12  657 3.87        
train   5.5 635 3.61        
train   19.5    761 5.18        
train   1.5 567 2.77        
train   3.5 678 4.1     
train   4   574 2.84        
train   7   628 3.5     
train   6   656 3.77        
train   0   551 2.6     
train   0.5 526 2.33        
train   0.5 555 2.64        
train   8.5 666 4.01        
train   1   564 2.74        
train   0   534 2.41        
train   0   521 2.27        
train   7.5 599 3.15        
train   4.5 585 2.96        
train   3   647 3.65        
train   0   547 2.56        
train   0   531 2.38        
train   0   508 2.15        
train   0   500 2.08        
train   0   503 2.09        
train   0   492 1.99        
train   0.5 492 1.99        
train   5   647 3.92        
train   0   513 2.19        
train   6.5 523 2.3     
train   2   527 2.35        
train   2   522 2.3     
train   22.5    817 6.14        
train   18.5    808 5.86        
train   8.5 775 5.37        
train   4.5 705 4.37        
train   58  891 6.96        
train   7   642 3.58        
train   7   614 3.29        
train   10.5    772 5.29        
train   7.5 714 4.54        
train   3.5 613 3.25        
train   6   575 2.85        
train   24.5    680 4.19        
train   18.5    801 5.64        
train   0   640 3.55        
train   6.5 610 3.23        
train   0.5 592 3.03        
train   36.5    835 6.2     
test    0   673 3.97    2.97    2.49
test    0.5 571 2.81    3.74    2.3
test    0   553 2.62    3.56    3.1
test    6   597 3.17    3.52    3.46
test    7   584 2.97    3.75    3.6
test    4.5 649 3.74    3.76    3.5
test    9.5 636 3.56    5.27    5.4
test    14.5    629 3.52    3.69    3.65
test    6.5 648 3.75    3.01    3
test    18  653 3.76    4.07    4.1
test    25.5    767 5.27    3.52    3.46
test    16  650 3.69    5.49    5.1
test    0.5 589 3.01    5.79    5.3
test    18.5    676 4.07    5.29    5.12
test    10  635 3.52    3.4 3.2
test    64  784 5.49    4.11    4.3
test    35.5    812 5.79    2.91    3
test    17.5    775 5.29    2.66    2.9
test    0.5 627 3.4 2.88    2.4
test    7   680 4.11    4.46    4.26
test    0   581 2.91    7.43    6.6
test    0   557 2.66    10.73   9.08
test    0   578 2.88    10.87   9.4
test    21  707 4.46    10.3    9.1
test    40  911 7.43    11.52   10.7
test    61  1151    10.73   11.33   10.4
test    42  1144    10.87   10.61   10.8
test    13  1121    10.3    13.26   13.29
test    6.5 1208    11.52   16.74   15.2
test    7.5 1206    11.33   13.26   12.7
test    0.5 1158    10.61   13.36   12.9
test    30.5    1328    13.26   11.22   11.19
test    84  1529    16.74   10.68   13.1
test    18.5    1332    13.26   13.22   13.8
test    8   1338    13.36   8.68    9.1
test    0.5 1199    11.22   8.13    10.05
test    19.5    1163    10.68   7.51    7.8
test    36.5    1313    13.22   7.05    9.6
test    1.5 1026    8.68    6.99    10.7
test    1   988 8.13    6.39    6.18
test    0   945 7.51    6.71    6.12
test    0   912 7.05    8.51    8.28
test    2   907 6.99    7.69    7.95
test    0.5 864 6.39    7.66    7.2
test    4   887 6.71    6.73    6.9
test    20  1012    8.51    6.86    6.4
test    21.5    957 7.69    8.88    8.1
test    17.5    955 7.66    7.26    7.4
test    1   889 6.73    6.35    6.32
test    11  898 6.86    6.25    6.18
test    9.5 1039    8.88    6.32    6.2
test    2.5 927 7.26    7.46    7.7
test    2.5 859 6.35    5.7 5.4
test    5   853 6.25    7.5 7.9
test    4   858 6.32    6.51    6.3
test    8   936 7.46    7.51    7.39
test    4   811 5.7 9.02    9.01
test    9   937 7.5 6.16    6.12
test    9   871 6.51    5.35    5.6
test    9   943 7.51    5.61    5.9
test    5   1047    9.02    8.56    8.3
test    6.5 846 6.16    7.3 7.1
test    2   784 5.35    6.4 6.2
test    3.5 804 5.61    5.46    5.43
test    0   726 4.63    5.3 5.32
test    37  917 7.3 7.2 7.12
test    12  864 6.4 6.1 6.01

那我现在该怎么办?我该如何解决这个错误?

So what should i do now? How can i solve this error?

plot.window(...)中的错误:需要有限的'xlim'值

Error in plot.window(...) : need finite 'xlim' values

此外:警告消息:

1:在min(x)中:没有min的必填参数;返回Inf

1: In min(x) : no non-missing arguments to min; returning Inf

2:在max(x)中:没有max的所有必输参数;返回-Inf

2: In max(x) : no non-missing arguments to max; returning -Inf

如果可能,请更正我的代码. 我对Rstudio和R不太熟悉.

If it is possible , please correct my code . I am not very familiar with Rstudio and R.

推荐答案

问题是,您正在(可能)试图绘制仅包含缺失的(NA)值的向量.这是一个示例:

The problem is that you're (probably) trying to plot a vector that consists exclusively of missing (NA) values. Here's an example:

> x=rep(NA,100)
> y=rnorm(100)
> plot(x,y)
Error in plot.window(...) : need finite 'xlim' values
In addition: Warning messages:
1: In min(x) : no non-missing arguments to min; returning Inf
2: In max(x) : no non-missing arguments to max; returning -Inf

在您的示例中,这意味着在您的行plot(costs,pseudor2,type="l")中,costs完全是NA.您必须弄清楚为什么会这样,但这就是您的错误的解释.

In your example this means that in your line plot(costs,pseudor2,type="l"), costs is completely NA. You have to figure out why this is, but that's the explanation of your error.

来自评论:

斯科特·威尔逊(Scott C Wilson): 该消息的另一个可能原因(不是在这种情况下,而是在其他情况下)是试图将字符值用作X或Y数据.您可以使用class函数检查x和Y值,以确保是否认为这可能是您的问题.

Scott C Wilson: Another possible cause of this message (not in this case, but in others) is attempting to use character values as X or Y data. You can use the class function to check your x and Y values to be sure if you think this might be your issue.

stevec :这是快速简便的解决方案(基本上将 x 包装在as.factor(x)中)

stevec: Here is a quick and easy solution to that problem (basically wrap x in as.factor(x))

这篇关于plot.window(...)中的错误:需要有限的"xlim"值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆