basedir必须是绝对的:?/.ivy2/local [英] basedir must be absolute: ?/.ivy2/local

查看:562
本文介绍了basedir必须是绝对的:?/.ivy2/local的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在这里处于绝望状态...

I'm writing here in a full desperation state...

我有2个用户:

  • 1个本地用户,在Linux中创建.可以100%很好地工作,字数统计可以完美地工作. Kerberos集群.有效票证.
  • 1个Active Directory用户,可以登录,但是pyspark指令(相同的字数)失败.与上面一张相同的kdc票.

线程主"中的异常java.lang.IllegalArgumentException:basedir 必须是绝对的:?/.ivy2/local 在org.apache.ivy.util.Checks.checkAbsolute(Checks.java:48) 在org.apache.ivy.plugins.repository.file.FileRepository.setBaseDir(FileRepository.java:135) 在org.apache.ivy.plugins.repository.file.FileRepository.(FileRepository.java:44) 在org.apache.spark.deploy.SparkSubmitUtils $ .createRepoResolvers(SparkSubmit.scala:943) 在org.apache.spark.deploy.SparkSubmitUtils $ .buildIvySettings(SparkSubmit.scala:1035) 在org.apache.spark.deploy.SparkSubmit $$ anonfun $ 2.apply(SparkSubmit.scala:295) 在org.apache.spark.deploy.SparkSubmit $$ anonfun $ 2.apply(SparkSubmit.scala:295) 在scala.Option.getOrElse(Option.scala:121) 在org.apache.spark.deploy.SparkSubmit $ .prepareSubmitEnvironment(SparkSubmit.scala:294) 在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:153) 在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:119) 在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Exception in thread "main" java.lang.IllegalArgumentException: basedir must be absolute: ?/.ivy2/local at org.apache.ivy.util.Checks.checkAbsolute(Checks.java:48) at org.apache.ivy.plugins.repository.file.FileRepository.setBaseDir(FileRepository.java:135) at org.apache.ivy.plugins.repository.file.FileRepository.(FileRepository.java:44) at org.apache.spark.deploy.SparkSubmitUtils$.createRepoResolvers(SparkSubmit.scala:943) at org.apache.spark.deploy.SparkSubmitUtils$.buildIvySettings(SparkSubmit.scala:1035) at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply(SparkSubmit.scala:295) at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply(SparkSubmit.scala:295) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我正在运行的代码.超级简单.

The Code I'm running. Super simple.

import findspark
findspark.init()
from pyspark import SparkConf, SparkContext
conf = SparkConf().setMaster("yarn")
sc = SparkContext(conf=conf)

最后一条指令中的错误以上述错误结束(请参见异常).

It ends in error in the last instruction with the above error (see exception).

?/.ivy2/local->这是问题所在,但我不知道这是怎么回事:(.

?/.ivy2/local -> This is the problem but I have no idea what's going on :(.

对于Linux用户,它可以完美运行...但是对于本地系统中不存在,但具有/home/userFolder的AD用户……我有这个问题:(

With the Linux user it works perfectly... but with the AD user that doesn't exists in the local system, but has /home/userFolder ... I have this problem :(

请帮助...我已经达到疯狂的地步...我已经在互联网的每个角落进行了搜索,但是我没有找到解决这个问题/错误的任何方法:( stackoverflow是我的最后选择heeeeeeeeeelp

Please help... I've reach the point of insanity... I've googled every corner of the internet but I haven't found any solution to this problem/mistake :( stackoverflow is my last resort heeeeeeeeeelp

推荐答案

上下文

Ivy需要一个名为.ivy2的目录,通常位于主目录中.您还可以通过在启动Spark或执行spark-submit时提供配置属性来配置.ivy2的位置.

Ivy needs a directory called .ivy2, usually located in the home directory. You can also configure where .ivy2 should be by giving a configuration property when Spark starts, or when you execute spark-submit.

问题来自哪里

In IvySettings.java (line 796 for the version 2.2.0 of ant-ivy) there is this line:

if (getVariable("ivy.home") != null) {
   setDefaultIvyUserDir(Checks.checkAbsolute(getVariable("ivy.home"), "ivy.home"));
   Message.verbose("using ivy.default.ivy.user.dir variable for default ivy user dir: " + defaultUserDir);
} else {
   setDefaultIvyUserDir(new File(System.getProperty("user.home"), ".ivy2"));
   Message.verbose("no default ivy user dir defined: set to " + defaultUserDir);
}

如您所见,如果未设置ivy.home,并且也未设置user.home,则会出现错误:

As you can see, if ivy.home is not set, and user.home is also not set, then you will get the error:

线程"main"中的异常java.lang.IllegalArgumentException:basedir必须是绝对的:?/.ivy2/local

Exception in thread "main" java.lang.IllegalArgumentException: basedir must be absolute: ?/.ivy2/local

解决方案1(火花壳或火花提交)

Rocke Yang 所述,您可以通过设置配置属性来启动spark-shell或spark-submit spark.jars.ivy.示例:

As Rocke Yang has mentioned, you can start spark-shell or spark-submit by setting the configuration property spark.jars.ivy. Example:

spark-shell --conf spark.jars.ivy=/tmp/.ivy

解决方案2(火花发射器或纱线客户端)

第二种解决方案是在以编程方式调用Submit方法时设置配置属性:

A second solution would be to set the configuration property when calling the submit method programmatically:

sparkLauncher.setSparkHome("/path/to/SPARK_HOME")
  .setAppResource("/path/to/jar/to/be/executed")
  .setMainClass("MainClassName")
  .setMaster("MasterType like yarn or local")
  .setDeployMode("set deploy mode like cluster")
  .setConf("spark.executor.cores","2")
  .setConf("spark.jars.ivy","/tmp/.ivy")

门票已打开

有一个门票由Spark-Community打开

There is a ticket opened by Spark-Community

这篇关于basedir必须是绝对的:?/.ivy2/local的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆