缺少火花工人的SLF4J记录器 [英] Missing SLF4J logger on spark workers

查看:130
本文介绍了缺少火花工人的SLF4J记录器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试通过 spark-submit 运行一份工作。



此工作产生的错误是:

 例外情况线程mainjava.lang.NoClassDefFoundError:org / slf4j / Logger 
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2625)
at java.lang.Class.getMethod0(Class.java:2866)
at java.lang.Class.getMethod(Class.java:1676)
at sun.launcher.LauncherHelper.getMainMethod (LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
引起:java.lang.ClassNotFoundException:org.slf4j.Logger
at java .net.URLClassLoader $ 1.run(URLClassLoader.java:366)
at java.net.URLClassLoader $ 1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.L auncher $ AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more

不确定是否重要,但我试图在Mesos上的Docker容器中运行此作业。 Spark是1.61,Mesos是0.27.1,Python是3.5,Docker是1.11.2。我在客户端模式下运行。



这是我的 spark-submit 声明的要点:

  export SPARK_PRINT_LAUNCH_COMMAND = true 
./spark-submit \
--master mesos:// mesos-blahblahblah: port \
--conf spark.mesos.executor.docker.image = docker-registry:spark-docker-image \
--conf spark.mesos.executor.home = / usr / local / spark \
--conf spark.executorEnv.MESOS_NATIVE_JAVA_LIBRARY = / usr / local / lib / libmesos.dylib \
--conf spark.shuffle.service.enabled = true \
--jars~ / spark / lib / slf4j-simple-1.7.21.jar \
test.py

test.py 的要点是它从镶木地板加载数据,按特定列对其进行排序,然后将其写回镶木地板。 / p>

当我一直收到错误时,我添加了 - jars 行(错误没有出现在我的驱动程序中 - 我浏览Mesos框架来查看stderr从每个Mesos任务中找到它)



我也尝试添加 - conf spark.executor.extraClassPath = http:some.ip:port /jars/slf4j-simple-1.7.21.jar



因为我注意到当我运行 spark-从上面提交它会输出



INFO SparkContext:添加了JAR文件:〜/ spark / lib / slf4j-simple -1.7.21.jar at http://some.ip:port / jars / slf4j-simple-1.7.21.jar with timestamp 1472138630497



但错误没有改变。有什么想法?



我发现这个链接,这让我觉得这是一个错误。但该人尚未发布任何解决方案。

解决方案

我遇到了同样的问题,并且还试图运行Mesos / Spark / Docker上的Python。



最终为我修复它的是将 hadoop classpath 输出添加到使用 spark.executor.extraClassPath 配置选项的Spark执行程序的类路径。



我运行的完整命令它的工作原理是:

  MESOS_NATIVE_JAVA_LIBRARY = / usr / local / lib / libmesos.so \ 
$ {SPARK_HOME } / bin / pyspark --conf spark.master = mesos:// mesos-master:5050 --driver-class-path $($ {HADOOP_HOME} / bin / hadoop classpath)--conf spark.executor.extraClassPath = $ ($ {HADOOP_HOME} / bin / hadoop classpath)


I am trying to run a job via spark-submit.

The error that results from this job is:

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2625)
    at java.lang.Class.getMethod0(Class.java:2866)
    at java.lang.Class.getMethod(Class.java:1676)
    at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 6 more

Not sure if it matters, but I am trying to run this job within a Docker container on Mesos. Spark is 1.61, Mesos is 0.27.1, Python is 3.5, and Docker is 1.11.2. I am running in client mode.

Here is the gist of my spark-submit statement:

export SPARK_PRINT_LAUNCH_COMMAND=true
./spark-submit \
    --master mesos://mesos-blahblahblah:port \
    --conf spark.mesos.executor.docker.image=docker-registry:spark-docker-image \
    --conf spark.mesos.executor.home=/usr/local/spark \
    --conf spark.executorEnv.MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.dylib \
    --conf spark.shuffle.service.enabled=true \
    --jars ~/spark/lib/slf4j-simple-1.7.21.jar \
    test.py

The gist of test.py is that it loads data from parquet, sorts it by a particular column, and then writes it back to parquet.

I added the --jars line when I kept getting that error (the error is not appearing in my driver - I navigate through the Mesos Framework to look at the stderr from each Mesos task to find it)

I also tried adding --conf spark.executor.extraClassPath=http:some.ip:port/jars/slf4j-simple-1.7.21.jar,

because I noticed when I ran the spark-submit from above it would output

INFO SparkContext: Added JAR file:~/spark/lib/slf4j-simple-1.7.21.jar at http://some.ip:port/jars/slf4j-simple-1.7.21.jar with timestamp 1472138630497

But the error is unchanged. Any ideas?

I found this link, which makes me think it is a bug. But the person hasn't posted any solution.

解决方案

I had this exact same problem and was also trying to run Mesos/Spark/Python on Docker.

The thing that finally fixed it for me was to add the hadoop classpath output to the Classpath of the Spark executors using the spark.executor.extraClassPath configuration option.

The full command I ran to get it to work was:

MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so \
${SPARK_HOME}/bin/pyspark --conf spark.master=mesos://mesos-master:5050 --driver-class-path $(${HADOOP_HOME}/bin/hadoop classpath) --conf spark.executor.extraClassPath=$(${HADOOP_HOME}/bin/hadoop classpath)

这篇关于缺少火花工人的SLF4J记录器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆