没有 Hadoop 的 Spark:无法启动 [英] Spark without Hadoop: Failed to Launch

查看:36
本文介绍了没有 Hadoop 的 Spark:无法启动的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 Ubuntu 16.04 上运行 Spark 2.1.0、Hive 2.1.1 和 Hadoop 2.7.3.

I'm running Spark 2.1.0, Hive 2.1.1 and Hadoop 2.7.3 on Ubuntu 16.04.

我从 github 下载了 Spark 项目并构建了没有 hadoop"的版本:

I download the Spark project from github and build the "without hadoop" version:

./dev/make-distribution.sh --name "hadoop2-without-hive" --tgz"-Pyarn,hadoop 提供,hadoop-2.7,parquet 提供"

./dev/make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.7,parquet-provided"

当我运行 ./sbin/start-master.sh 时,出现以下异常:

When I run ./sbin/start-master.sh, I get the following exception:

 Spark Command: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp /home/server/spark/conf/:/home/server/spark/jars/*:/home/server/hadoop/etc/hadoop/:/home/server/hadoop/share/hadoop/common/lib/:/home/server/hadoop/share/hadoop/common/:/home/server/hadoop/share/hadoop/mapreduce/:/home/server/hadoop/share/hadoop/mapreduce/lib/:/home/server/hadoop/share/hadoop/yarn/:/home/server/hadoop/share/hadoop/yarn/lib/ -Xmx1g org.apache.spark.deploy.master.Master --host ThinkPad-W550s-Lab --port 7077 --webui-port 8080
 ========================================
 Error: A JNI error has occurred, please check your installation and try again
 Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
     at java.lang.Class.getDeclaredMethods0(Native Method)
     at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
     at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
     at java.lang.Class.getMethod0(Class.java:3018)
     at java.lang.Class.getMethod(Class.java:1784)
     at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
     at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
 Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
     ... 7 more

我根据帖子编辑 SPARK_DIST_CLASSPATH hadoop jar文件在哪里hadoop 2?

I edit SPARK_DIST_CLASSPATH according to the post Where are hadoop jar files in hadoop 2?

export SPARK_DIST_CLASSPATH=~/hadoop/share/hadoop/common/lib:~/hadoop/share/hadoop/common:~/hadoop/share/hadoop/mapreduce:~/hadoop/share/hadoop/mapreduce/lib:~/hadoop/share/hadoop/yarn:~/hadoop/share/hadoop/yarn/lib

但我仍然遇到同样的错误.我可以看到 slf4j jar 文件在 ~/hadoop/share/hadoop/common/lib 下.

But I'm still getting the same error. I can see the slf4j jar file is under ~/hadoop/share/hadoop/common/lib.

我该如何解决这个错误?

How could I fix this error?

谢谢!

推荐答案

Hadoop free"构建需要修改 SPARK_DIST_CLASSPATH 以包含 Hadoop 的包 jar.

"Hadoop free" builds need to modify SPARK_DIST_CLASSPATH to include Hadoop’s package jars.

最方便的方法是在 conf/spark-env.sh 中添加一个条目:

The most convenient place to do this is by adding an entry in conf/spark-env.sh :

export SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop classpath)  

检查这个 https://spark.apache.org/docs/latest/hadoop-provided.html

这篇关于没有 Hadoop 的 Spark:无法启动的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆