spark-class java没有此类文件或目录 [英] spark-class java no such file or derectory
问题描述
我是火花/scala的新手...我已经建立了一个完全分布式的集群spark/scala和sbt.当我测试并发出命令pyspark时,出现以下错误:
I am a newbie to spark / scala ... I have set up on a fully distributed cluster spark / scala and sbt. when I test and issue the command pyspark I get the following error:
/home/hadoop/spark/bin/spark-class第75行/usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java-没有这样的文件或目录
/home/hadoop/spark/bin/spark-class line 75 /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java - no such file or directory
我的bashrc包含:
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
hadoop-env.sh contains export JAVA_HOME=/usr/lib/jvm/java7-openjdk-amd64/jre/
conf/spark-env.sh contains JAVA_HOME=usr/lib/jvm/java7-openjdk-amd64/jre
spark-class contains
if [ -n "${JAVA_HOME"}]; then RUNNER="${JAVA_HOME}/bin/java" else if [ "$( command -v java ) " ] then RUNNER = "java"
有人可以协助我进行一些更改以获取正确的Java路径
can someone assist in what I need to change to get the right path to java
推荐答案
我发现问题与从.bashrc到etc的所有路径都匹配到spark.conf
I found the issue matched all paths from .bashrc to etc to spark.conf
这篇关于spark-class java没有此类文件或目录的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!