从DSE 4.6到DSE 4.7找不到Spark程序集 [英] DSE 4.6 to DSE 4.7 Failed to find Spark assembly

查看:86
本文介绍了从DSE 4.6到DSE 4.7找不到Spark程序集的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

将DSE 4.6升级到4.7后,job-server-0.5.0出现问题。如果我运行server_start.sh,将收到错误
在/usr/share/dse/spark/assembly/target/scala-2.10
中找不到Spark程序集。运行此程序之前,您需要先构建Spark程序。



我在/usr/share/dse/spark/bin/compute-classpath.sh中找到



此代码引发错误

  for $ {assembly_folder} /spark-assembly*hadoop*.jar中的f;如果[[! -e $ f]];然后
echo无法在$ assembly_folder中找到Spark程序集 1>& 2
echo您需要在运行此程序之前构建Spark。 1>& 2
出口1
fi
ASSEMBLY_JAR = $ f
num_jars = $(((num_jars + 1))
完成

如果我运行/ usr / share / dse / spark / bin / spark-submit,我会遇到同样的错误。 / p>

解决方案

如果您使用的是DSE,则您很可能应该在不触发计算类路径的情况下启动spark-jobserver。您可以尝试修改启动脚本以使用dse spark-submit,如以下示例所示。

 #作业服务器jar需要首先出现,因此其部门具有更高的优先级
#需要在以下位置显式包含应用程序目录classpath,因此可以找到日志记录配置
#CLASSPATH = $ appdir:$ appdir / spark-job-server.jar:$($ SPARK_HOME / bin / compute-classpath.sh)

#exec java -cp $ CLASSPATH $ GC_OPTS $ JAVA_OPTS $ LOGGING_OPTS $ CONFIG_OVERRIDES $ MAIN $ conffile 2>& 1&
dse spark-submit --class $ MAIN $ appdir / spark-job-server.jar --driver-java-options $ GC_OPTS $ JAVA_OPTS $ LOGGING_OPTS $ conffile 2& 1&

https://github.com/spark-jobserver/spark-jobserver/blob/f5406a50406c59f26c878d7cee7334d6b9203312/bin/server_start.sh b $ b

I have a problem with job-server-0.5.0 after upgraded DSE 4.6 to 4.7. If I run server_start.sh I'll get error "Failed to find Spark assembly in /usr/share/dse/spark/assembly/target/scala-2.10 You need to build Spark before running this program."

I found in /usr/share/dse/spark/bin/compute-classpath.sh

this code raises error

for f in ${assembly_folder}/spark-assembly*hadoop*.jar; do
  if [[ ! -e "$f" ]]; then
    echo "Failed to find Spark assembly in $assembly_folder" 1>&2
    echo "You need to build Spark before running this program." 1>&2
    exit 1
  fi
  ASSEMBLY_JAR="$f"
  num_jars=$((num_jars+1))
done

If I run /usr/share/dse/spark/bin/spark-submit I'll get this same error.

解决方案

If you are using DSE you should most likely be launching the spark-jobserver without hitting compute-classpath. You can try modifying the launch script to use dse spark-submit like in the following example.

# job server jar needs to appear first so its deps take higher priority
# need to explicitly include app dir in classpath so logging configs can be found
#CLASSPATH="$appdir:$appdir/spark-job-server.jar:$($SPARK_HOME/bin/compute-classpath.sh)"

#exec java -cp $CLASSPATH $GC_OPTS $JAVA_OPTS $LOGGING_OPTS $CONFIG_OVERRIDES $MAIN $conffile 2>&1 &
dse spark-submit --class $MAIN $appdir/spark-job-server.jar --driver-java-options "$GC_OPTS $JAVA_OPTS $LOGGING_OPTS" $conffile 2>&1 &

https://github.com/spark-jobserver/spark-jobserver/blob/f5406a50406c59f26c878d7cee7334d6b9203312/bin/server_start.sh

这篇关于从DSE 4.6到DSE 4.7找不到Spark程序集的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆