SparkDeploySchedulerBackend错误:应用程序已被杀害。所有的高手都没有响应 [英] SparkDeploySchedulerBackend Error: Application has been killed. All masters are unresponsive

查看:2555
本文介绍了SparkDeploySchedulerBackend错误:应用程序已被杀害。所有的高手都没有响应的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

虽然我开始星火外壳:

bin>./spark-shell

我收到以下错误:

I get the following error :

Spark assembly has been built with Hive, including Data nucleus jars on classpath
Welcome to SPARK VERSION 1.3.0
Using Scala version 2.10.4 (Java HotSpot(TM) Server VM, Java 1.7.0_75)
Type in expressions to have them evaluated.
Type :help for more information.
15/05/10 12:12:21 ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
15/05/10 12:12:21 ERROR TaskSchedulerImpl: Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up.

我已经安装了跟踪火花链接如下: - 的http://www.philchen.com/2015/02/16/how-to-install-apache-spark-and-cassandra-stack-on-ubuntu

推荐答案

您应该提供您的星火集群的主URL启动时,火花壳

You should supply your Spark Cluster's Master URL when start a spark-shell

至少:

bin/spark-shell --master spark://master-ip:7077

所有的选项作出了一个长长的清单,你可以自己找到合适的:

All the options make up a long list and you can find the suitable ones yourself:

bin/spark-shell --help

这篇关于SparkDeploySchedulerBackend错误:应用程序已被杀害。所有的高手都没有响应的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆