SparkContext错误-找不到文件/tmp/spark-events不存在 [英] SparkContext Error - File not found /tmp/spark-events does not exist

查看:453
本文介绍了SparkContext错误-找不到文件/tmp/spark-events不存在的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

通过API调用运行Python Spark应用程序-提交申请时-回应-失败SSH进入Worker

Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker

我的python应用程序存在于

My python application exists in

/root/spark/work/driver-id/wordcount.py

可以在

/root/spark/work/driver-id/stderr

显示以下错误-

Traceback (most recent call last):
  File "/root/wordcount.py", line 34, in <module>
    main()
  File "/root/wordcount.py", line 18, in main
    sc = SparkContext(conf=conf)
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 172, in _do_init
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 235, in _initialize_context
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.io.FileNotFoundException: File file:/tmp/spark-events does not exist.
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:402)
  at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:255)
  at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
  at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
  at py4j.Gateway.invoke(Gateway.java:214)
  at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
  at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
  at py4j.GatewayConnection.run(GatewayConnection.java:209)
  at java.lang.Thread.run(Thread.java:745)

它表示-/tmp/spark-events不存在-是的但是,在wordcount.py

It indicates - /tmp/spark-events Does not exist - which is true However, in wordcount.py

from pyspark import SparkContext, SparkConf

... few more lines ...

def main():
    conf = SparkConf().setAppName("MyApp").setMaster("spark://ec2-54-209-108-127.compute-1.amazonaws.com:7077")
    sc = SparkContext(conf=conf)
    sc.stop()

if __name__ == "__main__":
    main()

推荐答案

/tmp/spark-events 是Spark存储事件日志的位置.只需在主计算机上创建此目录即可.

/tmp/spark-events is the location that Spark store the events logs. Just create this directory in the master machine and you're set.

$mkdir /tmp/spark-events
$ sudo /root/spark-ec2/copy-dir /tmp/spark-events/
RSYNC'ing /tmp/spark-events to slaves...
ec2-54-175-163-32.compute-1.amazonaws.com

这篇关于SparkContext错误-找不到文件/tmp/spark-events不存在的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆