Spark错误:无效的日志目录/app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/ [英] Spark Error: invalid log directory /app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/

查看:165
本文介绍了Spark错误:无效的日志目录/app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的spark应用程序因上述错误而失败.

My spark application is failing with the above error.

实际上,我的spark程序正在将日志写入该目录.stderr和stdout都被写给所有工人.

Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.

我的程序用来更早地工作.但是昨天我改变了指向SPARK_WORKER_DIR的饲料.但是今天我把旧的设置放回去了,重新开始了.

My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.

任何人都可以提供有关我为什么会收到此错误的线索吗?

Can anyone give me clue on why i am getting this error?

推荐答案

就我而言,问题是由于激活了 SPARK_WORKER_OPTS =-Dspark.worker.cleanup.enabled = true

In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true

在spark-env.sh中,它应该删除旧的应用程序/驱动程序数据目录,但似乎已出现问题并删除了正在运行的应用程序的数据.

in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.

只需注释该行,看看是否有帮助.

Just comment that line and see if it helps.

这篇关于Spark错误:无效的日志目录/app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆