Spark错误:无效的日志目录/app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/ [英] Spark Error: invalid log directory /app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/
问题描述
我的 spark 应用程序因上述错误而失败.
My spark application is failing with the above error.
实际上我的 spark 程序正在将日志写入该目录.stderr 和 stdout 都被写入所有工作人员.
Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.
我的程序以前可以正常工作.但是昨天我改变了指向 SPARK_WORKER_DIR 的 fodler.但是今天我把旧的设置放回去并重新启动了火花.
My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.
谁能告诉我为什么我会收到这个错误的线索?
Can anyone give me clue on why i am getting this error?
推荐答案
在我的情况下,问题是由激活SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true
In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true
在 spark-env.sh 中,这应该删除旧的应用程序/驱动程序数据目录,但它似乎被窃听并删除了正在运行的应用程序的数据.
in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.
只需评论该行,看看它是否有帮助.
Just comment that line and see if it helps.
这篇关于Spark错误:无效的日志目录/app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!