为什么 Spark 应用程序失败并显示“IOException: (null) entry in command string: null chmod 0644"? [英] Why does Spark application fail with "IOException: (null) entry in command string: null chmod 0644"?

查看:17
本文介绍了为什么 Spark 应用程序失败并显示“IOException: (null) entry in command string: null chmod 0644"?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用下面的 JAVA 将数据集结果写入单个 CSV

I'm trying to write the dataset results into a single CSV using below using JAVA

dataset.write().mode(SaveMode.Overwrite).option("header",true).csv("C:\\tmp\\csvs");

但是它超时了,文件没有被写入.

But it goes for timed out , the file is not being written.

抛出 org.apache.spark.SparkException: Job aborted.

错误:

org.apache.spark.SparkException: Job aborted due to stage failure:

Task 0 in stage 13.0 failed 1 times, most recent failure: Lost task 0.0 in stage 13.0 (TID 16, localhost): java.io.IOException: (null) entry in command string: null chmod 0644 C:\tmp\12333333testSpark\_temporary\0\_temporary\attempt_201712282255_0013_m_000000_0\part-r-00000-229fd1b6-ffb9-4ba1-9dc9-89dfdbd0be43.csv
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:225)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:398)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:132)
at org.apache.spark.sql.execution.datasources.csv.CsvOutputWriter.<init>(CSVRelation.scala:200)
at org.apache.spark.sql.execution.datasources.csv.CSVOutputWriterFactory.newInstance(CSVRelation.scala:170)
at org.apache.spark.sql.execution.datasources.BaseWriterContainer.newOutputWriter(WriterContainer.scala:131)
at org.apache.spark.sql.execution.datasources.DefaultWriterContainer.writeRows(WriterContainer.scala:247)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(InsertIntoHadoopFsRelationCommand.scala:143)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(InsertIntoHadoopFsRelationCommand.scala:143)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

推荐答案

您可能希望缩小范围以修复以下异常:

You might want to narrow down to fixing the following exception:

java.io.IOException: (null) entry in command string: null chmod 0644

尝试将 HADOOP_HOME 设置为 bin\winutils.exe 的子目录,如本 SO 问题.如果这没有帮助,在另一个 SO 链接.

Try set HADOOP_HOME to the subdirectory with bin\winutils.exe as reported in this SO question. If that doesn't help, there is a work-around reported at another SO link.

这篇关于为什么 Spark 应用程序失败并显示“IOException: (null) entry in command string: null chmod 0644"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆