写入 hdfs 路径时出现错误 java.io.IOException: Failed to rename [英] While writing to hdfs path getting error java.io.IOException: Failed to rename

查看:132
本文介绍了写入 hdfs 路径时出现错误 java.io.IOException: Failed to rename的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用的是使用 hadoop-2.6.5.jar 版本的 spark-sql-2.4.1v.我需要先将数据保存在 hdfs 上,然后再转移到 cassandra.因此,我试图将数据保存在 hdfs 上,如下所示:

I am using spark-sql-2.4.1v which is using hadoop-2.6.5.jar version . I need to save my data first on hdfs and move to cassandra later. Hence I am trying to save the data on hdfs as below:

String hdfsPath = "/user/order_items/";
cleanedDs.createTempViewOrTable("source_tab");

givenItemList.parallelStream().forEach( item -> {   
    String query = "select $item  as itemCol , avg($item) as mean groupBy year";
    Dataset<Row> resultDs = sparkSession.sql(query);

    saveDsToHdfs(hdfsPath, resultDs );   
});


public static void saveDsToHdfs(String parquet_file, Dataset<Row> df) {
    df.write()                                 
      .format("parquet")
      .mode("append")
      .save(parquet_file);
    logger.info(" Saved parquet file :   " + parquet_file + "successfully");
}

当我在集群上运行我的工作时,它无法抛出此错误:

When I run my job on cluster it fails throwing this error:

java.io.IOException: Failed to rename FileStatus{path=hdfs:/user/order_items/_temporary/0/_temporary/attempt_20180626192453_0003_m_000007_59/part-00007.parquet; isDirectory=false; length=952309; replication=1; blocksize=67108864; modification_time=1530041098000; access_time=0; owner=; group=; permission=rw-rw-rw-; isSymlink=false} to hdfs:/user/order_items/part-00007.parquet
    at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:415)

请建议如何解决此问题?

Please suggest how to fix this issue?

推荐答案

您可以在一个作业中完成所有选择,在一个表中获取所有选择和联合.

You can do all the selects in one single job, get all the selects and union in a single table.

Dataset<Row> resultDs = givenItemList.parallelStream().map( item -> {   
    String query = "select $item  as itemCol , avg($item) as mean groupBy year";
    return sparkSession.sql(query);
}).reduce((a, b) -> a.union(b)).get

saveDsToHdfs(hdfsPath, resultDs );

这篇关于写入 hdfs 路径时出现错误 java.io.IOException: Failed to rename的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆