将rdd的每个元素保存在文本文件hdfs中 [英] save each element of rdd in text file hdfs
问题描述
我正在使用spark应用程序. rdd的每个元素中都包含大量数据.我想将rdd的每个元素分别保存到多个hdfs文件中.我尝试了rdd.saveAsTextFile("foo.txt")
,但是我将为整个rdd
创建一个文件. rdd
的大小为10.我要在hdfs
中包含10个文件.我该如何实现?
I am using spark application. In each element of rdd contains good amount of data. I want to save each element of rdd into multiple hdfs files respectively. I tried rdd.saveAsTextFile("foo.txt")
But I will create a single file for whole rdd
. rdd
size is 10. I want 10 files in hdfs
. How can I achieve this??
推荐答案
如果我理解您的问题,则可以创建这样的自定义输出格式
If I understand your question, you can create a custom output format like this
class RDDMultipleTextOutputFormat extends MultipleTextOutputFormat[Any, Any] {
override def generateActualKey(key: Any, value: Any): Any = NullWritable.get()
override def generateFileNameForKeyValue(key: Any, value: Any, name: String): String = key.asInstanceOf[String]
}
然后将RDD转换为一个密钥/值,其中密钥为文件路径,您可以使用由saveAsTextFile插入的saveAsHadoopFile函数,如下所示:
Then convert your RDD into a key/val one where the key is the file path, and you can use saveAsHadoopFile function insted of saveAsTextFile, like this:
myRDD.saveAsHadoopFile(OUTPUT_PATH, classOf[String], classOf[String],classOf[RDDMultipleTextOutputFormat])
这篇关于将rdd的每个元素保存在文本文件hdfs中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!