Spark读取大文件作为输入流 [英] Spark to read a big file as inputstream

查看:632
本文介绍了Spark读取大文件作为输入流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道spark内置方法可以进行分区并读取大量文件,并使用textfile作为rdd进行分发. 但是,我正在自定义的加密文件系统中阅读此文件,该文件系统本质上不支持.我能想到的一种方法是读取输入流,然后加载多行并分发给执行程序.继续阅读,直到所有文件加载完毕.因此,没有执行程序会因内存不足错误而崩溃.可以在火花中这样做吗?

I know spark built in method can have partition and read huge chunk of file and distributed as rdd using textfile. However, i am reading this in a customized encrytped filessytem which spark does not support by nature. One way i can think of is to read an inputstream instead and loads multiple lines and distributed to executor. Keep reading until all file is loaded. So no executor will blow up due to out of memory error. Is that possible to do this in spark?

推荐答案

您可以尝试对不同的n使用lines.take(n)来找到群集的限制.

you can try lines.take(n) for different n to find the limit of your cluster.
or

spark.readStream.option("sep", ";").csv("filepath.csv")

这篇关于Spark读取大文件作为输入流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆