Java Apache Beam-保存文件"LOCALY"通过使用DataflowRunner [英] Java Apache Beam - save file "LOCALY" by using DataflowRunner
问题描述
Can send the java code but currently, it's not necessary.
我遇到一个问题,当我以(DirectRunner-使用Google VM实例)运行作业时,它运行正常,因为它将信息保存到本地文件并继续...
I have an issue as when I run the job as (DirectRunner - using Google VM Instance) it is working fine, as it saves information to the local file and carries on...
尝试使用(DataflowRunner)时出现问题,并且收到我收到的错误:
The problem appears when trying to use (DataflowRunner), and the error which I receive:
java.nio.file.NoSuchFileExtension: XXXX.csv
.....
.....
XXXX.csv could not be delete.
由于甚至没有创建,因此可以将其删除.
It could be deleted as it not even created.
问题-通过 DataflowRunner ??
PS (使用Apache Beam)
P.S. Using Apache Beam
Pipeline (part of the code) - Reading from BigQuery and store data to Google storage (Special Character issue)
推荐答案
AFAIK作为数据流实例运行时,必须将文件写入GCS服务(也称为存储桶),而不是本地磁盘.
AFAIK when it is ran as a dataflow instance, you have to write file to GCS service (aka storage bucket) rather than local disk.
您已经尝试过了吗?创建存储桶: https://cloud.google.com/storage/docs/creating -水桶
Did you try that already? to create storage bucket: https://cloud.google.com/storage/docs/creating-buckets
这篇关于Java Apache Beam-保存文件"LOCALY"通过使用DataflowRunner的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!