Node.js Cloud Function-将CSV数据直接流式传输到Google Cloud Storage文件 [英] Node.js Cloud Function - Stream CSV data directly to Google Cloud Storage file

查看:104
本文介绍了Node.js Cloud Function-将CSV数据直接流式传输到Google Cloud Storage文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个脚本,可以调用RESTful API并从报表中分块检索CSV数据.我可以在控制台中连接,解析和显示此数据.我也可以将此CSV数据写入本地文件并存储.

I have a script that can call a RESTful API and retrieve CSV data from a report in chunks. I'm able to concatenate, parse, and display this data in the console. I am also able to write this CSV data to a local file and store it.

我要弄清楚的是如何在将数据上传到GCS之前跳过创建文件来存储该数据的方法,而是直接将其传输到Google Cloud Storage中以另存为文件.由于我要使它成为无服务器的云功能,因此我试图将其直接从内存流式传输到Google Cloud Storage文件中.

What I am trying to figure out is how to skip creating a file to store this data before uploading it to GCS and instead transfer it directly into Google Cloud Storage to save as a file. Since I am trying to make this a serverless cloud function, I am trying to stream it directly from memory into a Google Cloud Storage file.

我在Google上找到了流传输" 文档,但它仅引用了使用"gsutil"执行此操作,我正在努力寻找有关如何使用node.js进行操作的任何示例或文档.我还尝试在堆栈溢出时遵循此 answer ,但这是从2013年开始的,而且这些方法似乎有些过时了.我的脚本也不是面向用户的,因此我不需要打任何路线.

I found this 'Streaming Transfers' documentation on google, but it only references doing this with 'gsutil' and I am struggling to find any examples or documentation on how to do this with node.js. I also tried to follow this answer on Stack overflow, but it's from 2013 and the methods seem a little out-dated. My script also isn't user-facing, so I don't need to hit any routes.

我可以使用以下功能将本地文件直接上传到存储桶,因此身份验证不是问题.我只是不确定如何将CSV Blob或内存中的对象转换为GCS中的文件.我找不到很多例子,所以不确定以前是否有人解决过这个问题.

I am able to upload local files directly to my bucket using the function below, so Authentication isn't an issue. I'm just unsure how to convert a CSV blob or object in memory into a file in GCS. I haven't been able to find many examples so wasn't sure if anyone else has solved this issue in the past.

const { Storage } = require('@google-cloud/storage');
const storage = new Storage({
  projectId,
  keyFilename
 });

function uploadCSVToGCS() {
   const localFilePath = './test.csv';
   const bucketName = "Test_Bucket";
   const bucket = storage.bucket(bucketName);

   bucket.upload(localFilePath);
};

我还找到了Google引用的名为'boto的第三方插件似乎可以满足我的要求,但是不幸的是,这是针对python的,而不是针对node.js的.

I also found a 3rd party plugin that Google references called 'boto' that seems to do what I want, but this is for python, not node.js unfortunately.

推荐答案

@ doug-stevenson感谢您将我推向正确的方向.我能够将其与以下代码一起使用:

@doug-stevenson thanks for pushing me in the right direction. I was able to get it to work with the following code:

const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
const bucketName = 'test_bucket';
const blobName = 'test.csv';
const bucket = storage.bucket(bucketName);
const blob = bucket.file(blobName);
const request = require('request');


function pipeCSVToGCS(redirectUrl) {
      request.get(redirectUrl)
      .pipe(blob.createWriteStream({
          metadata: {
              contentType: 'text/csv'
          }
      }))
    .on("error", (err) => {
        console.error(`error occurred`);
    })
    .on('finish', () => {
        console.info(`success`);
    });
};

这篇关于Node.js Cloud Function-将CSV数据直接流式传输到Google Cloud Storage文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆