在 S3 中监视文件并将特定路径发送到程序 [英] File watch in S3 and send the particular path to a program

查看:24
本文介绍了在 S3 中监视文件并将特定路径发送到程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是 S3 存储桶处理的新手.我运行在 ec2-insctance 中运行的 hive 脚本,其结果以 .csv 文件的形式根据 S3 中的脚本保存在各自的文件夹中.现在我的要求是我必须有一个文件监视来查看每当 S3 中的每个文件夹中的新 .csv 文件被覆盖并将这些 .csv 的完整路径发送到我的 python 程序并调用该程序运行并保存输出.csv 在同一个文件夹中.如果有人可以提出一些建议以便我可以选择并实施它,那将会很有帮助.

I am new with S3 bucket processing. I run my hive scripts running in ec2-insctance and its results in the form of .csv files gets saved in their respective folders according to the script in S3. Now my requirement is that I have to have a file watch to see whenever a new .csv file is overwritten in every folders in S3 and send the full path of those .csv to my python program and call the program to run and save the output.csv in the same folder.It would be helpful if anyone can suggest some ways so that I could pick up and implement it.

推荐答案

  1. 您可以使用 Spark Streaming 来监视目录,在添加新条目时开始工作.需要您一直运行一个 Spark 集群.
  2. 您可以设置 S3 本身以通过 S3 事件通知发送事件 到他们的队列服务或 AWS lambda.
  1. you can use Spark Streaming to monitor a directory, kick off work when new entries are added. Needs to you run a spark cluster all the time.
  2. you can set up S3 itself to send events through S3 Event notifications to their queue service or AWS lambda.

选项 #2 将是最低成本和最可靠的

Option #2 is going to be the lowest cost and most reliable

这篇关于在 S3 中监视文件并将特定路径发送到程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆