Azure数据工厂可以不断上传Blob中的数据进行处理吗? [英] Can Azure Data Factory constantly upload data from a Blob to process?

查看:87
本文介绍了Azure数据工厂可以不断上传Blob中的数据进行处理吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Azure数据工厂是否可以不断上传Blob中的数据进行处理?

Can Azure Data Factory constantly upload data from a Blob to process?

注意到在以下链接中,从Blob上传原始数据是手动或最多间隔预定? 第5步 - 从Blob上传。

Noticed that in the below link the upload of the raw data from the Blob is manual or at most scheduled in intervals?  Step 5 -upload from Blob.

我们希望数据立即可用。

We would like the data to be available instantly without delay.

https://docs.microsoft.com/en- us / azure / data-factory / quickstart-create-data- factory-portal

dsk

推荐答案

你好
kimdav111

感谢您的查询,是的,Azure Data Factory可以立即从Blob上传数据进行处理。

Thank you for your inquiry and yes, Azure Data Factory can instantly upload data from Blob to process.

数据工厂管道中基于事件的触发器当前支持从blob存储容器引发的事件。目前支持文件到达或删除Azure存储帐户中的文件。

Event based trigger in data factory pipeline is currently supporting event raised from blob storage containers. This is currently supported for arrival of a file or the deletion of a file in Azure Storage account.

要知道在Azure数据工厂管道中逐步实现基于事件的触发器,请参阅此文档:

创建一个触发器,运行管道以响应事件
。 

To know step by step implementation of event based trigger in Azure Data factory pipeline, please refer to this doc: Create a trigger that runs pipeline in response to an event

如果这有帮助,请告诉我们....

Please do let us know if this helps....


这篇关于Azure数据工厂可以不断上传Blob中的数据进行处理吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆