如何在Azure数据工厂管道中集成WebJob [英] How to integrate a WebJob within an Azure Data Factory Pipeline
问题描述
我正在尝试将WebJob集成到ADF管道中. Webjob是一个非常简单的控制台应用程序:
I'am trying to integrate a WebJob inside an ADF pipeline. The webjob is a very simple console application:
namespace WebJob4
{
class ReturnTest
{
static double CalculateArea(int r)
{
double area = r * r * Math.PI;
return area;
}
static void Main()
{
int radius = 5;
double result = CalculateArea(radius);
Console.WriteLine("The area is {0:0.00}", result);
}
}
}
我们如何通过ADF管道调用此webjob并将响应代码(成功的情况下为HTTP 200)存储在azure blob存储中?
How do we call this webjob through an ADF pipeline and store the response code (HTTP 200 in case of Success) in azure blob storage?
推荐答案
2018年12月更新:
如果您正在考虑使用azure函数执行此操作,则azure数据工厂现在为您提供azure函数步骤!基本原理与您必须使用HTTP触发器公开azure函数的原理相同.但这提供了更好的安全性,因为您可以使用ACL指定数据工厂实例对azure函数的访问权限
Dec 2018 Update :
If you are thinking of doing this using azure function, azure data factory NOW provides you with an azure function step! the underlying principle is the same as you will have to expose the azure function with a HTTP trigger. however this provides better security since you can specify your data factory instance access to the azure function using ACL
参考资料:
原始答案
- 从发布的评论中,我相信您不想使用自定义活动路线.
- 您可以尝试使用复制任务,即使这可能不是预期的目的.
- 有一个
httpConnector
可用于从Web来源复制数据.
Orginal Answer
- From the comments posted I believe you dont want to use custom activities route.
- You could try using a copy task for this, even though probably this is not the intended purpose.
- there is a
httpConnector
available for copying data from a web source.https://docs .microsoft.com/en-us/azure/data-factory/v1/data-factory-http-connector
- 复制任务会触发http端点
- 您可以指定多种身份验证机制,从基本到 OAuth2.
- 在下面我使用端点触发azure函数过程时,输出保存在datalake文件夹中以进行日志记录(您显然可以使用其他东西,例如您的情况是blob存储.)
- the copy task triggers an http endpoint,
- you can specify a variety of authentication mechanisms from Basic to OAuth2.
- below I am using the end point to trigger the azure function process, the output is saved in datalake folder for logging (you can use other things obviously, like in your case it would be blob storage.)
{ "name": "linkedservice-httpEndpoint", "properties": { "type": "Http", "typeProperties": { "url": "https://azurefunction.api.com/", "authenticationType": "Anonymous" } } }
基本输入数据集
{ "name": "Http-Request", "properties": { "type": "Http", "linkedServiceName": "linkedservice-httpEndpoint", "availability": { "frequency": "Minute", "interval": 30 }, "typeProperties": { "relativeUrl": "/api/status", "requestMethod": "Get", "format": { "type": "TextFormat", "columnDelimiter": "," } }, "structure": [ { "name": "Status", "type": "String" } ], "published": false, "external": true, "policy": {} } }
输出
{ "name": "Http-Response", "properties": { "structure": [ ... ], "published": false, "type": "AzureDataLakeStore", "linkedServiceName": "linkedservice-dataLake", "typeProperties": { ... }, "availability": { ... }, "external": false, "policy": {} } }
活动
{ "type": "Copy", "name": "Trigger Azure Function or WebJob with Http Trigger", "scheduler": { "frequency": "Day", "interval": 1 }, "typeProperties": { "source": { "type": "HttpSource", "recursive": false }, "sink": { "type": "AzureDataLakeStoreSink", "copyBehavior": "MergeFiles", "writeBatchSize": 0, "writeBatchTimeout": "00:00:00" } }, "inputs": [ { "name": "Http-Request" } ], "outputs": [ { "name": "Http-Response" } ], "policy": { ... } }
这篇关于如何在Azure数据工厂管道中集成WebJob的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!