Azure数据工厂 - 批量加载 [英] Azure Data Factory - Bulk Load
问题描述
我正在尝试使用ForEach Activity来练习ADF批量加载,方法是将多个表复制到BLOB中每个表的每个文件文件中 。
根据下面的教程,我创建了Array类型参数"tableList"。在这里寻求帮助。
I am trying to practice the ADF bulk load using ForEach Activity by copying multiple tables into each file file per table in BLOB. As per below tutorial, I have created parameter of type Array "tableList". Seeking help here.
我的来源是:SQL Server数据库
My source is: SQL Server DB
接收器:Azure BLOB
Sink: Azure BLOB
错误消息:活动ExecuteMultipleTablesForCopyPipeline失败:活动ForEachTable失败:函数'length'期望其参数为数组或字符串。提供的值为'Object'类型。
Error message: Activity ExecuteMultipleTablesForCopyPipeline failed: Activity ForEachTable failed: The function 'length' expects its parameter to be an array or a string. The provided value is of type 'Object'.
推荐答案
嗨Naveen Kumar,
Hi Naveen Kumar,
感谢您的查询。
Thanks for your inquiry.
我在您的查询中没有看到任何链接,但我已经厌倦了这个场景并能够复制多个表格每个表格中的每个文件都包含在Blob中。
I don't see any link here in your query, but I have tired the scenario and was able to copy multiple tables into each file per table in Blob.
以下是我遵循的步骤。
第1步:
- 创建源(本地)和接收(blob)数据集。
- 以下是我用于管道参数的输入值。 ( customer_table,
project_table, data_source_table是Sql表名称)
- Create Source(on-prem) and Sink(blob) datasets.
- Below is the input value I used for the pipeline parameter. (customer_table, project_table, data_source_table are the Sql table names)
[
{
"TABLE_NAME":" customer_table"
},...
{
"TABLE_NAME":" project_table"
},
{
"TABLE_NAME":" data_source_table"
}
]
以下是UI专长:
以下是Blob中的输出:
Here is the output in Blob:
希望这会有所帮助。如果您发现任何问题,请回复您所关注的参数输入值和教程链接。我们很乐意深入解读。
Hope this helps. If you see any issues, please reply back with the parameter input value and tutorial link you are following. We will be happy to take a deep dive to unblock.
谢谢!
这篇关于Azure数据工厂 - 批量加载的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!