脚本任务,使用Azure Datafactory SSIS将zip文件上传到Blob存储 [英] Script task to upload zip file to blob storage with azure datafactory SSIS
问题描述
我有一个天蓝色的数据工厂项目.我需要从我的Azure SQL数据库中查询一些数据,然后加载到xml中,将其压缩并上传到blob解决方案.我不想向文件系统写入任何内容(因为我认为Azure数据库没有任何存储功能),所以我正在使用Memorystream.
I have a azure data factory project. I need to query some data from my Azure SQL Database then load into an xml, zip it and upload to blob sotrage. I don't want to write anything to the file system (because I think the Azure Database doesn't have any lcoal storage) so I am using the Memorystream.
此脚本任务适用于我的本地SSIS数据库,但不适用于Azure Datafactory:
This Script Task is working on my local SSIS database but not on the Azure Datafactory:
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.Collections;
using System.Linq;
using System.Data.OleDb;
using System.IO;
using System.IO.Compression;
using System.Data.SqlClient;
using Microsoft.Azure;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
public void Main()
{
CloudStorageAccount storageAccount = null;
CloudBlobContainer cloudBlobContainer = null;
try
{
DataSet ds = new DataSet("FullList");
OleDbDataAdapter oleDa = new OleDbDataAdapter();
DataTable dt = new DataTable("CustomerTable");
oleDa.Fill(dt, Dts.Variables["User::CustomerSelect"].Value);
ds.Tables.Add(dt);
DataTable dt_product = new DataTable("ProductTable");
oleDa.Fill(dt_product, Dts.Variables["User::ProductSelect"].Value);
ds.Tables.Add(dt_product);
DataRelation relation = ds.Relations.Add("relation", ds.Tables["CustomerTable"].Columns["id"], ds.Tables["ProductTable"].Columns["id"]);
relation.Nested = true;
string connstring = Dts.Connections["testolgdev"].AcquireConnection(Dts.Transaction).ToString();
if (CloudStorageAccount.TryParse(connstring, out storageAccount))
{
try
{
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
cloudBlobContainer = cloudBlobClient.GetContainerReference("flat");
string fileName = "xml" + DateTime.Now.ToString("yyyyMMddHHmmssfff") + ".zip";
var blob = cloudBlobContainer.GetBlockBlobReference(fileName);
using (var stream = new ZipArchive(blob.OpenWrite(), ZipArchiveMode.Create))
{
var entry = stream.CreateEntry("test_dataset_fullresult_onlymem.xml");
using (var es = entry.Open())
{
ds.WriteXml(es);
}
}
}
catch (StorageException ex)
{
Console.WriteLine("Error returned from the service: {0}", ex.Message);
}
}
else
{
Console.WriteLine("Wrong connection string");
}
}
catch (TargetInvocationException e)
{
throw;
}
Dts.TaskResult = (int)ScriptResults.Success;
}
当我部署并执行它时,这是Azure Datafactory SSIS错误:
脚本任务1:错误:无法加载文件或程序集'Microsoft.WindowsAzure.Storage,版本= 4.3.0.0,区域性=中性,PublicKeyToken = 31bf3856ad364e35'或其依赖项之一.找到的程序集的清单定义与程序集引用不匹配. (来自HRESULT的异常:0x80131040)
This is the Azure Datafactory SSIS error when I deploy and execute it:
Script Task 1:Error: Could not load file or assembly 'Microsoft.WindowsAzure.Storage, Version=4.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
是否可以解决此问题?我可以将丢失的dll添加到Azure Datafactory吗?
Is it possible to fix this? Can I add the missing dll to Azure Datafactory?
推荐答案
通过本指南,我可以将丢失的dll添加到Azure-SSIS IR:
With this guide I can add the missing dlls to Azure-SSIS IR:
https://docs.microsoft.com/en-us/azure/data-factory/how-to-configure-azure-ssis-ir-custom-setup.
感谢桑迪·温纳科(MSFT)!
这篇关于脚本任务,使用Azure Datafactory SSIS将zip文件上传到Blob存储的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!