将DataTable上载到Azure Blob存储 [英] Uploading DataTable to Azure blob storage
问题描述
我正在尝试将DataTable序列化为XML,然后将其上传到Azure blob存储.
I am trying to serialize a DataTable to XML and then upload it to Azure blob storage.
下面的代码可以工作,但是看起来很笨拙并且占用大量内存.有一个更好的方法吗?我特别指的是我将内存流转储到字节数组,然后从中创建新的内存流的事实.
The below code works, but seems clunky and memory hungry. Is there a better way to do this? I'm especially referring to the fact that I am dumping a memory stream to a byte array and then creating a new memory stream from it.
var container = blobClient.GetContainerReference("container");
var blockBlob = container.GetBlockBlobReference("blob");
byte[] blobBytes;
using (var writeStream = new MemoryStream())
{
using (var writer = new StreamWriter(writeStream))
{
table.WriteXml(writer, XmlWriteMode.WriteSchema);
}
blobBytes = writeStream.ToArray();
}
using (var readStream = new MemoryStream(blobBytes))
{
blockBlob.UploadFromStream(readStream);
}
推荐答案
新答案:
我已经了解了一种更好的方法,该方法是直接向blob打开写流.例如:
I've learned of a better approach, which is to open a write stream directly to the blob. For example:
using (var writeStream = blockBlob.OpenWrite())
{
using (var writer = new StreamWriter(writeStream))
{
table.WriteXml(writer, XmlWriteMode.WriteSchema);
}
}
对于我们的开发人员而言,这不需要将整个表都缓存在内存中,并且可能会遇到较少的数据复制.
Per our developer, this does not require the entire table to be buffered in-memory, and will probably encounter less copying around of data.
原始答案:
您可以使用CloudBlockBlob.UploadFromByteArray方法,并直接上传字节数组,而不用创建第二个流.
You can use the CloudBlockBlob.UploadFromByteArray method, and upload the byte array directly, instead of creating the second stream.
这篇关于将DataTable上载到Azure Blob存储的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!