将Big ZipArchive-MemoryStream上载到Azure [英] Upload Big ZipArchive-MemoryStream to Azure
问题描述
TL; DR :是否有可能在压缩时将大的MemoryStream作为块快速上传到Azure?
TL;DR: Is it possible to upload a big MemoryStream to Azure as chunks on the fly while zipping?
我有保存到MemoryStream
中的文件,我将这些文件添加到另一个MemoryStream中的ZipArchive
中.
我想使用
I have files which get saved into a MemoryStream
, I add these files to a ZipArchive
in another MemoryStream.
This MemoryStream I want to upload to an Azure-BlockBlob-Storage
using
blockBlob.UploadFromStream(zipMemoryStream);
到目前为止一切都很好.
现在的问题是,Zip存档可能会变得更大,然后是 8GB ,这是MemoryStream的问题.
So far so good.
The Problem now is, that the Zip-Archive might get bigger then 8GB which is a problem with the MemoryStream.
是否可以将内存流中的部分作为块上传到蔚蓝,然后从流中删除这些字节?
Is it possible to upload parts from the memory-stream as chunks to azure, and remove these bytes from the Stream?
还是有更好的方法来处理zipArchive和azure?
Or is there a better approach to deal with zipArchive and azure?
要进行压缩,我在包System.IO.Compression
最好的问候,
Flo
Best Regards,
Flo
推荐答案
可能不是您要找的东西,但是您是否尝试执行以下操作:
It may be not exactly what you are looking for, but did you try to do something like this:
var blob = container.GetBlockBlobReference("zipped.zip");
using (var stream = new ZipArchive(blob.OpenWrite(), ZipArchiveMode.Create))
{
var entry = stream.CreateEntry("entry1");
using (var es = entry.Open())
{
// Fill entry with data
}
// Other code
}
当您调用CloudBlockBlob
的OpenWrite
时,它会创建一个CloudBlobStream
实例,该实例的工作方式与MemoryStream
不同.就我记得CloudBlobStream
不会将old
块保存到内存中而言,CloudBlobStream
以4MB块的形式将数据发送到Azure存储服务.
When you call OpenWrite
of CloudBlockBlob
it creates an instance of CloudBlobStream
that works in a different way than MemoryStream
. CloudBlobStream
sends data to Azure Storage Service in 4MB chunks, as far as I remember it doesn't save old
chunks into the memory.
这篇关于将Big ZipArchive-MemoryStream上载到Azure的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!