上载Zip块并将其加入Azure平台 [英] Upload Zip Chunks and Join them on Azure Platform

查看:50
本文介绍了上载Zip块并将其加入Azure平台的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道..如果我制作了一个大Zip文件的块并将所有块上传到Container Blob中的Azure云存储上.我可以在Azure平台上加入这些块吗?对于分块,我正在使用此代码,该代码还会生成.bat文件以重新加入分块.

I want to know that.. If I make chunks of a big Zip file and upload all chunks on Azure Cloud Storage in Container Blobs. Can i Join these chunks on Azure Platform.? for chunking i am using this code which is also generating .bat file for rejoining the chunks..

public void SplitFile(){
    int numericUpDown = 100;//in MB
    string PathToCopyChunks = "";  // path to store chunks and ( .bat  ) file
    string FilePathMakeChunks = DirectoryNameToPutScannedData; //the path of file to make chunks.
    try{
        int kbs = numericUpDown * 1024;
        int chunkSize = numericUpDown * 1024 * 1024;
        byte[] buffer = new byte[4096];
        string cmdout = "copy/b ";
        FileStream infile = File.OpenRead(FilePathMakeChunks);
        for (long i = 0; i <= infile.Length / chunkSize; i++)
        {
            string fname = Path.Combine(PathToCopyChunks, Path.Combine(PathToCopyChunks, Path.GetFileName(FilePathMakeChunks)) + "." + chunkSize + "." + i.ToString().PadLeft(4, '0') + ".part");
            string fname_x = Path.GetFileName(FilePathMakeChunks) + "." + chunkSize + "." + i.ToString().PadLeft(4, '0') + ".part";
            if (i == infile.Length / chunkSize)
                cmdout += "\"" + fname_x + "\"";
            else
                cmdout += "\"" + fname_x + "\" + ";
            FileStream outfile = File.Create(fname);
            for (int kb = 0; kb <= kbs; kb++)
            {
                int len = infile.Read(buffer, 0, 1024);
                outfile.Write(buffer, 0, len); 
            }
            outfile.Close();
        }
        cmdout += " \"" + Path.GetFileName(FilePathMakeChunks) + "\"";
        string combinerbatch = Path.Combine(PathToCopyChunks, Path.Combine(PathToCopyChunks, Path.GetFileName(DirectoryNameToPutScannedData)) + "." + chunkSize + ".combine.bat");
        File.WriteAllText(combinerbatch, cmdout);
        MessageBox.Show("Splitting Done...!");
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.ToString());
    }
}

我正在将这些数据块与批处理文件一起上传到azure存储容器中,我想在我的azure容器中运行此批处理文件以加入数据块.希望这将有助于理解我的问题

I am uploading these chunks along with batch file in azure storage container and i want to run this batch file at my azure container to join chunks. hope this will help to understand my Question

我正在使用此代码进行上传

And I am using this code for Uploading

string[] array1 = Directory.GetFiles(@"D:\Test");
string fileName = string.Empty;
foreach (string name in array1)
{
    fileName = Path.GetFileName(name);
    CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
    var fileStream = System.IO.File.OpenRead(name);
    blockBlob.UploadFromStream(fileStream); 
}

推荐答案

有两件事:

  1. 如果您希望将文件压缩到Blob存储中,即,将大文件上传到Blob存储中,并希望Blob存储将这些文件压缩到那里,则不可能.Blob存储是简单的文件存储.
  2. 如果您有一个较大的zip文件,要分块上传,然后让blob存储重新组合这些块以创建zip文件,则有可能.

由于您没有提及要使用的技术,因此我将使用

Since you didn't mention the technology you want to use, I will use Azure REST API to describe the process.

您需要做的是在客户端(从您上载的位置)将文件拆分成块.每个块的大小不能超过4MB,并且由于块Blob的最大大小可以为200GB,因此您不能有超过50,000个块.然后,您将使用 放置块 API.

What you would need to do is split the file on the client side (from where you are uploading) in chunks. Each chunk can't be more than 4MB in size and because a block blob's maximum size can be 200GB, you can't have more than 50,000 chunks. Then you will upload these chunks using Put Block API.

一旦所有块都上传完毕,您将指示Blob存储使用

Once all blocks are uploaded, you will instruct Blob storage to create the zip file using Put Block List API.

这篇关于上载Zip块并将其加入Azure平台的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆