通过Azure.Storage.Files.Shares库将Azure分段上传到文件共享的问题 [英] Issue with Azure chunked upload to fileshare via Azure.Storage.Files.Shares library

查看:95
本文介绍了通过Azure.Storage.Files.Shares库将Azure分段上传到文件共享的问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正尝试使用库Azure.Storage.Files.Shares将文件上传到Azure文件共享.

I'm trying to upload files to an Azure fileshare using the library Azure.Storage.Files.Shares.

如果我不对文件进行分块(通过调用单个UploadRange),则可以正常工作,但是对于4Mb以上的文件,我无法使分块正常工作.下载时文件大小相同,但不会在查看器中打开.

If I don't chunk the file (by making a single UploadRange call) it works fine, but for files over 4Mb I haven't been able to get the chunking working. The file is the same size when downloaded, but won't open in a viewer.

由于出现请求正文过大"错误,我无法在大文件上设置较小的HttpRanges,因此,我将文件流拆分为多个迷你流,并上传了每个迷你流的整个HttpRange

I can't set smaller HttpRanges on a large file as I get a 'request body is too large' error, so I'm splitting the filestream into multiple mini streams and uploading the entire HttpRange of each of these

        ShareClient share = new ShareClient(Common.Settings.AppSettings.AzureStorageConnectionString, ShareName());
        ShareDirectoryClient directory = share.GetDirectoryClient(directoryName);

        ShareFileClient file = directory.GetFileClient(fileKey);
        using(FileStream stream = fileInfo.OpenRead())
        {
            file.Create(stream.Length);

            //file.UploadRange(new HttpRange(0, stream.Length), stream);

            int blockSize = 128 * 1024;

            BinaryReader reader = new BinaryReader(stream);
            while(true)
            {
                byte[] buffer = reader.ReadBytes(blockSize);
                if (buffer.Length == 0)
                    break;

                MemoryStream uploadChunk = new MemoryStream();
                uploadChunk.Write(buffer, 0, buffer.Length);
                uploadChunk.Position = 0;

                file.UploadRange(new HttpRange(0, uploadChunk.Length), uploadChunk);
            }

            reader.Close();
        }

上面的代码没有错误地上传,但是当从Azure下载图像时,它已损坏.

The code above uploads without error, but when downloading the image from Azure it is corrupt.

有人有什么想法吗?感谢您提供的任何帮助.

Does anyone have any ideas? Thanks for any help you can provide.

欢呼

史蒂夫

推荐答案

我能够重现该问题.基本上,问题在于以下代码行:

I was able to reproduce the issue. Basically the problem is with the following line of code:

new HttpRange(0, uploadChunk.Length)

基本上,您总是将内容设置在相同的范围内,这就是文件损坏的原因.

Essentially you're always setting the content at the same range and that's why the file is getting corrupted.

请尝试以下代码.它应该工作.我在这里所做的是定义HTTP范围偏移量,并使用已写入文件的字节数不断移动它.

Please try the code below. It should work. What I did here is defined the HTTP range offset and moving it constantly with number of bytes already written to the file.

        using (FileStream stream = fileInfo.OpenRead())
        {
            file.Create(stream.Length);

            //file.UploadRange(new HttpRange(0, stream.Length), stream);

            int blockSize = 1 * 1024;
            long offset = 0;//Define http range offset
            BinaryReader reader = new BinaryReader(stream);
            while (true)
            {
                byte[] buffer = reader.ReadBytes(blockSize);
                if (buffer.Length == 0)
                    break;

                MemoryStream uploadChunk = new MemoryStream();
                uploadChunk.Write(buffer, 0, buffer.Length);
                uploadChunk.Position = 0;

                HttpRange httpRange = new HttpRange(offset, buffer.Length);
                var resp = file.UploadRange(httpRange, uploadChunk);
                offset += buffer.Length;//Shift the offset by number of bytes already written
            }

            reader.Close();
        }

这篇关于通过Azure.Storage.Files.Shares库将Azure分段上传到文件共享的问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆