重组在多部分上传中生成的文件块 [英] Reassembling file chunks produced in a multi-part upload

查看:172
本文介绍了重组在多部分上传中生成的文件块的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用优秀的flow.js库来处理文件上传。这是一个可恢复的HTML5上传,在服务器上产生大量的块,必须重新组合。例如,foo.mov可能会变成

pre $时间戳-foomov.1
时间戳-foomov.2
...
timestamp-foomov.n

上传工作正常,但遇到问题重新组合成一个单一的二进制。我有以下代码从库作者在Github上提供的Node.js服务器示例( https://github.com/flowjs/flow.js/tree/master/samples/Node.js )。

  $ .write = function(identifier,writableStream,options){
options = options || {};
options.end =(typeof options ['end'] =='undefined'?true:options ['end']);

//遍历每个块
var pipeChunk =函数(number){

var chunkFilename = getChunkFilename(number,identifier);
fs.exists(chunkFilename,function(exists){

if(exists){
//如果包含当前编号的块存在,
// //从文件
//创建一个ReadStream并将其传递给指定的WritableStream
var sourceStream = fs.createReadStream(chunkFilename);
sourceStream.pipe(writableStream,{
end :false
);
sourceStream.on('end',function(){
//当数据块完全流式传输时,
//跳转到下一个
pipeChunk(number + 1);
});
} else {
//当所有的块已经被管道,结束流
if(options.end )();
if(options.onDone)options.onDone();
}
});
}
pipeChunk(1);



$ b $ p
$ b

我用下面的路线调用这段代码,并期待它产生在tmp目录中重新组合的二进制文件(这是我保存我的块)。然而,没有发生。我缺少什么?

  exports.download =函数(req,res,next){
switch(req。方法){
'GET':
var stream = fs.createWriteStream('foobar');
flow.write(req.params.identifier,res);
break;


$ / code $ / pre

解决方案

所有的块很容易,只要调用这个:

  var stream = fs.createWriteStream(filename); 
r.write(identifier,stream);

而就是这样!

其他的问题是,什么时候应该调用这个方法?
也许当所有的块被上传并出现在tmp文件夹中。

但是重复调用done还有另一个问题。
一旦所有块存在,就可以通过创建和锁定文件来解决这个问题。
然后调用

  r.write(identifier,stream); 

然后清理所有块,释放锁并关闭文件。

同样的approuch在php服务器端库中完成: https://github.com/flowjs/flow-php-server/blob/master/src/Flow/File.php#L102


I'm using the excellent flow.js library to handle file uploads. It's a resumable HTML5 upload that produces a bunch of chunks on the server that must be reassembled. For example, foo.mov might become

timestamp-foomov.1
timestamp-foomov.2
...
timestamp-foomov.n

Uploads are working but I'm having trouble recombining the parts into a single binary. I have the following code from the Node.js server example the library authors provided on Github (https://github.com/flowjs/flow.js/tree/master/samples/Node.js).

  $.write = function(identifier, writableStream, options) {
  options = options || {};
  options.end = (typeof options['end'] == 'undefined' ? true : options['end']);

  // Iterate over each chunk
  var pipeChunk = function(number) {

      var chunkFilename = getChunkFilename(number, identifier);
      fs.exists(chunkFilename, function(exists) {

          if (exists) {
              // If the chunk with the current number exists,
              // then create a ReadStream from the file
              // and pipe it to the specified writableStream.
              var sourceStream = fs.createReadStream(chunkFilename);
              sourceStream.pipe(writableStream, {
                  end: false
              });
              sourceStream.on('end', function() {
                  // When the chunk is fully streamed,
                  // jump to the next one
                  pipeChunk(number + 1);
              });
          } else {
              // When all the chunks have been piped, end the stream
              if (options.end) writableStream.end();
              if (options.onDone) options.onDone();
          }
      });
  }
  pipeChunk(1);
  }

I'm invoking this code with the following route and am expecting it to produce a reassembled binary in the tmp directory (that's where I'm saving my chunks). However nothing is happening. What am I missing?

exports.download = function(req, res, next) {
switch(req.method) {
    case 'GET':
    var stream = fs.createWriteStream('foobar');
    flow.write(req.params.identifier, res);
    break;
}
}

解决方案

Reassembling all chunks is easy, just call this:

     var stream = fs.createWriteStream(filename);
     r.write(identifier, stream);

And that is it!

But other question is, when this method should be called? Maybe when all chunks are uploaded and present at tmp folder.

But there is another issue with duplicate calls of the done. This can be solved by creating and locking the file, once all chunks exists. Then call

    r.write(identifier, stream);

Then clean all chunks, release the lock and close the file.

Same approuch is done in php server side library: https://github.com/flowjs/flow-php-server/blob/master/src/Flow/File.php#L102

这篇关于重组在多部分上传中生成的文件块的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆