播放2.x:使用迭代器上传反应式文件 [英] Play 2.x : Reactive file upload with Iteratees

查看:169
本文介绍了播放2.x:使用迭代器上传反应式文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我将从以下问题开始:如何使用Scala API的 Iteratee 将文件上传到云存储(我的Azure Blob存储但是我不认为这是最重要的)



背景:

我需要将块输入大约1 MB的块,用于存储大型媒体文件(300 MB +)作为Azure的 BlockBlobs 。不幸的是,我的Scala知识仍然很差(我的项目是基于Java的,Scala的唯一用处就是上传控制器)。



我试过这个代码: 为什么调用错误或在BodyParser的迭代器中完成请求在Play Framework 2.0中挂起?(作为输入 Iteratee ) - 它工作的很好,但是我可以使用的每个 Element 的大小8192字节,因此发送一百兆字节文件到云是太小了。



我必须说这对我来说是一种新的方法,很可能我误解了一些东西不要告诉我误解了所有的东西;>)

我会欣赏任何提示或链接,这将有助于我的话题。如果有类似用法的样本,对我来说这将是最好的选择。

解决方案

基本上,您首先需要的是以更大的块(1024 * 1024字节)重新输入输入。

首先让我们有一个 Iteratee ,这会消耗高达1m的字节(确保最后一个块变小)

  val consumeAMB = 
Traversable.takeUpTo [Array [Byte]](1024 * 1024)&>> Iteratee.consume()

使用这个,我们可以构造一个 Enumeratee < code $(适配器),它将重新组合块,使用一个称为分组的API:

$ $ $ $ $ $ $ $ $ $> $ rechunkAdapter:Enumeratee [Array [ Byte],Array [Byte]] =
Enumeratee.grouped(consumeAMB)

使用 Iteratee 来确定每个块的放置量。它使用我们的consumerAMB。这意味着结果是一个 Enumeratee ,它重新输入到1MB的 Array [Byte] 中。
$ b

现在我们需要编写 BodyParser ,它将使用 Iteratee.foldM 发送每个字节块的方法:

$ $ $ $ $ $ $ $ $ $ $ $ $ $ $写入存储:Iteratee [Array [Byte],_] =
Iteratee .foldM [Array [Byte],_](connectionHandle){(c,bytes)=>
//写入字节并返回下一个句柄,可能在未来
}

foldM传递一个状态,并在它的传递函数中使用它(S,Input [Array [Byte]])=>未来[S] 返回一个新的未来状态。 foldM将不会再次调用这个函数,直到 Future 完成并且有一个可用的输入块。



body分析器将会重新输入并将其推入商店:

  BodyParser(rh =>(rechunkAdapter&> > writeToStore).map(Right(_)))

返回一个Right表示你正在返回一个正文解析结束的正文(正好是这里的处理程序)。

I will start with the question: How to use Scala API's Iteratee to upload a file to the cloud storage (Azure Blob Storage in my case, but I don't think it's most important now)

Background:

I need to chunk the input into blocks of about 1 MB for storing large media files (300 MB+) as an Azure's BlockBlobs. Unfortunately, my Scala knowledge is still poor (my project is Java based and the only use for Scala in it will be an Upload controller).

I tried with this code: Why makes calling error or done in a BodyParser's Iteratee the request hang in Play Framework 2.0? (as a Input Iteratee) - it works quite well but eachElement that I could use has size of 8192 bytes, so it's too small for sending some hundred megabyte files to the cloud.

I must say that's quite a new approach to me, and most probably I misunderstood something (don't want to tell that I misunderstood everything ;> )

I will appreciate any hint or link, which will help me with that topic. If is there any sample of similar usage it would be the best option for me to get the idea.

解决方案

Basically what you need first is rechunk input as bigger chunks, 1024 * 1024 bytes.

First let's have an Iteratee that will consume up to 1m of bytes (ok to have the last chunk smaller)

val consumeAMB = 
  Traversable.takeUpTo[Array[Byte]](1024*1024) &>> Iteratee.consume()

Using that, we can construct an Enumeratee (adapter) that will regroup chunks, using an API called grouped:

val rechunkAdapter:Enumeratee[Array[Byte],Array[Byte]] =
  Enumeratee.grouped(consumeAMB)

Here grouped uses an Iteratee to determine how much to put in each chunk. It uses the our consumeAMB for that. Which means the result is an Enumeratee that rechunks input into Array[Byte] of 1MB.

Now we need to write the BodyParser, which will use the Iteratee.foldM method to send each chunk of bytes:

val writeToStore: Iteratee[Array[Byte],_] =
  Iteratee.foldM[Array[Byte],_](connectionHandle){ (c,bytes) => 
    // write bytes and return next handle, probable in a Future
  }

foldM passes a state along and uses it in its passed function (S,Input[Array[Byte]]) => Future[S] to return a new Future of state. foldM will not call the function again until the Future is completed and there is an available chunk of input.

And the body parser will be rechunking input and pushing it into the store:

BodyParser( rh => (rechunkAdapter &>> writeToStore).map(Right(_)))

Returning a Right indicates that you are returning a body by the end of the body parsing (which happens to be the handler here).

这篇关于播放2.x:使用迭代器上传反应式文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆