Django“批量上传"到Amazon s3 [英] Django "chunked uploads" to Amazon s3
问题描述
我们将 S3Boto3Storage 用于将媒体文件上传到我们在Amazon上的s3存储.这工作得很好.由于我们将Cloudflare用作免费"版本,因此我们仅限于
We're using the S3Boto3Storage to upload media files to our s3 storage on Amazon. This works pretty well. Since we're using Cloudflare as a "free" version we're limited to a maximum of 100MB per request. This is a big problem. Even the Enterprise plan is limited to 500MB.
是否可以使用一种块上传"来绕过每个请求100MB的限制?
Is there a way to use a kind of "chunked uploads" to bypass the 100MB-per-request limit?
model.py
class Media(models.Model):
name = models.CharField(max_length=100, null=True)
file = models.FileField(upload_to=get_path)
storage.py
storage.py
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
location = 'media'
file_overwrite = False
views.py
@api_view(['POST'])
def upload_media(request):
if request.method == 'POST':
serializer = MediaSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
推荐答案
要绕过该限制,您必须使用 resumable.js 对上传的文件进行分块,以通过REST调用发送给服务器.在服务器端,您必须在推送到s3之前在服务器端重新组装文件.
In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.
这篇关于Django“批量上传"到Amazon s3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!