如何在动态Amazon S3存储桶中使用Django FileField? [英] How to use Django FileField with dynamic Amazon S3 bucket?

查看:89
本文介绍了如何在动态Amazon S3存储桶中使用Django FileField?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个带Filefield的Django模型,以及一个使用Amazon S3存储桶的默认存储(通过出色的 django -存储).

I have a Django model with a Filefield, and a default storage using Amazon S3 bucket (via the excellent django-storage).

我的问题是不将文件上传到动态文件夹路径(正如我们在许多其他答案中看到的那样).我的问题更加深层和双重:

My problem is not to upload files to a dynamic folder path (as we see in many other answers). My problem is deeper and twofold:

  • 文件已经在Amazon S3存储桶中,并且我不想下载-重新上传它们(更糟糕的是:我只有它们的读取权限).
  • 可以通过S3凭据访问文件,这些凭据可能因一个文件而异(即文件可以位于不同的存储桶中,并且可以通过不同的凭据访问).因此,我的FileField必须具有动态存储空间.

有什么主意吗?

(Djabgo 1.11,Python 3).

(Djabgo 1.11, Python 3).

推荐答案

事实证明这并不困难. 但是下面的代码没有经过充分测试,我必须警告您不要未经检查就复制粘贴!

It turns out it is not so difficult. But the code below isn't much tested, and I must warn you to not copy-paste without checking!

我创建了一个自定义的FileField子类:

I have created a custom FileField subclass:

class DynamicS3BucketFileField(models.FileField):
    attr_class = S3Boto3StorageFile
    descriptor_class = DynamicS3BucketFileDescriptor

    def pre_save(self, model_instance, add):
        return getattr(model_instance, self.attname)

请注意,attr_class特别使用了S3Boto3StorageFile类(由django-storages提供的File子类).

Note that the attr_class is specifically using the S3Boto3StorageFile class (a File subclass provided by django-storages).

pre_save重载只有一个目标:避免尝试尝试重新上传文件的内部file.save调用.

The pre_save overload has only one goal: avoid the internal file.save call that would attempt to re-upload the file.

魔术发生在FileDescriptor子类中:

class DynamicS3BucketFileDescriptor(FileDescriptor):
    def __get__(self, instance, cls=None):
        if instance is None:
            return self

        # Copied from FileDescriptor
        if self.field.name in instance.__dict__:
            file = instance.__dict__[self.field.name]
        else:
            instance.refresh_from_db(fields=[self.field.name])
            file = getattr(instance, self.field.name)

        # Make sure to transform storage to a Storage instance.
        if callable(self.field.storage):
            self.field.storage = self.field.storage(instance)

        # The file can be a string here (depending on when/how we access the field).
        if isinstance(file, six.string_types):
            # We instance file following S3Boto3StorageFile constructor.
            file = self.field.attr_class(file, 'rb', self.field.storage)
            # We follow here the way FileDescriptor work (see 'return' finish line).
            instance.__dict__[self.field.name] = file

        # Copied from FileDescriptor. The difference here is that these 3
        # properties are set systematically without conditions.
        file.instance = instance
        file.field = self.field
        file.storage = self.field.storage
        # Added a very handy property to file.
        file.url = self.field.storage.url(file.name)

        return instance.__dict__[self.field.name]

上面的代码采用适合于我的情况的FileDescriptor的一些内部代码.请注意if callable(self.field.storage):,如下所述.

The code above takes some internal code of FileDescriptor adapted to my case. Note the if callable(self.field.storage):, explained below.

关键行是:file = self.field.attr_class(file, 'rb', self.field.storage),它会根据当前file实例的内容自动创建一个有效的S3Boto3StorageFile实例(有时是一个文件,有时是一个简单的字符串,这是FileDescriptor业务).

The key line is: file = self.field.attr_class(file, 'rb', self.field.storage), which automatically creates a valid instance of S3Boto3StorageFile depending on the content of the current file instance (sometimes, it's a file, sometimes it's a simple string, that's part of the FileDescriptor business).

现在,动态部分非常简单.实际上,在声明FileField时,可以为storage选项提供一个函数.像这样:

Now, the dynamic part comes quite simply. In fact, when declaring a FileField, you can provide to the storage option, a function. Like this:

class MyMedia(models.Model):
    class Meta:
        app_label = 'appname'

    mediaset = models.ForeignKey(Mediaset, on_delete=models.CASCADE, related_name='media_files')
    file = DynamicS3BucketFileField(null=True, blank=True, storage=get_fits_file_storage)

然后将使用单个参数调用函数get_fits_file_storage:MyMedia的实例.因此,我可以使用该对象的任何属性来返回有效存储.在我的案例中,mediaset包含一个密钥,该密钥使我可以检索包含S3凭证的对象,利用该对象可以构建S3Boto3Storage实例(由django-storages提供的另一个类).

And the function get_fits_file_storage will be called with a single argument: the instance of MyMedia. Hence, I can use any property of that object, to return the valid storage. In my case mediaset, which contains a key that allow me to retrieve an object containing S3 credentials with which I can build a S3Boto3Storage instance (another class provided by django-storages).

特别是:

def get_fits_file_storage(instance):
    name = instance.mediaset.archive_storage_name
    return instance.mediaset.archive.bucket_keys.get(name= name).get_storage()

等等!

这篇关于如何在动态Amazon S3存储桶中使用Django FileField?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆