Python-按块上传FTP中的内存文件(由API调用生成) [英] Python - Upload a in-memory file (generated by API calls) in FTP by chunks

查看:196
本文介绍了Python-按块上传FTP中的内存文件(由API调用生成)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要能够在Python中通过FTP和SFTP上传文件,但有一些不常见的限制.

I need to be able to upload a file through FTP and SFTP in Python but with some not so usual constraints.

  1. 文件不得写入磁盘.

  1. File MUST NOT be written in disk.

文件的生成方式是通过调用API并将JSON响应写入文件中.

The file how it is generated is by calling an API and writing the response which is in JSON to the file.

有多个API调用.一次调用API不可能检索全部结果.

There are multiple calls to the API. It is not possible to retrieve the whole result in one single call of the API.

我无法通过执行所需的多个调用并在每个调用中追加附加内容来将完整结果存储在字符串变量中,直到将整个文件存储在内存中为止.文件可能很大,并且存在内存资源限制.每个块应该被发送并且内存被释放.

I can not store in a string variable the full result by doing the multiple calls needed and appending in each call until I have the whole file in memory. File could be huge and there is a memory resource constraint. Each chunk should be sent and memory deallocated.

所以这是我想要的一些示例代码:

So here some sample code of what I would like to:

def chunks_generator():
    range_list = range(0, 4000, 100)
    for i in range_list:
        data_chunk = requests.get(url=someurl, url_parameters={'offset':i, 'limit':100})
        yield str(data_chunk)

def upload_file():
    chunks_generator = chunks_generator()
    for chunk in chunks_generator:
        data_chunk= chunk
        chunk_io = io.BytesIO(data_chunk)
        ftp = FTP(self.host)
        ftp.login(user=self.username, passwd=self.password)
        ftp.cwd(self.remote_path)
        ftp.storbinary("STOR " + "myfilename.json", chunk_io)

我只需要一个附加了所有块的文件. 我已经并且可以使用的是,如果我将整个文件存储在内存中并像这样立即发送:

I want only one file with all the chunks appended. What I have already and works is if I have the whole file in memory and send it at once like this:

string_io = io.BytesIO(all_chunks_together_in_one_string)
ftp = FTP(self.host)
ftp.login(user=self.username, passwd=self.password)
ftp.cwd(self.remote_path)
ftp.storbinary("STOR " + "myfilename.json", string_io )

奖金

我在ftplib中需要此文件,但对于SFTP在Paramiko中也将需要它.如果还有其他更好的库,我会开放.

I need this in ftplib but will need it in Paramiko as well for SFTP. If there are any other libraries that this would work better I am open.

如果我需要压缩文件怎么办?我可以一次压缩每个块并一次发送压缩后的块吗?

How about if I need to zip the file? Can I zip each chunk and send the zip-chunked chunk at a time?

推荐答案

您可以实现类似文件的类,该类通过调用.read(blocksize)方法从requests对象中检索数据.

You can implement file-like class that upon calling .read(blocksize) method retrieves data from requests object.

类似的东西(未经测试):

Something like this (untested):

class ChunksGenerator:
    i = 0
    requests = None

    def __init__(self, requests)
        self.requests = requests

    def read(self, blocksize):
        # TODO: somehow detect end-of-file and return false in that case
        buf = requests.get(
                  url=someurl, url_parameters={'offset':self.i, 'limit':blocksize})
        self.i += blocksize
        return buf

generator = ChunksGenerator(requests)
ftp.storbinary("STOR " + "myfilename.json", generator)


使用Paramiko,您可以将同一类与


With Paramiko, you can use the same class with SFTPClient.putfo method.

这篇关于Python-按块上传FTP中的内存文件(由API调用生成)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆