通过Python客户端流式传输一个大文件给CherryPy [英] Streaming POST a large file to CherryPy by Python client
问题描述
我想从一个python客户端发送一个大文件到cherrypy。我使用的是请求库。
这是我的客户端代码:
def upload(fileName = None):
url ='http:// localhost:8080 / upload'
files = {'myFile':(fileName,open(fileName,'rb') )}
r = requests.post(url,files = files)
#with open(fileName,'rb')as payload:
#headers = {'content-type ':'multipart / form-data'}
#r = requests.post('http://127.0.0.1:8080',data = payload,verify = False,headers = headers)
if __name__ =='__main__':
upload(sys.argv [1])$ b $ b
问题是这将整个文件放在RAM内存中。有没有办法将文件分片?
class FileDemo(object):
@ cherrypy.expose
def upload(self,myFile):
print myFile.filename
#size = 0
#decoder = MultipartDecoder(myFile, 'image / jpeg')
#为decode.parts的一部分:
#print(part.header ['content-type'])
#while True:
#advances到尚未读取的内容
#myFile.file.seek(size,0)
#reads 100mb一次,所以它不' t填满RAM
#data = myFile.file.read(10240000)
$ b #newFile = open(/ home / ivo / Desktop /+ str(myFile.filename), 'a +')
#newFile.write(data)
#newFile.close
#size + = len(data)
#if len (数据) 10240000:
#break
if __name__ =='__main__':
cherrypy.quickstart(FileDemo())
这是服务器端的代码。它有很多评论,因为我一直在尝试很多东西。现在,我只是打印文件名,客户端仍然将整个文件传输到RAM。
我不知道还有什么可以尝试的。如果是CherryPy特定的上传,您可以跳过 multipart / form - 数据
编码障碍,只发送流文件内容的正文。
客户
#!/ usr / bin / env python
# - * - coding:utf-8 - * -
导入urllib2
导入io
导入os
$ b class FileLenIO(io.FileIO):
def __init __( self,name,mode ='r',closefd = True):
io.FileIO .__ init __(self,name,mode,closefd)
$ b $ self .__ size = statinfo = os.stat (name).st_size
$ b def __len __(self):
return self .__ size
$ bf = FileLenIO('/ home / user / Videos / video.mp4','rb')
request = urllib2.Request('http://127.0.0.1:8080/upload',f)
request.add_header('Content-Type',' application / octet-stream')
#如果你需要,你可以添加自定义的头文件名
resp onse = urllib2.urlopen(request)
print response.read()
#!/ usr / bin / env python
#编码:utf-8 - * -
导入os
导入tempfile
导入shutil
导入cherrypy
$ b'全局':{
'server.socket_host':'127.0.0.1',
'server.socket_port':8080,
'server.thread_pool':8,
#删除请求主体大小的任何限制; cherrypy的默认值是100MB
'server.max_request_body_size':0,
#增加服务器套接字超时到60s; cherrypy's defult是10s
'server.socket_timeout':60
}
}
class App:
@cherrypy .config(** {'response.timeout':3600})#默认是300s
@ cherrypy.expose()
def upload(self):
'''Handle non-multipart上传'''
目的地= os.path.join('/ home / user / test-upload')
打开(destination,'wb')为f:
shutil.copyfileobj(cherrypy.request.body,f)
return'好的'
if __name__ =='__main__':
cherrypy .quickstart(App(),'/',config)
测试1.3GiB视频文件。服务器端内存消耗在10MiB以下,客户端在5MiB以下。
I'm want to POST a large file from a python client to cherrypy. I'm using the requests library.
This is my client code:
def upload(fileName=None):
url = 'http://localhost:8080/upload'
files = {'myFile': ( fileName, open(fileName, 'rb') )}
r = requests.post(url, files=files)
#with open(fileName,'rb') as payload:
#headers = {'content-type': 'multipart/form-data'}
#r = requests.post('http://127.0.0.1:8080', data=payload,verify=False,headers=headers)
if __name__ == '__main__':
upload(sys.argv[1])
The problem is that this puts the whole file in the RAM memory. Is there any way to POST the file in pieces?
class FileDemo(object):
@cherrypy.expose
def upload(self, myFile):
print myFile.filename
#size = 0
#decoder = MultipartDecoder(myFile, 'image/jpeg')
#for part in decoder.parts:
#print(part.header['content-type'])
#while True:
#advances to the content that hasn't been read
#myFile.file.seek(size, 0)
#reads 100mb at a time so it doesn't fill up the RAM
#data = myFile.file.read(10240000)
#newFile = open("/home/ivo/Desktop/"+str(myFile.filename), 'a+')
#newFile.write(data)
#newFile.close
#size += len(data)
#if len(data) < 10240000:
#break
if __name__ == '__main__':
cherrypy.quickstart(FileDemo())
This is the code in the server side. It has a lot of comments because I've been trying a lot of stuff. Right now I'm just printing the file name and the client still transfers the whole file to RAM.
I don't know what else to try. Thank you in advance for your help.
If it's CherryPy specific upload you can skip multipart/form-data
encoding obstacles and just send streaming POST body of file contents.
client
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import urllib2
import io
import os
class FileLenIO(io.FileIO):
def __init__(self, name, mode = 'r', closefd = True):
io.FileIO.__init__(self, name, mode, closefd)
self.__size = statinfo = os.stat(name).st_size
def __len__(self):
return self.__size
f = FileLenIO('/home/user/Videos/video.mp4', 'rb')
request = urllib2.Request('http://127.0.0.1:8080/upload', f)
request.add_header('Content-Type', 'application/octet-stream')
# you can add custom header with filename if you need it
response = urllib2.urlopen(request)
print response.read()
server
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import tempfile
import shutil
import cherrypy
config = {
'global' : {
'server.socket_host' : '127.0.0.1',
'server.socket_port' : 8080,
'server.thread_pool' : 8,
# remove any limit on the request body size; cherrypy's default is 100MB
'server.max_request_body_size' : 0,
# increase server socket timeout to 60s; cherrypy's defult is 10s
'server.socket_timeout' : 60
}
}
class App:
@cherrypy.config(**{'response.timeout': 3600}) # default is 300s
@cherrypy.expose()
def upload(self):
'''Handle non-multipart upload'''
destination = os.path.join('/home/user/test-upload')
with open(destination, 'wb') as f:
shutil.copyfileobj(cherrypy.request.body, f)
return 'Okay'
if __name__ == '__main__':
cherrypy.quickstart(App(), '/', config)
Tested on 1.3GiB video file. Server-side memory consumption is under 10MiB, client's under 5MiB.
这篇关于通过Python客户端流式传输一个大文件给CherryPy的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!