在python中获取POST请求的连续响应 [英] Get continuous response of POST request in python

查看:928
本文介绍了在python中获取POST请求的连续响应的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在写一个脚本,它向服务器发出一个POST请求,并被阻塞,当特定事件被触发时,服务器会继续发送响应。我必须采取一个cookie的post请求与早期的登录请求,并将其作为数据传递到POST,每个cookie持续10分钟后,我必须运行保活请求。



每当一个事件被触发,我想在文件中记录该事件,我尝试异步,unirest请求他们生成post请求,但我没有控制输出,我尝试会话,但没有用。我想以同样的顺序做以下事情



1]登录(只能执行一次)



将请求发送到服务器



3]如果有一些输出将日志记录到文件中,则永远保持步骤2的监视输出



4]通过对服务器的另一个请求保持会话。



如果您需要更多解释,请告诉我。



下面是代码,它不工作

  while True:
try:
xmldata =< eventSubscribe cookie = \%s\/> %(self.cookie)
r = requests.post(post_url,data = xmldata,stream = False,verify = False,timeout = 10)
write_to_file('Ok',r.text)
unubevents()
logout()
除了异常作为e:
print e
self.write_to_file('Ok',)
self.login

所以在上面的代码中,我这里的post调用是阻塞和连续的,后调用从来没有真正完成。
但是它接收xml格式的输出,服务器在每次触发事件时发送这些响应。



PS:我不想退出和登录再次,这工作在curl,它保持打印输出stdout,我必须运行这个代码为几个服务器,如200。

解决方案

我已经解决了这个问题与两级线程和读取块,而不是内容或read_lines()。
1]第一个线程将创建第二个线程并在超时命中时运行keepalive。



2]第二个线程使用POST请求订阅事件,然后继续监听大小为1024的块,每当接收到响应时,它被解析并且相应的数据被更新。这里我使用Stream = True的请求;这对我来说不行,因为cookie在读取响应和用于关闭的会话之前已过期。



如果有人有更好的方法,请在这里更新。 / p>

I'm writing a script which does a POST request to a server and gets blocked the server keeps sending the response whenever a specific event is triggered. I have to take a cookie for post request with earlier login request and pass it as data to POST, each cookie lasts for 10 mins after which I've to run keep-alive request.

Whenever some event is triggered I want to log that event in a file, I tried async, unirest requests they generate the post request but I don't have control over output, I tried sessions also but of no use. I want to do following things in same order

1]Login (can do only once)

2]Post the request to server

3]Keep monitoring output of step 2 eternally whenever there is some output log it into a file

4]Keep the session alive by another request to server.

Let me know if you need more explanation.

Below is code, it does not work though

while True:
    try:
        xmldata = "<eventSubscribe cookie=\"%s\" />" % (self.cookie)
        r = requests.post(post_url,data=xmldata,stream=False,verify=False,timeout=10)
        write_to_file('Ok',r.text)
        unsubevents()
        logout()
    except Exception as e:
        print e
        self.write_to_file('Ok',"")
    self.login()

So in above code the post call I make here is blocking and continuous, It streams the output continuously so the post call never really gets completed. But it receives output in xml format, server sends these responses every time an event is triggered.

PS: I don't want to do logout and login again,this works in curl where it keeps printing output on stdout, I have to run this code for several servers like 200.

解决方案

I've fixed this problem with two level threading and reading chunks instead of content or read_lines(). 1] First threads will be created which will spawn second thread and run keepalive when timeout hits.

2]Second thread subscribes to event with POST request and then keeps on listening to chunks of size 1024 everytime a response is received it is parsed and respective data is updated. Here I used requests with Stream=True; This wasn't working for me earlier because cookie used to expire before reading response and session used to close.

If someone has better way to do this please update here.

这篇关于在python中获取POST请求的连续响应的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆