同时使用 urllib2.urlopen() 的多个请求 [英] Multiple requests using urllib2.urlopen() at the same time

查看:36
本文介绍了同时使用 urllib2.urlopen() 的多个请求的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的问题是这样的;是否可以同时请求两个不同的 URL?

My question is this; Is it possible to request two different URLs at the same time?

我想要做的是使用 Python 脚本同时调用来自两个不同 URL 的请求.使两个 PHP 脚本同时运行(在不同的服务器上,运行终端命令).我的问题是我不能一个接一个地做它们,因为它们每个都需要特定的时间来做某事,并且需要同时运行,同时结束.

What I'm trying to do, is use a Python script to call requests from two different URLs at the same time. Making two PHP scripts run simultaneously (on different servers, running a terminal command). My issue is that I can't do them right after each other, because they each take a specific time to do something, and need to run at the same time, and end at the same time.

这可以使用 urllib2.urlopen 吗?如果是这样,我将如何去做?

Is this possible using urllib2.urlopen? If so, how would I go about doing this?

如果没有,那么这样做的好方法是什么?

If not, then what would be a good method to do so?

目前我有类似的东西:

import urllib2
...
if cmd.startswith('!command '):
    cmdTime = cmd.replace('!command ','',1)
    urllib2.urlopen('http://example.com/page.php?time='+cmdTime)
    urllib2.urlopen('http://example2.com/page.php?time='+cmdTime)
    print "Finished."

我的问题是它们不会同时运行.

My issue is that they don't run at the same time.

如果我执行 !command 60,那么它会运行 site.com 60 秒,然后转到 site2.com 60 秒并运行那个.

If I did !command 60, then it'll run site.com for 60 seconds, then go to site2.com for 60 seconds and run that one.

推荐答案

我建议您创建一个用于获取解析源的函数,您应该在其中传递要在列表中抓取的 url 列表作为参数.稍后在 URL 列表上循环并使用线程.我会为您发布一些示例代码,请相应修改.

I would suggest you to create a function for getting the parsed source, where you should pass a list of url's to be crawled in a list as argument. Later on loop on the list of URL's and use threading. I will post some sample code for you, please modify it accordingly.

import threading
import urllib2


def execute_url_list(urls_list):
    if cmd.startswith('!command '):
    cmdTime = cmd.replace('!command ','',1)
    for url in urls_list:
        urllib2.urlopen(url+cmdTime)

urls_list = ['url1', 'url2']
processes = []
for k in urls_list:
    process = threading.Thread(target=execute_url_list, args=[k])
    process.setDaemon(True)
    process.start()
    processes.append(process)


for process in processes:
    process.join()

这篇关于同时使用 urllib2.urlopen() 的多个请求的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆