使用python中的aiohttp获取多个URL [英] Fetching multiple urls with aiohttp in python

查看:354
本文介绍了使用python中的aiohttp获取多个URL的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在先前的问题中,用户建议使用以下方法通过 aiohttp 获取多个URL(API调用):

In a previous question, a user suggested the following approach for fetching multiple urls (API calls) with aiohttp:

import asyncio
import aiohttp


url_list = ['https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530396000&before=1530436000', 'https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530436000&before=1530476000']

async def fetch(session, url):
    async with session.get(url) as response:
        return await response.json()['data']


async def fetch_all(session, urls, loop):
    results = await asyncio.gather(*[loop.create_task(fetch(session, url)) for url in urls], return_exceptions= True)
    return results

if __name__=='__main__':
    loop = asyncio.get_event_loop()
    urls = url_list
    with aiohttp.ClientSession(loop=loop) as session:
        htmls = loop.run_until_complete(fetch_all(session, urls, loop))
    print(htmls)

但是,这只会导致返回属性错误:

However, this results in only returning Attribute errors:

[AttributeError('__aexit__',), AttributeError('__aexit__',)]

(我启用了它,否则它会中断)。我真的希望这里有人可以提供帮助,仍然很难找到 asyncio 等的资源。返回的数据为json格式。最后,我想将所有json字典放入列表中。

(which I enabled, otherwhise it would just break). I really hope there is somebody here, who can help with this, it is still kind of hard to find resources for asyncio etc. The returned data is in json format. In the end I would like to put all json dicts in a list.

推荐答案

工作示例:

import asyncio
import aiohttp
import ssl

url_list = ['https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530396000&before=1530436000',
            'https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530436000&before=1530476000']


async def fetch(session, url):
    async with session.get(url, ssl=ssl.SSLContext()) as response:
        return await response.json()


async def fetch_all(urls, loop):
    async with aiohttp.ClientSession(loop=loop) as session:
        results = await asyncio.gather(*[fetch(session, url) for url in urls], return_exceptions=True)
        return results


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    urls = url_list
    htmls = loop.run_until_complete(fetch_all(urls, loop))
    print(htmls)

这篇关于使用python中的aiohttp获取多个URL的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆