python写爬虫如何实现任务队列?

查看:270
本文介绍了python写爬虫如何实现任务队列?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

问 题

比如说我的爬虫要爬行http://xxx.com/page.php?id=1http://xxx.com/page.php?id=88的数据,但是由于机器限制我只能开十个线程,那么我怎么做到把这88个抓取任务分配给十个线程,并且以最快的速度完成任务?(Ps:我不要用框架,我想自己原生实现)

解决方案

至少三种方法及相应的参考实现:

from multiprocessing.dummy import Pool as ThreadPool


def worker(n):
    return n + 2

numbers = range(100)

pool = ThreadPool(processes=10)
result = pool.map(worker, numbers)
pool.close()
pool.join()
print(result)

import concurrent.futures


def worker(n):
    return n + 2

numbers = range(100)

with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor:
    result = executor.map(worker, numbers)
    print(list(result))

from collections import deque
import queue
import threading


def do_work(n):
    return n + 2


def worker():
    while True:
        item = q.get()
        if item is None:
            break
        result.append(do_work(item))
        q.task_done()

q = queue.Queue()
result = deque()
num_worker_threads = 10
threads = []
for i in range(num_worker_threads):
    t = threading.Thread(target=worker)
    t.start()
    threads.append(t)

for item in range(100):
    q.put(item)

# block until all tasks are done
q.join()

# stop workers
for i in range(num_worker_threads):
    q.put(None)
for t in threads:
    t.join()

print(result)

官方文档:

这篇关于python写爬虫如何实现任务队列?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆