【问题标题】:How to check if webpage are alive with python with multiproccessing如何使用多处理的python检查网页是否存在
【发布时间】:2023-04-06 02:36:01
【问题描述】:

我有一个 url 列表(大约 25k),我正在尝试检查它们是否还活着(200 响应)。想要使用 Python 的多处理库并行执行这些检查。我写了以下内容(主要基于 Python 文档示例),但它似乎运行得很慢。有什么办法可以让这个脚本运行得更快?

    import urllib2
    import time
    import random

    from multiprocessing import Process, Queue, current_process, freeze_support

    class HeadRequest(urllib2.Request):
        def get_method(self):
            return "HEAD"
    #
    # Function run by worker processes
    #

    def worker(input, output):
        for args in iter(input.get, 'STOP'):
            result = alive(args) 
            output.put(result)

    #
    # Functions referenced by tasks
    #

    def alive(x):
        x = x.strip()
        try:
            return x, ":", urllib2.urlopen(HeadRequest(x)).getcode()
        except urllib2.HTTPError as e:
            return x, ":", e.code
        except:
            return x, ": Error"

    #
    #
    #

    def check():
        NUMBER_OF_PROCESSES = 500
        text_file = open("url.txt", "r")
        TASKS1 = text_file.readlines()

        # Create queues
        task_queue = Queue()
        done_queue = Queue()

        # Submit tasks
        for task in TASKS1:
            task_queue.put(task)

        # Start worker processes
        for i in range(NUMBER_OF_PROCESSES):
            Process(target=worker, args=(task_queue, done_queue)).start()

        # Get and print results
        for i in range(len(TASKS1)):
            print done_queue.get()

        # Tell child processes to stop
        for i in range(NUMBER_OF_PROCESSES):
            task_queue.put('STOP')


    if __name__ == '__main__':
        freeze_support()
        check()

感谢任何帮助

【问题讨论】:

    标签:
    python
    http