在 python 中填充队列和管理多处理

2022-01-12 00:00:00 python queue multiprocessing pool

问题描述

我在 python 中遇到了这个问题:

I'm having this problem in python:

  • 我需要不时检查的 URL 队列
  • 如果队列已满,我需要处理队列中的每个项目
  • 队列中的每个项目都必须由单个进程处理(多处理)

到目前为止,我设法像这样手动"实现了这一目标:

So far I managed to achieve this "manually" like this:

while 1:
        self.updateQueue()

        while not self.mainUrlQueue.empty():
            domain = self.mainUrlQueue.get()

            # if we didn't launched any process yet, we need to do so
            if len(self.jobs) < maxprocess:
                self.startJob(domain)
                #time.sleep(1)
            else:
                # If we already have process started we need to clear the old process in our pool and start new ones
                jobdone = 0

                # We circle through each of the process, until we find one free ; only then leave the loop 
                while jobdone == 0:
                    for p in self.jobs :
                        #print "entering loop"
                        # if the process finished
                        if not p.is_alive() and jobdone == 0:
                            #print str(p.pid) + " job dead, starting new one"
                            self.jobs.remove(p)
                            self.startJob(domain)
                            jobdone = 1

但是,这会导致大量问题和错误.我想知道我是否更适合使用进程池.这样做的正确方法是什么?

However that leads to tons of problems and errors. I wondered if I was not better suited using a Pool of process. What would be the right way to do this?

但是,很多时候我的队列是空的,一秒钟可以填满 300 个项目,所以我不太清楚这里该怎么做.

However, a lot of times my queue is empty, and it can be filled by 300 items in a second, so I'm not too sure how to do things here.


解决方案

您可以使用 queue 在启动时产生多个进程(使用 multiprocessing.Pool) 并让它们休眠,直到队列中有一些数据可供处理.如果您对此不熟悉,可以尝试玩"用那个简单的程序:

You could use the blocking capabilities of queue to spawn multiple process at startup (using multiprocessing.Pool) and letting them sleep until some data are available on the queue to process. If your not familiar with that, you could try to "play" with that simple program:

import multiprocessing
import os
import time

the_queue = multiprocessing.Queue()


def worker_main(queue):
    print os.getpid(),"working"
    while True:
        item = queue.get(True)
        print os.getpid(), "got", item
        time.sleep(1) # simulate a "long" operation

the_pool = multiprocessing.Pool(3, worker_main,(the_queue,))
#                           don't forget the comma here  ^

for i in range(5):
    the_queue.put("hello")
    the_queue.put("world")


time.sleep(10)

在 Linux 上使用 Python 2.7.3 测试

这将产生 3 个进程(除了父进程).每个孩子都执行 worker_main 函数.这是一个简单的循环,在每次迭代时从队列中获取一个新项目.如果没有准备好处理,worker 将阻塞.

This will spawn 3 processes (in addition of the parent process). Each child executes the worker_main function. It is a simple loop getting a new item from the queue on each iteration. Workers will block if nothing is ready to process.

在启动时,所有 3 个进程都将休眠,直到向队列提供一些数据.当数据可用时,等待的工作人员之一获得该项目并开始处理它.之后,它会尝试从队列中获取其他项目,如果没有可用则再次等待...

At startup all 3 process will sleep until the queue is fed with some data. When a data is available one of the waiting workers get that item and starts to process it. After that, it tries to get an other item from the queue, waiting again if nothing is available...

相关文章