Python - 从可执行文件运行时,Multiprocessing.processes 成为主进程的副本

2022-01-12 00:00:00 python multiprocessing

问题描述

我刚刚在我的程序中发现了一个奇怪的错误,它与它使用 Python 的多处理模块有关.当我从机器上的源代码运行程序时,一切正常.但是我一直在使用 pyinstaller 将它构建成一个可执行文件,并且由于某种原因,当我运行从我的代码构建的可执行文件时,多处理的行为发生了巨大的变化.具体来说,当我尝试运行我的代码的多处理部分时,而不是做它应该做的,似乎是我的程序主窗口的副本弹出,每个进程一个.更糟糕的是,如果手动关闭它们,它们会重新打开,大概是因为它们是 multiprocessing.pool 的一部分.不打印任何错误消息,并且一旦创建所有窗口就坐在那里什么都不做.是什么原因造成的?

I just discovered a bizarre bug in my program related to its use of Python's multiprocessing module. Everything works fine when I run the program from the source on my machine. But I've been building it into an executable using pyinstaller, and for some reason the behavior of multiprocessing changes drastically when I run the executable built from my code. Specifically, when I try to run the multiprocessing part of my code, rather than do what it's supposed to, what appears to be a copy of my program's main window pops up, one for each process. Even worse, they reopen if they are closed manually, presumably because they are part of a multiprocessing.pool. No error messages are printed, and once created all the windows just sit there doing nothing. What could be happening to cause this?


解决方案

在 Windows 上,multiprocessing 尝试通过启动新的实例来模拟 Unix fork() 系统调用您的可执行文件,并在其中执行其子进程例程 (multiprocessing.forking.main()).使用标准 Python 解释器 (python.exe),multiprocessing 可以传递 -c 参数来运行自定义代码.但是,对于自定义可执行文件,这是不可能的,因为可执行文件很可能不支持与 python.exe 相同的命令行选项.

On Windows, multiprocessing tries to emulate the Unix fork() system call by starting new instances of your executable, and execute its child process routine (multiprocessing.forking.main()) therein. With the standard Python interpreter (python.exe), multiprocessing can pass the -c parameter to run custom code. For custom executables, however, this is not be possible since the executable will most probably not support the same command line options as python.exe.

freeze_support() 函数通过显式执行子进程例程来回避这个问题,并通过调用 sys.exit() 终止解释器.如果忘记调用freeze_support(),新进程不知道自己是子进程,运行主应用逻辑.在您的情况下,这将弹出另一个主 GUI 窗口.

The freeze_support() function sidesteps this problem by executing the child process routine explicitely, and terminate the interpreter by calling sys.exit(). If you forget to call freeze_support(), the new process does not know that it is a child process and runs the main application logic. In your case, this will pop up another main GUI window.

由于从新创建的进程启动另一个子进程将导致无限递归,multiprocessing 试图通过检查 sys.frozen 属性并引发 RuntimeError 如果 freeze_support() 未被调用.在您的情况下,似乎需要用户交互来生成进程,因此没有无限递归,也没有 RuntimeError.

Since starting yet another child process from the newly created process will cause infinite recursion, multiprocessing tries to prevent this by checking the sys.frozen attribute and raise a RuntimeError if freeze_support() was not called. In your case, it seems that user interaction is required to spawn the processes, therefore there is no infinite recursion and no RuntimeError.

按照惯例,sys.frozen 仅设置为由 py2exe 或 PyInstaller 创建的自动生成的可执行文件.当您想要将 Python 嵌入到应支持 windows 下的多处理的自定义可执行文件中时,了解此逻辑并将 sys.frozen 设置为 True 非常重要.

By convention, sys.frozen is only set for automatically generated executables as created by py2exe or PyInstaller. It is important to understand this logic and set sys.frozen to True when one wants to embed Python in a custom executable that should support multiprocessing under windows.

相关文章