Multiprocessing Logging in Python
When using Python's multiprocessing module, it's important to consider logging practices to avoid errors caused by multiple processes writing to the same filehandle simultaneously. By default, the multiprocessing-aware logger provided by mp.get_logger() ensures proper locking mechanisms in sys.stderr.
However, modules that are not multiprocessing-aware may need modifications to use multiprocessing-aware logging. To avoid these changes, consider alternative approaches:
Custom Logging Handler
One approach is to create a custom log handler that sends logs to the parent process via a pipe. This allows modules to use the standard logging module while the parent process handles the actual logging. Here's an implementation:
<code class="python">from logging.handlers import RotatingFileHandler import multiprocessing, threading, logging, sys, traceback class MultiProcessingLog(logging.Handler): def __init__(self, name, mode, maxsize, rotate): logging.Handler.__init__(self) self._handler = RotatingFileHandler(name, mode, maxsize, rotate) self.queue = multiprocessing.Queue(-1) t = threading.Thread(target=self.receive) t.daemon = True t.start()</code>
The handler receives log records from the child processes and writes them to a file using the provided file handler. This ensures centralized logging without the need to make changes to dependent modules.
The above is the detailed content of How to Handle Logging in Multiprocessing Python Applications?. For more information, please follow other related articles on the PHP Chinese website!