Detailed explanation of the use of logging module in Python in multi-process environment

高洛峰
Release: 2017-02-14 13:51:08
Original
1933 people have browsed it

Many applications will have a log module, which is used to record some key information during the operation of the system to facilitate tracking of the system's operating status. This article mainly introduces the use of the logging module in Python in a multi-process environment. Friends in need can refer to it. Let’s take a look together.

Preface

I believe every programmer should know that when using Python to write background tasks, it is often necessary to use output logs to record the status of program running, and to report errors when errors occur. The details are saved for further debugging and analysis. Python's logging module is a great helper in this situation.

The logging module can specify the log level, DEBUG, INFO, WARNING, ERROR, CRITICAL. For example, during development and debugging, logs with levels above DEBUG can be output, while in a production environment, only INFO can be output. level. (If not specified, the default level is warning)

logging You can also specify the output to the command line or file, and you can also split the log file by time or size.

Regarding the detailed use of logging, I will not go into details here. You can refer to the official documentation or the introduction here.

Logging configuration

Normally, we need to save the log to a file and expect to automatically split the file to avoid the log file being too large. An example of logging configuration is given below.

import logging.config
 
logging.config.dictConfig({
 'version': 1,
 'disable_existing_loggers': True,
 'formatters': {
  'verbose': {
   'format': "[%(asctime)s] %(levelname)s [%(name)s:%(lineno)s] %(message)s",
   'datefmt': "%Y-%m-%d %H:%M:%S"
  },
  'simple': {
   'format': '%(levelname)s %(message)s'
  },
 },
 'handlers': {
  'null': {
   'level': 'DEBUG',
   'class': 'logging.NullHandler',
  },
  'console': {
   'level': 'DEBUG',
   'class': 'logging.StreamHandler',
   'formatter': 'verbose'
  },
  'file': {
   'level': 'DEBUG',
   'class': 'logging.RotatingFileHandler',
   # 当达到10MB时分割日志
   'maxBytes': 1024 * 1024 * 10,
   # 最多保留50份文件
   'backupCount': 50,
   # If delay is true,
   # then file opening is deferred until the first call to emit().
   'delay': True,
   'filename': 'logs/mysite.log',
   'formatter': 'verbose'
  }
 },
 'loggers': {
  '': {
   'handlers': ['file'],
   'level': 'info',
  },
 }
})
Copy after login

We can use this to record logs in a module

import logging
logger = logging.getLogger(__name__)
 
if __name__ == '__main__':
 logger.info('log info')
Copy after login

Usage in a multi-process environment

According to the official documentation, logging is thread-safe, that is to say, it is safe for multiple threads in a process to write logs to the same file at the same time. But (yes, there is a but) it is not safe for multiple processes to write logs to the same file. The official statement is this:

Because there is no standard way to serialize access to a single file across multiple processes in Python. If you need to log to a single file from multiple processes, one way of doing this is to have all the processes log to a SocketHandler, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you prefer, you can dedicate one thread in one of the existing processes to perform this function.)

Some people will say, then I don’t need multiple processes. However, Python has a big GIL lock (you can read here about the disputes about GIL). It is impossible to use multi-threading to take advantage of multi-core CPUs. In most cases, multi-processes will be used to take advantage of multi-core CPUs, so we still can't get around it. The problem of not opening logs under multiple processes.

In order to solve this problem, you can use ConcurrentLogHandler. ConcurrentLogHandler can safely write logs to the same file in a multi-process environment, and can split the log file when the log file reaches a specific size. In the default logging module, there is a TimedRotatingFileHandler class that can split log files by time. Unfortunately, ConcurrentLogHandler does not support this method of splitting log files by time.

Remodify the class in handlers.

logging.config.dictConfig({
 ...
 'handlers': {
  'file': {
   'level': 'DEBUG',
   # 如果没有使用并发的日志处理类,在多实例的情况下日志会出现缺失
   'class': 'cloghandler.ConcurrentRotatingFileHandler',
   # 当达到10MB时分割日志
   'maxBytes': 1024 * 1024 * 10,
   # 最多保留50份文件
   'backupCount': 50,
   # If delay is true,
   # then file opening is deferred until the first call to emit().
   'delay': True,
   'filename': 'logs/mysite.log',
   'formatter': 'verbose'
  }
 },
 ...
})
Copy after login

After running, you can find that a .lock file will be automatically created, and the log file can be safely written through the lock.

For more detailed articles on the use of the logging module in Python in a multi-process environment, please pay attention to the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template