The content of this article is about the usage examples (code) of python multi-process. It has certain reference value. Friends in need can refer to it. I hope it will be helpful to you.
Python multi-threading is suitable for IO-intensive scenarios, but in CPU-intensive scenarios, multi-core CPUs cannot be fully utilized. Coroutines are essentially thread-based and cannot fully utilize the advantages of multi-cores.
For computing-intensive scenarios that require the use of multiple processes, python's multiprocessing is very similar to the threading module and supports batch creation of child processes using a process pool.
You only need to instantiate the Process class and pass the function to the target parameter. This is the same as the threading module Very similar, args is the parameter of the function
import os from multiprocessing import Process # 子进程要执行的代码 def task(name): print('run child process %s (%s)...' % (name, os.getpid())) if __name__ == '__main__': print('parent process %s.' % os.getpid()) p = Process(target=task, args=('test',)) p.start() p.join() print('process end.')
import multiprocessing import os from multiprocessing import current_process class Worker(multiprocessing.Process): def run(self): name = current_process().name # 获取当前进程的名称 print('run child process <%s> (%s)' % (name, os.getpid())) print('In %s' % self.name) return if __name__ == '__main__': print('parent process %s.' % os.getpid()) p = Worker() p.start() p.join() print('process end.')
import multiprocessing import time def worker(): print('starting worker') time.sleep(0.1) print('finished worker') if __name__ == '__main__': p = multiprocessing.Process(target=worker) print('执行前:', p.is_alive()) p.start() print('执行中:', p.is_alive()) p.terminate() # 发送停止号 print('停止:', p.is_alive()) p.join() print('等待完成:', p.is_alive())
import multiprocessing def worker(num): print(f'Worker:%s %s', num) return if __name__ == '__main__': jobs = [] for i in range(5): p = multiprocessing.Process(target=worker, args=(i,)) jobs.append(p) p.start()
Pool can provide a specified number of processes for users to call. When a new request is submitted to the pool, if the pool is not full, a new process will be created to execute the request; but if the pool If the number of processes has reached the specified maximum, the request will wait until a process in the pool ends, and then a new process will be created to handle it.
import os import random import time from multiprocessing import Pool from time import ctime def task(name): print('start task %s (%s)...' % (name, os.getpid())) start = time.time() time.sleep(random.random() * 3) print('end task %s runs %0.2f seconds.' % (name, (time.time() - start))) if __name__ == '__main__': print('parent process %s.' % os.getpid()) p = Pool() # 初始化进程池 for i in range(5): p.apply_async(task, args=(i,)) # 追加任务 apply_async 是异步非阻塞的,就是不用等待当前进程执行完毕,随时根据系统调度来进行进程切换。 p.close() p.join() # 等待所有结果执行完毕,会等待所有子进程执行完毕,调用join()之前必须先调用close() print(f'all done at: {ctime()}')
import os import random import time from multiprocessing import Pool, current_process from time import ctime def task(name): print('start task %s (%s)...' % (name, os.getpid())) start = time.time() time.sleep(random.random() * 3) print('end task %s runs %0.2f seconds.' % (name, (time.time() - start))) return current_process().name + 'done' if __name__ == '__main__': print('parent process %s.' % os.getpid()) result = [] p = Pool() # 初始化进程池 for i in range(5): result.append(p.apply_async(task, args=(i,))) # 追加任务 apply_async 是异步非阻塞的,就是不用等待当前进程执行完毕,随时根据系统调度来进行进程切换。 p.close() p.join() # 等待所有结果执行完毕 for res in result: print(res.get()) # get()函数得出每个返回结果的值 print(f'all done at: {ctime()}')
The above is the detailed content of Python multi-process usage example (code). For more information, please follow other related articles on the PHP Chinese website!