當涉及在 Python 中同時執行多個任務時,concurrent.futures 模組是一個強大而簡單的工具。在本文中,我們將探討如何使用 ThreadPoolExecutor 並行執行任務,並結合實際範例。
在Python中,執行緒非常適合I/O操作占主導地位的任務,例如網路呼叫或檔案讀取/寫入操作。使用 ThreadPoolExecutor,您可以:
讓我們來看一個簡單的例子來理解這個概念。
from concurrent.futures import ThreadPoolExecutor import time # Function simulating a task def task(n): print(f"Task {n} started") time.sleep(2) # Simulates a long-running task print(f"Task {n} finished") return f"Result of task {n}" # Using ThreadPoolExecutor def execute_tasks(): tasks = [1, 2, 3, 4, 5] # List of tasks results = [] # Create a thread pool with 3 simultaneous threads with ThreadPoolExecutor(max_workers=3) as executor: # Execute tasks in parallel results = executor.map(task, tasks) return list(results) if __name__ == "__main__": results = execute_tasks() print("All results:", results)
當您執行此程式碼時,您將看到類似這樣的內容(以某種並行順序):
Task 1 started Task 2 started Task 3 started Task 1 finished Task 4 started Task 2 finished Task 5 started Task 3 finished Task 4 finished Task 5 finished All results: ['Result of task 1', 'Result of task 2', 'Result of task 3', 'Result of task 4', 'Result of task 5']
任務 1、2 和 3 同時啟動,因為 max_workers=3。其他任務(4 和 5)等待執行緒可用。
限制執行緒數:
處理異常:
使用 ProcessPoolExecutor 執行 CPU 密集型任務:
這是一個真實的範例:並行取得多個 URL。
import requests from concurrent.futures import ThreadPoolExecutor # Function to fetch a URL def fetch_url(url): try: response = requests.get(url) return f"URL: {url}, Status: {response.status_code}" except Exception as e: return f"URL: {url}, Error: {e}" # List of URLs to fetch urls = [ "https://example.com", "https://httpbin.org/get", "https://jsonplaceholder.typicode.com/posts", "https://invalid-url.com" ] def fetch_all_urls(urls): with ThreadPoolExecutor(max_workers=4) as executor: results = executor.map(fetch_url, urls) return list(results) if __name__ == "__main__": results = fetch_all_urls(urls) for result in results: print(result)
ThreadPoolExecutor 簡化了 Python 中的執行緒管理,是加速 I/O 密集型任務的理想選擇。只需幾行程式碼,您就可以並行化操作並節省寶貴的時間。
以上是# 使用 ThreadPoolExecutor 增強你的 Python 任務的詳細內容。更多資訊請關注PHP中文網其他相關文章!