You can use a customized Adapter and force several retries with an exponential backoff factor on all HTTP/HTTPS requests. See example below:
import requests from requests import adapters from urllib3.util import Retry # Create a transport adapter with a custom retry strategy. retries = Retry( total=3, backoff_factor=3, status_forcelist=[500, 502, 503, 504] ) adapter = adapters.HTTPAdapter(max_retries=retries) # Ensure adapter is used for both HTTP and HTTPS requests. session = requests.Session() session.mount('https://', adapter) session.mount('http://', adapter) # Testing the retry mechanism response = session.get("http://httpbin.org/status/500")
This returns the error below:
RetryError: HTTPConnectionPool(host='httpbin.org', port=80): Max retries exceeded with url: /status/500 (Caused by ResponseError('too many 500 error responses'))
The unfortunate thing is that there doesn't seem to be a way to tell how many times the above mechanism has attempted to retry, only when all attempts have been exhausted
https://stackoverflow.com/a/47475019/4477547
The above is the detailed content of TIL that the `requests` library supports automatic retries with exponential backoff. For more information, please follow other related articles on the PHP Chinese website!