I am using the following code to increase the pool maxsize with requests:
import requests
session = requests.Session()
session.mount("https://", requests.adapters.HTTPAdapter(pool_maxsize=50))
session.mount("http://", requests.adapters.HTTPAdapter(pool_maxsize=50))
Is there a drawback to settings pool_maxsize=1000?
I sometimes need 50 - 1000 connections, but most of the time I only need 1 connection.
Alternatively, is there a way to allow dynamic pool sizing?
Which solution is best:
- Set pool_maxsize = 1000
- Create 2 sessions, 1 with pool_maxsize = 1 and the other with pool_maxsize=1000.
- Dynamically alter pool_maxsize as and when I need a different number of connections. (if possible)
Speed is paramount!
Edit: Most of the time I'm doing normal requests:
session.get(....)
But sometimes I am using asyncio where I will have a large number of requests to carry out:
import asyncio
async def perform_async_calls(self, session, urls):
loop = asyncio.get_event_loop()
futures = []
for url in urls:
futures.append(loop.run_in_executor(None, session.get, url)
results = []
for future in futures:
result = await future
results.append(result.json())
return results
from Requests / Asyncio: Is there a drawback for making pool_maxsize=1000 with a Python requests session?
No comments:
Post a Comment