-
Notifications
You must be signed in to change notification settings - Fork 82
Closed
Description
Issue Type
- Bug Report
- Feature Request
- Task / Chore
- Documentation Update
- Other (Please describe below)
Description
We're encountering a potential file descriptor leak in pyacres version 4.9.0 when making HTTP connections, specifically within the aiobotocore integration. It appears that a large number of file descriptors are opened but not properly closed, which eventually leads to resource exhaustion under sustained workloads.
Steps to Reproduce
A minimal Python script to reproduce the issue is attached below. Running the script demonstrates a steady increase in open file descriptors over time, indicating they are not being properly released.
import asyncio
import os
import psutil
import resource
import subprocess
from botocore.exceptions import HTTPClientError
import aiobotocore
import aiobotocore.session
from aiobotocore.config import AioConfig
from contextlib import AsyncExitStack
__session = None
async def get_dynamodb_secret(table: str, key: dict, region_name: str = 'us-east-1'):
global __session
if __session is None:
__session = aiobotocore.session.get_session()
async with AsyncExitStack() as exit_stack:
config = AioConfig(connect_timeout=5, read_timeout=5)
dynamodb_client = await exit_stack.enter_async_context(
__session.create_client('dynamodb', config=config, region_name=region_name)
)
await dynamodb_client.get_item(
TableName=table,
Key=key,
ConsistentRead=False,
)
async def monitor_open_files(process: psutil.Process, interval: float = 0.2):
print("--- Starting File Descriptor Monitor ---")
while True:
try:
cmd = f"lsof -p {process.pid} | wc -l"
result = subprocess.run(cmd, shell=True, capture_output=True, text=True)
lsof_count = int(result.stdout.strip())
print(f"Monitoring... lsof count: {lsof_count})")
except psutil.NoSuchProcess:
print("--- Monitor stopping: Process not found. ---")
break
await asyncio.sleep(interval)
async def retrieve_dynamodb_secrets():
table = "<your_dynamodb_table>"
key = {"random_key": {"S": "random_value"}}
await get_dynamodb_secret(table, key)
async def main():
soft_limit, hard_limit = resource.getrlimit(resource.RLIMIT_NOFILE)
print(f"File Descriptor Limits - Soft: {soft_limit}, Hard: {hard_limit}\n")
current_process = psutil.Process(os.getpid())
monitor_task = asyncio.create_task(monitor_open_files(current_process))
try:
for i in range(100):
print(f"dynamodb tests start: {i}")
tasks = []
for j in range(5):
tasks.append(retrieve_dynamodb_secrets())
await asyncio.gather(*tasks)
print(f"dynamodb tests end: {i}")
except HTTPClientError:
print("\n⚠️ Caught 'Too many open files' error!")
finally:
monitor_task.cancel()
print("\nExiting.")
if __name__ == '__main__':
asyncio.run(main())Local result
File Descriptor Limits - Soft: 256, Hard: 9223372036854775807
dynamodb tests start: 0
--- Starting File Descriptor Monitor ---
Monitoring... lsof count: 43)
Monitoring... lsof count: 54)
dynamodb tests end: 0
dynamodb tests start: 1
Monitoring... lsof count: 59)
dynamodb tests end: 1
dynamodb tests start: 2
Monitoring... lsof count: 64)
...
Monitoring... lsof count: 282)
dynamodb tests end: 50
dynamodb tests start: 51
Monitoring... lsof count: 286)
dynamodb tests end: 51
dynamodb tests start: 52
dynamodb tests end: 52
dynamodb tests start: 53
dynamodb tests end: 53
dynamodb tests start: 54
⚠️ Caught 'Too many open files' error!
Exiting.
Expected Behavior
All file descriptors should be released after use.
Actual Behavior
File descriptors remain open, leading to an eventual failure due to exceeding the system's limit.
Environment
- Python version:
3.11.11 - pyacres version:
4.9.0 - aiobotocore version:
2.21.1 - OS: macOS Sequoia Version 15.5, Apple M1 Pro
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels