Skip to content
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 14 additions & 7 deletions streamrip/client/qobuz.py
Original file line number Diff line number Diff line change
Expand Up @@ -409,14 +409,21 @@ async def _test_secret(self, secret: str) -> Optional[str]:
return None

async def _get_valid_secret(self, secrets: list[str]) -> str:
results = await asyncio.gather(
*[self._test_secret(secret) for secret in secrets],
)
working_secrets = [r for r in results if r is not None]
if len(working_secrets) == 0:
# ⚑ Bolt: Use asyncio.as_completed to short-circuit and return immediately
# on the first valid secret found. This avoids waiting for all secrets to
# be tested concurrently, reducing network I/O and latency.
tasks = [asyncio.create_task(self._test_secret(secret)) for secret in secrets]
try:
for future in asyncio.as_completed(tasks):
result = await future
if result is not None:
return result
raise InvalidAppSecretError(secrets)

return working_secrets[0]
finally:
# Cancel remaining tasks to prevent background task leakage
for task in tasks:
if not task.done():
task.cancel()
Comment on lines +422 to +426
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟑 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

rg -n -C3 'asyncio.as_completed|task.cancel|_test_secret' streamrip/client/qobuz.py

python - <<'PY'
import asyncio

async def worker():
    try:
        await asyncio.sleep(10)
    finally:
        await asyncio.sleep(0.05)

async def main():
    tasks = [asyncio.create_task(worker()) for _ in range(2)]
    await asyncio.sleep(0)
    for task in tasks:
        task.cancel()
    print("after cancel:", [task.done() for task in tasks])
    await asyncio.sleep(0)
    print("next loop turn:", [task.done() for task in tasks])
    await asyncio.gather(*tasks, return_exceptions=True)
    print("after gather:", [task.done() for task in tasks])

asyncio.run(main())
PY

Repository: davidjuarezdev/streamrip_RipDL

Length of output: 1344


Drain cancelled tasks before returning.

task.cancel() only requests cancellation. Without awaiting the pending tasks, the cancelled _test_secret() coroutines continue unwinding against the shared aiohttp session after _get_valid_secret() returns, causing background task leakage.

Collect pending tasks, cancel them, and await their completion:

Suggested fix
         finally:
-            # Cancel remaining tasks to prevent background task leakage
-            for task in tasks:
-                if not task.done():
-                    task.cancel()
+            pending = [task for task in tasks if not task.done()]
+            for task in pending:
+                task.cancel()
+            if pending:
+                await asyncio.gather(*pending, return_exceptions=True)
πŸ€– Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@streamrip/client/qobuz.py` around lines 422 - 426, The finally block in
_get_valid_secret() currently calls task.cancel() on items in tasks but doesn't
await them, so cancelled _test_secret() coroutines may keep running against the
shared aiohttp session; update the cleanup to collect the pending tasks, call
cancel() on each, then await their completion (e.g., via
asyncio.gather(pending_tasks, return_exceptions=True)) to drain/unwind the
coroutines before returning, referencing the tasks list, _test_secret(), and
_get_valid_secret() to locate where to add the await.

Comment on lines +424 to +426
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Cancelled tasks must be awaited to avoid 'Task was destroyed but it is pending!' warnings and to ensure aiohttp connections are properly closed. Add await asyncio.gather(*tasks, return_exceptions=True) after cancelling.

Prompt for AI agents
Check if this issue is valid β€” if so, understand the root cause and fix it. At streamrip/client/qobuz.py, line 424:

<comment>Cancelled tasks must be awaited to avoid 'Task was destroyed but it is pending!' warnings and to ensure aiohttp connections are properly closed. Add `await asyncio.gather(*tasks, return_exceptions=True)` after cancelling.</comment>

<file context>
@@ -409,14 +409,21 @@ async def _test_secret(self, secret: str) -> Optional[str]:
-        return working_secrets[0]
+        finally:
+            # Cancel remaining tasks to prevent background task leakage
+            for task in tasks:
+                if not task.done():
+                    task.cancel()
</file context>
Suggested change
for task in tasks:
if not task.done():
task.cancel()
for task in tasks:
if not task.done():
task.cancel()
await asyncio.gather(*tasks, return_exceptions=True)
Fix with Cubic


async def _request_file_url(
self,
Expand Down