Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lock on file never unlocked when request ends prematurely #697

Closed
2 tasks done
Vija02 opened this issue Jan 16, 2025 · 6 comments · Fixed by #699
Closed
2 tasks done

Lock on file never unlocked when request ends prematurely #697

Vija02 opened this issue Jan 16, 2025 · 6 comments · Fixed by #699
Labels

Comments

@Vija02
Copy link
Contributor

Vija02 commented Jan 16, 2025

Initial checklist

  • I understand this is a bug report and questions should be posted in the Community Forum
  • I searched issues and couldn’t find anything (or linked relevant results below)

Steps to reproduce

When a request ends prematurely (eg: when the user refresh the page), it could lead to the file never being unlocked and the promise hanging indefinitely. This only happens when the action occurs on specific timings.

Steps:

  • Extend the getUpload function and artificially delay it for a few seconds.
  • Start an upload to the server
  • In the middle of the artificial delay, interrupt the request (eg: by refreshing the page)

What's happening is that in

.pipeline(req.pipe(proxy), new StreamLimiter(maxFileSize), async (stream) => {

The req is handed over to the pipeline which eventually resolves the Promise. However, if the request is already closed by this point, it never resolves nor rejects. I think adding a quick req.closed here will solve the issue?

Separately, I also encountered an issue where if the resource is locked (like this issue) and the user tries to upload, the server will go into an infinite loop and eventually run out of heap memory. Fixing this issue should avoid that entirely but I thought it might be worth looking into as well (from a cursory look, it seems to be looping on trying to acquire the lock)

Not sure what the best fix here is (maybe just reject if it's closed?). If you give me a direction, I could create a PR

Expected behavior

It should handle the scenario when the request is over before it gets to that point

Actual behavior

The whole workflow stayed pending, leaving the lock on the file to remain forever

@Vija02 Vija02 added the bug label Jan 16, 2025
@fenos
Copy link
Collaborator

fenos commented Jan 17, 2025

Hi @Vija02 thanks a lot for reporting this.
I would have thought that if the request stream was closed, the pipeline would have returned right away.

I'll need to investigate this further if it's not the case.
If that's not the case, then yes checking if the request is already closed and reject the promise seems a sensible approach

@fenos
Copy link
Collaborator

fenos commented Jan 17, 2025

Alternatively, we could add an event listener to the request object as soon as the request is received

const context = this.createContext(req)

Detect if the request is aborted and cancel the context with context.abort()
This should allow the pipeline to reject correctly

@Murderlon
Copy link
Member

Was just going to say, maybe we also need to listen for 'close'. Cancelling context would be better in that case over only applying it locally in writeToStore.

@fenos
Copy link
Collaborator

fenos commented Jan 17, 2025

Yep! makes a lot of sense to cancel the context on close regardless 👍

@fenos
Copy link
Collaborator

fenos commented Jan 17, 2025

We should probably also pass the context into newLock (or at least the AbortSignal), which would stop trying to acquire the lock if the context is cancelled

@Vija02
Copy link
Contributor Author

Vija02 commented Jan 18, 2025

Thanks so much for the quick fix! Took me awhile to pin down the issue but all worth it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants