Skip to content

Conversation

@samharnack
Copy link

I periodically get a match error when a caller is dropped that doesn't exist in Terminator.calls. I'm not sure how to test for this or even how it gets in this state in the first place. Does anyone have any insight into this?

@chrismccord chrismccord self-assigned this Oct 8, 2024
@chrismccord
Copy link
Member

Can you share a specific match error, or the line that fails? We are matching on state.calls in a couple places, so I just want to be sure before I review. Thanks!

@samharnack
Copy link
Author

It was matching against an empty map %{}.

I don't have the exact error anymore; I'll see if I can get it to happen again and provide the stack trace.

This could also be added as another case here:
https://github.com/phoenixframework/flame/blob/main/lib/flame/terminator.ex#L202

I don't know if its related, but I'm also seeing this error occasionally:

** (ErlangError) Erlang error: :timeout

lib/flame/runner.ex:143 FLAME.Runner.call/4
lib/flame/pool.ex:250 anonymous fn/6 in FLAME.Pool.do_call/4
lib/flame/pool.ex:326 FLAME.Pool.caller_checkout!/5
lib/task/supervised.ex:101 Task.Supervised.invoke_mfa/2
lib/task/supervised.ex:36 Task.Supervised.reply/4

@samharnack
Copy link
Author

samharnack commented Nov 19, 2024

Finally got the exception 🎉

** (MatchError) no match of right hand side value: %{
#Reference<70218.4250035323.1243086849.196657> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4631.0>, timer: #Reference<70218.4250035323.1243086849.196658>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196670> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4632.0>, timer: #Reference<70218.4250035323.1243086849.196671>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196712> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4636.0>, timer: #Reference<70218.4250035323.1243086849.196713>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196880> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4646.0>, timer: #Reference<70218.4250035323.1243086849.196881>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196901> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4650.0>, timer: #Reference<70218.4250035323.1243086849.196902>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196908> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4651.0>, timer: #Reference<70218.4250035323.1243086849.196909>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196919> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4652.0>, timer: #Reference<70218.4250035323.1243086849.196920>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086849.196941> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4661.0>, timer: #Reference<70218.4250035323.1243086849.196942>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086853.194886> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4656.0>, timer: #Reference<70218.4250035323.1243086853.194887>, placed_child_ref: nil, placed_caller_ref: nil, link?: false},
#Reference<70218.4250035323.1243086853.194899> => %FLAME.Terminator.Caller{from_pid: #PID<70218.4657.0>, timer: #Reference<70218.4250035323.1243086853.194900>, placed_child_ref: nil, placed_caller_ref: nil, link?: false}
}


lib/flame/terminator.ex:333 FLAME.Terminator.drop_caller/2
lib/flame/terminator.ex:203 FLAME.Terminator.handle_info/2
gen_server.erl:2173 :gen_server.try_handle_info/3
gen_server.erl:2261 :gen_server.handle_msg/6
proc_lib.erl:329 :proc_lib.init_p_do_apply/3

@josevalim
Copy link
Member

It may be better to make sure we don't call drop_caller if it has been dropped already, rather than making every call now potentially "soft"?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants