-
-
Notifications
You must be signed in to change notification settings - Fork 431
Cancel/Interrupt active evaluation #3442
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
There isn't a hook to do this at the moment, but we have plans to support this. The simplest API I can think of is a way to set a flag for the |
cool.
yeah, this is exactly what I'm looking for, like |
Alright, I'll put this on our backlog then. |
It's difficult to use Boa for running untrusted third-party code without a way to interrupt eval. I appreciate the simplicity and effectiveness of rquickjs's set_interrupt_handler method for handling this issue. It's a great solution for many use cases. Here's an example showing how a timeout could be implemented using set_interrupt_handler: let timestamp_now = std::time::SystemTime::now();
rt.set_interrupt_handler(Some(Box::new(move || {
if let Ok(elapsed) = timestamp_now.elapsed() {
if elapsed.as_millis() > 1000 {
return true;
}
}
false
}))); |
It could be good if any solution to interrupting evaluation didn't depend on threads, in case you might want to embed the Boa runtime in a wasm/browser environment without threads. Some interpreter runtimes (e.g. Wasmi: https://docs.rs/wasmi/latest/wasmi/struct.Config.html#method.consume_fuel, or Piccolo: https://github.com/kyren/piccolo?tab=readme-ov-file#executor-fuel-and-vm-memory-tracking) define some notion of "fuel" that can effectively count and limit the number bytecode instructions evaluated, and maybe an approach like that could work for Boa too. I'd like to be able to use Boa for evaluating code that is generated by AI agents where I can't guarantee that they don't write nonsense code that might end up in some infinite loop. I'd also love to find an embeddable runtime that can be used in a browser. |
Interesting; it vaguely looks like the Boa VM already has some support for evaluting opcodes with a limited budget: /// Runs the current frame to completion, yielding to the caller each time `budget`
/// "clock cycles" have passed.
#[allow(clippy::future_not_send)]
pub(crate) async fn run_async_with_budget(&mut self, budget: u32) -> CompletionRecord { |
Looking at this again, it seems like Boa already has some support, and examples, for running the engine async with a budget system that forces the engine to yield to the async runtime every X cycles. There are some examples here:
Not sure if that could support your use case @mattsse / @felipefdl or if there are some notable limitations with this currently @jedel1043? It looks like it could be perfect for my use case. |
@rib The biggest limitation right now is that the natively implemented functions don't have a way to "suspend", since they're implemented as sync code. This means things like I see two possible solutions for this:
|
Ah, right, that makes sense - thanks for clarifying. I was also half hoping it was going to be possible to implement async native functions (i.e sync from the pov of javascript), so yeah guess that's going to be tricky. |
Without being familiar with the constraints atm, I vaguely wonder what would stop Boa from having boxed futures ala Imagining that maybe in addition to So unlike I guess it's not really that simple, e.g. considering something then needs to reset the mutable state of the future once the function completes without an error. |
Maybe that would be enough? We had to use |
Usage of Maybe there could be a corresponding |
That's basically doing manual state machines, and if you have tried to convert a complex for loop into a state machine... yeah, it's not pretty. |
hmm, not sure. The complex state machine would be in a regular Future that's boxed within a The thing about the program counter is just to make sure that the VM would keep spinning on the same operation until its |
Hmm yeah, that could work. The other missing piece is how to communicate to the VM that you want to call a function within another function. We cannot call it directly since that would cause a double mutable borrow, so there must be some mechanism to make the VM suspend the currently running native function, then start executing the callee, then resume the caller with the callee's return value when the callee finishes executing. |
I also don't know atm how to practically deal with creating a newly reset Future once it reaches completion since Rust doesn't expose the state machine in a way that can simply be reset. |
As a draft/prototype of some of my thoughts above, I experimented with this branch that adds basic support for async NativeFunctions that can yield to the application's async runtime but appear synchronous to JavaScript: #4237 I might be wrong but I don't think it would be too tricky to improve the Waker details so the implementation wouldn't lead to a busy yield loop when waiting for async host functions. It's a lot harder for me to see a good way of extending it to support all the various Operations that themselves make |
feature
I would like to be able to cancel an evaluation (or
JsValue::call
) after a certain timeout.Perhaps this is already possible, but I wasn't able to find it.
boja for example has
interrupts
:https://github.com/dop251/goja/blob/b396bb4c349df65109dea3df00fb60f6a044950d/runtime.go#L1474C9-L1482
my motivation for this is to have the option to terminate a (malicious) evaluation for example
while(1) {}
Example code
Give a code example that should work after the implementation of this feature.
The text was updated successfully, but these errors were encountered: