-
Notifications
You must be signed in to change notification settings - Fork 26
Open
Description
Dear ADAM developers,
I just stumbled across your package and it looks amazing! However, there's a question I couldn't find an answer to in your examples and the README: If I am not mistaken, a JAX/PyTorch-implementation of the algorithms you included in ADAM should allow to compute gradients not only with respect to the joint configuration (as far as I understand, that's what you do in your examples), but leverage autograd to do so w.r.t any model parameter (e.g. joint offsets (aka link length) or inertial parameters).
Is that something that is supported by ADAM and if so, would you be so kind to provide me a pointer on where to start with something like that?
Metadata
Metadata
Assignees
Labels
No labels