Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Syntax upgrade for Inheritance and Composite wrappers #10

Open
vahidzee opened this issue Dec 31, 2022 · 0 comments
Open

Syntax upgrade for Inheritance and Composite wrappers #10

vahidzee opened this issue Dec 31, 2022 · 0 comments
Labels
enhancement New feature or request

Comments

@vahidzee
Copy link
Owner

vahidzee commented Dec 31, 2022

In addition to suggestions on improving the proposed syntax for dynamizing Inheritance, I suggest we generalize everything into a unified methodology of parsing the arguments passed on to the __init__ of a class that has been decorated with dycode.dynamize.

As a showcase, imagine the code bellow:

@dy.dynamize(
    dynamize_init=True, 
    extends=[torch.nn.Linear], 
    merge_args=dict(
        in_features=["conv__in_channels", "torch.nn.Linear__in_features"],
        device=None,
        dtype=None,
     ) # or perhaps extends_merge_args=["dytpe", "device"] to only share those
 )
class LinearBlock(torch.nn.Linear):
   activation: th.Optional[torch.nn.Module] = dy.composite(default="torch.nn.ReLU", required=False, null_value=None)
   batch_norm: th.Optional[torch.nn.Module] = dy.composite(default="torch.nn.BatchNorm1d, required=False)
   conv= dy.composite(torch.nn.Conv1d)
   
   def __init__(self):
        # a linear block, with optional activation and batchnorm, which passes the results to a conv1d as well
        super().__init__(...) # instantiate the linear layer
        ....

Although the example might not be sound in any conventional DL model, similar scenarios regularly arise when developing deep models.

A neat feature that dycode could potentially handle would be to infer what arguments are not present in the implementation of original LinearBlock's __init__, and how to pass composite arguments or to share parameters between them and the inherited classes.

For instance, here, except for usual implementations of activation, all batch_norm, conv, and even the base Linear modules accept a device or a dtype argument at initialization. Mentioning "dtype" and "device" in the extends_merge_args, should tell dycode to add a single device and dtype argument to LinearBlock, so that we can manipulate those at initialization of LinearBlock by LinearBlock(dtype=sth, device=sth).

Or that, if a parameter could be shared among composite classes and/or base classes but has different names in their constructors, we can mention a shared argument name and what those arguments translate to in merge_args.
Of course, many things here can be automatically inferred, and it might also make sense for dycode.dynamize to automatically parse the init in this way by default.

Another helpful feature would be the ability to manipulate arguments passed on to composite/base classes from the dynamized constructor using the same __ syntax. For instance, instead of providing args for each composite class, why not just parse the arguments passed on to the constructor and see where each provided parameter should go:

LinearBlock(in_features=2, activation__in_place=True, device='cpu', kernel_size=3, conv__padding=0)

In this example, it's obvious where each parameter should go, and it's pretty readable too.
Why not automate the same idea?
Stack all the signatures of composite and inherited base classes, try to merge their arguments into one single translation table, use the merge_args to merge those which could be shared, or maybe merge them by default for those with the same name if merge_args=True, then use such a table to dynamically evaluate the arguments passed on to the dynamized class.

@vahidzee vahidzee added the enhancement New feature or request label Dec 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant