Skip to content

Code to create orthogonal optimisers from torch.optim.Optimizer

License

Notifications You must be signed in to change notification settings

MarkTuddenham/Orthogonal-Optimisers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

579d6c0 · May 9, 2022

History

7 Commits
Oct 1, 2021
Sep 15, 2021
Sep 15, 2021
May 9, 2022
May 9, 2022
Sep 15, 2021

Repository files navigation

Orthogonalised Optimisers

Code for Orthogonalising gradients to speed up neural network optimisation

Install package

git clone https://github.com/MarkTuddenham/Orthogonal-Optimisers.git
cd Orthogonal-Optimisers
pip install .

or

pip install git+https://github.com/MarkTuddenham/Orthogonal-Optimisers.git#egg=orth_optim

Usage

And then at the top of your main python script:

from orth_optim import hook
hook()

Now the torch optimisers have an orthogonal option, e.g:

torch.optim.SGD(model.parameters(),
                lr=1e-3,
                momentum=0.9,
                orth=True)

Custom Optimisers

If you have a custom optimiser you can apply the orthogonalise decorator.

from orth_optim import orthogonalise

@orthogonalise
class LARS(torch.optim.Optimizer):
	...

About

Code to create orthogonal optimisers from torch.optim.Optimizer

Resources

License

Citation

Stars

Watchers

Forks

Packages

No packages published

Languages