This repository contains 3rd-solution for the HPO Module Contest.
This contest presents the following challenges:
- Limited trials (100)
- A large number of parameters (10-20)
- Vast search space.
My solution is based on the TuRBO algorithm, which balances exploration and exploitation by narrowing down the search space based on the number of updates of the best parameters. I also added a restart feature to monitor the number of updates and implemented a forced restart when the same evaluation value continues. These enhancements were added to aiaccel for a more effective hyperparameter optimization process.
- Restart functionality using the TuRBO algorithm
- Flexibility to change kernel functions and probability models in botorch
- Implemented as a sampler in Optuna, enabling tell-and-ask functionality
HPOModuleContest_3rdPlaceSolution
└── src
└── workspace
├── model # optimizer parameters
├── src # optimizer main unit
└── tests # benchmark function
└── schwefel_5dim
$ pip install git+https://github.com/aistairc/aiaccel.git
$ pip install -r requirement.txt
$ cd src
$ bash local_run.sh
The codes are based on BoTorch and optuna. Please also follow their licenses. Thanks for their awesome works.