Release v1.1.1
Highlights
- Support passing a list of objectives as the
objective
argument. - Raise better error message when the return value of
run_trial()
orHyperModel.fit()
are of wrong type. - Various bug fixes for
BayesianOptimization
tuner. - The trial IDs are changed from hex strings to integers counting from 0.
What's Changed
- Make hyperparameters names visible in Display output by @C-Pro in #634
- Replace import kerastuner with import keras_tuner by @ageron in #640
- Support multi-objective by @haifeng-jin in #641
- reorganize the tests to follow keras best practices by @haifeng-jin in #643
- keep Objective in oracle for backward compatibility by @haifeng-jin in #644
- better error check for returned eval results by @haifeng-jin in #646
- Mitigate the issue of hanging workers after chief already quits when running keras-tuner in distributed tuning mode. by @mtian29 in #645
- Ensure hallucination checks if the Gaussian regressor has been fit be… by @brydon in #650
- Resolves #609: Support for sklearn functions without sample_weight by @brydon in #651
- Resolves #652 and #605: Make human readable trial_id and sync trial numbers between worker Displays by @brydon in #653
- Update tuner.py by @haifeng-jin in #657
- fix(bayesian): scalar optimization result (#655) by @haifeng-jin in #662
- Generalize hallucination checks to avoid racing conditions by @alisterl in #664
- remove scipy from required dependency by @haifeng-jin in #665
- Import scipy.optimize by @haifeng-jin in #667
New Contributors
- @C-Pro made their first contribution in #634
- @ageron made their first contribution in #640
- @mtian29 made their first contribution in #645
- @brydon made their first contribution in #650
- @alisterl made their first contribution in #664
Full Changelog: 1.1.1rc0...1.1.1