Description
Description
Currently, there is no example or demonstration of a hyperparameter optimization function that uses warmstart (GridSearchCV and RandomSearchCV ignore warmstart) and tunes multiple classifier types in the same function. For comparisons between algorithm performance on multiple datasets, a common experiment for tasks such as benchmarking, such a function would be especially useful and reduce computational time.
Enhancement
-
I will write a function that, given a classifier and a dictionary of two hyperparameters, finds the optimal pair of hyperparameter values using grid search. In this specific example, max_features and n_estimators will be the parameters being tuned.
-
The function will implement warmstart for searching through increasing numbers of estimators in the ensemble.
-
A function will be included that visualizes the performance of each parameter value pair in a heatmap to show more detail on how the optimal parameters were determined.