Skip to content

Conversation

@cshih14
Copy link

@cshih14 cshih14 commented Mar 30, 2020

Solves issue #34

This tutorial tunes hyperparameters max_features and n_estimators for ExtraTrees and RandomForestClassifier in order to fairly compare their performance.

Included in Functions

  • Given classifiers and dictionaries of two hyperparameters, finds the optimal pair of hyperparameter values for each classifier using grid search. In this specific example, max_features and n_estimators will be the parameters being tuned.

  • Implements warmstart for searching through increasing numbers of estimators in the ensemble.

  • Visualizes the performance of each parameter value pair in a heatmap to show more detail on how the optimal parameters were determined.

@cshih14 cshih14 requested a review from j1c March 30, 2020 18:48
@j1c
Copy link
Member

j1c commented Apr 6, 2020

The notebook looks great! I think the last thing to change is to turn the ipynb file into a .py file. In sklearn, the .py files are used to build the examples page automatically, and they wont take ipynb files.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants