logo
  • Install
  • User Guide
  • API
  • Examples
logo
  • HPO - Grid Search

Note

Go to the end to download the full example code or to run this example in your browser via Binder

HPO - Grid Search¶

Experiment initialization and data preparation

from piml import Experiment
from piml.models import XGB2Regressor

exp = Experiment()
exp.data_loader(data="BikeSharing", silent=True)
exp.data_summary(feature_exclude=["yr", "mnth", "temp"], silent=True)
exp.data_prepare(target="cnt", task_type="regression", silent=True)

Train Model

exp.model_train(model=XGB2Regressor(), name="XGB2")

Define hyperparameter search space for grid search

parameters = {'n_estimators': [100, 300, 500],
              'eta': [0.1, 0.3, 0.5],
              'reg_lambda': [0.0, 0.5, 1.0],
              'reg_alpha': [0.0, 0.5, 1.0]
             }

Tune hyperparameters of registered models

result = exp.model_tune("XGB2", method="grid", parameters=parameters, metric=['MSE', 'MAE'], test_ratio=0.2)
result.data
Rank(by MSE) MSE MAE time
params
{'eta': 0.5, 'n_estimators': 500, 'reg_alpha': 0.5, 'reg_lambda': 0.0} 1 0.006082 0.055082 3.674156
{'eta': 0.5, 'n_estimators': 500, 'reg_alpha': 0.5, 'reg_lambda': 1.0} 2 0.006267 0.055823 3.981528
{'eta': 0.5, 'n_estimators': 500, 'reg_alpha': 0.5, 'reg_lambda': 0.5} 3 0.006328 0.056353 3.933432
{'eta': 0.5, 'n_estimators': 500, 'reg_alpha': 0.0, 'reg_lambda': 1.0} 4 0.006331 0.056674 3.523289
{'eta': 0.5, 'n_estimators': 500, 'reg_alpha': 0.0, 'reg_lambda': 0.0} 5 0.006416 0.056506 4.211272
... ... ... ... ...
{'eta': 0.1, 'n_estimators': 100, 'reg_alpha': 0.5, 'reg_lambda': 0.0} 77 0.011644 0.074410 0.522839
{'eta': 0.1, 'n_estimators': 100, 'reg_alpha': 0.5, 'reg_lambda': 1.0} 78 0.011669 0.074375 0.524328
{'eta': 0.1, 'n_estimators': 100, 'reg_alpha': 1.0, 'reg_lambda': 0.0} 79 0.011741 0.074464 0.662398
{'eta': 0.1, 'n_estimators': 100, 'reg_alpha': 1.0, 'reg_lambda': 1.0} 80 0.011752 0.074676 0.523580
{'eta': 0.1, 'n_estimators': 100, 'reg_alpha': 1.0, 'reg_lambda': 0.5} 81 0.011808 0.074723 0.584545

81 rows × 4 columns



Show hyperparameter result plot

fig = result.plot(param='n_estimators', figsize=(6, 4.5))
HPO Result

Refit model using a selected hyperparameter

params = result.get_params_ranks(rank=1)
exp.model_train(XGB2Regressor(**params), name="XGB2-HPO-GridSearch")

Compare the default model and HPO refitted model

exp.model_diagnose("XGB2", show="accuracy_table")
          MSE     MAE       R2

Train  0.0090  0.0669   0.7382
Test   0.0095  0.0688   0.7287
Gap    0.0005  0.0019  -0.0095

Compare the default model and HPO refitted model

exp.model_diagnose("XGB2-HPO-GridSearch", show="accuracy_table")
          MSE     MAE       R2

Train  0.0057  0.0535   0.8346
Test   0.0063  0.0559   0.8193
Gap    0.0006  0.0024  -0.0153

Total running time of the script: ( 3 minutes 52.724 seconds)

Estimated memory usage: 31 MB

Launch binder

Download Python source code: plot_1_hpo_grid.py

Download Jupyter notebook: plot_1_hpo_grid.ipynb

Gallery generated by Sphinx-Gallery

© Copyright 2022-, PiML-Toolbox authors. Show this page source