Tuning an RF's Hyperparameters

Machine Learning with Tree-Based Models in Python

Elie Kawerk

Data Scientist

Random Forests Hyperparameters

  • CART hyperparameters

  • number of estimators

  • bootstrap

  • ....

Machine Learning with Tree-Based Models in Python

Tuning is expensive

Hyperparameter tuning:

  • computationally expensive,

  • sometimes leads to very slight improvement,

Weight the impact of tuning on the whole project.

Machine Learning with Tree-Based Models in Python

Inspecting RF Hyperparameters in sklearn

# Import RandomForestRegressor 
from sklearn.ensemble import RandomForestRegressor

# Set seed for reproducibility
SEED = 1

# Instantiate a random forests regressor 'rf' 
rf = RandomForestRegressor(random_state= SEED)

Machine Learning with Tree-Based Models in Python
# Inspect rf' s hyperparameters
rf.get_params()
{'bootstrap': True,
 'criterion': 'mse',
 'max_depth': None,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 1,
 'min_samples_split': 2,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 10,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': 1,
 'verbose': 0,
 'warm_start': False}
Machine Learning with Tree-Based Models in Python
# Basic imports
from sklearn.metrics import mean_squared_error as MSE
from sklearn.model_selection import GridSearchCV

# Define a grid of hyperparameter 'params_rf' params_rf = { 'n_estimators': [300, 400, 500], 'max_depth': [4, 6, 8], 'min_samples_leaf': [0.1, 0.2], 'max_features': ['log2', 'sqrt'] }
# Instantiate 'grid_rf' grid_rf = GridSearchCV(estimator=rf, param_grid=params_rf, cv=3, scoring='neg_mean_squared_error', verbose=1, n_jobs=-1)
Machine Learning with Tree-Based Models in Python

Searching for the best hyperparameters

# Fit 'grid_rf' to the training set
grid_rf.fit(X_train, y_train)
Fitting 3 folds for each of 36 candidates, totalling 108 fits
[Parallel(n_jobs=-1)]: Done  42 tasks      | elapsed:   10.0s
[Parallel(n_jobs=-1)]: Done 108 out of 108 | elapsed:   24.3s finished
RandomForestRegressor(bootstrap=True, criterion='mse', max_depth=4,
           max_features='log2', max_leaf_nodes=None,
           min_impurity_decrease=0.0, min_impurity_split=None,
           min_samples_leaf=0.1, min_samples_split=2,
           min_weight_fraction_leaf=0.0, n_estimators=400, n_jobs=1,
           oob_score=False, random_state=1, verbose=0, warm_start=False)
Machine Learning with Tree-Based Models in Python

Extracting the best hyperparameters

# Extract the best hyperparameters from 'grid_rf'
best_hyperparams = grid_rf.best_params_

print('Best hyperparameters:\n', best_hyperparams)
Best hyperparameters:
        {'max_depth': 4,
         'max_features': 'log2', 
         'min_samples_leaf': 0.1,
         'n_estimators': 400}
Machine Learning with Tree-Based Models in Python

Evaluating the best model performance

# Extract the best model from 'grid_rf'
best_model = grid_rf.best_estimator_
# Predict the test set labels
y_pred = best_model.predict(X_test)
# Evaluate the test set RMSE
rmse_test = MSE(y_test, y_pred)**(1/2)
# Print the test set RMSE
print('Test set RMSE of rf: {:.2f}'.format(rmse_test))
Test set RMSE of rf: 3.89
Machine Learning with Tree-Based Models in Python

Let's practice!

Machine Learning with Tree-Based Models in Python

Preparing Video For Download...