Machine Learning with Tree-Based Models in Python
Elie Kawerk
Data Scientist
CART hyperparameters
number of estimators
bootstrap
....
Hyperparameter tuning:
computationally expensive,
sometimes leads to very slight improvement,
Weight the impact of tuning on the whole project.
# Import RandomForestRegressor
from sklearn.ensemble import RandomForestRegressor
# Set seed for reproducibility
SEED = 1
# Instantiate a random forests regressor 'rf'
rf = RandomForestRegressor(random_state= SEED)
# Inspect rf' s hyperparameters
rf.get_params()
{'bootstrap': True,
'criterion': 'mse',
'max_depth': None,
'max_features': 'auto',
'max_leaf_nodes': None,
'min_impurity_decrease': 0.0,
'min_impurity_split': None,
'min_samples_leaf': 1,
'min_samples_split': 2,
'min_weight_fraction_leaf': 0.0,
'n_estimators': 10,
'n_jobs': -1,
'oob_score': False,
'random_state': 1,
'verbose': 0,
'warm_start': False}
# Basic imports from sklearn.metrics import mean_squared_error as MSE from sklearn.model_selection import GridSearchCV
# Define a grid of hyperparameter 'params_rf' params_rf = { 'n_estimators': [300, 400, 500], 'max_depth': [4, 6, 8], 'min_samples_leaf': [0.1, 0.2], 'max_features': ['log2', 'sqrt'] }
# Instantiate 'grid_rf' grid_rf = GridSearchCV(estimator=rf, param_grid=params_rf, cv=3, scoring='neg_mean_squared_error', verbose=1, n_jobs=-1)
# Fit 'grid_rf' to the training set
grid_rf.fit(X_train, y_train)
Fitting 3 folds for each of 36 candidates, totalling 108 fits
[Parallel(n_jobs=-1)]: Done 42 tasks | elapsed: 10.0s
[Parallel(n_jobs=-1)]: Done 108 out of 108 | elapsed: 24.3s finished
RandomForestRegressor(bootstrap=True, criterion='mse', max_depth=4,
max_features='log2', max_leaf_nodes=None,
min_impurity_decrease=0.0, min_impurity_split=None,
min_samples_leaf=0.1, min_samples_split=2,
min_weight_fraction_leaf=0.0, n_estimators=400, n_jobs=1,
oob_score=False, random_state=1, verbose=0, warm_start=False)
# Extract the best hyperparameters from 'grid_rf'
best_hyperparams = grid_rf.best_params_
print('Best hyperparameters:\n', best_hyperparams)
Best hyperparameters:
{'max_depth': 4,
'max_features': 'log2',
'min_samples_leaf': 0.1,
'n_estimators': 400}
# Extract the best model from 'grid_rf'
best_model = grid_rf.best_estimator_
# Predict the test set labels
y_pred = best_model.predict(X_test)
# Evaluate the test set RMSE
rmse_test = MSE(y_test, y_pred)**(1/2)
# Print the test set RMSE
print('Test set RMSE of rf: {:.2f}'.format(rmse_test))
Test set RMSE of rf: 3.89
Machine Learning with Tree-Based Models in Python