1.2 Grid of hyperparameters

Each machine learning model may have its own set hyperparameters. Some of them have only a few while others may have a lot of parameters to tune. Again, checking out the documentation to understand these hyperparameters is important. Once you know them and the range of values they can take, you can create a grid to do a grid search. A grid is nothing but a full combination of all the possible values of the hyperparameters. In order to create a grid, we use expand.grid() function from base R. Here is a simple example.

Consider that we have two parameters alpha(\(\small \alpha\)) and beta(\(\small \beta\)) such that \(\small \alpha \in [0, 1]\) and \(\small \beta \in [1, 10]\) As we don’t know which combination of these two parameters will give us the best model, we decide to try out multiple combinations and choose one. We decide to increment alpha by 0.4 and beta by 3 step-wise and try their combinations.

The code below will give us all the possible combinations satisfying the rules we specified.

expand.grid(alpha = seq(0, 1, by = 0.4),
            beta = seq(1, 10, by = 3))
##    alpha beta
## 1    0.0    1
## 2    0.4    1
## 3    0.8    1
## 4    0.0    4
## 5    0.4    4
## 6    0.8    4
## 7    0.0    7
## 8    0.4    7
## 9    0.8    7
## 10   0.0   10
## 11   0.4   10
## 12   0.8   10

Thus, we get 12 possible combinations. We can pass this data.frame to the train() function and it will run models using all the 12 combinations and determine the best performing combination in terms of the metric that we specify.