1

I have the following model defined, that I would like to apply Hyperparameter tuning to. I want to use GridSearchCV and change the number of layers etc.

class Regressor(nn.Module):
    def __init__(self, n_layers=3, n_features=10, activation=nn.ReLU):
        super().__init__()
        self.layers = []
        self.activation_functions = []

        for i in range(n_layers):
            self.layers.append(nn.Linear(n_features, n_features))
            self.activation_functions.append(activation())
            self.add_module(f"layer{i}", self.layers[-1])
            self.add_module(f"act{i}", self.activation_functions[-1])

        self.output = nn.Linear(n_features, 1)
    
    def forward(self, x):
        for layer, act in zip(self.layers, self.activation_functions):
            x=act(layer(x))
        
        x = self.output(x)
        return x

I have defined the Skorch NeuralNetRegressor as follows:

model = NeuralNetRegressor(
    module=FatRegressor,
    max_epochs=100,
    batch_size=10,
    module__n_layers=2,
    criterion=nn.MSELoss,
)

print(model.initialize())

My parameter grid is

param_grid = {
    'model__optimizer': [optim.Adam, optim.Adamax, optim.NAdam],
    'model__max_epochs': list(range(30,40)), # Want to ramp between 10 and 100
     'module__activation': [nn.Identity, nn.ReLU, nn.ELU, nn.ReLU6, nn.GELU, nn.Softplus, nn.Softsign, nn.Tanh, 
                               nn.Sigmoid, nn.Hardsigmoid],
    'model__batch_size': [10,12,15,20],
    'model__n_layers': list(range(11,30)),
    'model__lr': [0.0001, 0.0008, 0.009, 0.001, 0.002, 0.003, 0.004, 0.01],
}

When using the Pipeline:

pipeline = Pipeline(steps=[('scaler', StandardScaler()),
                           ('model', NeuralNetRegressor(module=FatRegressor, device='cuda'))])

grid = GridSearchCV(
    estimator = pipeline,
    param_grid=param_grid,
    n_jobs=-1,
    cv=3,
    error_score='raise',
    return_train_score=True,
    verbose=3
)

I get the following error:

Invalid parameter 'n_layers' for estimator <class 'skorch.regressor.NeuralNetRegressor'>[uninitialized](
  module=<class '__main__.Regressor'>,
). Valid parameters are: ['module', 'criterion', 'optimizer', 'lr', 'max_epochs', 'batch_size', 'iterator_train', 'iterator_valid', 'dataset', 'train_split', 'callbacks', 'predict_nonlinearity', 'warm_start', 'verbose', 'device', 'compile', '_params_to_validate'].

Is there any way to use custom parameter names?

flying_loaf_3
  • 397
  • 2
  • 3
  • 12
  • Your `model` does not have an `n_layers` parameter, so it's expected that `model__n_layers` produces an error; according to your code, it does have a `module__n_layers` parameter, so shouldn't it be `model__module__n_layers`? – desertnaut May 24 '23 at 20:59
  • On the other hand, what is `FatRegressor`, and where exactly is your `class Regressor` used here? – desertnaut May 24 '23 at 21:00
  • In my code, the class is named FatRegressor. I thought I re-named all instances of it in this post to Regressor. – flying_loaf_3 May 24 '23 at 23:21
  • Please edit your post and rename it now; what about my other remarks? – desertnaut May 25 '23 at 00:32

0 Answers0