4

I want to use MLPClassifier of skilearn

mlp = MLPClassifier(hidden_layer_sizes=(50,), max_iter=10, alpha=1e-4,
                solver='sgd', verbose=10, tol=1e-4, random_state=1,
                learning_rate_init=.1)

I didn't find any parameter for the loss function, I want it to be mean_squared_error. Is it possible to determine it for the model?

Vivek Kumar
  • 35,217
  • 8
  • 109
  • 132
Ahmad
  • 8,811
  • 11
  • 76
  • 141
  • 1
    By *definition*, since `MLPClassifier` is, well, a classifier, you cannot use MSE, since that would make the problem a regression one; for regression problems, you should use [`MLPRegressor`](https://scikit-learn.org/stable/modules/generated/sklearn.neural_network.MLPRegressor.html). – desertnaut Nov 19 '18 at 11:11
  • @desertnaut There might be issues with this particular software implementation, but MSE is an acceptable loss function for a classification problem. In that context, it often goes by the name of Brier score. Yann LeCun has examples on his website of using MSE as the loss function in MNIST neural networks. – Dave Jun 09 '21 at 19:04

1 Answers1

4

According to the docs:

This model optimizes the log-loss function using LBFGS or stochastic gradient descent.

Log-loss is basically the same as cross-entropy.

There is no way to pass another loss function to MLPClassifier, so you cannot use MSE. But MLPRegressor uses MSE, if you really want that.

However, the general advice is to stick to cross-entropy loss for classification, it is said to have some advantages over MSE. So you may just want to use MLPClassifier as is for your classification problem.

desertnaut
  • 57,590
  • 26
  • 140
  • 166
cheersmate
  • 2,385
  • 4
  • 19
  • 32