4

Sklearn gradient boosting classifier accepts deviance and exponential loss, as detailed here and here. But, can we pass a custom loss instead of its predefined loss ('deviance' or 'exponential' loss).


Similar questions on stackoverflow and why they are not answering my question:

I already found this question, which looks most relevant. But it does not cover my question at all, since it does not discuss how could one pass the custom loss to GradientBoostingClassifier.

Also this question looks relevant, but the answer does not explain once you define a class (as described in the answer), how do you pass this as an argument to the GradientBoostingClassifier.

Meysam Sadeghi
  • 1,483
  • 2
  • 17
  • 23

2 Answers2

2

To do so, you should create a subclass of "BaseGradientBoosting" and a subclass of both the first subclass and GradientBoostingClassifier (in the classification case) classes. Inside first class you should pass the name of the custom loss function in the super().__init__, and inside the second subclass you can pass the name of your custom loss function as a _SUPPORTED_LOSS.

Besides, to turn off the ValueError of _check_params in SKlearn gradient boosting, you have to override this function, or consider an exception for this function.

For example:

class my_base_gradient_boost(BaseGradientBoosting, metaclass=ABCMeta):
    @abstractmethod
    def __init__(self, *args):
       super().__init__(loss='my_custom_loss', *other_args)

    def _check_params(self):
      try:
        super()._check_params()
      except ValueError as e:
        if str(e) == "Loss 'my_costum_loss' not supported. ":
            self.loss_ = self.my_costum_loss
        else:
            raise

class my_classifier(my_base_gradient_boost, GradientBoostingClassifier):

    _SUPPORTED_LOSS = ('my_costum_loss')

    @_deprecate_positional_args           
    def __init__(self, *args):
        super().__init__(*args)

Please keep in mind that *args are all the arguments of BaseGradientBoosting.

Well, that's a lot of work, but I could not find any better solution to do so. I hope this helps.

P.S. by the way, you are right, the mentioned links don't answer your question.

Maryam Bahrami
  • 1,056
  • 9
  • 18
0

Here's a simple example of how to do custom loss in a Gradient Boosting Regressor. I made a custom loss function to weight 0s differently, as I was dealing with a zero-inflation. But you could customize it with your loss function.

This logic could be applied to many of the sklearn methods.

class CustomGradientBoostingRegressor(GradientBoostingRegressor):
    def __init__(self, penalty_factor=5, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.penalty_factor = penalty_factor

    def fit(self, X, y, sample_weight=None):
        # Custom logic to compute the loss, giving higher penalty for errors when y_true is zero
        super().fit(X, y, sample_weight=sample_weight)
        loss = np.abs(y - self.predict(X))
        loss[y == 0] *= self.penalty_factor
        return self
mikey
  • 1,066
  • 9
  • 17