To do so, you should create a subclass of "BaseGradientBoosting" and a subclass of both the first subclass and GradientBoostingClassifier (in the classification case) classes. Inside first class you should pass the name of the custom loss function in the super().__init__
, and inside the second subclass you can pass the name of your custom loss function as a _SUPPORTED_LOSS
.
Besides, to turn off the ValueError
of _check_params
in SKlearn gradient boosting, you have to override this function, or consider an exception for this function.
For example:
class my_base_gradient_boost(BaseGradientBoosting, metaclass=ABCMeta):
@abstractmethod
def __init__(self, *args):
super().__init__(loss='my_custom_loss', *other_args)
def _check_params(self):
try:
super()._check_params()
except ValueError as e:
if str(e) == "Loss 'my_costum_loss' not supported. ":
self.loss_ = self.my_costum_loss
else:
raise
class my_classifier(my_base_gradient_boost, GradientBoostingClassifier):
_SUPPORTED_LOSS = ('my_costum_loss')
@_deprecate_positional_args
def __init__(self, *args):
super().__init__(*args)
Please keep in mind that *args are all the arguments of BaseGradientBoosting.
Well, that's a lot of work, but I could not find any better solution to do so. I hope this helps.
P.S. by the way, you are right, the mentioned links don't answer your question.