0

I wish to train some data using the Gradient Boosting Regressor of Scikit-Learn.

My questions are:

1) Is the algorithm able to capture non-linear relationships? For example, in the case of y=x^2, y increases as x approaches negative infinity and positive infinity. What if the graph looks like y=sin(x)?

2) Is the algorithm able to detect interactions/relationships among the features? Specifically, should I add features that are the sums/differences of the raw features to the training set?

Chong Lip Phang
  • 8,755
  • 5
  • 65
  • 100
  • a downvote for...? – Chong Lip Phang Feb 11 '19 at 11:27
  • 1
    For the *extrapolation* part (i.e. predicting outside the training range), my answer here may be useful: [Is deep learning bad at fitting simple non linear functions outside training scope?](https://stackoverflow.com/questions/53795142/is-deep-learning-bad-at-fitting-simple-non-linear-functions-outside-training-sco/53796253#53796253) (it is about neural nets, but the rationale is applicable to ML models in general). – desertnaut Feb 11 '19 at 11:39

0 Answers0