1

I am using h2o4gpu and the parameters which i have set are

h2o4gpu.solvers.xgboost.RandomForestClassifier model.

XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
colsample_bytree=1.0, gamma=0, learning_rate=0.1, max_delta_step=0,
max_depth=8, min_child_weight=1, missing=nan, n_estimators=100,
n_gpus=1, n_jobs=-1, nthread=None, num_parallel_tree=1, num_round=1,
objective='binary:logistic', predictor='gpu_predictor',
random_state=123, reg_alpha=0, reg_lambda=1, scale_pos_weight=1,
seed=None, silent=False, subsample=1.0, tree_method='gpu_hist')

When i am training this model and then predicting, everything is running fine on GPU.

However, when i am saving my model in pickle and then loading back into another notebook and then running a prediction through predict_proba on it, then everything is running on CPU.

Why is my prediction not running on GPU?

1 Answers1

0

The predictions are meant to run on CPU so you don't need a GPU to actually use the model.

TomKraljevic
  • 3,661
  • 11
  • 14
  • I have a very large dataset of about 10TB and also the data is not fixed and is always increasing (as it is a streaming data) . So, how can CPU predict for such large dataset.? – Anshul Gupta Jun 13 '18 at 05:25