XGBoost – get probabilities after multi:softmax function

Question:

I have a question regarding xgboost and multiclass. I am not using the sklearn wrapper as I always struggle with some parameters. I was wondering if it is possible to get the probability vector plus the softmax output. The following is my code:

param = {}
param['objective'] = 'multi:softmax'
param['booster'] = 'gbtree'
param['eta'] = 0.1
param['max_depth'] = 30
param['silent'] = 1
param['nthread'] = 4
param['num_round'] = 40
param['num_class'] = len(np.unique(label)) + 1   
model = xgb.train(param, dtrain)                                    
# predict                                                                                   
pred = model.predict(dtest)

I would like to be able to call a function like predict_proba, but I do not know if it is possible. A lot of answers (ex:https://datascience.stackexchange.com/questions/14527/xgboost-predict-probabilities) suggest to move to the sklearn wrapper, however, I would like to stay with the normal train method.

Asked By: Guido Muscioni

||

Answers:

If you use param['objective'] = 'multi:prob' instead of param['objective'] = 'multi:softmax', the result of the classifier is a of probabilities for each class.

See the documentation here:
https://xgboost.readthedocs.io/en/latest/parameter.html#learning-task-parameters

Answered By: MartinVotruba
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.