What are different options for objective functions available in xgboost.XGBClassifier?

Question:

Apart from binary:logistic (which is the default objective function), Is there any other built-in objective function that can be used in xbgoost.XGBClassifier() ?

Asked By: Venkatachalam

||

Answers:

That’s true that binary:logistic is the default objective for XGBClassifier, but I don’t see any reason why you couldn’t use other objectives offered by XGBoost package.
For example, you can see in sklearn.py source code that multi:softprob is used explicitly in multiclass case.

Moreover, if it’s really necessary, you can provide a custom objective function (details here).

Answered By: kawa

The default objective for XGBClassifier is [‘reg:linear]
however there are other parameters as well..
binary:logistic-It returns predicted probabilities for predicted class
multi:softmax – Returns hard class for multiclass classification
multi:softprob – It Returns probabilities for multiclass classification

Note: when using multi:softmax as objective, you need to pass num_class also
as num_class is number of parameters defining number of class
such as for labelliing (0,1,2), here we have 3 classes, so num_class = 3

Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.