verticapy.machine_learning.memmodel.ensemble.XGBClassifier.predict_proba#
- XGBClassifier.predict_proba(X: list | ndarray) ndarray #
Computes the model’s probabilites using the input matrix.
Parameters#
- X: list | numpy.array
The data on which to make the prediction.
Returns#
- numpy.array
Probabilities.
Examples#
Import the required modules and create many
BinaryTreeClassifier
.from verticapy.machine_learning.memmodel.tree import BinaryTreeClassifier model1 = BinaryTreeClassifier( children_left = [1, 3, None, None, None], children_right = [2, 4, None, None, None], feature = [0, 1, None, None, None], threshold = ["female", 30, None, None, None], value = [ None, None, [0.8, 0.1, 0.1], [0.1, 0.8, 0.1], [0.2, 0.2, 0.6], ], classes = ["a", "b", "c"] ) model2 = BinaryTreeClassifier( children_left = [1, 3, None, None, None], children_right = [2, 4, None, None, None], feature = [0, 1, None, None, None], threshold = ["female", 30, None, None, None], value = [ None, None, [0.7, 0.2, 0.1], [0.3, 0.5, 0.2], [0.2, 0.2, 0.6], ], classes = ["a", "b", "c"], ) model3 = BinaryTreeClassifier( children_left = [1, 3, None, None, None], children_right = [2, 4, None, None, None], feature = [0, 1, None, None, None], threshold = ["female", 30, None, None, None], value = [ None, None, [0.4, 0.4, 0.2], [0.2, 0.2, 0.6], [0.2, 0.5, 0.3], ], classes = ["a", "b", "c"], )
Let’s create a model.
from verticapy.machine_learning.memmodel.ensemble import XGBClassifier model_xgbc = XGBClassifier( trees = [model1, model2, model3], classes = ["a", "b", "c"], logodds = [0.1, 0.12, 0.15], learning_rate = 0.1, )
Create a dataset.
data = [["male", 100], ["female", 20], ["female", 50]]
Compute the predictions.
model_xgbc.predict_proba(data) Out[8]: array([[0.34318847, 0.32840576, 0.32840576], [0.32393829, 0.34024456, 0.33581715], [0.32394919, 0.33138502, 0.34466579]])
Note
Refer to
XGBClassifier
for more information about the different methods and usages.