Loading...

verticapy.machine_learning.memmodel.ensemble.XGBRegressor.predict_sql#

XGBRegressor.predict_sql(X: list | ndarray) str#

Returns the SQL code needed to deploy the model.

Parameters#

X: ArrayLike

The names or values of the input predictors.

Returns#

str

SQL code.

Import the required modules and create many BinaryTreeRegressor.

from verticapy.machine_learning.memmodel.tree import BinaryTreeRegressor

model1 = BinaryTreeRegressor(
    children_left = [1, 3, None, None, None],
    children_right = [2, 4, None, None, None],
    feature = [0, 1, None, None, None],
    threshold = ["female", 30, None, None, None],
    value = [None, None, 3, 11, 23],
)


model2 = BinaryTreeRegressor(
    children_left = [1, 3, None, None, None],
    children_right = [2, 4, None, None, None],
    feature = [0, 1, None, None, None],
    threshold = ["female", 30, None, None, None],
    value = [None, None, -3, 12, 56],
)


model3 = BinaryTreeRegressor(
    children_left = [1, 3, None, None, None],
    children_right = [2, 4, None, None, None],
    feature = [0, 1, None, None, None],
    threshold = ["female", 30, None, None, None],
    value = [None, None, 1, 3, 6],
)

Let’s create a model.

from verticapy.machine_learning.memmodel.ensemble import XGBRegressor

model_xgbr = XGBRegressor(
    trees = [model1, model2, model3],
    mean = 2.5,
    eta = 0.9,
)

Let’s use the following column names:

cnames = ["sex", "fare"]

Get the SQL code needed to deploy the model.

model_xgbr.predict_sql(cnames)
Out[8]: "((CASE WHEN sex = 'female' THEN (CASE WHEN fare < 30 THEN 11 ELSE 23 END) ELSE 3 END) + (CASE WHEN sex = 'female' THEN (CASE WHEN fare < 30 THEN 12 ELSE 56 END) ELSE -3 END) + (CASE WHEN sex = 'female' THEN (CASE WHEN fare < 30 THEN 3 ELSE 6 END) ELSE 1 END)) * 0.9 + 2.5"

Note

Refer to XGBRegressor for more information about the different methods and usages.