Loading...

verticapy.machine_learning.vertica.ensemble.XGBRegressor#

class verticapy.machine_learning.vertica.ensemble.XGBRegressor(name: str = None, overwrite_model: bool = False, max_ntree: int = 10, max_depth: int = 5, nbins: int = 32, split_proposal_method: Literal['local', 'global'] = 'global', tol: float = 0.001, learning_rate: float = 0.1, min_split_loss: float = 0.0, weight_reg: float = 0.0, sample: float = 1.0, col_sample_by_tree: float = 1.0, col_sample_by_node: float = 1.0)#

Creates an XGBRegressor object using the Vertica XGB_REGRESSOR algorithm.

Parameters#

name: str, optional

Name of the model. The model is stored in the DB.

overwrite_model: bool, optional

If set to True, training a model with the same name as an existing model overwrites the existing model.

max_ntree: int, optional

Maximum number of trees that can be created.

max_depth: int, optional

aximum depth of each tree, an integer between 1 and 20, inclusive.

nbins: int, optional

Number of bins used to find splits in each column, where more splits leads to a longer runtime but more fine-grained, possibly better splits. Must be an integer between 2 and 1000, inclusive.

split_proposal_method: str, optional

Approximate splitting strategy, either global or local (not yet supported).

tol: float, optional

Approximation error of quantile summary structures used in the approximate split finding method.

learning_rate: float, optional

Weight applied to each tree’s prediction. This reduces each tree’s impact, allowing for later trees to contribute and keeping earlier trees from dominating.

min_split_loss: float, optional

Each split must improve the model’s objective function value by at least this much in order to avoid pruning. A value of 0 is the same as turning off this parameter (trees are still pruned based on positive / negative objective function values).

weight_reg: float, optional

Regularization term that is applied to the weights of the leaves in the regression tree. A higher value leads to more sparse/smooth weights, which often helps to prevent overfitting.

sample: float, optional

Fraction of rows used per iteration in training.

col_sample_by_tree: float, optional

float in the range (0,1] that specifies the fraction of columns (features), chosen at random, to use when building each tree.

col_sample_by_node: float, optional

float in the range (0,1] that specifies the fraction of columns (features), chosen at random, to use when evaluating each split.

Attributes#

Many attributes are created during the fitting phase.

trees_: list of BinaryTreeRegressor

Tree models are instances of ` BinaryTreeRegressor, each possessing various attributes. For more detailed information, refer to the documentation for BinaryTreeRegressor.

features_importance_: numpy.array

The importance of features. It is calculated using the average gain of each tree. To determine the final score, VerticaPy sums the scores of each tree, normalizes them and applies an activation function to scale them. It is necessary to use the features_importance() method to compute it initially, and the computed values will be subsequently utilized for subsequent calls.

features_importance_trees_: dict of numpy.array

Each element of the array represents the feature importance of tree i. The importance of features is calculated using the average gain of each tree. It is necessary to use the features_importance() method to compute it initially, and the computed values will be subsequently utilized for subsequent calls.

mean_: float

The mean of the response column.

eta_: float

The learning rate, is a crucial hyperparameter in machine learning algorithms. It determines the step size at each iteration during the model training process. A well-chosen learning rate is essential for achieving optimal convergence and preventing overshooting or slow convergence in the training phase. Adjusting the learning rate is often necessary to strike a balance between model accuracy and computational efficiency.

n_estimators_: int

The number of model estimators.

Note

All attributes can be accessed using the get_attributes() method.

Note

Several other attributes can be accessed by using the get_vertica_attributes() method.

Examples#

The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.

Important

Many tree-based models inherit from the XGB base class, and it’s recommended to use it directly for access to a wider range of options.

Load data for machine learning#

We import verticapy:

import verticapy as vp

Hint

By assigning an alias to verticapy, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions from verticapy are used as intended without interfering with functions from other libraries.

For this example, we will use the winequality dataset.

import verticapy.datasets as vpd

data = vpd.load_winequality()
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
13.80.310.0211.10.03620.0114.00.992483.750.4412.460white
23.90.2250.44.20.0329.0118.00.9893.570.3612.881white
34.20.170.361.80.02993.0161.00.989993.650.8912.071white
44.20.2150.235.10.04164.0157.00.996883.420.448.030white
54.40.320.394.30.0331.0127.00.989043.460.3612.881white
64.40.460.12.80.02431.0111.00.988163.480.3413.160white
74.40.540.095.10.03852.097.00.990223.410.412.271white
84.50.190.210.950.03389.0159.00.993323.340.428.050white
94.60.4450.01.40.05311.0178.00.994263.790.5510.250white
104.60.520.152.10.0548.065.00.99343.90.5613.140red
114.70.1450.291.00.04235.090.00.99083.760.4911.360white
124.70.3350.141.30.03669.0168.00.992123.470.4610.550white
134.70.4550.181.90.03633.0106.00.987463.210.8314.071white
144.70.60.172.30.05817.0106.00.99323.850.612.960red
154.70.670.091.00.025.09.00.987223.30.3413.650white
164.70.7850.03.40.03623.0134.00.989813.530.9213.860white
174.80.130.321.20.04240.098.00.98983.420.6411.871white
184.80.170.282.90.0322.0111.00.99023.380.3411.371white
194.80.210.2110.20.03717.0112.00.993243.660.4812.271white
204.80.2250.381.20.07447.0130.00.991323.310.410.360white
214.80.260.2310.60.03423.0111.00.992743.460.2811.571white
224.80.290.231.10.04438.0180.00.989243.280.3411.960white
234.80.330.06.50.02834.0163.00.99373.350.619.950white
244.80.340.06.50.02833.0163.00.99393.360.619.960white
254.80.650.121.10.0134.010.00.992463.320.3613.540white
264.90.2350.2711.750.0334.0118.00.99543.070.59.460white
274.90.330.311.20.01639.0150.00.987133.330.5914.081white
284.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
294.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
304.90.3450.341.00.06832.0143.00.991383.240.410.150white
314.90.3450.341.00.06832.0143.00.991383.240.410.150white
324.90.420.02.10.04816.042.00.991543.710.7414.071red
334.90.470.171.90.03560.0148.00.989643.270.3511.560white
345.00.170.561.50.02624.0115.00.99063.480.3910.871white
355.00.20.41.90.01520.098.00.98973.370.5512.0560white
365.00.2350.2711.750.0334.0118.00.99543.070.59.460white
375.00.240.195.00.04317.0101.00.994383.670.5710.050white
385.00.240.212.20.03931.0100.00.990983.690.6211.760white
395.00.240.341.10.03449.0158.00.987743.320.3213.171white
405.00.2550.222.70.04346.0153.00.992383.750.7611.360white
415.00.270.324.50.03258.0178.00.989563.450.3112.671white
425.00.270.324.50.03258.0178.00.989563.450.3112.671white
435.00.270.41.20.07642.0124.00.992043.320.4710.160white
445.00.290.545.70.03554.0155.00.989763.270.3412.981white
455.00.30.333.70.0354.0173.00.98873.360.313.071white
465.00.310.06.40.04643.0166.00.9943.30.639.960white
475.00.330.161.50.04910.097.00.99173.480.4410.760white
485.00.330.161.50.04910.097.00.99173.480.4410.760white
495.00.330.161.50.04910.097.00.99173.480.4410.760white
505.00.330.184.60.03240.0124.00.991143.180.411.060white
515.00.330.2311.80.0323.0158.00.993223.410.6411.860white
525.00.350.257.80.03124.0116.00.992413.390.411.360white
535.00.350.257.80.03124.0116.00.992413.390.411.360white
545.00.380.011.60.04826.060.00.990843.70.7514.060red
555.00.40.54.30.04629.080.00.99023.490.6613.660red
565.00.420.242.00.0619.050.00.99173.720.7414.081red
575.00.440.0418.60.03938.0128.00.99853.370.5710.260white
585.00.4550.181.90.03633.0106.00.987463.210.8314.071white
595.00.550.148.30.03235.0164.00.99183.530.5112.581white
605.00.610.121.30.00965.0100.00.98743.260.3713.550white
615.00.740.01.20.04116.046.00.992584.010.5912.560red
625.01.020.041.40.04541.085.00.99383.750.4810.540red
635.01.040.241.60.0532.096.00.99343.740.6211.550red
645.10.110.321.60.02812.090.00.990083.570.5212.260white
655.10.140.250.70.03915.089.00.99193.220.439.260white
665.10.1650.225.70.04742.0146.00.99343.180.559.960white
675.10.210.281.40.04748.0148.00.991683.50.4910.450white
685.10.230.181.00.05313.099.00.989563.220.3911.550white
695.10.250.361.30.03540.078.00.98913.230.6412.171white
705.10.260.331.10.02746.0113.00.989463.350.4311.471white
715.10.260.346.40.03426.099.00.994493.230.419.260white
725.10.290.288.30.02627.0107.00.993083.360.3711.060white
735.10.290.288.30.02627.0107.00.993083.360.3711.060white
745.10.30.32.30.04840.0150.00.989443.290.4612.260white
755.10.3050.131.750.03617.073.00.993.40.5112.333333333333350white
765.10.310.30.90.03728.0152.00.9923.540.5610.160white
775.10.330.221.60.02718.089.00.98933.510.3812.571white
785.10.330.221.60.02718.089.00.98933.510.3812.571white
795.10.330.221.60.02718.089.00.98933.510.3812.571white
805.10.330.276.70.02244.0129.00.992213.360.3911.071white
815.10.350.266.80.03436.0120.00.991883.380.411.560white
825.10.350.266.80.03436.0120.00.991883.380.411.560white
835.10.350.266.80.03436.0120.00.991883.380.411.560white
845.10.390.211.70.02715.072.00.98943.50.4512.560white
855.10.420.01.80.04418.088.00.991573.680.7313.671red
865.10.420.011.50.01725.0102.00.98943.380.3612.371white
875.10.470.021.30.03418.044.00.99213.90.6212.860red
885.10.510.182.10.04216.0101.00.99243.460.8712.971red
895.10.520.062.70.05230.079.00.99323.320.439.350white
905.10.5850.01.70.04414.086.00.992643.560.9412.971red
915.20.1550.331.60.02813.059.00.989753.30.8411.981white
925.20.1550.331.60.02813.059.00.989753.30.8411.981white
935.20.160.340.80.02926.077.00.991553.250.5110.160white
945.20.170.270.70.0311.068.00.992183.30.419.850white
955.20.1850.221.00.0347.0123.00.992183.550.4410.1560white
965.20.20.273.20.04716.093.00.992353.440.5310.171white
975.20.210.311.70.04817.061.00.989533.240.3712.071white
985.20.220.466.20.06641.0187.00.993623.190.429.7333333333333350white
995.20.240.157.10.04332.0134.00.993783.240.489.960white
1005.20.240.453.80.02721.0128.00.9923.550.4911.281white
Rows: 1-100 | Columns: 14

Note

VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.

You can easily divide your dataset into training and testing subsets using the vDataFrame.train_test_split() method. This is a crucial step when preparing your data for machine learning, as it allows you to evaluate the performance of your models accurately.

data = vpd.load_winequality()
train, test = data.train_test_split(test_size = 0.2)

Warning

In this case, VerticaPy utilizes seeded randomization to guarantee the reproducibility of your data split. However, please be aware that this approach may lead to reduced performance. For a more efficient data split, you can use the vDataFrame.to_db() method to save your results into tables or temporary tables. This will help enhance the overall performance of the process.

Model Initialization#

First we import the XGBRegressor model:

from verticapy.machine_learning.vertica import XGBRegressor

Then we can create the model:

model = XGBRegressor(
    max_ntree = 3,
    max_depth = 3,
    nbins = 6,
    split_proposal_method = 'global',
    tol = 0.001,
    learning_rate = 0.1,
    min_split_loss = 0,
    weight_reg = 0,
    sample = 0.7,
    col_sample_by_tree = 1,
    col_sample_by_node = 1,
)

Hint

In verticapy 1.0.x and higher, you do not need to specify the model name, as the name is automatically assigned. If you need to re-use the model, you can fetch the model name from the model’s attributes.

Important

The model name is crucial for the model management system and versioning. It’s highly recommended to provide a name if you plan to reuse the model later.

Model Training#

We can now fit the model:

model.fit(
    train,
    [
        "fixed_acidity",
        "volatile_acidity",
        "citric_acid",
        "residual_sugar",
        "chlorides",
        "density",
    ],
    "quality",
    test,
)

Important

To train a model, you can directly use the vDataFrame or the name of the relation stored in the database. The test set is optional and is only used to compute the test metrics. In verticapy, we don’t work using X matrices and y vectors. Instead, we work directly with lists of predictors and the response name.

Features Importance#

We can conveniently get the features importance:

result = model.features_importance()

Note

In models such as XGBoost, feature importance is calculated using the average gain of each tree. To determine the final score, VerticaPy sums the scores of each tree, normalizes them and applies an activation function to scale them.

Metrics#

We can get the entire report using:

model.report()
value
explained_variance0.0424480033064472
max_error3.2174649327779
median_absolute_error0.749793444096306
mean_absolute_error0.668678201377953
mean_squared_error0.74661601997019
root_mean_squared_error0.864069453209746
r20.0413394427370029
r2_adj0.0368840102477866
aic-365.118332939786
bic-329.101064356612
Rows: 1-10 | Columns: 2

Important

Most metrics are computed using a single SQL query, but some of them might require multiple SQL queries. Selecting only the necessary metrics in the report can help optimize performance. E.g. model.report(metrics = ["mse", "r2"]).

You can utilize the score() function to calculate various regression metrics, with the R-squared being the default.

model.score()
Out[4]: 0.0413394427370029

Prediction#

Prediction is straight-forward:

model.predict(
    test,
    [
        "fixed_acidity",
        "volatile_acidity",
        "citric_acid",
        "residual_sugar",
        "chlorides",
        "density",
    ],
    "prediction",
)
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
123
prediction
Float(22)
13.80.310.0211.10.03620.0114.00.992483.750.4412.460white5.82832992382762
23.90.2250.44.20.0329.0118.00.9893.570.3612.881white5.89180370645734
34.40.320.394.30.0331.0127.00.989043.460.3612.881white5.89180370645734
44.60.4450.01.40.05311.0178.00.994263.790.5510.250white5.82832992382762
54.70.3350.141.30.03669.0168.00.992123.470.4610.550white5.82832992382762
64.80.260.2310.60.03423.0111.00.992743.460.2811.571white5.82832992382762
74.80.650.121.10.0134.010.00.992463.320.3613.540white5.77256915929526
84.90.330.311.20.01639.0150.00.987133.330.5914.081white5.89180370645734
94.90.470.171.90.03560.0148.00.989643.270.3511.560white5.82832992382762
105.00.170.561.50.02624.0115.00.99063.480.3910.871white5.89180370645734
115.00.2350.2711.750.0334.0118.00.99543.070.59.460white5.82832992382762
125.00.2550.222.70.04346.0153.00.992383.750.7611.360white5.82832992382762
135.00.270.324.50.03258.0178.00.989563.450.3112.671white5.89180370645734
145.00.270.41.20.07642.0124.00.992043.320.4710.160white5.89180370645734
155.00.30.333.70.0354.0173.00.98873.360.313.071white5.89180370645734
165.00.310.06.40.04643.0166.00.9943.30.639.960white5.82832992382762
175.00.330.2311.80.0323.0158.00.993223.410.6411.860white5.82832992382762
185.00.420.242.00.0619.050.00.99173.720.7414.081red5.82832992382762
195.00.740.01.20.04116.046.00.992584.010.5912.560red5.77256915929526
205.01.020.041.40.04541.085.00.99383.750.4810.540red5.73655927051281
215.01.040.241.60.0532.096.00.99343.740.6211.550red5.73655927051281
225.10.140.250.70.03915.089.00.99193.220.439.260white5.82832992382762
235.10.250.361.30.03540.078.00.98913.230.6412.171white5.89180370645734
245.10.310.30.90.03728.0152.00.9923.540.5610.160white5.89180370645734
255.20.170.270.70.0311.068.00.992183.30.419.850white5.82832992382762
265.20.310.365.10.03146.0145.00.98973.140.3112.471white5.89180370645734
275.20.360.021.60.03124.0104.00.98963.440.3512.260white5.82832992382762
285.20.4050.151.450.03810.044.00.991253.520.411.640white5.82832992382762
295.20.50.182.00.03623.0129.00.989493.360.7713.471white5.82832992382762
305.20.60.077.00.04433.0147.00.99443.330.589.750white5.77256915929526
315.30.230.560.90.04146.0141.00.991193.160.629.750white5.89180370645734
325.30.570.011.70.0545.027.00.99343.570.8412.571red5.82832992382762
335.30.5850.077.10.04434.0145.00.99453.340.579.760white5.77256915929526
345.30.60.341.40.0313.060.00.988543.270.3813.060white5.83604294192498
355.40.1850.197.10.04836.0110.00.994383.260.419.560white5.82832992382762
365.40.240.182.30.0522.0145.00.992073.240.4610.350white5.82832992382762
375.40.2650.287.80.05227.091.00.994323.190.3810.460white5.89180370645734
385.40.290.381.20.02931.0132.00.988953.280.3612.460white5.89180370645734
395.40.290.473.00.05247.0145.00.9933.290.7510.060white5.89180370645734
405.40.530.162.70.03634.0128.00.988563.20.5313.281white5.82832992382762
415.50.160.311.20.02631.068.00.98983.330.4411.633333333333360white5.89180370645734
425.50.210.251.20.0418.075.00.990063.310.5611.360white5.82832992382762
435.50.230.192.20.04439.0161.00.992093.190.4310.460white5.82832992382762
445.50.240.451.70.04622.0113.00.992243.220.4810.050white5.89180370645734
455.50.3350.32.50.07127.0128.00.99243.140.519.660white5.89180370645734
465.50.420.091.60.01918.068.00.99063.330.5111.471white5.82832992382762
475.60.1850.491.10.0328.0117.00.99183.550.4510.360white5.89180370645734
485.60.250.192.40.04942.0166.00.9923.250.4310.460white5.82832992382762
495.60.250.263.60.03718.0115.00.99043.420.512.660white5.82832992382762
505.60.260.511.40.02925.093.00.994283.230.4910.560white5.89180370645734
515.60.280.284.20.04452.0158.00.9923.350.4410.771white5.89180370645734
525.60.290.050.80.03811.030.00.99243.360.359.250white5.82832992382762
535.60.6050.052.40.07319.025.00.992583.560.5512.950red5.77256915929526
545.70.120.265.50.03421.099.00.993243.090.579.960white5.82832992382762
555.70.150.283.70.04557.0151.00.99133.220.2711.260white5.89180370645734
565.70.210.242.30.04760.0189.00.9953.650.7210.160white5.82832992382762
575.70.210.251.10.03526.081.00.99023.310.5211.460white5.82832992382762
585.70.210.320.90.03838.0121.00.990743.240.4610.660white5.89180370645734
595.70.210.321.60.0333.0122.00.990443.330.5211.960white5.89180370645734
605.70.220.216.00.04441.0113.00.998623.220.468.960white5.7825350672221
615.70.220.216.00.04441.0113.00.998623.220.468.960white5.7825350672221
625.70.220.251.10.0597.0175.00.990993.440.6211.160white5.82832992382762
635.70.220.281.30.02726.0101.00.989483.350.3812.571white5.89180370645734
645.70.2450.331.10.04928.0150.00.99273.130.429.350white5.89180370645734
655.70.250.211.50.04421.0108.00.991423.30.5911.060white5.82832992382762
665.70.250.2710.80.0558.0116.00.995923.10.59.860white5.7825350672221
675.70.250.2711.50.0424.0120.00.994113.330.3110.860white5.82832992382762
685.70.260.31.80.03930.0105.00.989953.480.5212.571white5.89180370645734
695.70.2650.286.90.03646.0150.00.992993.360.4410.871white5.89180370645734
705.70.2650.286.90.03646.0150.00.992993.360.4410.871white5.89180370645734
715.70.280.2417.50.04460.0167.00.99893.310.449.450white5.7825350672221
725.70.430.35.70.03924.098.00.9923.540.6112.371white5.89180370645734
735.70.440.137.00.02528.0173.00.99133.330.4812.560white5.82832992382762
745.70.450.421.10.05161.0197.00.99323.020.49.050white5.89180370645734
755.80.130.2212.70.05824.0183.00.99563.320.4211.760white5.82832992382762
765.80.170.31.40.03755.0130.00.99093.290.3811.360white5.89180370645734
775.80.180.371.10.03631.096.00.989423.160.4812.060white5.89180370645734
785.80.190.2510.80.04233.0124.00.996463.220.419.260white5.7825350672221
795.80.190.494.90.0444.0118.00.99353.340.389.571white5.89180370645734
805.80.220.290.90.03434.089.00.989363.140.3611.171white5.89180370645734
815.80.220.291.30.03625.068.00.988653.240.3512.660white5.89180370645734
825.80.230.271.80.04324.069.00.99333.380.319.460white5.82832992382762
835.80.240.281.40.03840.076.00.987113.10.2913.971white5.89180370645734
845.80.240.391.50.05437.0158.00.99323.210.529.360white5.89180370645734
855.80.240.443.50.0295.0109.00.99133.530.4311.730white5.89180370645734
865.80.250.2413.30.04441.0137.00.99723.340.429.550white5.7825350672221
875.80.260.249.20.04455.0152.00.99613.310.389.450white5.7825350672221
885.80.270.263.50.07126.069.00.989943.10.3811.560white5.82832992382762
895.80.280.33.90.02636.0105.00.989633.260.5812.7560white5.89180370645734
905.80.290.050.80.03811.030.00.99243.360.359.250white5.82832992382762
915.80.320.22.60.02717.0123.00.989363.360.7813.971white5.82832992382762
925.80.320.284.30.03246.0115.00.989463.160.5713.081white5.89180370645734
935.80.3350.145.80.04649.0197.00.99373.30.7110.350white5.82832992382762
945.80.340.167.00.03726.0116.00.99493.460.4510.071white5.82832992382762
955.80.360.51.00.12763.0178.00.992123.10.459.750white5.89180370645734
965.80.390.477.50.02712.088.00.99073.380.4514.060white5.89180370645734
975.80.4150.131.40.0411.064.00.99223.290.5210.550white5.82832992382762
985.80.540.01.40.03340.0107.00.989183.260.3512.450white5.82832992382762
995.80.60.01.30.04472.0197.00.992023.560.4310.950white5.77256915929526
1005.80.680.021.80.08721.094.00.99443.540.5210.050red5.77256915929526
Rows: 1-100 | Columns: 15

Note

Predictions can be made automatically using the test set, in which case you don’t need to specify the predictors. Alternatively, you can pass only the vDataFrame to the predict() function, but in this case, it’s essential that the column names of the vDataFrame match the predictors and response name in the model.

Plots#

Tree models can be visualized by drawing their tree plots. For more examples, check out Machine Learning - Tree Plots.

model.plot_tree()
../_images/machine_learning_vertica_xgbreg.png

Note

The above example may not render properly in the doc because of the huge size of the tree. But it should render nicely in jupyter environment.

In order to plot graph using graphviz separately, you can extract the graphviz DOT file code as follows:

model.to_graphviz()
Out[5]: 'digraph Tree {\ngraph [bgcolor="#FFFFFF00"];\n0 [label="\\"density\\"", shape="box", style="filled", fillcolor="#FFFFFF00", fontcolor="#666666", color="#666666"]\n0 -> 1 [label="<= 0.995847", color="#666666", fontcolor="#666666"]\n0 -> 2 [label="> 0.995847", color="#666666", fontcolor="#666666"]\n1 [label="\\"citric_acid\\"", shape="box", style="filled", fillcolor="#FFFFFF00", fontcolor="#666666", color="#666666"]\n1 -> 3 [label="<= 0.276667", color="#666666", fontcolor="#666666"]\n1 -> 4 [label="> 0.276667", color="#666666", fontcolor="#666666"]\n2 [label="\\"volatile_acidity\\"", shape="box", style="filled", fillcolor="#FFFFFF00", fontcolor="#666666", color="#666666"]\n2 -> 5 [label="<= 0.33", color="#666666", fontcolor="#666666"]\n2 -> 6 [label="> 0.33", color="#666666", fontcolor="#666666"]\n3 [label="-0.063588", fillcolor="#FFFFFF00", fontcolor="#666666", shape="none", color="#666666"]\n4 [label="0.292243", fillcolor="#FFFFFF00", fontcolor="#666666", shape="none", color="#666666"]\n5 [label="-0.087916", fillcolor="#FFFFFF00", fontcolor="#666666", shape="none", color="#666666"]\n6 [label="-0.415332", fillcolor="#FFFFFF00", fontcolor="#666666", shape="none", color="#666666"]\n}'

This string can then be copied into a DOT file which can beparsed by graphviz.

Contour plot is another useful plot that can be produced for models with two predictors.

model.contour()

Important

Machine learning models with two predictors can usually benefit from their own contour plot. This visual representation aids in exploring predictions and gaining a deeper understanding of how these models perform in different scenarios. Please refer to Contour Plot for more examples.

Model Register#

In order to register the model for tracking and versioning:

model.register("model_v1")

Please refer to Model Tracking and Versioning for more details on model tracking and versioning.

Model Exporting#

To Memmodel

model.to_memmodel()

Note

MemModel objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle a scikit-learn model.

The preceding methods for exporting the model use MemModel, and it is recommended to use MemModel directly.

To SQL

You can get the SQL query equivalent of the XGB model by:

model.to_sql()
Out[6]: '((CASE WHEN "density" < 0.995847 THEN (CASE WHEN "citric_acid" < 0.276667 THEN -0.063588 ELSE 0.292243 END) ELSE (CASE WHEN "volatile_acidity" < 0.33 THEN -0.087916 ELSE -0.415332 END) END) + (CASE WHEN "density" < 0.995847 THEN (CASE WHEN "citric_acid" < 0.276667 THEN -0.033897 ELSE 0.24501 END) ELSE (CASE WHEN "volatile_acidity" < 0.58 THEN -0.181541 ELSE -0.509379 END) END) + (CASE WHEN "volatile_acidity" < 0.58 THEN (CASE WHEN "density" < 0.995847 THEN 0.140738 ELSE -0.145238 END) ELSE (CASE WHEN "volatile_acidity" < 0.83 THEN -0.416869 ELSE -0.776968 END) END)) * 0.1 + 5.82400461627236'

Note

This SQL query can be directly used in any database.

Deploy SQL

To get the SQL query which uses Vertica functions use below:

model.deploySQL()
Out[7]: 'PREDICT_XGB_REGRESSOR("fixed_acidity", "volatile_acidity", "citric_acid", "residual_sugar", "chlorides", "density" USING PARAMETERS model_name = \'"public"."_verticapy_tmp_xgbregressor_v_demo_b39141a2e22c11eea3a80242ac120002_"\', match_by_pos = \'true\')'

To Python

To obtain the prediction function in Python syntax, use the following code:

X = [[4.2, 0.17, 0.36, 1.8, 0.029, 0.9899]]

model.to_python()(X)
Out[9]: array([5.89180372])

Hint

The to_python() method is used to retrieve predictions, probabilities, or cluster distances. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.

__init__(name: str = None, overwrite_model: bool = False, max_ntree: int = 10, max_depth: int = 5, nbins: int = 32, split_proposal_method: Literal['local', 'global'] = 'global', tol: float = 0.001, learning_rate: float = 0.1, min_split_loss: float = 0.0, weight_reg: float = 0.0, sample: float = 1.0, col_sample_by_tree: float = 1.0, col_sample_by_node: float = 1.0) None#

Must be overridden in the child class

Methods

__init__([name, overwrite_model, max_ntree, ...])

Must be overridden in the child class

contour([nbins, chart])

Draws the model's contour plot.

deploySQL([X])

Returns the SQL code needed to deploy the model.

does_model_exists(name[, raise_error, ...])

Checks whether the model is stored in the Vertica database.

drop()

Drops the model from the Vertica database.

export_models(name, path[, kind])

Exports machine learning models.

features_importance([tree_id, show, chart])

Computes the model's features importance.

fit(input_relation, X, y[, test_relation, ...])

Trains the model.

get_attributes([attr_name])

Returns the model attributes.

get_match_index(x, col_list[, str_check])

Returns the matching index.

get_params()

Returns the parameters of the model.

get_plotting_lib([class_name, chart, ...])

Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.

get_score([tree_id])

Returns the feature importance metrics for the input tree.

get_tree([tree_id])

Returns a table with all the input tree information.

get_vertica_attributes([attr_name])

Returns the model Vertica attributes.

import_models(path[, schema, kind])

Imports machine learning models.

plot([max_nb_points, chart])

Draws the model.

plot_tree([tree_id, pic_path])

Draws the input tree.

predict(vdf[, X, name, inplace])

Predicts using the input relation.

register(registered_name[, raise_error])

Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.

regression_report([metrics])

Computes a regression report

report([metrics])

Computes a regression report

score([metric])

Computes the model score.

set_params([parameters])

Sets the parameters of the model.

summarize()

Summarizes the model.

to_binary(path)

Exports the model to the Vertica Binary format.

to_graphviz([tree_id, classes_color, ...])

Returns the code for a Graphviz tree.

to_json([path])

Creates a Python XGBoost JSON file

to_memmodel()

Converts the model to an InMemory object that can be used for different types of predictions.

to_pmml(path)

Exports the model to PMML.

to_python([return_proba, ...])

Returns the Python function needed for in-memory scoring without using built-in Vertica functions.

to_sql([X, return_proba, ...])

Returns the SQL code needed to deploy the model without using built-in Vertica functions.

to_tf(path)

Exports the model to the Frozen Graph format (TensorFlow).

Attributes