Loading...

verticapy.machine_learning.vertica.ensemble.XGBRegressor.to_json#

XGBRegressor.to_json(path: str | None = None) str | None#

Creates a Python XGBoost JSON file that can be imported into the Python XGBoost API.

Warning

For multiclass classifiers, the probabilities returned by the VerticaPy and exported models might differ slightly because of normalization; while Vertica uses multinomial LogisticRegression, XGBoost Python uses Softmax. This difference does not affect the model’s final predictions. Categorical predictors must be encoded.

path: str, optional

The path and name of the output file. If a file with the same name already exists, the function returns an error.

None | str

The content of the JSON file if variable path is empty. Otherwise, nothing is returned.

Let’s use the wine quality dataset:

import verticapy.datasets as vpd

data = vpd.load_winequality()
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
13.80.310.0211.10.03620.0114.00.992483.750.4412.460white
23.90.2250.44.20.0329.0118.00.9893.570.3612.881white
34.20.170.361.80.02993.0161.00.989993.650.8912.071white
44.20.2150.235.10.04164.0157.00.996883.420.448.030white
54.40.320.394.30.0331.0127.00.989043.460.3612.881white
64.40.460.12.80.02431.0111.00.988163.480.3413.160white
74.40.540.095.10.03852.097.00.990223.410.412.271white
84.50.190.210.950.03389.0159.00.993323.340.428.050white
94.60.4450.01.40.05311.0178.00.994263.790.5510.250white
104.60.520.152.10.0548.065.00.99343.90.5613.140red
114.70.1450.291.00.04235.090.00.99083.760.4911.360white
124.70.3350.141.30.03669.0168.00.992123.470.4610.550white
134.70.4550.181.90.03633.0106.00.987463.210.8314.071white
144.70.60.172.30.05817.0106.00.99323.850.612.960red
154.70.670.091.00.025.09.00.987223.30.3413.650white
164.70.7850.03.40.03623.0134.00.989813.530.9213.860white
174.80.130.321.20.04240.098.00.98983.420.6411.871white
184.80.170.282.90.0322.0111.00.99023.380.3411.371white
194.80.210.2110.20.03717.0112.00.993243.660.4812.271white
204.80.2250.381.20.07447.0130.00.991323.310.410.360white
214.80.260.2310.60.03423.0111.00.992743.460.2811.571white
224.80.290.231.10.04438.0180.00.989243.280.3411.960white
234.80.330.06.50.02834.0163.00.99373.350.619.950white
244.80.340.06.50.02833.0163.00.99393.360.619.960white
254.80.650.121.10.0134.010.00.992463.320.3613.540white
264.90.2350.2711.750.0334.0118.00.99543.070.59.460white
274.90.330.311.20.01639.0150.00.987133.330.5914.081white
284.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
294.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
304.90.3450.341.00.06832.0143.00.991383.240.410.150white
314.90.3450.341.00.06832.0143.00.991383.240.410.150white
324.90.420.02.10.04816.042.00.991543.710.7414.071red
334.90.470.171.90.03560.0148.00.989643.270.3511.560white
345.00.170.561.50.02624.0115.00.99063.480.3910.871white
355.00.20.41.90.01520.098.00.98973.370.5512.0560white
365.00.2350.2711.750.0334.0118.00.99543.070.59.460white
375.00.240.195.00.04317.0101.00.994383.670.5710.050white
385.00.240.212.20.03931.0100.00.990983.690.6211.760white
395.00.240.341.10.03449.0158.00.987743.320.3213.171white
405.00.2550.222.70.04346.0153.00.992383.750.7611.360white
415.00.270.324.50.03258.0178.00.989563.450.3112.671white
425.00.270.324.50.03258.0178.00.989563.450.3112.671white
435.00.270.41.20.07642.0124.00.992043.320.4710.160white
445.00.290.545.70.03554.0155.00.989763.270.3412.981white
455.00.30.333.70.0354.0173.00.98873.360.313.071white
465.00.310.06.40.04643.0166.00.9943.30.639.960white
475.00.330.161.50.04910.097.00.99173.480.4410.760white
485.00.330.161.50.04910.097.00.99173.480.4410.760white
495.00.330.161.50.04910.097.00.99173.480.4410.760white
505.00.330.184.60.03240.0124.00.991143.180.411.060white
515.00.330.2311.80.0323.0158.00.993223.410.6411.860white
525.00.350.257.80.03124.0116.00.992413.390.411.360white
535.00.350.257.80.03124.0116.00.992413.390.411.360white
545.00.380.011.60.04826.060.00.990843.70.7514.060red
555.00.40.54.30.04629.080.00.99023.490.6613.660red
565.00.420.242.00.0619.050.00.99173.720.7414.081red
575.00.440.0418.60.03938.0128.00.99853.370.5710.260white
585.00.4550.181.90.03633.0106.00.987463.210.8314.071white
595.00.550.148.30.03235.0164.00.99183.530.5112.581white
605.00.610.121.30.00965.0100.00.98743.260.3713.550white
615.00.740.01.20.04116.046.00.992584.010.5912.560red
625.01.020.041.40.04541.085.00.99383.750.4810.540red
635.01.040.241.60.0532.096.00.99343.740.6211.550red
645.10.110.321.60.02812.090.00.990083.570.5212.260white
655.10.140.250.70.03915.089.00.99193.220.439.260white
665.10.1650.225.70.04742.0146.00.99343.180.559.960white
675.10.210.281.40.04748.0148.00.991683.50.4910.450white
685.10.230.181.00.05313.099.00.989563.220.3911.550white
695.10.250.361.30.03540.078.00.98913.230.6412.171white
705.10.260.331.10.02746.0113.00.989463.350.4311.471white
715.10.260.346.40.03426.099.00.994493.230.419.260white
725.10.290.288.30.02627.0107.00.993083.360.3711.060white
735.10.290.288.30.02627.0107.00.993083.360.3711.060white
745.10.30.32.30.04840.0150.00.989443.290.4612.260white
755.10.3050.131.750.03617.073.00.993.40.5112.333333333333350white
765.10.310.30.90.03728.0152.00.9923.540.5610.160white
775.10.330.221.60.02718.089.00.98933.510.3812.571white
785.10.330.221.60.02718.089.00.98933.510.3812.571white
795.10.330.221.60.02718.089.00.98933.510.3812.571white
805.10.330.276.70.02244.0129.00.992213.360.3911.071white
815.10.350.266.80.03436.0120.00.991883.380.411.560white
825.10.350.266.80.03436.0120.00.991883.380.411.560white
835.10.350.266.80.03436.0120.00.991883.380.411.560white
845.10.390.211.70.02715.072.00.98943.50.4512.560white
855.10.420.01.80.04418.088.00.991573.680.7313.671red
865.10.420.011.50.01725.0102.00.98943.380.3612.371white
875.10.470.021.30.03418.044.00.99213.90.6212.860red
885.10.510.182.10.04216.0101.00.99243.460.8712.971red
895.10.520.062.70.05230.079.00.99323.320.439.350white
905.10.5850.01.70.04414.086.00.992643.560.9412.971red
915.20.1550.331.60.02813.059.00.989753.30.8411.981white
925.20.1550.331.60.02813.059.00.989753.30.8411.981white
935.20.160.340.80.02926.077.00.991553.250.5110.160white
945.20.170.270.70.0311.068.00.992183.30.419.850white
955.20.1850.221.00.0347.0123.00.992183.550.4410.1560white
965.20.20.273.20.04716.093.00.992353.440.5310.171white
975.20.210.311.70.04817.061.00.989533.240.3712.071white
985.20.220.466.20.06641.0187.00.993623.190.429.7333333333333350white
995.20.240.157.10.04332.0134.00.993783.240.489.960white
1005.20.240.453.80.02721.0128.00.9923.550.4911.281white
Rows: 1-100 | Columns: 14

Let’s import the model:

from verticapy.machine_learning.vertica import XGBRegressor

Then we can create the model:

model = XGBRegressor(
    max_ntree = 3,
    max_depth = 3,
    nbins = 6,
    split_proposal_method = 'global',
    tol = 0.001,
    learning_rate = 0.1,
    min_split_loss = 0,
    weight_reg = 0,
    sample = 0.7,
    col_sample_by_tree = 1,
    col_sample_by_node = 1,
)

We can now fit the model:

model.fit(
    train,
    [
        "fixed_acidity",
        "volatile_acidity",
        "citric_acid",
        "residual_sugar",
        "chlorides",
        "density",
    ],
    "quality",
    test,
)

And export it to the JSON format.

model.to_json()
Out[59]: '{"learner": {"attributes": {"scikit_learn": "{\\"n_estimators\\": 3, \\"objective\\": \\"reg:squarederror\\", \\"max_depth\\": 3, \\"learning_rate\\": 0.1, \\"verbosity\\": null, \\"booster\\": null, \\"tree_method\\": null, \\"gamma\\": null, \\"min_child_weight\\": null, \\"max_delta_step\\": null, \\"subsample\\": null, \\"colsample_bytree\\": 1, \\"colsample_bylevel\\": null, \\"colsample_bynode\\": 1, \\"reg_alpha\\": null, \\"reg_lambda\\": null, \\"scale_pos_weight\\": null, \\"base_score\\": null, \\"missing\\": NaN, \\"num_parallel_tree\\": null, \\"kwargs\\": {}, \\"random_state\\": null, \\"n_jobs\\": null, \\"monotone_constraints\\": null, \\"interaction_constraints\\": null, \\"importance_type\\": \\"gain\\", \\"gpu_id\\": null, \\"validate_parameters\\": null, \\"_estimator_type\\": \\"regressor\\"}"}, "feature_names": [], "feature_types": [], "gradient_booster": {"model": {"trees": [{"base_weights": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "categories": [], "categories_nodes": [], "categories_segments": [], "categories_sizes": [], "default_left": [true, true, true, true, true, true, true], "id": 0, "left_children": [1, 3, 5, -1, -1, -1, -1], "loss_changes": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "parents": [814933951, 0, 0, 1, 1, 2, 2], "right_children": [2, 4, 6, -1, -1, -1, -1], "split_conditions": [0.991067, 0.58, 0.58, 0.06308899999999999, -9.28e-05, -0.0029000000000000002, -0.056189199999999995], "split_indices": [5, 1, 1, 0, 0, 0, 0], "split_type": [0, 0, 0, 0, 0, 0, 0], "sum_hessian": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "tree_param": {"num_deleted": "0", "num_feature": "6", "num_nodes": "7", "size_leaf_vector": "0"}}, {"base_weights": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "categories": [], "categories_nodes": [], "categories_segments": [], "categories_sizes": [], "default_left": [true, true, true, true, true, true, true], "id": 1, "left_children": [1, 3, 5, -1, -1, -1, -1], "loss_changes": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "parents": [319210776, 0, 0, 1, 1, 2, 2], "right_children": [2, 4, 6, -1, -1, -1, -1], "split_conditions": [0.991067, 0.83, 0.994913, 0.060307599999999996, -0.18190170000000003, 0.009292300000000002, -0.0224471], "split_indices": [5, 1, 5, 0, 0, 0, 0], "split_type": [0, 0, 0, 0, 0, 0, 0], "sum_hessian": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "tree_param": {"num_deleted": "0", "num_feature": "6", "num_nodes": "7", "size_leaf_vector": "0"}}, {"base_weights": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "categories": [], "categories_nodes": [], "categories_segments": [], "categories_sizes": [], "default_left": [true, true, true, true, true, true, true], "id": 2, "left_children": [1, 3, 5, -1, -1, -1, -1], "loss_changes": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "parents": [876944119, 0, 0, 1, 1, 2, 2], "right_children": [2, 4, 6, -1, -1, -1, -1], "split_conditions": [0.991067, 0.83, 0.276667, 0.0530449, -0.1637115, -0.0289889, 0.002421], "split_indices": [5, 1, 2, 0, 0, 0, 0], "split_type": [0, 0, 0, 0, 0, 0, 0], "sum_hessian": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], "tree_param": {"num_deleted": "0", "num_feature": "6", "num_nodes": "7", "size_leaf_vector": "0"}}], "tree_info": [0, 0, 0], "gbtree_model_param": {"num_trees": "3", "size_leaf_vector": "0"}}, "name": "gbtree"}, "learner_model_param": {"base_score": "5.81911E+00", "num_class": "0", "num_feature": "6"}, "objective": {"name": "reg:squarederror", "reg_loss_param": {"scale_pos_weight": "1"}}}, "version": [1, 6, 2]}'

Note

Refer to XGBRegressor or XGBClassifier for more information about the different methods and usages.