Loading...

verticapy.machine_learning.vertica.linear_model.ElasticNet

class verticapy.machine_learning.vertica.linear_model.ElasticNet(name: str = None, overwrite_model: bool = False, tol: float = 1e-06, C: Annotated[int | float | Decimal, 'Python Numbers'] = 1.0, max_iter: int = 100, solver: Literal['newton', 'bfgs', 'cgd'] = 'cgd', l1_ratio: float = 0.5, fit_intercept: bool = True)

Creates an ElasticNet object using the Vertica Linear Regression algorithm. The Elastic Net is a regularized regression method that linearly combines the L1 and L2 penalties of the Lasso and Ridge methods.

Parameters

name: str, optional

Name of the model. The model is stored in the database.

overwrite_model: bool, optional

If set to True, training a model with the same name as an existing model overwrites the existing model.

tol: float, optional

Determines whether the algorithm has reached the specified accuracy result.

C: PythonNumber, optional

The regularization parameter value. The value must be zero or non-negative.

max_iter: int, optional

Determines the maximum number of iterations the algorithm performs before achieving the specified accuracy result.

solver: str, optional

The optimizer method used to train the model.

  • newton:

    Newton Method.

  • bfgs:

    Broyden Fletcher Goldfarb Shanno.

  • cgd:

    Coordinate Gradient Descent.

l1_ratio: float, optional

ENet mixture parameter that defines the provided ratio of L1 versus L2 regularization.

fit_intercept: bool, optional

boolean, specifies whether the model includes an intercept. If set to False, no intercept is used in training the model. Note that setting fit_intercept to False does not work well with the BFGS optimizer.

Attributes

Many attributes are created during the fitting phase.

coef_: numpy.array

The regression coefficients. The order of coefficients is the same as the order of columns used during the fitting phase.

intercept_: float

The expected value of the dependent variable when all independent variables are zero, serving as the baseline or constant term in the model.

features_importance_: numpy.array

The importance of features is computed through the model coefficients, which are normalized based on their range. Subsequently, an activation function calculates the final score. It is necessary to use the features_importance() method to compute it initially, and the computed values will be subsequently utilized for subsequent calls.

Note

All attributes can be accessed using the get_attributes() method.

Note

Several other attributes can be accessed by using the get_vertica_attributes() method.

Examples

The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.

Load data for machine learning

We import verticapy:

import verticapy as vp

Hint

By assigning an alias to verticapy, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions from verticapy are used as intended without interfering with functions from other libraries.

For this example, we will use the winequality dataset.

import verticapy.datasets as vpd

data = vpd.load_winequality()
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
13.80.310.0211.10.03620.0114.00.992483.750.4412.460white
23.90.2250.44.20.0329.0118.00.9893.570.3612.881white
34.20.170.361.80.02993.0161.00.989993.650.8912.071white
44.20.2150.235.10.04164.0157.00.996883.420.448.030white
54.40.320.394.30.0331.0127.00.989043.460.3612.881white
64.40.460.12.80.02431.0111.00.988163.480.3413.160white
74.40.540.095.10.03852.097.00.990223.410.412.271white
84.50.190.210.950.03389.0159.00.993323.340.428.050white
94.60.4450.01.40.05311.0178.00.994263.790.5510.250white
104.60.520.152.10.0548.065.00.99343.90.5613.140red
114.70.1450.291.00.04235.090.00.99083.760.4911.360white
124.70.3350.141.30.03669.0168.00.992123.470.4610.550white
134.70.4550.181.90.03633.0106.00.987463.210.8314.071white
144.70.60.172.30.05817.0106.00.99323.850.612.960red
154.70.670.091.00.025.09.00.987223.30.3413.650white
164.70.7850.03.40.03623.0134.00.989813.530.9213.860white
174.80.130.321.20.04240.098.00.98983.420.6411.871white
184.80.170.282.90.0322.0111.00.99023.380.3411.371white
194.80.210.2110.20.03717.0112.00.993243.660.4812.271white
204.80.2250.381.20.07447.0130.00.991323.310.410.360white
214.80.260.2310.60.03423.0111.00.992743.460.2811.571white
224.80.290.231.10.04438.0180.00.989243.280.3411.960white
234.80.330.06.50.02834.0163.00.99373.350.619.950white
244.80.340.06.50.02833.0163.00.99393.360.619.960white
254.80.650.121.10.0134.010.00.992463.320.3613.540white
264.90.2350.2711.750.0334.0118.00.99543.070.59.460white
274.90.330.311.20.01639.0150.00.987133.330.5914.081white
284.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
294.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
304.90.3450.341.00.06832.0143.00.991383.240.410.150white
314.90.3450.341.00.06832.0143.00.991383.240.410.150white
324.90.420.02.10.04816.042.00.991543.710.7414.071red
334.90.470.171.90.03560.0148.00.989643.270.3511.560white
345.00.170.561.50.02624.0115.00.99063.480.3910.871white
355.00.20.41.90.01520.098.00.98973.370.5512.0560white
365.00.2350.2711.750.0334.0118.00.99543.070.59.460white
375.00.240.195.00.04317.0101.00.994383.670.5710.050white
385.00.240.212.20.03931.0100.00.990983.690.6211.760white
395.00.240.341.10.03449.0158.00.987743.320.3213.171white
405.00.2550.222.70.04346.0153.00.992383.750.7611.360white
415.00.270.324.50.03258.0178.00.989563.450.3112.671white
425.00.270.324.50.03258.0178.00.989563.450.3112.671white
435.00.270.41.20.07642.0124.00.992043.320.4710.160white
445.00.290.545.70.03554.0155.00.989763.270.3412.981white
455.00.30.333.70.0354.0173.00.98873.360.313.071white
465.00.310.06.40.04643.0166.00.9943.30.639.960white
475.00.330.161.50.04910.097.00.99173.480.4410.760white
485.00.330.161.50.04910.097.00.99173.480.4410.760white
495.00.330.161.50.04910.097.00.99173.480.4410.760white
505.00.330.184.60.03240.0124.00.991143.180.411.060white
515.00.330.2311.80.0323.0158.00.993223.410.6411.860white
525.00.350.257.80.03124.0116.00.992413.390.411.360white
535.00.350.257.80.03124.0116.00.992413.390.411.360white
545.00.380.011.60.04826.060.00.990843.70.7514.060red
555.00.40.54.30.04629.080.00.99023.490.6613.660red
565.00.420.242.00.0619.050.00.99173.720.7414.081red
575.00.440.0418.60.03938.0128.00.99853.370.5710.260white
585.00.4550.181.90.03633.0106.00.987463.210.8314.071white
595.00.550.148.30.03235.0164.00.99183.530.5112.581white
605.00.610.121.30.00965.0100.00.98743.260.3713.550white
615.00.740.01.20.04116.046.00.992584.010.5912.560red
625.01.020.041.40.04541.085.00.99383.750.4810.540red
635.01.040.241.60.0532.096.00.99343.740.6211.550red
645.10.110.321.60.02812.090.00.990083.570.5212.260white
655.10.140.250.70.03915.089.00.99193.220.439.260white
665.10.1650.225.70.04742.0146.00.99343.180.559.960white
675.10.210.281.40.04748.0148.00.991683.50.4910.450white
685.10.230.181.00.05313.099.00.989563.220.3911.550white
695.10.250.361.30.03540.078.00.98913.230.6412.171white
705.10.260.331.10.02746.0113.00.989463.350.4311.471white
715.10.260.346.40.03426.099.00.994493.230.419.260white
725.10.290.288.30.02627.0107.00.993083.360.3711.060white
735.10.290.288.30.02627.0107.00.993083.360.3711.060white
745.10.30.32.30.04840.0150.00.989443.290.4612.260white
755.10.3050.131.750.03617.073.00.993.40.5112.333333333333350white
765.10.310.30.90.03728.0152.00.9923.540.5610.160white
775.10.330.221.60.02718.089.00.98933.510.3812.571white
785.10.330.221.60.02718.089.00.98933.510.3812.571white
795.10.330.221.60.02718.089.00.98933.510.3812.571white
805.10.330.276.70.02244.0129.00.992213.360.3911.071white
815.10.350.266.80.03436.0120.00.991883.380.411.560white
825.10.350.266.80.03436.0120.00.991883.380.411.560white
835.10.350.266.80.03436.0120.00.991883.380.411.560white
845.10.390.211.70.02715.072.00.98943.50.4512.560white
855.10.420.01.80.04418.088.00.991573.680.7313.671red
865.10.420.011.50.01725.0102.00.98943.380.3612.371white
875.10.470.021.30.03418.044.00.99213.90.6212.860red
885.10.510.182.10.04216.0101.00.99243.460.8712.971red
895.10.520.062.70.05230.079.00.99323.320.439.350white
905.10.5850.01.70.04414.086.00.992643.560.9412.971red
915.20.1550.331.60.02813.059.00.989753.30.8411.981white
925.20.1550.331.60.02813.059.00.989753.30.8411.981white
935.20.160.340.80.02926.077.00.991553.250.5110.160white
945.20.170.270.70.0311.068.00.992183.30.419.850white
955.20.1850.221.00.0347.0123.00.992183.550.4410.1560white
965.20.20.273.20.04716.093.00.992353.440.5310.171white
975.20.210.311.70.04817.061.00.989533.240.3712.071white
985.20.220.466.20.06641.0187.00.993623.190.429.7333333333333350white
995.20.240.157.10.04332.0134.00.993783.240.489.960white
1005.20.240.453.80.02721.0128.00.9923.550.4911.281white
Rows: 1-100 | Columns: 14

Note

VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.

You can easily divide your dataset into training and testing subsets using the vDataFrame.train_test_split() method. This is a crucial step when preparing your data for machine learning, as it allows you to evaluate the performance of your models accurately.

data = vpd.load_winequality()
train, test = data.train_test_split(test_size = 0.2)

Warning

In this case, VerticaPy utilizes seeded randomization to guarantee the reproducibility of your data split. However, please be aware that this approach may lead to reduced performance. For a more efficient data split, you can use the vDataFrame.to_db() method to save your results into tables or temporary tables. This will help enhance the overall performance of the process.

Model Initialization

First we import the ElasticNet model:

from verticapy.machine_learning.vertica import ElasticNet

Then we can create the model:

model = ElasticNet(
    tol = 1e-6,
    C = 1,
    max_iter = 100,
    solver = 'CGD',
    l1_ratio = 0.5,
    fit_intercept = True,
)

Hint

In verticapy 1.0.x and higher, you do not need to specify the model name, as the name is automatically assigned. If you need to re-use the model, you can fetch the model name from the model’s attributes.

Important

The model name is crucial for the model management system and versioning. It’s highly recommended to provide a name if you plan to reuse the model later.

Model Training

We can now fit the model:

model.fit(
    train,
    [
        "fixed_acidity",
        "volatile_acidity",
        "citric_acid",
        "residual_sugar",
        "chlorides",
        "density",
    ],
    "quality",
    test,
)



=======
details
=======
   predictor    |coefficient|std_err |t_value |p_value 
----------------+-----------+--------+--------+--------
   Intercept    |  5.81725  | 7.45695| 0.78011| 0.43536
 fixed_acidity  |  0.00000  | 0.01358| 0.00000| 1.00000
volatile_acidity|  0.00000  | 0.09544| 0.00000| 1.00000
  citric_acid   |  0.00000  | 0.10370| 0.00000| 1.00000
 residual_sugar |  0.00000  | 0.00421| 0.00000| 1.00000
   chlorides    |  0.00000  | 0.43409| 0.00000| 1.00000
    density     |  0.00000  | 7.58525| 0.00000| 1.00000


==============
regularization
==============
type| lambda 
----+--------
enet| 1.00000


===========
call_string
===========
linear_reg('"public"."_verticapy_tmp_linearregression_v_demo_03ddc39855a411ef880f0242ac120002_"', '"public"."_verticapy_tmp_view_v_demo_03ee622055a411ef880f0242ac120002_"', '"quality"', '"fixed_acidity", "volatile_acidity", "citric_acid", "residual_sugar", "chlorides", "density"'
USING PARAMETERS optimizer='cgd', epsilon=1e-06, max_iterations=100, regularization='enet', lambda=1, alpha=0.5, fit_intercept=true)

===============
Additional Info
===============
       Name       |Value
------------------+-----
 iteration_count  |  1  
rejected_row_count|  0  
accepted_row_count|5193 

Important

To train a model, you can directly use the vDataFrame or the name of the relation stored in the database. The test set is optional and is only used to compute the test metrics. In verticapy, we don’t work using X matrices and y vectors. Instead, we work directly with lists of predictors and the response name.

Metrics

We can get the entire report using:

model.report()
value
explained_variance-6.66133814775094e-15
max_error2.817254
median_absolute_error0.817254
mean_absolute_error0.684241202453988
mean_squared_error0.770030569841153
root_mean_squared_error0.87751385734993
r2-4.07092901077988e-05
r2_adj-0.00466695775251402
aic-326.605846197372
bic-290.555541037003
Rows: 1-10 | Columns: 2

Important

Most metrics are computed using a single SQL query, but some of them might require multiple SQL queries. Selecting only the necessary metrics in the report can help optimize performance. E.g. model.report(metrics = ["mse", "r2"]).

For LinearModel, we can easily get the ANOVA table using:

model.report(metrics = "anova")
Df
SS
MS
F
p_value
Regression60.04087534280270770.00681255713378461650.0087996333181564680.9999969802595494
Residual12971004.119863072860.7741864788534001
Total13031004.07898773006
Rows: 1-3 | Columns: 6

You can also use the LinearModel.score function to compute the R-squared value:

model.score()
Out[2]: -4.07092901077988e-05

Prediction

Prediction is straight-forward:

model.predict(
    test,
    [
        "fixed_acidity",
        "volatile_acidity",
        "citric_acid",
        "residual_sugar",
        "chlorides",
        "density",
    ],
    "prediction",
)
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
123
prediction
Float(22)
14.50.190.210.950.03389.0159.00.993323.340.428.050white5.817254
24.60.4450.01.40.05311.0178.00.994263.790.5510.250white5.817254
34.70.1450.291.00.04235.090.00.99083.760.4911.360white5.817254
44.80.2250.381.20.07447.0130.00.991323.310.410.360white5.817254
54.90.2350.2711.750.0334.0118.00.99543.070.59.460white5.817254
64.90.3450.341.00.06832.0143.00.991383.240.410.150white5.817254
75.00.2350.2711.750.0334.0118.00.99543.070.59.460white5.817254
85.00.240.341.10.03449.0158.00.987743.320.3213.171white5.817254
95.00.270.324.50.03258.0178.00.989563.450.3112.671white5.817254
105.00.270.41.20.07642.0124.00.992043.320.4710.160white5.817254
115.00.290.545.70.03554.0155.00.989763.270.3412.981white5.817254
125.00.310.06.40.04643.0166.00.9943.30.639.960white5.817254
135.00.330.161.50.04910.097.00.99173.480.4410.760white5.817254
145.00.330.2311.80.0323.0158.00.993223.410.6411.860white5.817254
155.01.020.041.40.04541.085.00.99383.750.4810.540red5.817254
165.10.1650.225.70.04742.0146.00.99343.180.559.960white5.817254
175.10.30.32.30.04840.0150.00.989443.290.4612.260white5.817254
185.10.330.221.60.02718.089.00.98933.510.3812.571white5.817254
195.10.350.266.80.03436.0120.00.991883.380.411.560white5.817254
205.10.420.01.80.04418.088.00.991573.680.7313.671red5.817254
215.10.510.182.10.04216.0101.00.99243.460.8712.971red5.817254
225.20.1550.331.60.02813.059.00.989753.30.8411.981white5.817254
235.20.170.270.70.0311.068.00.992183.30.419.850white5.817254
245.20.20.273.20.04716.093.00.992353.440.5310.171white5.817254
255.20.240.157.10.04332.0134.00.993783.240.489.960white5.817254
265.20.240.453.80.02721.0128.00.9923.550.4911.281white5.817254
275.20.280.291.10.02818.069.00.991683.240.5410.060white5.817254
285.20.340.01.80.0527.063.00.99163.680.7914.060red5.817254
295.20.360.021.60.03124.0104.00.98963.440.3512.260white5.817254
305.20.3650.0813.50.04137.0142.00.9973.460.399.960white5.817254
315.20.50.182.00.03623.0129.00.989493.360.7713.471white5.817254
325.30.1650.241.10.05125.0105.00.99253.320.479.150white5.817254
335.30.240.331.30.03325.097.00.99063.590.3811.081white5.817254
345.30.30.31.20.02925.093.00.987423.310.413.671white5.817254
355.30.320.239.650.02626.0119.00.991683.180.5312.260white5.817254
365.30.3950.071.30.03526.0102.00.9923.50.3510.660white5.817254
375.30.430.111.10.0296.051.00.990763.510.4811.240white5.817254
385.40.220.291.20.04569.0152.00.991783.760.6311.071white5.817254
395.40.270.224.60.02229.0107.00.988893.330.5413.860white5.817254
405.40.30.31.20.02925.093.00.987423.310.413.671white5.817254
415.40.5950.12.80.04226.080.00.99323.360.389.350white5.817254
425.50.120.331.00.03823.0131.00.991643.250.459.850white5.817254
435.50.150.3214.00.03116.099.00.994373.260.3811.581white5.817254
445.50.170.232.90.03910.0108.00.992433.280.510.050white5.817254
455.50.230.192.20.04439.0161.00.992093.190.4310.460white5.817254
465.50.320.131.30.03745.0156.00.991843.260.3810.750white5.817254
475.50.340.262.20.02131.0119.00.989193.550.4913.081white5.817254
485.60.120.332.90.04421.073.00.988963.170.3212.981white5.817254
495.60.160.271.40.04453.0168.00.99183.280.3710.160white5.817254
505.60.180.292.30.045.047.00.991263.070.4510.140white5.817254
515.60.190.270.90.0452.0103.00.990263.50.3911.250white5.817254
525.60.190.461.10.03233.0115.00.99093.360.510.460white5.817254
535.60.190.474.50.0319.0112.00.99223.560.4511.260white5.817254
545.60.20.221.30.04925.0155.00.992963.740.4310.050white5.817254
555.60.20.362.50.04816.0125.00.992823.490.4910.060white5.817254
565.60.20.6610.20.04378.0175.00.99452.980.4310.471white5.817254
575.60.230.293.10.02319.089.00.990683.250.5111.260white5.817254
585.60.280.273.90.04352.0158.00.992023.350.4410.771white5.817254
595.60.290.050.80.03811.030.00.99243.360.359.250white5.817254
605.60.320.328.30.04332.0105.00.992663.240.4711.260white5.817254
615.60.340.11.30.03120.068.00.99063.360.5111.271white5.817254
625.60.350.46.30.02223.0174.00.99223.540.511.671white5.817254
635.60.490.134.50.03917.0116.00.99073.420.913.771white5.817254
645.60.660.02.20.0873.011.00.993783.710.6312.871red5.817254
655.60.660.02.20.0873.011.00.993783.710.6312.871red5.817254
665.60.850.051.40.04512.088.00.99243.560.8212.981red5.817254
675.70.140.35.40.04526.0105.00.994693.320.459.350white5.817254
685.70.150.4711.40.03549.0128.00.994563.030.3410.581white5.817254
695.70.210.321.60.0333.0122.00.990443.330.5211.960white5.817254
705.70.220.216.00.04441.0113.00.998623.220.468.960white5.817254
715.70.220.216.00.04441.0113.00.998623.220.468.960white5.817254
725.70.240.476.30.06935.0182.00.993913.110.469.7333333333333350white5.817254
735.70.3350.341.00.0413.0174.00.9923.270.6610.050white5.817254
745.70.360.344.20.02621.077.00.99073.410.4511.960white5.817254
755.70.390.254.90.03349.0113.00.989663.260.5813.171white5.817254
765.70.450.421.10.05161.0197.00.99323.020.49.050white5.817254
775.70.460.461.40.0431.0169.00.99323.130.478.850white5.817254
785.70.60.01.40.06311.018.00.991913.450.5612.260red5.817254
795.71.130.091.50.1727.019.00.9943.50.489.840red5.817254
805.80.150.315.90.0367.073.00.991523.20.4311.960white5.817254
815.80.180.371.10.03631.096.00.989423.160.4812.060white5.817254
825.80.180.371.20.03619.074.00.988533.090.4912.771white5.817254
835.80.190.241.30.04438.0128.00.993623.770.610.650white5.817254
845.80.190.2510.80.04233.0124.00.996463.220.419.260white5.817254
855.80.20.271.40.03112.077.00.99053.250.3610.971white5.817254
865.80.20.341.00.03540.086.00.989933.50.4211.750white5.817254
875.80.220.291.30.03625.068.00.988653.240.3512.660white5.817254
885.80.230.211.50.04421.0110.00.991383.30.5711.060white5.817254
895.80.240.281.40.03840.076.00.987113.10.2913.971white5.817254
905.80.250.2811.10.05645.0175.00.997553.420.439.550white5.817254
915.80.260.32.60.03475.0129.00.99023.20.3811.540white5.817254
925.80.2750.35.40.04341.0149.00.99263.330.4210.871white5.817254
935.80.280.181.20.0587.0108.00.992883.230.589.5540white5.817254
945.80.280.31.50.02631.0114.00.989523.320.612.571white5.817254
955.80.280.33.90.02636.0105.00.989633.260.5812.7560white5.817254
965.80.280.342.20.03724.0125.00.989863.360.3312.881white5.817254
975.80.280.669.10.03926.0159.00.99653.660.5510.850white5.817254
985.80.30.271.70.01445.0104.00.989143.40.5612.671white5.817254
995.80.320.22.60.02717.0123.00.989363.360.7813.971white5.817254
1005.80.320.284.30.03246.0115.00.989463.160.5713.081white5.817254
Rows: 1-100 | Columns: 15

Note

Predictions can be made automatically using the test set, in which case you don’t need to specify the predictors. Alternatively, you can pass only the vDataFrame to the predict() function, but in this case, it’s essential that the column names of the vDataFrame match the predictors and response name in the model.

Plots

If the model allows, you can also generate relevant plots. For example, regression plots can be found in the Machine Learning - Regression Plots.

model.plot()

Important

The plotting feature is typically suitable for models with fewer than three predictors.

Parameter Modification

In order to see the parameters:

model.get_params()
Out[3]: 
{'tol': 1e-06,
 'C': 1,
 'max_iter': 100,
 'solver': 'cgd',
 'l1_ratio': 0.5,
 'fit_intercept': True}

And to manually change some of the parameters:

model.set_params({'tol': 0.001})

Model Register

In order to register the model for tracking and versioning:

model.register("model_v1")

Please refer to Model Tracking and Versioning for more details on model tracking and versioning.

Model Exporting

To Memmodel

model.to_memmodel()

Note

MemModel objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle a scikit-learn model.

The following methods for exporting the model use MemModel, and it is recommended to use MemModel directly.

To SQL

You can get the SQL code by:

model.to_sql()
Out[5]: '5.817254 + 0.0 * "fixed_acidity" + 0.0 * "volatile_acidity" + 0.0 * "citric_acid" + 0.0 * "residual_sugar" + 0.0 * "chlorides" + 0.0 * "density"'

To Python

To obtain the prediction function in Python syntax, use the following code:

X = [[4.2, 0.17, 0.36, 1.8, 0.029, 0.9899]]

model.to_python()(X)
Out[7]: array([5.817254])

Hint

The to_python() method is used to retrieve predictions, probabilities, or cluster distances. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.

__init__(name: str = None, overwrite_model: bool = False, tol: float = 1e-06, C: Annotated[int | float | Decimal, 'Python Numbers'] = 1.0, max_iter: int = 100, solver: Literal['newton', 'bfgs', 'cgd'] = 'cgd', l1_ratio: float = 0.5, fit_intercept: bool = True) None

Methods

__init__([name, overwrite_model, tol, C, ...])

contour([nbins, chart])

Draws the model's contour plot.

deploySQL([X])

Returns the SQL code needed to deploy the model.

does_model_exists(name[, raise_error, ...])

Checks whether the model is stored in the Vertica database.

drop()

Drops the model from the Vertica database.

export_models(name, path[, kind])

Exports machine learning models.

features_importance([show, chart])

Computes the model's features importance.

fit(input_relation, X, y[, test_relation, ...])

Trains the model.

get_attributes([attr_name])

Returns the model attributes.

get_match_index(x, col_list[, str_check])

Returns the matching index.

get_params()

Returns the parameters of the model.

get_plotting_lib([class_name, chart, ...])

Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.

get_vertica_attributes([attr_name])

Returns the model Vertica attributes.

import_models(path[, schema, kind])

Imports machine learning models.

plot([max_nb_points, chart])

Draws the model.

predict(vdf[, X, name, inplace])

Predicts using the input relation.

register(registered_name[, raise_error])

Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.

regression_report([metrics])

Computes a regression report

report([metrics])

Computes a regression report

score([metric])

Computes the model score.

set_params([parameters])

Sets the parameters of the model.

summarize()

Summarizes the model.

to_binary(path)

Exports the model to the Vertica Binary format.

to_memmodel()

Converts the model to an InMemory object that can be used for different types of predictions.

to_pmml(path)

Exports the model to PMML.

to_python([return_proba, ...])

Returns the Python function needed for in-memory scoring without using built-in Vertica functions.

to_sql([X, return_proba, ...])

Returns the SQL code needed to deploy the model without using built-in Vertica functions.

to_tf(path)

Exports the model to the Frozen Graph format (TensorFlow).

Attributes