Loading...

verticapy.machine_learning.vertica.decomposition.SVD#

class verticapy.machine_learning.vertica.decomposition.SVD(name: str = None, overwrite_model: bool = False, n_components: int = 0, method: Literal['lapack'] = 'lapack')#

Creates an SVD (Singular Value Decomposition) object using the Vertica SVD algorithm.

Parameters#

name: str, optional

Name of the model. The model is stored in the database.

overwrite_model: bool, optional

If set to True, training a model with the same name as an existing model overwrites the existing model.

n_components: int, optional

The number of components to keep in the model. If this value is not provided, all components are kept. The maximum number of components is the number of non-zero singular values returned by the internal call to SVD. This number is less than or equal to SVD (number of columns, number of rows).

method: str, optional

The method used to calculate SVD.

  • lapack:

    Lapack definition.

Attributes#

Many attributes are created during the fitting phase.

values_: numpy.array

Matrix of the right singular vectors.

values_: numpy.array

Array of the singular values for each input feature.

Note

All attributes can be accessed using the get_attributes() method.

Note

Several other attributes can be accessed by using the get_vertica_attributes() method.

Examples#

The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.

Load data for machine learning#

We import verticapy:

import verticapy as vp

Hint

By assigning an alias to verticapy, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions from verticapy are used as intended without interfering with functions from other libraries.

For this example, we will use the winequality dataset.

import verticapy.datasets as vpd

data = vpd.load_winequality()
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
13.80.310.0211.10.03620.0114.00.992483.750.4412.460white
23.90.2250.44.20.0329.0118.00.9893.570.3612.881white
34.20.170.361.80.02993.0161.00.989993.650.8912.071white
44.20.2150.235.10.04164.0157.00.996883.420.448.030white
54.40.320.394.30.0331.0127.00.989043.460.3612.881white
64.40.460.12.80.02431.0111.00.988163.480.3413.160white
74.40.540.095.10.03852.097.00.990223.410.412.271white
84.50.190.210.950.03389.0159.00.993323.340.428.050white
94.60.4450.01.40.05311.0178.00.994263.790.5510.250white
104.60.520.152.10.0548.065.00.99343.90.5613.140red
114.70.1450.291.00.04235.090.00.99083.760.4911.360white
124.70.3350.141.30.03669.0168.00.992123.470.4610.550white
134.70.4550.181.90.03633.0106.00.987463.210.8314.071white
144.70.60.172.30.05817.0106.00.99323.850.612.960red
154.70.670.091.00.025.09.00.987223.30.3413.650white
164.70.7850.03.40.03623.0134.00.989813.530.9213.860white
174.80.130.321.20.04240.098.00.98983.420.6411.871white
184.80.170.282.90.0322.0111.00.99023.380.3411.371white
194.80.210.2110.20.03717.0112.00.993243.660.4812.271white
204.80.2250.381.20.07447.0130.00.991323.310.410.360white
214.80.260.2310.60.03423.0111.00.992743.460.2811.571white
224.80.290.231.10.04438.0180.00.989243.280.3411.960white
234.80.330.06.50.02834.0163.00.99373.350.619.950white
244.80.340.06.50.02833.0163.00.99393.360.619.960white
254.80.650.121.10.0134.010.00.992463.320.3613.540white
264.90.2350.2711.750.0334.0118.00.99543.070.59.460white
274.90.330.311.20.01639.0150.00.987133.330.5914.081white
284.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
294.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
304.90.3450.341.00.06832.0143.00.991383.240.410.150white
314.90.3450.341.00.06832.0143.00.991383.240.410.150white
324.90.420.02.10.04816.042.00.991543.710.7414.071red
334.90.470.171.90.03560.0148.00.989643.270.3511.560white
345.00.170.561.50.02624.0115.00.99063.480.3910.871white
355.00.20.41.90.01520.098.00.98973.370.5512.0560white
365.00.2350.2711.750.0334.0118.00.99543.070.59.460white
375.00.240.195.00.04317.0101.00.994383.670.5710.050white
385.00.240.212.20.03931.0100.00.990983.690.6211.760white
395.00.240.341.10.03449.0158.00.987743.320.3213.171white
405.00.2550.222.70.04346.0153.00.992383.750.7611.360white
415.00.270.324.50.03258.0178.00.989563.450.3112.671white
425.00.270.324.50.03258.0178.00.989563.450.3112.671white
435.00.270.41.20.07642.0124.00.992043.320.4710.160white
445.00.290.545.70.03554.0155.00.989763.270.3412.981white
455.00.30.333.70.0354.0173.00.98873.360.313.071white
465.00.310.06.40.04643.0166.00.9943.30.639.960white
475.00.330.161.50.04910.097.00.99173.480.4410.760white
485.00.330.161.50.04910.097.00.99173.480.4410.760white
495.00.330.161.50.04910.097.00.99173.480.4410.760white
505.00.330.184.60.03240.0124.00.991143.180.411.060white
515.00.330.2311.80.0323.0158.00.993223.410.6411.860white
525.00.350.257.80.03124.0116.00.992413.390.411.360white
535.00.350.257.80.03124.0116.00.992413.390.411.360white
545.00.380.011.60.04826.060.00.990843.70.7514.060red
555.00.40.54.30.04629.080.00.99023.490.6613.660red
565.00.420.242.00.0619.050.00.99173.720.7414.081red
575.00.440.0418.60.03938.0128.00.99853.370.5710.260white
585.00.4550.181.90.03633.0106.00.987463.210.8314.071white
595.00.550.148.30.03235.0164.00.99183.530.5112.581white
605.00.610.121.30.00965.0100.00.98743.260.3713.550white
615.00.740.01.20.04116.046.00.992584.010.5912.560red
625.01.020.041.40.04541.085.00.99383.750.4810.540red
635.01.040.241.60.0532.096.00.99343.740.6211.550red
645.10.110.321.60.02812.090.00.990083.570.5212.260white
655.10.140.250.70.03915.089.00.99193.220.439.260white
665.10.1650.225.70.04742.0146.00.99343.180.559.960white
675.10.210.281.40.04748.0148.00.991683.50.4910.450white
685.10.230.181.00.05313.099.00.989563.220.3911.550white
695.10.250.361.30.03540.078.00.98913.230.6412.171white
705.10.260.331.10.02746.0113.00.989463.350.4311.471white
715.10.260.346.40.03426.099.00.994493.230.419.260white
725.10.290.288.30.02627.0107.00.993083.360.3711.060white
735.10.290.288.30.02627.0107.00.993083.360.3711.060white
745.10.30.32.30.04840.0150.00.989443.290.4612.260white
755.10.3050.131.750.03617.073.00.993.40.5112.333333333333350white
765.10.310.30.90.03728.0152.00.9923.540.5610.160white
775.10.330.221.60.02718.089.00.98933.510.3812.571white
785.10.330.221.60.02718.089.00.98933.510.3812.571white
795.10.330.221.60.02718.089.00.98933.510.3812.571white
805.10.330.276.70.02244.0129.00.992213.360.3911.071white
815.10.350.266.80.03436.0120.00.991883.380.411.560white
825.10.350.266.80.03436.0120.00.991883.380.411.560white
835.10.350.266.80.03436.0120.00.991883.380.411.560white
845.10.390.211.70.02715.072.00.98943.50.4512.560white
855.10.420.01.80.04418.088.00.991573.680.7313.671red
865.10.420.011.50.01725.0102.00.98943.380.3612.371white
875.10.470.021.30.03418.044.00.99213.90.6212.860red
885.10.510.182.10.04216.0101.00.99243.460.8712.971red
895.10.520.062.70.05230.079.00.99323.320.439.350white
905.10.5850.01.70.04414.086.00.992643.560.9412.971red
915.20.1550.331.60.02813.059.00.989753.30.8411.981white
925.20.1550.331.60.02813.059.00.989753.30.8411.981white
935.20.160.340.80.02926.077.00.991553.250.5110.160white
945.20.170.270.70.0311.068.00.992183.30.419.850white
955.20.1850.221.00.0347.0123.00.992183.550.4410.1560white
965.20.20.273.20.04716.093.00.992353.440.5310.171white
975.20.210.311.70.04817.061.00.989533.240.3712.071white
985.20.220.466.20.06641.0187.00.993623.190.429.7333333333333350white
995.20.240.157.10.04332.0134.00.993783.240.489.960white
1005.20.240.453.80.02721.0128.00.9923.550.4911.281white
Rows: 1-100 | Columns: 14

We can drop the “color” column as it is varchar type.

data.drop("color")

Note

VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.

Model Initialization#

First we import the SVD model:

from verticapy.machine_learning.vertica import SVD

Then we can create the model:

model = SVD(
    n_components = 3,
)

You can select the number of components by the n_component parameter. If it is not provided, then all are considered.

Hint

In verticapy 1.0.x and higher, you do not need to specify the model name, as the name is automatically assigned. If you need to re-use the model, you can fetch the model name from the model’s attributes.

Important

The model name is crucial for the model management system and versioning. It’s highly recommended to provide a name if you plan to reuse the model later.

Model Training#

We can now fit the model:

model.fit(data)

Important

To train a model, you can directly use the vDataFrame or the name of the relation stored in the database.

Scores#

The decomposition score on the dataset for each transformed column can be calculated by:

model.score()
Out[4]: 
None                                   Score  
fixed_acidity               3.19669356683238  
volatile_acidity           0.264348755504044  
citric_acid                0.228932614877665  
residual_sugar              5.38039944191312  
chlorides                 0.0427139489873566  
free_sulfur_dioxide         2.10040482930188  
total_sulfur_dioxide        4.73074234245691  
density                    0.309070644222815  
pH                          1.14052266015885  
sulphates                  0.304887828479003  
alcohol                     3.19983025734727  
quality                     2.33115583120493  
good                       0.352765059376177  
Rows: 1-13 | Columns: 2

For more details on the function, check out score()

You can also fetch the explained variance by:

model.explained_variance_
Out[5]: array([0.9883586 , 0.00807026, 0.00248987])

Principal Components#

To get the transformed dataset in the form of principal components:

model.transform(data)
Out[6]: 
None                  col1                     col2                     col3  
1       0.0108141080605074     -0.00898690374891701      0.00727314637892158  
2       0.0113609654056891       -0.001188361075854      0.00674904946769466  
3       0.0166627178211831       0.0504421228529201       -0.013344560552529  
4       0.0156076978382921       0.0224989559819784      -0.0151162620266151  
5        0.012213209822063     -0.00158312495446874      0.00561623220768627  
6       0.0107729444391737      0.00256515037026606      0.00663577398135461  
7       0.0100155674790493       0.0270441151710081      0.00376563452202662  
8       0.0163554466085546       0.0466574148608037      -0.0189083140020647  
9       0.0162665231231895      -0.0352134384874341     -0.00229993058464557  
10     0.00612561656911707     -0.00800801568019847       0.0162712298233318  
11     0.00897314841200707        0.011950961619011      0.00652009642959555  
12      0.0167155507504623       0.0247094490831147       -0.012430840269128  
13      0.0103795306807778      0.00596388304507683      0.00901423507482371  
14      0.0100017231953311     -0.00992966839614197        0.010317295285701  
15     0.00105751138767118      0.00394349706810463       0.0253800950241802  
16      0.0126503572208966      -0.0113636033145426      0.00657373809766758  
17     0.00981065936878354       0.0148445654146031      0.00584151877390348  
18      0.0105589692445995     -0.00637543518725687      0.00714797062666492  
19      0.0105691701583817      -0.0113629436261072      0.00952468977202705  
20      0.0128160451773933        0.013109827216752     -0.00253564961142822  
21      0.0106152336426165     -0.00521753382020842      0.00742653982907914  
22      0.0170823856441917      -0.0089456684893137     -0.00511840498754081  
23      0.0154778450255575     -0.00844424924439236     -0.00527451937022658  
24      0.0154583833096818     -0.00938853426778954     -0.00436702722677493  
25     0.00112031244917541      0.00264822944127992       0.0247287583690205  
26        0.01148166275695       0.0036172201927973     0.000741660518064614  
27      0.0144487952276349     0.000242787333090651      0.00295673168007065  
28      0.0167162120544499       0.0247153778951736      -0.0122822601942573  
29      0.0167162120544499       0.0247153778951736      -0.0122822601942573  
30      0.0136246739310925     -0.00520528702297353     -0.00207523067964397  
31      0.0136246739310925     -0.00520528702297353     -0.00207523067964397  
32     0.00427461786124396      0.00619558925697606        0.021193124762103  
33      0.0147347506278432       0.0212604638292909     -0.00583120404868525  
34      0.0109545127582932     -0.00551409595328058      0.00576446729975465  
35     0.00934990184656775     -0.00490655658311333       0.0096981848845159  
36      0.0114821012145568      0.00362128901795188     0.000838108278379084  
37     0.00954488009164201      -0.0087742944602826      0.00667892232556901  
38      0.0097823446972645      0.00540987186800628      0.00686511042374751  
39      0.0153847215828542      0.00788842583923433     -0.00194983681799175  
40      0.0148611056419884       0.0061308952609764     -0.00358340879790029  
41      0.0173895891344854       0.0115009120151289      -0.0069213084272989  
42      0.0173895891344854       0.0115009120151289      -0.0069213084272989  
43      0.0121643687586682      0.00976266966246829    -0.000797056270560854  
44      0.0152536586305299       0.0137425595116523     -0.00203872797701861  
45      0.0168497956817113      0.00888850199443394     -0.00497628224283744  
46      0.0159577446628235    -0.000308884162375855     -0.00662477199184711  
47     0.00901925111535918       -0.014608096467652       0.0100922049391821  
48     0.00901925111535918       -0.014608096467652       0.0100922049391821  
49     0.00901925111535918       -0.014608096467652       0.0100922049391821  
50      0.0121376232603184      0.00791387265659954     0.000825965463865728  
51      0.0148159331848086      -0.0176893913881306      0.00119415364309956  
52      0.0110690260632225     -0.00566710155619228      0.00568347667217994  
53      0.0110690260632225     -0.00566710155619228      0.00568347667217994  
54     0.00610760123535686       0.0112349312993762       0.0161564895686208  
55     0.00797094362520398      0.00891093373305621       0.0124018649063269  
56     0.00506201349163946      0.00707814959395135       0.0203449582810378  
57      0.0125014629881725      0.00511287008052087     7.31990933828669e-05  
58      0.0103808460535981      0.00597608952054056      0.00930357835576712  
59      0.0156267641483374     -0.00737015988358075     0.000296320931996781  
60      0.0105716460915439       0.0390215154904031      0.00145608459238926  
61     0.00461567673665519       0.0049791934244293        0.018138757313459  
62     0.00865600250002652       0.0190843469128548      0.00382400225369011  
63     0.00944116324386852      0.00739348227175051      0.00626789774520582  
64     0.00845119313041551      -0.0106681490480454        0.012723953808117  
65     0.00840739769300279     -0.00766543030535406      0.00809400514385817  
66      0.0141461253309525      0.00400108861898725     -0.00378600461398395  
67      0.0144459310487507      0.00930219157574227     -0.00528452907136737  
68     0.00926654633906355      -0.0121787695713679      0.00956171013538002  
69     0.00802816459064073       0.0201851385051224      0.00907586192594589  
70       0.011286840064499       0.0167666532148082      0.00238562301666936  
71     0.00957808963795109     0.000653930534385808      0.00470240138816252  
72      0.0103352354224899    -0.000322779278897781      0.00594078563297295  
73      0.0103352354224899    -0.000322779278897781      0.00594078563297295  
74      0.0144581674641318      0.00105048013198394    -0.000880308960696798  
75     0.00704632805117089     -0.00125311656786212       0.0133444561329968  
76      0.0143405642039511      -0.0114881981132763     -0.00142287511458727  
77     0.00850613281699714     -0.00441482870613591       0.0127431247799744  
78     0.00850613281699714     -0.00441482870613591       0.0127431247799744  
79     0.00850613281699714     -0.00441482870613591       0.0127431247799744  
80      0.0126897436110269       0.0106316956818193      0.00031262943042162  
81      0.0117011565329233      0.00511316508798839      0.00305555709434668  
82      0.0117011565329233      0.00511316508798839      0.00305555709434668  
83      0.0117011565329233      0.00511316508798839      0.00305555709434668  
84      0.0069155247298427     -0.00290726213742451       0.0148426427801898  
85     0.00842520674025287     -0.00406786111660374         0.01442894666402  
86     0.00982671259866448    -0.000974362410638909      0.00930281068428658  
87     0.00448585606275903      0.00750788014098886       0.0184363410346446  
88     0.00953612346062723     -0.00953706740255909       0.0121541620122911  
89     0.00786669266643752      0.00980318166512076      0.00588189636259572  
90     0.00814908792851708     -0.00753491705347241       0.0145490178143425  
91     0.00571174655543743     -0.00138118892270593       0.0176180582249371  
92     0.00571174655543743     -0.00138118892270593       0.0176180582249371  
93     0.00759694685748644      0.00644547936960487      0.00873790743769185  
94     0.00644068949568042     -0.00603581304579231       0.0118720332715975  
95       0.012191422156916       0.0149750163321693     -0.00135461681419584  
96     0.00880846552072471     -0.00757448402887002      0.00955177673886906  
97      0.0059799495172463      0.00199945607844215       0.0159252209807065  
98      0.0177817555288171     -0.00791052940334145     -0.00973689484084486  
99      0.0128497955493046     -0.00265075743350479    -3.69216962120902e-05  
100      0.012062677406264      -0.0118012043899065       0.0061936554761859  
Rows: 1-100 | Columns: 3

Please refer to transform() for more details on transforming a vDataFrame.

Similarly, you can perform the inverse tranform to get the original features using:

model.inverse_transform(data_transformed)

The variable data_transformed includes the PCA components.

Plots - SVD#

You can plot the first two dimensions conveniently using:

model.plot()

Plots - Scree#

You can also plot the Scree plot:

model.plot_scree()
Loading....

Parameter Modification#

In order to see the parameters:

model.get_params()
Out[7]: {'n_components': 3, 'method': 'lapack'}

And to manually change some of the parameters:

model.set_params({'n_components': 3})

Model Register#

In order to register the model for tracking and versioning:

model.register("model_v1")

Please refer to Model Tracking and Versioning for more details on model tracking and versioning.

Model Exporting#

To Memmodel

model.to_memmodel()

Note

MemModel objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle a scikit-learn model.

The preceding methods for exporting the model use MemModel, and it is recommended to use MemModel directly.

SQL

To get the SQL query use below:

model.to_sql()
Out[9]: 
['"fixed_acidity" * 0.0472721802865271 / 10781.471128853 + "volatile_acidity" * 0.00206840925765328 / 10781.471128853 + "citric_acid" * 0.00224024758641517 / 10781.471128853 + "residual_sugar" * 0.044395714039541 / 10781.471128853 + "chlorides" * 0.000346326837712132 / 10781.471128853 + "free_sulfur_dioxide" * 0.249206286239206 / 10781.471128853 + "total_sulfur_dioxide" * 0.962684441178257 / 10781.471128853 + "density" * 0.00670829104944216 / 10781.471128853 + "pH" * 0.0215818136774093 / 10781.471128853 + "sulphates" * 0.00345087841775058 / 10781.471128853 + "alcohol" * 0.0697381862438041 / 10781.471128853 + "quality" * 0.0391425145196623 / 10781.471128853 + "good" * 0.00126598337328779 / 10781.471128853',
 '"fixed_acidity" * 0.0396399788533397 / 974.236477296841 + "volatile_acidity" * 0.00188557954207304 / 974.236477296841 + "citric_acid" * 0.00119732121820979 / 974.236477296841 + "residual_sugar" * 0.0197505553836915 / 974.236477296841 + "chlorides" * 0.000483104744026943 / 974.236477296841 + "free_sulfur_dioxide" * 0.961682217712392 / 974.236477296841 + "total_sulfur_dioxide" * -0.258651563180513 / 974.236477296841 + "density" * 0.00544627404807701 / 974.236477296841 + "pH" * 0.0186923356923426 / 974.236477296841 + "sulphates" * 0.00373864932226108 / 974.236477296841 + "alcohol" * 0.0645579238733936 / 974.236477296841 + "quality" * 0.0415184345466081 / 974.236477296841 + "good" * 0.00393668701479983 / 974.236477296841',
 '"fixed_acidity" * 0.521916714261151 / 541.139278464763 + "volatile_acidity" * 0.02994886567606 / 541.139278464763 + "citric_acid" * 0.0173905391926748 / 541.139278464763 + "residual_sugar" * 0.0130884723973582 / 541.139278464763 + "chlorides" * 0.00460654396890658 / 541.139278464763 + "free_sulfur_dioxide" * -0.110072060878328 / 541.139278464763 + "total_sulfur_dioxide" * -0.0705721810950697 / 541.139278464763 + "density" * 0.0623122781813198 / 541.139278464763 + "pH" * 0.205991789180222 / 541.139278464763 + "sulphates" * 0.0388553814537634 / 541.139278464763 + "alcohol" * 0.719424850273745 / 541.139278464763 + "quality" * 0.378635713837342 / 541.139278464763 + "good" * 0.0187739554796354 / 541.139278464763']

To Python

To obtain the prediction function in Python syntax, use the following code:

X = [[3.8, 0.3, 0.02, 11, 0.03, 20, 113, 0.99, 3, 0.4, 12, 6, 0]]

model.to_python()(X)
Out[11]: array([[ 0.0107203 , -0.00876453,  0.0065801 ]])

Hint

The to_python() method is used to retrieve the Principal Component values. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.

__init__(name: str = None, overwrite_model: bool = False, n_components: int = 0, method: Literal['lapack'] = 'lapack') None#

Must be overridden in the child class

Methods

__init__([name, overwrite_model, ...])

Must be overridden in the child class

contour([nbins, chart])

Draws the model's contour plot.

deployInverseSQL([key_columns, ...])

Returns the SQL code needed to deploy the inverse model.

deploySQL([X, n_components, cutoff, ...])

Returns the SQL code needed to deploy the model.

does_model_exists(name[, raise_error, ...])

Checks whether the model is stored in the Vertica database.

drop()

Drops the model from the Vertica database.

export_models(name, path[, kind])

Exports machine learning models.

fit(input_relation[, X, return_report])

Trains the model.

get_attributes([attr_name])

Returns the model attributes.

get_match_index(x, col_list[, str_check])

Returns the matching index.

get_params()

Returns the parameters of the model.

get_plotting_lib([class_name, chart, ...])

Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.

get_vertica_attributes([attr_name])

Returns the model Vertica attributes.

import_models(path[, schema, kind])

Imports machine learning models.

inverse_transform(vdf[, X])

Applies the Inverse Model on a vDataFrame.

plot([dimensions, chart])

Draws a decomposition scatter plot.

plot_circle([dimensions, chart])

Draws a decomposition circle.

plot_scree([chart])

Draws a decomposition scree plot.

register(registered_name[, raise_error])

Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.

score([X, input_relation, metric, p])

Returns the decomposition score on a dataset for each transformed column.

set_params([parameters])

Sets the parameters of the model.

summarize()

Summarizes the model.

to_binary(path)

Exports the model to the Vertica Binary format.

to_memmodel()

Converts the model to an InMemory object that can be used for different types of predictions.

to_pmml(path)

Exports the model to PMML.

to_python([return_proba, ...])

Returns the Python function needed for in-memory scoring without using built-in Vertica functions.

to_sql([X, return_proba, ...])

Returns the SQL code needed to deploy the model without using built-in Vertica functions.

to_tf(path)

Exports the model to the Frozen Graph format (TensorFlow).

transform([vdf, X, n_components, cutoff])

Applies the model on a vDataFrame.

Attributes