Loading...

verticapy.machine_learning.vertica.decomposition.PCA#

class verticapy.machine_learning.vertica.decomposition.PCA(name: str = None, overwrite_model: bool = False, n_components: int = 0, scale: bool = False, method: Literal['lapack'] = 'lapack')#

Creates a PCA (Principal Component Analysis) object using the Vertica PCA algorithm.

Parameters#

name: str, optional

Name of the model. The model is stored in the database.

overwrite_model: bool, optional

If set to True, training a model with the same name as an existing model overwrites the existing model.

n_components: int, optional

The number of components to keep in the model. If this value is not provided, all components are kept. The maximum number of components is the number of non-zero singular values returned by the internal call to SVD. This number is less than or equal to SVD (number of columns, number of rows).

scale: bool, optional

A Boolean value that specifies whether to standardize the columns during the preparation step.

method: str, optional

The method used to calculate PCA.

  • lapack:

    Lapack definition.

Attributes#

Many attributes are created during the fitting phase.

principal_components_: numpy.array

Matrix of the principal components.

mean_: numpy.array

List of the averages of each input feature.

cos2_: numpy.array

Quality of representation of each observation in the principal component space. A high cos2 value indicates that the observation is well-represented in the reduced-dimensional space defined by the principal components, while a low value suggests poor representation.

explained_variance_: numpy.array

Represents the proportion of the total variance in the original dataset that is captured by a specific principal component or a combination of principal components.

Note

All attributes can be accessed using the get_attributes() method.

Note

Several other attributes can be accessed by using the get_vertica_attributes() method.

Examples#

The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.

Load data for machine learning#

We import verticapy:

import verticapy as vp

Hint

By assigning an alias to verticapy, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions from verticapy are used as intended without interfering with functions from other libraries.

For this example, we will use the winequality dataset.

import verticapy.datasets as vpd

data = vpd.load_winequality()
123
fixed_acidity
Numeric(8)
123
volatile_acidity
Numeric(9)
123
citric_acid
Numeric(8)
123
residual_sugar
Numeric(9)
123
chlorides
Float(22)
123
free_sulfur_dioxide
Numeric(9)
123
total_sulfur_dioxide
Numeric(9)
123
density
Float(22)
123
pH
Numeric(8)
123
sulphates
Numeric(8)
123
alcohol
Float(22)
123
quality
Integer
123
good
Integer
Abc
color
Varchar(20)
13.80.310.0211.10.03620.0114.00.992483.750.4412.460white
23.90.2250.44.20.0329.0118.00.9893.570.3612.881white
34.20.170.361.80.02993.0161.00.989993.650.8912.071white
44.20.2150.235.10.04164.0157.00.996883.420.448.030white
54.40.320.394.30.0331.0127.00.989043.460.3612.881white
64.40.460.12.80.02431.0111.00.988163.480.3413.160white
74.40.540.095.10.03852.097.00.990223.410.412.271white
84.50.190.210.950.03389.0159.00.993323.340.428.050white
94.60.4450.01.40.05311.0178.00.994263.790.5510.250white
104.60.520.152.10.0548.065.00.99343.90.5613.140red
114.70.1450.291.00.04235.090.00.99083.760.4911.360white
124.70.3350.141.30.03669.0168.00.992123.470.4610.550white
134.70.4550.181.90.03633.0106.00.987463.210.8314.071white
144.70.60.172.30.05817.0106.00.99323.850.612.960red
154.70.670.091.00.025.09.00.987223.30.3413.650white
164.70.7850.03.40.03623.0134.00.989813.530.9213.860white
174.80.130.321.20.04240.098.00.98983.420.6411.871white
184.80.170.282.90.0322.0111.00.99023.380.3411.371white
194.80.210.2110.20.03717.0112.00.993243.660.4812.271white
204.80.2250.381.20.07447.0130.00.991323.310.410.360white
214.80.260.2310.60.03423.0111.00.992743.460.2811.571white
224.80.290.231.10.04438.0180.00.989243.280.3411.960white
234.80.330.06.50.02834.0163.00.99373.350.619.950white
244.80.340.06.50.02833.0163.00.99393.360.619.960white
254.80.650.121.10.0134.010.00.992463.320.3613.540white
264.90.2350.2711.750.0334.0118.00.99543.070.59.460white
274.90.330.311.20.01639.0150.00.987133.330.5914.081white
284.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
294.90.3350.141.30.03669.0168.00.992123.470.4610.466666666666750white
304.90.3450.341.00.06832.0143.00.991383.240.410.150white
314.90.3450.341.00.06832.0143.00.991383.240.410.150white
324.90.420.02.10.04816.042.00.991543.710.7414.071red
334.90.470.171.90.03560.0148.00.989643.270.3511.560white
345.00.170.561.50.02624.0115.00.99063.480.3910.871white
355.00.20.41.90.01520.098.00.98973.370.5512.0560white
365.00.2350.2711.750.0334.0118.00.99543.070.59.460white
375.00.240.195.00.04317.0101.00.994383.670.5710.050white
385.00.240.212.20.03931.0100.00.990983.690.6211.760white
395.00.240.341.10.03449.0158.00.987743.320.3213.171white
405.00.2550.222.70.04346.0153.00.992383.750.7611.360white
415.00.270.324.50.03258.0178.00.989563.450.3112.671white
425.00.270.324.50.03258.0178.00.989563.450.3112.671white
435.00.270.41.20.07642.0124.00.992043.320.4710.160white
445.00.290.545.70.03554.0155.00.989763.270.3412.981white
455.00.30.333.70.0354.0173.00.98873.360.313.071white
465.00.310.06.40.04643.0166.00.9943.30.639.960white
475.00.330.161.50.04910.097.00.99173.480.4410.760white
485.00.330.161.50.04910.097.00.99173.480.4410.760white
495.00.330.161.50.04910.097.00.99173.480.4410.760white
505.00.330.184.60.03240.0124.00.991143.180.411.060white
515.00.330.2311.80.0323.0158.00.993223.410.6411.860white
525.00.350.257.80.03124.0116.00.992413.390.411.360white
535.00.350.257.80.03124.0116.00.992413.390.411.360white
545.00.380.011.60.04826.060.00.990843.70.7514.060red
555.00.40.54.30.04629.080.00.99023.490.6613.660red
565.00.420.242.00.0619.050.00.99173.720.7414.081red
575.00.440.0418.60.03938.0128.00.99853.370.5710.260white
585.00.4550.181.90.03633.0106.00.987463.210.8314.071white
595.00.550.148.30.03235.0164.00.99183.530.5112.581white
605.00.610.121.30.00965.0100.00.98743.260.3713.550white
615.00.740.01.20.04116.046.00.992584.010.5912.560red
625.01.020.041.40.04541.085.00.99383.750.4810.540red
635.01.040.241.60.0532.096.00.99343.740.6211.550red
645.10.110.321.60.02812.090.00.990083.570.5212.260white
655.10.140.250.70.03915.089.00.99193.220.439.260white
665.10.1650.225.70.04742.0146.00.99343.180.559.960white
675.10.210.281.40.04748.0148.00.991683.50.4910.450white
685.10.230.181.00.05313.099.00.989563.220.3911.550white
695.10.250.361.30.03540.078.00.98913.230.6412.171white
705.10.260.331.10.02746.0113.00.989463.350.4311.471white
715.10.260.346.40.03426.099.00.994493.230.419.260white
725.10.290.288.30.02627.0107.00.993083.360.3711.060white
735.10.290.288.30.02627.0107.00.993083.360.3711.060white
745.10.30.32.30.04840.0150.00.989443.290.4612.260white
755.10.3050.131.750.03617.073.00.993.40.5112.333333333333350white
765.10.310.30.90.03728.0152.00.9923.540.5610.160white
775.10.330.221.60.02718.089.00.98933.510.3812.571white
785.10.330.221.60.02718.089.00.98933.510.3812.571white
795.10.330.221.60.02718.089.00.98933.510.3812.571white
805.10.330.276.70.02244.0129.00.992213.360.3911.071white
815.10.350.266.80.03436.0120.00.991883.380.411.560white
825.10.350.266.80.03436.0120.00.991883.380.411.560white
835.10.350.266.80.03436.0120.00.991883.380.411.560white
845.10.390.211.70.02715.072.00.98943.50.4512.560white
855.10.420.01.80.04418.088.00.991573.680.7313.671red
865.10.420.011.50.01725.0102.00.98943.380.3612.371white
875.10.470.021.30.03418.044.00.99213.90.6212.860red
885.10.510.182.10.04216.0101.00.99243.460.8712.971red
895.10.520.062.70.05230.079.00.99323.320.439.350white
905.10.5850.01.70.04414.086.00.992643.560.9412.971red
915.20.1550.331.60.02813.059.00.989753.30.8411.981white
925.20.1550.331.60.02813.059.00.989753.30.8411.981white
935.20.160.340.80.02926.077.00.991553.250.5110.160white
945.20.170.270.70.0311.068.00.992183.30.419.850white
955.20.1850.221.00.0347.0123.00.992183.550.4410.1560white
965.20.20.273.20.04716.093.00.992353.440.5310.171white
975.20.210.311.70.04817.061.00.989533.240.3712.071white
985.20.220.466.20.06641.0187.00.993623.190.429.7333333333333350white
995.20.240.157.10.04332.0134.00.993783.240.489.960white
1005.20.240.453.80.02721.0128.00.9923.550.4911.281white
Rows: 1-100 | Columns: 14

Note

VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.

We can drop the “color” column as it is varchar type.

data.drop("color")

Model Initialization#

First we import the PCA model:

from verticapy.machine_learning.vertica import PCA

Then we can create the model:

model = PCA(
    n_components = 3,
)

You can select the number of components by the n_component parameter. If it is not provided, then all are considered.

Hint

In verticapy 1.0.x and higher, you do not need to specify the model name, as the name is automatically assigned. If you need to re-use the model, you can fetch the model name from the model’s attributes.

Important

The model name is crucial for the model management system and versioning. It’s highly recommended to provide a name if you plan to reuse the model later.

Model Training#

We can now fit the model:

model.fit(data)

Important

To train a model, you can directly use the vDataFrame or the name of the relation stored in the database.

Scores#

The decomposition score on the dataset for each transformed column can be calculated by:

model.score()
Out[4]: 
None                                   Score  
fixed_acidity               3.26018188659356  
volatile_acidity           0.255704870531137  
citric_acid                0.232055113862187  
residual_sugar             0.784335096491506  
chlorides                 0.0412739282041459  
free_sulfur_dioxide        0.743079726641283  
total_sulfur_dioxide        1.03787465340886  
density                   0.0561515090418321  
pH                         0.801959298244123  
sulphates                  0.303876275255451  
alcohol                     4.02272456571807  
quality                     2.54477536599627  
good                       0.348243267430713  
Rows: 1-13 | Columns: 2

For more details on the function, check out score()

You can also fetch the explained variance by:

model.explained_variance_
Out[5]: array([0.95351037, 0.04062083, 0.00482552])

Principal Components#

To get the transformed dataset in the form of principal components:

model.transform(data)
Out[6]: 
None                col1                  col2                  col3  
1      -3.87539586294116     -9.70263141971311      5.73365140692485  
2        1.8007158864829     -1.98185721028904     -1.57290632355351  
3       58.2585435866957      50.2581718382341       -7.157700338295  
4       47.8458923519163      22.9883205947403     -2.56964241029969  
5       11.0115318397046     -2.12012882814391      -1.8375746016803  
6      -4.60521361896981       1.5341694499065     -2.75824074459435  
7      -13.2771414595065      25.2504890643806    -0.476520460754179  
8       55.3788310502415      46.7802985538193     -7.45917494667214  
9       55.8773719495786     -33.4658268951551     -5.76129791729379  
10     -54.6555678140085     -10.2240557382057     -1.16251198171631  
11     -24.1648594959502      10.2439927861505     -3.75116700698459  
12      59.5176275497709       25.258097509706     -7.09430396571515  
13     -9.05011526637347      4.63228831511623     -3.60981634879058  
14     -12.7149929321467     -10.9367841190281     -2.67613969311331  
15      -109.83712113806     -0.19489867094966    -0.211036337506947  
16      15.9284646994512     -11.5572095245047     -2.82109432623325  
17     -15.2310907537324      13.2721178525629     -4.02619559516123  
18     -6.66895840225276     -7.21300376171408     -2.27046351728112  
19      -6.5550265533149     -12.1686938056515      5.01741673181463  
20      17.5008031764373      12.6595659366785     -5.22319739599442  
21     -6.12385714274914     -6.09641638348886      5.34842459664887  
22      64.0218965257457      -7.6609298570904     -7.00387756953786  
23      46.8056869539979     -7.53188964107484    -0.739011856339026  
24      46.5746543609322     -8.49535015989575    -0.720852160376789  
25      -109.09098458572     -1.40699916293401    -0.100865448889536  
26      3.27549637771107      2.98554661691686      6.13681673815753  
27      35.0776851714978     0.281798297982036     -6.05124271309635  
28      59.5163276885018      25.2569270184036     -7.08677590412429  
29      59.5163276885018      25.2569270184036     -7.08677590412429  
30      26.6743114830131     -4.95190539016235     -5.45347556437632  
31      26.6743114830131     -4.95190539016235     -5.45347556437632  
32      -75.180559206192      2.90998249909939    -0.653853292464544  
33      38.0171694522185      21.1542722104449     -5.62104279390702  
34     -2.37547571780678     -6.22192340926697     -3.81553048439253  
35     -19.8139354364376     -6.17937852955219     -2.78776089879949  
36       3.2747555833307      2.98500940837631      6.13920308997688  
37     -17.4503747760159     -9.74850498290234     0.446947775612021  
38     -15.3204945867663      4.06143755060113     -2.83284045680938  
39      45.1607450949272      8.14262728404729     -6.62246702457769  
40      39.6841016220826      6.39508394279764     -4.60997393999157  
41      66.8204822700176      12.3302615531612     -4.15952189478502  
42      66.8204822700176      12.3302615531612     -4.15952189478502  
43      10.5149045897773      9.18318394872144     -4.85132644225315  
44      43.5859575590291      13.7940321018126     -2.06416136230967  
45      61.0027620426438      9.58292669483431     -4.70032804310602  
46      51.7904435247003     0.533685199469238     -1.19389604379981  
47     -23.1002361108527     -15.6854930106014      -2.7681976678508  
48     -23.1002361108527     -15.6854930106014      -2.7681976678508  
49     -23.1002361108527     -15.6854930106014      -2.7681976678508  
50      10.1884723538412      7.30376481892969     -1.48712788146904  
51      39.6146400834973     -16.9614288466675      4.85177815550988  
52     -1.14706454035176     -6.34637338860446      2.39197736130292  
53     -1.14706454035176     -6.34637338860446      2.39197736130292  
54     -55.3970836353244      8.44951985367211     -2.05075012096447  
55     -35.1489550264151      6.78834277464059    -0.126587947134508  
56     -66.7170483841066      3.98339148251936     -1.12767424697563  
57      14.1943699338795      4.69155400256368      12.4206564896857  
58     -9.05233764951458      4.63067668949458     -3.60265729333253  
59      48.0643491900927     -6.72080812877328     0.746048150327689  
60     -7.53034129056041      37.1090070862536     -4.78644698313109  
61     -71.3209829620148      1.95095263269743     -1.55326105628057  
62     -27.6243365464012      17.2212353028306     -3.24496967423472  
63      -19.002636112788      5.93811212781568     -3.28776432567215  
64     -29.4490299121844     -12.1146438784988     -2.59415770058483  
65      -29.750093053573     -8.99107737676142      -3.2848216598803  
66      32.0875872308678      4.17530649922571     -1.14257902778671  
67      35.2360399214631      9.46054341305244     -5.69064790913363  
68     -20.4892045223688     -13.2472443987493     -3.47184703179337  
69     -34.6741775069697      17.9008639524076     -3.22563541364455  
70     0.730202192986314      15.6320679596781     -4.78499660843747  
71     -17.2594071938456    -0.500180186642626      1.73125273716163  
72     -9.18358839473754     -1.33806872969595      3.15843640640076  
73     -9.18358839473754     -1.33806872969595      3.15843640640076  
74      35.3631549238248      1.24771708830033     -4.80588411457336  
75     -44.8177038153327     -3.32408322957084     -1.97193555518784  
76      34.4954873429935     -10.9184000917509     -5.77391014958169  
77     -29.0409734384488     -6.03520319370948      -2.7598636252927  
78     -29.0409734384488     -6.03520319370948      -2.7598636252927  
79     -29.0409734384488     -6.03520319370948      -2.7598636252927  
80      16.0557078794116      10.0876558355333     0.301566854171818  
81      5.46454014126889      4.38084910265817     0.914005555368653  
82      5.46454014126889      4.38084910265817     0.914005555368653  
83      5.46454014126889      4.38084910265817     0.914005555368653  
84     -46.2544093147873     -5.02925948152294     -1.95463187806248  
85     -30.0115077058734     -5.79669845533482     -2.61808240679804  
86     -14.7925554427983     -2.23755020006416     -3.49815587620536  
87     -72.8022460813843      4.36130645643567     -1.45815612704746  
88     -17.8181544771345     -10.7466772356141     -2.67193978657889  
89     -35.9331209476592      7.94028679930402     -1.34254106589017  
90     -32.8783174079216     -9.22839098854763     -2.47901890123736  
91     -59.3563097995857     -3.94947065130502     -1.50484516743191  
92     -59.3563097995857     -3.94947065130502     -1.50484516743191  
93     -38.8823966562381      4.48833970280952     -3.12535345595975  
94     -51.0910613999878     -8.03023822650388     -2.46212354452028  
95      10.6850783476395      14.2732474183979     -5.15136456874776  
96     -25.5351793082778     -8.88356110618463     -1.05583066878426  
97     -56.4856688259442      -0.5290378076123     -1.58351166931636  
98      71.7372841256507     -6.28542003701019     -2.06235268516737  
99      18.1733465755406     -2.74862458726538     0.954111381558329  
100     9.66114424275959     -12.0957510762698     -1.94944007569165  
Rows: 1-100 | Columns: 3

Please refer to transform() for more details on transforming a vDataFrame.

Similarly, you can perform the inverse tranform to get the original features using:

model.inverse_transform(data_transformed)

The variable data_transformed includes the PCA components.

Plots - PCA#

You can plot the first two components conveniently using:

model.plot()

Plots - Scree#

You can also plot the Scree plot:

model.plot_scree()
Loading....

Parameter Modification#

In order to see the parameters:

model.get_params()
Out[7]: {'n_components': 3, 'scale': False, 'method': 'lapack'}

And to manually change some of the parameters:

model.set_params({'n_components': 3})

Model Register#

In order to register the model for tracking and versioning:

model.register("model_v1")

Please refer to Model Tracking and Versioning for more details on model tracking and versioning.

Model Exporting#

To Memmodel

model.to_memmodel()

Note

MemModel objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle a scikit-learn model.

The preceding methods for exporting the model use MemModel, and it is recommended to use MemModel directly.

SQL

To get the SQL query use below:

model.to_sql()
Out[9]: 
['("fixed_acidity" - 7.21530706479914) * -0.00740794380368571 + ("volatile_acidity" - 0.339665999692166) * -0.00118432129646007 + ("citric_acid" - 0.318633215330152) * 0.000486867004519229 + ("residual_sugar" - 5.44323533938741) * 0.0410197294279797 + ("chlorides" - 0.0560338617823611) * -0.000168197431343702 + ("free_sulfur_dioxide" - 30.5253193781745) * 0.230481489722234 + ("total_sulfur_dioxide" - 115.744574418963) * 0.972166693841427 + ("density" - 0.994696633830999) * 1.7724964231297e-06 + ("pH" - 3.21850084654456) * -0.000655520779893176 + ("sulphates" - 0.531268277666615) * -0.000704339207760154 + ("alcohol" - 10.4918008311528) * -0.0054518247482017 + ("quality" - 5.81837771279052) * -0.000532705277259338 + ("good" - 0.196552254886871) * -0.000326386219750347',
 '("fixed_acidity" - 7.21530706479914) * -0.00537208540551764 + ("volatile_acidity" - 0.339665999692166) * -0.000787179265926509 + ("citric_acid" - 0.318633215330152) * -0.000247100931315118 + ("residual_sugar" - 5.44323533938741) * 0.0186262696200211 + ("chlorides" - 0.0560338617823611) * 6.67995503853851e-05 + ("free_sulfur_dioxide" - 30.5253193781745) * 0.972615829133942 + ("total_sulfur_dioxide" - 115.744574418963) * -0.231392918488152 + ("density" - 0.994696633830999) * 1.27197739295473e-06 + ("pH" - 3.21850084654456) * 0.00064802744147177 + ("sulphates" - 0.531268277666615) * 0.000346564126401858 + ("alcohol" - 10.4918008311528) * 0.00288222663864476 + ("quality" - 5.81837771279052) * 0.00915670157688827 + ("good" - 0.196552254886871) * 0.00256735969031283',
 '("fixed_acidity" - 7.21530706479914) * 0.0238635181935159 + ("volatile_acidity" - 0.339665999692166) * 0.000908752011459457 + ("citric_acid" - 0.318633215330152) * 0.0019208440336196 + ("residual_sugar" - 5.44323533938741) * 0.995191758737064 + ("chlorides" - 0.0560338617823611) * 0.000177543760783315 + ("free_sulfur_dioxide" - 30.5253193781745) * -0.0271071934686967 + ("total_sulfur_dioxide" - 115.744574418963) * -0.0358591349601494 + ("density" - 0.994696633830999) * 0.000460901781338666 + ("pH" - 3.21850084654456) * -0.00691133676962082 + ("sulphates" - 0.531268277666615) * -0.00193655721535319 + ("alcohol" - 10.4918008311528) * -0.0826607385648245 + ("quality" - 5.81837771279052) * -0.00888756383923461 + ("good" - 0.196552254886871) * -0.00592921565549895']

To Python

To obtain the prediction function in Python syntax, use the following code:

X = [[3.8, 0.3, 0.02, 11, 0.03, 20, 113, 0.99, 3, 0.4, 12, 6, 0]]

model.to_python()(X)
Out[11]: array([[-4.84895114, -9.47474643,  5.70830533]])

Hint

The to_python() method is used to retrieve the Principal Component values. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.

__init__(name: str = None, overwrite_model: bool = False, n_components: int = 0, scale: bool = False, method: Literal['lapack'] = 'lapack') None#

Must be overridden in the child class

Methods

__init__([name, overwrite_model, ...])

Must be overridden in the child class

contour([nbins, chart])

Draws the model's contour plot.

deployInverseSQL([key_columns, ...])

Returns the SQL code needed to deploy the inverse model.

deploySQL([X, n_components, cutoff, ...])

Returns the SQL code needed to deploy the model.

does_model_exists(name[, raise_error, ...])

Checks whether the model is stored in the Vertica database.

drop()

Drops the model from the Vertica database.

export_models(name, path[, kind])

Exports machine learning models.

fit(input_relation[, X, return_report])

Trains the model.

get_attributes([attr_name])

Returns the model attributes.

get_match_index(x, col_list[, str_check])

Returns the matching index.

get_params()

Returns the parameters of the model.

get_plotting_lib([class_name, chart, ...])

Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.

get_vertica_attributes([attr_name])

Returns the model Vertica attributes.

import_models(path[, schema, kind])

Imports machine learning models.

inverse_transform(vdf[, X])

Applies the Inverse Model on a vDataFrame.

plot([dimensions, chart])

Draws a decomposition scatter plot.

plot_circle([dimensions, chart])

Draws a decomposition circle.

plot_scree([chart])

Draws a decomposition scree plot.

register(registered_name[, raise_error])

Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.

score([X, input_relation, metric, p])

Returns the decomposition score on a dataset for each transformed column.

set_params([parameters])

Sets the parameters of the model.

summarize()

Summarizes the model.

to_binary(path)

Exports the model to the Vertica Binary format.

to_memmodel()

Converts the model to an InMemory object that can be used for different types of predictions.

to_pmml(path)

Exports the model to PMML.

to_python([return_proba, ...])

Returns the Python function needed for in-memory scoring without using built-in Vertica functions.

to_sql([X, return_proba, ...])

Returns the SQL code needed to deploy the model without using built-in Vertica functions.

to_tf(path)

Exports the model to the Frozen Graph format (TensorFlow).

transform([vdf, X, n_components, cutoff])

Applies the model on a vDataFrame.

Attributes