Loading...

verticapy.machine_learning.vertica.decomposition.MCA#

class verticapy.machine_learning.vertica.decomposition.MCA(name: str = None, overwrite_model: bool = False)#

Creates a MCA (multiple correspondence analysis) object using the Vertica PCA algorithm. MCA is a PCA applied to a complete disjunctive table. The input relation is transformed to a TCDT (transformed complete disjunctive table) before applying the PCA.

Important

This algorithm is not Vertica Native and relies solely on SQL for attribute computation. While this model does not take advantage of the benefits provided by a model management system, including versioning and tracking, the SQL code it generates can still be used to create a pipeline.

Parameters#

name: str, optional

Name of the model. The model is stored in the database.

overwrite_model: bool, optional

If set to True, training a model with the same name as an existing model overwrites the existing model.

Attributes#

Many attributes are created during the fitting phase.

principal_components_: numpy.array

Matrix of the principal components.

mean_: numpy.array

List of the averages of each input feature.

cos2_: numpy.array

Quality of representation of each observation in the principal component space. A high cos2 value indicates that the observation is well-represented in the reduced-dimensional space defined by the principal components, while a low value suggests poor representation.

explained_variance_: numpy.array

Represents the proportion of the total variance in the original dataset that is captured by a specific principal component or a combination of principal components.

Note

All attributes can be accessed using the get_attributes() method.

Note

Several other attributes can be accessed by using the get_vertica_attributes() method.

Examples#

The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.

Load data for machine learning#

We import verticapy:

import verticapy as vp

Hint

By assigning an alias to verticapy, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions from verticapy are used as intended without interfering with functions from other libraries.

For this example, we will use the Titanic dataset.

import verticapy.datasets as vpd

data = vpd.load_titanic()
123
pclass
Integer
123
survived
Integer
Abc
Varchar(164)
Abc
sex
Varchar(20)
123
age
Numeric(8)
123
sibsp
Integer
123
parch
Integer
Abc
ticket
Varchar(36)
123
fare
Numeric(12)
Abc
cabin
Varchar(30)
Abc
embarked
Varchar(20)
Abc
boat
Varchar(100)
123
body
Integer
Abc
home.dest
Varchar(100)
110female2.012113781151.55C22 C26S[null][null]Montreal, PQ / Chesterville, ON
210male30.012113781151.55C22 C26S[null]135Montreal, PQ / Chesterville, ON
310female25.012113781151.55C22 C26S[null][null]Montreal, PQ / Chesterville, ON
410male39.0001120500.0A36S[null][null]Belfast, NI
510male71.000PC 1760949.5042[null]C[null]22Montevideo, Uruguay
610male47.010PC 17757227.525C62 C64C[null]124New York, NY
710male[null]00PC 1731825.925[null]S[null][null]New York, NY
810male24.001PC 17558247.5208B58 B60C[null][null]Montreal, PQ
910male36.0001305075.2417C6CA[null]Winnipeg, MN
1010male25.0001390526.0[null]C[null]148San Francisco, CA
1110male45.00011378435.5TS[null][null]Trenton, NJ
1210male42.00011048926.55D22S[null][null]London / Winnipeg, MB
1310male41.00011305430.5A21S[null][null]Pomeroy, WA
1410male48.000PC 1759150.4958B10C[null]208Omaha, NE
1510male[null]0011237939.6[null]C[null][null]Philadelphia, PA
1610male45.00011305026.55B38S[null][null]Washington, DC
1710male[null]0011379831.0[null]S[null][null][null]
1810male33.0006955.0B51 B53 B55S[null][null]New York, NY
1910male28.00011305947.1[null]S[null][null]Montevideo, Uruguay
2010male17.00011305947.1[null]S[null][null]Montevideo, Uruguay
2110male49.0001992426.0[null]S[null][null]Ascot, Berkshire / Rochester, NY
2210male36.0101987778.85C46S[null]172Little Onn Hall, Staffs
2310male46.010W.E.P. 573461.175E31S[null][null]Amenia, ND
2410male[null]001120510.0[null]S[null][null]Liverpool, England / Belfast
2510male27.01013508136.7792C89C[null][null]Los Angeles, CA
2610male[null]0011046552.0A14S[null][null]Stoughton, MA
2710male47.000572725.5875E58S[null][null]Victoria, BC
2810male37.011PC 1775683.1583E52C[null][null]Lakewood, NJ
2910male[null]0011379126.55[null]S[null][null]Roachdale, IN
3010male70.011WE/P 573571.0B22S[null]269Milwaukee, WI
3110male39.010PC 1759971.2833C85C[null][null]New York, NY
3210male31.010F.C. 1275052.0B71S[null][null]Montreal, PQ
3310male50.010PC 17761106.425C86C[null]62Deephaven, MN / Cedar Rapids, IA
3410male39.000PC 1758029.7A18C[null]133Philadelphia, PA
3510female36.000PC 1753131.6792A29C[null][null]New York, NY
3610male[null]00PC 17483221.7792C95S[null][null][null]
3710male30.00011305127.75C111C[null][null]New York, NY
3810male19.03219950263.0C23 C25 C27S[null][null]Winnipeg, MB
3910male64.01419950263.0C23 C25 C27S[null][null]Winnipeg, MB
4010male[null]0011377826.55D34S[null][null]Westcliff-on-Sea, Essex
4110male[null]001120580.0B102S[null][null][null]
4210male37.01011380353.1C123S[null][null]Scituate, MA
4310male47.00011132038.5E63S[null]275St Anne's-on-Sea, Lancashire
4410male24.000PC 1759379.2B86C[null][null][null]
4510male71.000PC 1775434.6542A5C[null][null]New York, NY
4610male38.001PC 17582153.4625C91S[null]147Winnipeg, MB
4710male46.000PC 1759379.2B82 B84C[null][null]New York, NY
4810male[null]0011379642.4[null]S[null][null][null]
4910male45.0103697383.475C83S[null][null]New York, NY
5010male40.0001120590.0B94S[null]110[null]
5110male55.0111274993.5B69S[null]307Montreal, PQ
5210male42.00011303842.5B11S[null][null]London / Middlesex
5310male[null]001746351.8625E46S[null][null]Brighton, MA
5410male55.00068050.0C39S[null][null]London / Birmingham
5510male42.01011378952.0[null]S[null]38New York, NY
5610male[null]00PC 1760030.6958[null]C14[null]New York, NY
5710female50.000PC 1759528.7125C49C[null][null]Paris, France New York, NY
5810male46.00069426.0[null]S[null]80Bennington, VT
5910male50.00011304426.0E60S[null][null]London
6010male32.500113503211.5C132C[null]45[null]
6110male58.0001177129.7B37C[null]258Buffalo, NY
6210male41.0101746451.8625D21S[null][null]Southington / Noank, CT
6310male[null]0011302826.55C124S[null][null]Portland, OR
6410male[null]00PC 1761227.7208[null]C[null][null]Chicago, IL
6510male29.00011350130.0D6S[null]126Springfield, MA
6610male30.00011380145.5[null]S[null][null]London / New York, NY
6710male30.00011046926.0C106S[null][null]Brockton, MA
6810male19.01011377353.1D30S[null][null]New York, NY
6910male46.0001305075.2417C6C[null]292Vancouver, BC
7010male54.0001746351.8625E46S[null]175Dorchester, MA
7110male28.010PC 1760482.1708[null]C[null][null]New York, NY
7210male65.0001350926.55E38S[null]249East Bridgewater, MA
7310male44.0201992890.0C78Q[null]230Fond du Lac, WI
7410male55.00011378730.5C30S[null][null]Montreal, PQ
7510male47.00011379642.4[null]S[null][null]Washington, DC
7610male37.001PC 1759629.7C118C[null][null]Brooklyn, NY
7710male58.00235273113.275D48C[null]122Lexington, MA
7810male64.00069326.0[null]S[null]263Isle of Wight, England
7910male65.00111350961.9792B30C[null]234Providence, RI
8010male28.500PC 1756227.7208D43C[null]189?Havana, Cuba
8110male[null]001120520.0[null]S[null][null]Belfast
8210male45.50011304328.5C124S[null]166Surbiton Hill, Surrey
8310male23.0001274993.5B24S[null][null]Montreal, PQ
8410male29.01011377666.6C2S[null][null]Isleworth, England
8510male18.010PC 17758108.9C65C[null][null]Madrid, Spain
8610male47.00011046552.0C110S[null]207Worcester, MA
8710male38.000199720.0[null]S[null][null]Rotterdam, Netherlands
8810male22.000PC 17760135.6333[null]C[null]232[null]
8910male[null]00PC 17757227.525[null]C[null][null][null]
9010male31.000PC 1759050.4958A24S[null][null]Trenton, NJ
9110male[null]0011376750.0A32S[null][null]Seattle, WA
9210male36.0001304940.125A10C[null][null]Winnipeg, MB
9310male55.010PC 1760359.4[null]C[null][null]New York, NY
9410male33.00011379026.55[null]S[null]109London
9510male61.013PC 17608262.375B57 B59 B63 B66C[null][null]Haverford, PA / Cooperstown, NY
9610male50.0101350755.9E44S[null][null]Duluth, MN
9710male56.00011379226.55[null]S[null][null]New York, NY
9810male56.0001776430.6958A7C[null][null]St James, Long Island, NY
9910male24.0101369560.0C31S[null][null]Huntington, WV
10010male[null]0011305626.0A19S[null][null]Streatham, Surrey
Rows: 1-100 | Columns: 14

Note

VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.

Model Initialization#

First we import the MCA model:

from verticapy.machine_learning.vertica import MCA

Then we can create the model:

model = MCA()

You can select the number of components by the n_component parameter. If it is not provided, then all are considered.

Important

As this model is not native, it solely relies on SQL statements to compute various attributes, storing them within the object. No data is saved in the database.

Model Training#

Before fitting the model, we need to calculate the Transformed Completely Disjontive Table before fitting the model:

tcdt = data[["survived", "pclass", "sex"]].cdt()

We can now fit the model:

model.fit(tcdt)

Important

To train a model, you can directly use the vDataFrame or the name of the relation stored in the database.

Scores#

The decomposition score on the dataset for each transformed column can be calculated by:

model.score()
Out[6]: 
None            Score  
survived_0        0.0  
survived_1        0.0  
pclass_1          0.0  
pclass_2          0.0  
pclass_3          0.0  
sex_female        0.0  
sex_male          0.0  
Rows: 1-7 | Columns: 2

For more details on the function, check out score()

You can also fetch the explained variance by:

model.explained_variance_
Out[7]: 
array([3.71602747e-01, 3.42234698e-01, 2.01530443e-01, 8.46321118e-02,
       5.05034016e-16, 1.63467134e-16, 4.43412970e-16])

Principal Components#

To get the transformed dataset in the form of principal components:

model.transform(tcdt)
Out[8]: 
None                  col1                     col2                    col3   \\
1      0.00229204335572662     0.000321740516013252    0.000797337308334691   \\
2      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
3      0.00229204335572662     0.000321740516013252    0.000797337308334691   \\
4      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
5      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
6      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
7      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
8      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
9      0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
10     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
11     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
12     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
13     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
14     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
15     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
16     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
17     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
18     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
19     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
20     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
21     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
22     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
23     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
24     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
25     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
26     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
27     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
28     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
29     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
30     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
31     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
32     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
33     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
34     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
35     0.00229204335572662     0.000321740516013252    0.000797337308334691   \\
36     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
37     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
38     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
39     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
40     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
41     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
42     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
43     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
44     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
45     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
46     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
47     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
48     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
49     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
50     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
51     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
52     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
53     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
54     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
55     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
56     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
57     0.00229204335572662     0.000321740516013252    0.000797337308334691   \\
58     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
59     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
60     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
61     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
62     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
63     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
64     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
65     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
66     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
67     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
68     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
69     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
70     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
71     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
72     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
73     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
74     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
75     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
76     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
77     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
78     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
79     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
80     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
81     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
82     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
83     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
84     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
85     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
86     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
87     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
88     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
89     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
90     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
91     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
92     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
93     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
94     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
95     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
96     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
97     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
98     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
99     0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
100    0.00146911605526505    -0.000826178800615709     0.00235861847839741   \\
None                    col4                    col5  
1       -0.00224766909385777    1.56423268435346e-16  
2      -0.000590677084986731    1.59025353649311e-16  
3       -0.00224766909385777    1.56423268435346e-16  
4      -0.000590677084986731    1.59025353649311e-16  
5      -0.000590677084986731    1.59025353649311e-16  
6      -0.000590677084986731    1.59025353649311e-16  
7      -0.000590677084986731    1.59025353649311e-16  
8      -0.000590677084986731    1.59025353649311e-16  
9      -0.000590677084986731    1.59025353649311e-16  
10     -0.000590677084986731    1.59025353649311e-16  
11     -0.000590677084986731    1.59025353649311e-16  
12     -0.000590677084986731    1.59025353649311e-16  
13     -0.000590677084986731    1.59025353649311e-16  
14     -0.000590677084986731    1.59025353649311e-16  
15     -0.000590677084986731    1.59025353649311e-16  
16     -0.000590677084986731    1.59025353649311e-16  
17     -0.000590677084986731    1.59025353649311e-16  
18     -0.000590677084986731    1.59025353649311e-16  
19     -0.000590677084986731    1.59025353649311e-16  
20     -0.000590677084986731    1.59025353649311e-16  
21     -0.000590677084986731    1.59025353649311e-16  
22     -0.000590677084986731    1.59025353649311e-16  
23     -0.000590677084986731    1.59025353649311e-16  
24     -0.000590677084986731    1.59025353649311e-16  
25     -0.000590677084986731    1.59025353649311e-16  
26     -0.000590677084986731    1.59025353649311e-16  
27     -0.000590677084986731    1.59025353649311e-16  
28     -0.000590677084986731    1.59025353649311e-16  
29     -0.000590677084986731    1.59025353649311e-16  
30     -0.000590677084986731    1.59025353649311e-16  
31     -0.000590677084986731    1.59025353649311e-16  
32     -0.000590677084986731    1.59025353649311e-16  
33     -0.000590677084986731    1.59025353649311e-16  
34     -0.000590677084986731    1.59025353649311e-16  
35      -0.00224766909385777    1.56423268435346e-16  
36     -0.000590677084986731    1.59025353649311e-16  
37     -0.000590677084986731    1.59025353649311e-16  
38     -0.000590677084986731    1.59025353649311e-16  
39     -0.000590677084986731    1.59025353649311e-16  
40     -0.000590677084986731    1.59025353649311e-16  
41     -0.000590677084986731    1.59025353649311e-16  
42     -0.000590677084986731    1.59025353649311e-16  
43     -0.000590677084986731    1.59025353649311e-16  
44     -0.000590677084986731    1.59025353649311e-16  
45     -0.000590677084986731    1.59025353649311e-16  
46     -0.000590677084986731    1.59025353649311e-16  
47     -0.000590677084986731    1.59025353649311e-16  
48     -0.000590677084986731    1.59025353649311e-16  
49     -0.000590677084986731    1.59025353649311e-16  
50     -0.000590677084986731    1.59025353649311e-16  
51     -0.000590677084986731    1.59025353649311e-16  
52     -0.000590677084986731    1.59025353649311e-16  
53     -0.000590677084986731    1.59025353649311e-16  
54     -0.000590677084986731    1.59025353649311e-16  
55     -0.000590677084986731    1.59025353649311e-16  
56     -0.000590677084986731    1.59025353649311e-16  
57      -0.00224766909385777    1.56423268435346e-16  
58     -0.000590677084986731    1.59025353649311e-16  
59     -0.000590677084986731    1.59025353649311e-16  
60     -0.000590677084986731    1.59025353649311e-16  
61     -0.000590677084986731    1.59025353649311e-16  
62     -0.000590677084986731    1.59025353649311e-16  
63     -0.000590677084986731    1.59025353649311e-16  
64     -0.000590677084986731    1.59025353649311e-16  
65     -0.000590677084986731    1.59025353649311e-16  
66     -0.000590677084986731    1.59025353649311e-16  
67     -0.000590677084986731    1.59025353649311e-16  
68     -0.000590677084986731    1.59025353649311e-16  
69     -0.000590677084986731    1.59025353649311e-16  
70     -0.000590677084986731    1.59025353649311e-16  
71     -0.000590677084986731    1.59025353649311e-16  
72     -0.000590677084986731    1.59025353649311e-16  
73     -0.000590677084986731    1.59025353649311e-16  
74     -0.000590677084986731    1.59025353649311e-16  
75     -0.000590677084986731    1.59025353649311e-16  
76     -0.000590677084986731    1.59025353649311e-16  
77     -0.000590677084986731    1.59025353649311e-16  
78     -0.000590677084986731    1.59025353649311e-16  
79     -0.000590677084986731    1.59025353649311e-16  
80     -0.000590677084986731    1.59025353649311e-16  
81     -0.000590677084986731    1.59025353649311e-16  
82     -0.000590677084986731    1.59025353649311e-16  
83     -0.000590677084986731    1.59025353649311e-16  
84     -0.000590677084986731    1.59025353649311e-16  
85     -0.000590677084986731    1.59025353649311e-16  
86     -0.000590677084986731    1.59025353649311e-16  
87     -0.000590677084986731    1.59025353649311e-16  
88     -0.000590677084986731    1.59025353649311e-16  
89     -0.000590677084986731    1.59025353649311e-16  
90     -0.000590677084986731    1.59025353649311e-16  
91     -0.000590677084986731    1.59025353649311e-16  
92     -0.000590677084986731    1.59025353649311e-16  
93     -0.000590677084986731    1.59025353649311e-16  
94     -0.000590677084986731    1.59025353649311e-16  
95     -0.000590677084986731    1.59025353649311e-16  
96     -0.000590677084986731    1.59025353649311e-16  
97     -0.000590677084986731    1.59025353649311e-16  
98     -0.000590677084986731    1.59025353649311e-16  
99     -0.000590677084986731    1.59025353649311e-16  
100    -0.000590677084986731    1.59025353649311e-16  
Rows: 1-100 | Columns: 5

Please refer to transform() for more details on transforming a vDataFrame.

Similarly, you can perform the inverse tranform to get the original features using:

model.inverse_transform(data_transformed)

The variable data_transformed includes the MCA components.

Plots - MCA#

You can plot the first two dimensions conveniently using:

model.plot()

Plots - Scree#

You can also plot the Scree plot:

model.plot_scree()
Loading....

Plots - Decomposition Circle#

You can also plot the Decomposition Circles:

model.plot_circle()

Model Register#

As this model is not native, it does not support model management and versioning. However, it is possible to use the SQL code it generates for deployment.

Model Exporting#

To Memmodel

model.to_memmodel()

Note

MemModel objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle a scikit-learn model.

The preceding methods for exporting the model use MemModel, and it is recommended to use MemModel directly.

SQL

To get the SQL query use below:

model.to_sql()
Out[9]: 
['("survived_0" - -0.999189627228525) * -0.181634302254853 + ("survived_1" - -0.999189627228526) * 0.316447317706109 + ("pclass_1" - -0.999189627228525) * 0.648146903282904 + ("pclass_2" - -0.999189627228525) * -0.588909547134059 + ("pclass_3" - -0.999189627228525) * -0.0749536366765789 + ("sex_female" - -0.999189627228525) * 0.272960421482605 + ("sex_male" - -0.999189627228525) * -0.140839529511909',
 '("survived_0" - -0.999189627228525) * -0.232487358829214 + ("survived_1" - -0.999189627228526) * 0.405044642937848 + ("pclass_1" - -0.999189627228525) * 0.123041881257647 + ("pclass_2" - -0.999189627228525) * 0.689909726160874 + ("pclass_3" - -0.999189627228525) * -0.327414307734689 + ("sex_female" - -0.999189627228525) * 0.380758470789969 + ("sex_male" - -0.999189627228525) * -0.196460144633645',
 '("survived_0" - -0.999189627228525) * 0.134292054542379 + ("survived_1" - -0.999189627228526) * -0.233966601691524 + ("pclass_1" - -0.999189627228525) * 0.612207252170766 + ("pclass_2" - -0.999189627228525) * 0.251905271494925 + ("pclass_3" - -0.999189627228525) * -0.386503963792653 + ("sex_female" - -0.999189627228525) * -0.517868305005975 + ("sex_male" - -0.999189627228525) * 0.267204776538708',
 '("survived_0" - -0.999189627228525) * -0.377802794939545 + ("survived_1" - -0.999189627228526) * 0.658216424961079 + ("pclass_1" - -0.999189627228525) * -0.169709763121011 + ("pclass_2" - -0.999189627228525) * -0.0518795498186791 + ("pclass_3" - -0.999189627228525) * 0.100130089738773 + ("sex_female" - -0.999189627228525) * -0.549615059411762 + ("sex_male" - -0.999189627228525) * 0.28358516578985',
 '("survived_0" - -0.999189627228525) * -0.224208471317908 + ("survived_1" - -0.999189627228526) * -0.128691086853437 + ("pclass_1" - -0.999189627228525) * 0.373337373303813 + ("pclass_2" - -0.999189627228525) * 0.309917883607947 + ("pclass_3" - -0.999189627228525) * 0.793341918270386 + ("sex_female" - -0.999189627228525) * -0.119875894561693 + ("sex_male" - -0.999189627228525) * -0.232330900412423',
 '("survived_0" - -0.999189627228525) * -0.502650716476987 + ("survived_1" - -0.999189627228526) * -0.288511253080016 + ("pclass_1" - -0.999189627228525) * 0.0269792637627993 + ("pclass_2" - -0.999189627228525) * 0.0223962478030921 + ("pclass_3" - -0.999189627228525) * 0.0573309354959348 + ("sex_female" - -0.999189627228525) * 0.372396385080935 + ("sex_male" - -0.999189627228525) * 0.721739660609242',
 '("survived_0" - -0.999189627228525) * 0.670270183632036 + ("survived_1" - -0.999189627228526) * 0.384721406421604 + ("pclass_1" - -0.999189627228525) * 0.145115433134172 + ("pclass_2" - -0.999189627228525) * 0.120464414044063 + ("pclass_3" - -0.999189627228525) * 0.308370295410028 + ("sex_female" - -0.999189627228525) * 0.239169401566029 + ("sex_male" - -0.999189627228525) * 0.463533078273215']

To Python

To obtain the prediction function in Python syntax, use the following code:

X = [[0, 1, 0, 1, 1, 0, 1]]

model.to_python()(X)
Out[11]: 
array([[-0.23724135,  1.41279017,  0.02580683,  0.88306341,  1.51310344,
         0.92230412,  3.60684391]])

Hint

The to_python() method is used to retrieve the Principal Component values. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.

__init__(name: str = None, overwrite_model: bool = False) None#

Must be overridden in the child class

Methods

__init__([name, overwrite_model])

Must be overridden in the child class

contour([nbins, chart])

Draws the model's contour plot.

deployInverseSQL([key_columns, ...])

Returns the SQL code needed to deploy the inverse model.

deploySQL([X, n_components, cutoff, ...])

Returns the SQL code needed to deploy the model.

does_model_exists(name[, raise_error, ...])

Checks whether the model is stored in the Vertica database.

drop()

Drops the model from the Vertica database.

export_models(name, path[, kind])

Exports machine learning models.

fit(input_relation[, X, return_report])

Trains the model.

get_attributes([attr_name])

Returns the model attributes.

get_match_index(x, col_list[, str_check])

Returns the matching index.

get_params()

Returns the parameters of the model.

get_plotting_lib([class_name, chart, ...])

Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.

get_vertica_attributes([attr_name])

Returns the model Vertica attributes.

import_models(path[, schema, kind])

Imports machine learning models.

inverse_transform(vdf[, X])

Applies the Inverse Model on a vDataFrame.

plot([dimensions, chart])

Draws a decomposition scatter plot.

plot_circle([dimensions, chart])

Draws a decomposition circle.

plot_contrib([dimension, chart])

Draws a decomposition contribution plot of the input dimension.

plot_cos2([dimensions, chart])

Draws a MCA (multiple correspondence analysis) cos2 plot of the two input dimensions.

plot_scree([chart])

Draws a decomposition scree plot.

plot_var([dimensions, method, chart])

Draws the MCA (multiple correspondence analysis) graph.

register(registered_name[, raise_error])

Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.

score([X, input_relation, metric, p])

Returns the decomposition score on a dataset for each transformed column.

set_params([parameters])

Sets the parameters of the model.

summarize()

Summarizes the model.

to_binary(path)

Exports the model to the Vertica Binary format.

to_memmodel()

Converts the model to an InMemory object that can be used for different types of predictions.

to_pmml(path)

Exports the model to PMML.

to_python([return_proba, ...])

Returns the Python function needed for in-memory scoring without using built-in Vertica functions.

to_sql([X, return_proba, ...])

Returns the SQL code needed to deploy the model without using built-in Vertica functions.

to_tf(path)

Exports the model to the Frozen Graph format (TensorFlow).

transform([vdf, X, n_components, cutoff])

Applies the model on a vDataFrame.

Attributes