verticapy.machine_learning.vertica.decomposition.MCA#
- class verticapy.machine_learning.vertica.decomposition.MCA(name: str = None, overwrite_model: bool = False)#
Creates a MCA (multiple correspondence analysis) object using the Vertica PCA algorithm. MCA is a PCA applied to a complete disjunctive table. The input relation is transformed to a TCDT (transformed complete disjunctive table) before applying the PCA.
Important
This algorithm is not Vertica Native and relies solely on SQL for attribute computation. While this model does not take advantage of the benefits provided by a model management system, including versioning and tracking, the SQL code it generates can still be used to create a pipeline.
Parameters#
- name: str, optional
Name of the model. The model is stored in the database.
- overwrite_model: bool, optional
If set to
True
, training a model with the same name as an existing model overwrites the existing model.
Attributes#
Many attributes are created during the fitting phase.
- principal_components_: numpy.array
Matrix of the principal components.
- mean_: numpy.array
List of the averages of each input feature.
- cos2_: numpy.array
Quality of representation of each observation in the principal component space. A high cos2 value indicates that the observation is well-represented in the reduced-dimensional space defined by the principal components, while a low value suggests poor representation.
- explained_variance_: numpy.array
Represents the proportion of the total variance in the original dataset that is captured by a specific principal component or a combination of principal components.
Note
All attributes can be accessed using the
get_attributes()
method.Note
Several other attributes can be accessed by using the
get_vertica_attributes()
method.Examples#
The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.
Load data for machine learning#
We import
verticapy
:import verticapy as vp
Hint
By assigning an alias to
verticapy
, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions fromverticapy
are used as intended without interfering with functions from other libraries.For this example, we will use the Titanic dataset.
import verticapy.datasets as vpd data = vpd.load_titanic()
123pclassInteger123survivedIntegerAbcVarchar(164)AbcsexVarchar(20)123ageNumeric(8)123sibspInteger123parchIntegerAbcticketVarchar(36)123fareNumeric(12)AbccabinVarchar(30)AbcembarkedVarchar(20)AbcboatVarchar(100)123bodyIntegerAbchome.destVarchar(100)1 1 0 female 2.0 1 2 113781 151.55 C22 C26 S [null] [null] Montreal, PQ / Chesterville, ON 2 1 0 male 30.0 1 2 113781 151.55 C22 C26 S [null] 135 Montreal, PQ / Chesterville, ON 3 1 0 female 25.0 1 2 113781 151.55 C22 C26 S [null] [null] Montreal, PQ / Chesterville, ON 4 1 0 male 39.0 0 0 112050 0.0 A36 S [null] [null] Belfast, NI 5 1 0 male 71.0 0 0 PC 17609 49.5042 [null] C [null] 22 Montevideo, Uruguay 6 1 0 male 47.0 1 0 PC 17757 227.525 C62 C64 C [null] 124 New York, NY 7 1 0 male [null] 0 0 PC 17318 25.925 [null] S [null] [null] New York, NY 8 1 0 male 24.0 0 1 PC 17558 247.5208 B58 B60 C [null] [null] Montreal, PQ 9 1 0 male 36.0 0 0 13050 75.2417 C6 C A [null] Winnipeg, MN 10 1 0 male 25.0 0 0 13905 26.0 [null] C [null] 148 San Francisco, CA 11 1 0 male 45.0 0 0 113784 35.5 T S [null] [null] Trenton, NJ 12 1 0 male 42.0 0 0 110489 26.55 D22 S [null] [null] London / Winnipeg, MB 13 1 0 male 41.0 0 0 113054 30.5 A21 S [null] [null] Pomeroy, WA 14 1 0 male 48.0 0 0 PC 17591 50.4958 B10 C [null] 208 Omaha, NE 15 1 0 male [null] 0 0 112379 39.6 [null] C [null] [null] Philadelphia, PA 16 1 0 male 45.0 0 0 113050 26.55 B38 S [null] [null] Washington, DC 17 1 0 male [null] 0 0 113798 31.0 [null] S [null] [null] [null] 18 1 0 male 33.0 0 0 695 5.0 B51 B53 B55 S [null] [null] New York, NY 19 1 0 male 28.0 0 0 113059 47.1 [null] S [null] [null] Montevideo, Uruguay 20 1 0 male 17.0 0 0 113059 47.1 [null] S [null] [null] Montevideo, Uruguay 21 1 0 male 49.0 0 0 19924 26.0 [null] S [null] [null] Ascot, Berkshire / Rochester, NY 22 1 0 male 36.0 1 0 19877 78.85 C46 S [null] 172 Little Onn Hall, Staffs 23 1 0 male 46.0 1 0 W.E.P. 5734 61.175 E31 S [null] [null] Amenia, ND 24 1 0 male [null] 0 0 112051 0.0 [null] S [null] [null] Liverpool, England / Belfast 25 1 0 male 27.0 1 0 13508 136.7792 C89 C [null] [null] Los Angeles, CA 26 1 0 male [null] 0 0 110465 52.0 A14 S [null] [null] Stoughton, MA 27 1 0 male 47.0 0 0 5727 25.5875 E58 S [null] [null] Victoria, BC 28 1 0 male 37.0 1 1 PC 17756 83.1583 E52 C [null] [null] Lakewood, NJ 29 1 0 male [null] 0 0 113791 26.55 [null] S [null] [null] Roachdale, IN 30 1 0 male 70.0 1 1 WE/P 5735 71.0 B22 S [null] 269 Milwaukee, WI 31 1 0 male 39.0 1 0 PC 17599 71.2833 C85 C [null] [null] New York, NY 32 1 0 male 31.0 1 0 F.C. 12750 52.0 B71 S [null] [null] Montreal, PQ 33 1 0 male 50.0 1 0 PC 17761 106.425 C86 C [null] 62 Deephaven, MN / Cedar Rapids, IA 34 1 0 male 39.0 0 0 PC 17580 29.7 A18 C [null] 133 Philadelphia, PA 35 1 0 female 36.0 0 0 PC 17531 31.6792 A29 C [null] [null] New York, NY 36 1 0 male [null] 0 0 PC 17483 221.7792 C95 S [null] [null] [null] 37 1 0 male 30.0 0 0 113051 27.75 C111 C [null] [null] New York, NY 38 1 0 male 19.0 3 2 19950 263.0 C23 C25 C27 S [null] [null] Winnipeg, MB 39 1 0 male 64.0 1 4 19950 263.0 C23 C25 C27 S [null] [null] Winnipeg, MB 40 1 0 male [null] 0 0 113778 26.55 D34 S [null] [null] Westcliff-on-Sea, Essex 41 1 0 male [null] 0 0 112058 0.0 B102 S [null] [null] [null] 42 1 0 male 37.0 1 0 113803 53.1 C123 S [null] [null] Scituate, MA 43 1 0 male 47.0 0 0 111320 38.5 E63 S [null] 275 St Anne's-on-Sea, Lancashire 44 1 0 male 24.0 0 0 PC 17593 79.2 B86 C [null] [null] [null] 45 1 0 male 71.0 0 0 PC 17754 34.6542 A5 C [null] [null] New York, NY 46 1 0 male 38.0 0 1 PC 17582 153.4625 C91 S [null] 147 Winnipeg, MB 47 1 0 male 46.0 0 0 PC 17593 79.2 B82 B84 C [null] [null] New York, NY 48 1 0 male [null] 0 0 113796 42.4 [null] S [null] [null] [null] 49 1 0 male 45.0 1 0 36973 83.475 C83 S [null] [null] New York, NY 50 1 0 male 40.0 0 0 112059 0.0 B94 S [null] 110 [null] 51 1 0 male 55.0 1 1 12749 93.5 B69 S [null] 307 Montreal, PQ 52 1 0 male 42.0 0 0 113038 42.5 B11 S [null] [null] London / Middlesex 53 1 0 male [null] 0 0 17463 51.8625 E46 S [null] [null] Brighton, MA 54 1 0 male 55.0 0 0 680 50.0 C39 S [null] [null] London / Birmingham 55 1 0 male 42.0 1 0 113789 52.0 [null] S [null] 38 New York, NY 56 1 0 male [null] 0 0 PC 17600 30.6958 [null] C 14 [null] New York, NY 57 1 0 female 50.0 0 0 PC 17595 28.7125 C49 C [null] [null] Paris, France New York, NY 58 1 0 male 46.0 0 0 694 26.0 [null] S [null] 80 Bennington, VT 59 1 0 male 50.0 0 0 113044 26.0 E60 S [null] [null] London 60 1 0 male 32.5 0 0 113503 211.5 C132 C [null] 45 [null] 61 1 0 male 58.0 0 0 11771 29.7 B37 C [null] 258 Buffalo, NY 62 1 0 male 41.0 1 0 17464 51.8625 D21 S [null] [null] Southington / Noank, CT 63 1 0 male [null] 0 0 113028 26.55 C124 S [null] [null] Portland, OR 64 1 0 male [null] 0 0 PC 17612 27.7208 [null] C [null] [null] Chicago, IL 65 1 0 male 29.0 0 0 113501 30.0 D6 S [null] 126 Springfield, MA 66 1 0 male 30.0 0 0 113801 45.5 [null] S [null] [null] London / New York, NY 67 1 0 male 30.0 0 0 110469 26.0 C106 S [null] [null] Brockton, MA 68 1 0 male 19.0 1 0 113773 53.1 D30 S [null] [null] New York, NY 69 1 0 male 46.0 0 0 13050 75.2417 C6 C [null] 292 Vancouver, BC 70 1 0 male 54.0 0 0 17463 51.8625 E46 S [null] 175 Dorchester, MA 71 1 0 male 28.0 1 0 PC 17604 82.1708 [null] C [null] [null] New York, NY 72 1 0 male 65.0 0 0 13509 26.55 E38 S [null] 249 East Bridgewater, MA 73 1 0 male 44.0 2 0 19928 90.0 C78 Q [null] 230 Fond du Lac, WI 74 1 0 male 55.0 0 0 113787 30.5 C30 S [null] [null] Montreal, PQ 75 1 0 male 47.0 0 0 113796 42.4 [null] S [null] [null] Washington, DC 76 1 0 male 37.0 0 1 PC 17596 29.7 C118 C [null] [null] Brooklyn, NY 77 1 0 male 58.0 0 2 35273 113.275 D48 C [null] 122 Lexington, MA 78 1 0 male 64.0 0 0 693 26.0 [null] S [null] 263 Isle of Wight, England 79 1 0 male 65.0 0 1 113509 61.9792 B30 C [null] 234 Providence, RI 80 1 0 male 28.5 0 0 PC 17562 27.7208 D43 C [null] 189 ?Havana, Cuba 81 1 0 male [null] 0 0 112052 0.0 [null] S [null] [null] Belfast 82 1 0 male 45.5 0 0 113043 28.5 C124 S [null] 166 Surbiton Hill, Surrey 83 1 0 male 23.0 0 0 12749 93.5 B24 S [null] [null] Montreal, PQ 84 1 0 male 29.0 1 0 113776 66.6 C2 S [null] [null] Isleworth, England 85 1 0 male 18.0 1 0 PC 17758 108.9 C65 C [null] [null] Madrid, Spain 86 1 0 male 47.0 0 0 110465 52.0 C110 S [null] 207 Worcester, MA 87 1 0 male 38.0 0 0 19972 0.0 [null] S [null] [null] Rotterdam, Netherlands 88 1 0 male 22.0 0 0 PC 17760 135.6333 [null] C [null] 232 [null] 89 1 0 male [null] 0 0 PC 17757 227.525 [null] C [null] [null] [null] 90 1 0 male 31.0 0 0 PC 17590 50.4958 A24 S [null] [null] Trenton, NJ 91 1 0 male [null] 0 0 113767 50.0 A32 S [null] [null] Seattle, WA 92 1 0 male 36.0 0 0 13049 40.125 A10 C [null] [null] Winnipeg, MB 93 1 0 male 55.0 1 0 PC 17603 59.4 [null] C [null] [null] New York, NY 94 1 0 male 33.0 0 0 113790 26.55 [null] S [null] 109 London 95 1 0 male 61.0 1 3 PC 17608 262.375 B57 B59 B63 B66 C [null] [null] Haverford, PA / Cooperstown, NY 96 1 0 male 50.0 1 0 13507 55.9 E44 S [null] [null] Duluth, MN 97 1 0 male 56.0 0 0 113792 26.55 [null] S [null] [null] New York, NY 98 1 0 male 56.0 0 0 17764 30.6958 A7 C [null] [null] St James, Long Island, NY 99 1 0 male 24.0 1 0 13695 60.0 C31 S [null] [null] Huntington, WV 100 1 0 male [null] 0 0 113056 26.0 A19 S [null] [null] Streatham, Surrey Rows: 1-100 | Columns: 14Note
VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.
Model Initialization#
First we import the
MCA
model:from verticapy.machine_learning.vertica import MCA
Then we can create the model:
model = MCA()
You can select the number of components by the
n_component
parameter. If it is not provided, then all are considered.Important
As this model is not native, it solely relies on SQL statements to compute various attributes, storing them within the object. No data is saved in the database.
Model Training#
Before fitting the model, we need to calculate the Transformed Completely Disjontive Table before fitting the model:
tcdt = data[["survived", "pclass", "sex"]].cdt()
We can now fit the model:
model.fit(tcdt)
Important
To train a model, you can directly use the
vDataFrame
or the name of the relation stored in the database.Scores#
The decomposition score on the dataset for each transformed column can be calculated by:
model.score() Out[6]: None Score survived_0 0.0 survived_1 0.0 pclass_1 0.0 pclass_2 0.0 pclass_3 0.0 sex_female 0.0 sex_male 0.0 Rows: 1-7 | Columns: 2
For more details on the function, check out
score()
You can also fetch the explained variance by:
model.explained_variance_ Out[7]: array([3.71602747e-01, 3.42234698e-01, 2.01530443e-01, 8.46321118e-02, 5.05034016e-16, 1.63467134e-16, 4.43412970e-16])
Principal Components#
To get the transformed dataset in the form of principal components:
model.transform(tcdt) Out[8]: None col1 col2 col3 \\ 1 0.00229204335572662 0.000321740516013252 0.000797337308334691 \\ 2 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 3 0.00229204335572662 0.000321740516013252 0.000797337308334691 \\ 4 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 5 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 6 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 7 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 8 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 9 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 10 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 11 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 12 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 13 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 14 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 15 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 16 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 17 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 18 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 19 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 20 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 21 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 22 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 23 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 24 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 25 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 26 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 27 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 28 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 29 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 30 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 31 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 32 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 33 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 34 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 35 0.00229204335572662 0.000321740516013252 0.000797337308334691 \\ 36 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 37 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 38 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 39 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 40 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 41 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 42 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 43 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 44 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 45 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 46 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 47 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 48 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 49 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 50 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 51 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 52 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 53 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 54 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 55 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 56 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 57 0.00229204335572662 0.000321740516013252 0.000797337308334691 \\ 58 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 59 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 60 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 61 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 62 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 63 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 64 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 65 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 66 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 67 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 68 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 69 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 70 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 71 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 72 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 73 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 74 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 75 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 76 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 77 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 78 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 79 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 80 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 81 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 82 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 83 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 84 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 85 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 86 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 87 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 88 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 89 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 90 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 91 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 92 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 93 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 94 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 95 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 96 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 97 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 98 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 99 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ 100 0.00146911605526505 -0.000826178800615709 0.00235861847839741 \\ None col4 col5 1 -0.00224766909385777 1.56423268435346e-16 2 -0.000590677084986731 1.59025353649311e-16 3 -0.00224766909385777 1.56423268435346e-16 4 -0.000590677084986731 1.59025353649311e-16 5 -0.000590677084986731 1.59025353649311e-16 6 -0.000590677084986731 1.59025353649311e-16 7 -0.000590677084986731 1.59025353649311e-16 8 -0.000590677084986731 1.59025353649311e-16 9 -0.000590677084986731 1.59025353649311e-16 10 -0.000590677084986731 1.59025353649311e-16 11 -0.000590677084986731 1.59025353649311e-16 12 -0.000590677084986731 1.59025353649311e-16 13 -0.000590677084986731 1.59025353649311e-16 14 -0.000590677084986731 1.59025353649311e-16 15 -0.000590677084986731 1.59025353649311e-16 16 -0.000590677084986731 1.59025353649311e-16 17 -0.000590677084986731 1.59025353649311e-16 18 -0.000590677084986731 1.59025353649311e-16 19 -0.000590677084986731 1.59025353649311e-16 20 -0.000590677084986731 1.59025353649311e-16 21 -0.000590677084986731 1.59025353649311e-16 22 -0.000590677084986731 1.59025353649311e-16 23 -0.000590677084986731 1.59025353649311e-16 24 -0.000590677084986731 1.59025353649311e-16 25 -0.000590677084986731 1.59025353649311e-16 26 -0.000590677084986731 1.59025353649311e-16 27 -0.000590677084986731 1.59025353649311e-16 28 -0.000590677084986731 1.59025353649311e-16 29 -0.000590677084986731 1.59025353649311e-16 30 -0.000590677084986731 1.59025353649311e-16 31 -0.000590677084986731 1.59025353649311e-16 32 -0.000590677084986731 1.59025353649311e-16 33 -0.000590677084986731 1.59025353649311e-16 34 -0.000590677084986731 1.59025353649311e-16 35 -0.00224766909385777 1.56423268435346e-16 36 -0.000590677084986731 1.59025353649311e-16 37 -0.000590677084986731 1.59025353649311e-16 38 -0.000590677084986731 1.59025353649311e-16 39 -0.000590677084986731 1.59025353649311e-16 40 -0.000590677084986731 1.59025353649311e-16 41 -0.000590677084986731 1.59025353649311e-16 42 -0.000590677084986731 1.59025353649311e-16 43 -0.000590677084986731 1.59025353649311e-16 44 -0.000590677084986731 1.59025353649311e-16 45 -0.000590677084986731 1.59025353649311e-16 46 -0.000590677084986731 1.59025353649311e-16 47 -0.000590677084986731 1.59025353649311e-16 48 -0.000590677084986731 1.59025353649311e-16 49 -0.000590677084986731 1.59025353649311e-16 50 -0.000590677084986731 1.59025353649311e-16 51 -0.000590677084986731 1.59025353649311e-16 52 -0.000590677084986731 1.59025353649311e-16 53 -0.000590677084986731 1.59025353649311e-16 54 -0.000590677084986731 1.59025353649311e-16 55 -0.000590677084986731 1.59025353649311e-16 56 -0.000590677084986731 1.59025353649311e-16 57 -0.00224766909385777 1.56423268435346e-16 58 -0.000590677084986731 1.59025353649311e-16 59 -0.000590677084986731 1.59025353649311e-16 60 -0.000590677084986731 1.59025353649311e-16 61 -0.000590677084986731 1.59025353649311e-16 62 -0.000590677084986731 1.59025353649311e-16 63 -0.000590677084986731 1.59025353649311e-16 64 -0.000590677084986731 1.59025353649311e-16 65 -0.000590677084986731 1.59025353649311e-16 66 -0.000590677084986731 1.59025353649311e-16 67 -0.000590677084986731 1.59025353649311e-16 68 -0.000590677084986731 1.59025353649311e-16 69 -0.000590677084986731 1.59025353649311e-16 70 -0.000590677084986731 1.59025353649311e-16 71 -0.000590677084986731 1.59025353649311e-16 72 -0.000590677084986731 1.59025353649311e-16 73 -0.000590677084986731 1.59025353649311e-16 74 -0.000590677084986731 1.59025353649311e-16 75 -0.000590677084986731 1.59025353649311e-16 76 -0.000590677084986731 1.59025353649311e-16 77 -0.000590677084986731 1.59025353649311e-16 78 -0.000590677084986731 1.59025353649311e-16 79 -0.000590677084986731 1.59025353649311e-16 80 -0.000590677084986731 1.59025353649311e-16 81 -0.000590677084986731 1.59025353649311e-16 82 -0.000590677084986731 1.59025353649311e-16 83 -0.000590677084986731 1.59025353649311e-16 84 -0.000590677084986731 1.59025353649311e-16 85 -0.000590677084986731 1.59025353649311e-16 86 -0.000590677084986731 1.59025353649311e-16 87 -0.000590677084986731 1.59025353649311e-16 88 -0.000590677084986731 1.59025353649311e-16 89 -0.000590677084986731 1.59025353649311e-16 90 -0.000590677084986731 1.59025353649311e-16 91 -0.000590677084986731 1.59025353649311e-16 92 -0.000590677084986731 1.59025353649311e-16 93 -0.000590677084986731 1.59025353649311e-16 94 -0.000590677084986731 1.59025353649311e-16 95 -0.000590677084986731 1.59025353649311e-16 96 -0.000590677084986731 1.59025353649311e-16 97 -0.000590677084986731 1.59025353649311e-16 98 -0.000590677084986731 1.59025353649311e-16 99 -0.000590677084986731 1.59025353649311e-16 100 -0.000590677084986731 1.59025353649311e-16 Rows: 1-100 | Columns: 5
Please refer to
transform()
for more details on transforming avDataFrame
.Similarly, you can perform the inverse tranform to get the original features using:
model.inverse_transform(data_transformed)
The variable
data_transformed
includes the MCA components.Plots - MCA#
You can plot the first two dimensions conveniently using:
model.plot()
Plots - Scree#
You can also plot the Scree plot:
model.plot_scree()
Loading....Plots - Decomposition Circle#
You can also plot the Decomposition Circles:
model.plot_circle()
Model Register#
As this model is not native, it does not support model management and versioning. However, it is possible to use the SQL code it generates for deployment.
Model Exporting#
To Memmodel
model.to_memmodel()
Note
MemModel
objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle ascikit-learn
model.The preceding methods for exporting the model use
MemModel
, and it is recommended to useMemModel
directly.SQL
To get the SQL query use below:
model.to_sql() Out[9]: ['("survived_0" - -0.999189627228525) * -0.181634302254853 + ("survived_1" - -0.999189627228526) * 0.316447317706109 + ("pclass_1" - -0.999189627228525) * 0.648146903282904 + ("pclass_2" - -0.999189627228525) * -0.588909547134059 + ("pclass_3" - -0.999189627228525) * -0.0749536366765789 + ("sex_female" - -0.999189627228525) * 0.272960421482605 + ("sex_male" - -0.999189627228525) * -0.140839529511909', '("survived_0" - -0.999189627228525) * -0.232487358829214 + ("survived_1" - -0.999189627228526) * 0.405044642937848 + ("pclass_1" - -0.999189627228525) * 0.123041881257647 + ("pclass_2" - -0.999189627228525) * 0.689909726160874 + ("pclass_3" - -0.999189627228525) * -0.327414307734689 + ("sex_female" - -0.999189627228525) * 0.380758470789969 + ("sex_male" - -0.999189627228525) * -0.196460144633645', '("survived_0" - -0.999189627228525) * 0.134292054542379 + ("survived_1" - -0.999189627228526) * -0.233966601691524 + ("pclass_1" - -0.999189627228525) * 0.612207252170766 + ("pclass_2" - -0.999189627228525) * 0.251905271494925 + ("pclass_3" - -0.999189627228525) * -0.386503963792653 + ("sex_female" - -0.999189627228525) * -0.517868305005975 + ("sex_male" - -0.999189627228525) * 0.267204776538708', '("survived_0" - -0.999189627228525) * -0.377802794939545 + ("survived_1" - -0.999189627228526) * 0.658216424961079 + ("pclass_1" - -0.999189627228525) * -0.169709763121011 + ("pclass_2" - -0.999189627228525) * -0.0518795498186791 + ("pclass_3" - -0.999189627228525) * 0.100130089738773 + ("sex_female" - -0.999189627228525) * -0.549615059411762 + ("sex_male" - -0.999189627228525) * 0.28358516578985', '("survived_0" - -0.999189627228525) * -0.224208471317908 + ("survived_1" - -0.999189627228526) * -0.128691086853437 + ("pclass_1" - -0.999189627228525) * 0.373337373303813 + ("pclass_2" - -0.999189627228525) * 0.309917883607947 + ("pclass_3" - -0.999189627228525) * 0.793341918270386 + ("sex_female" - -0.999189627228525) * -0.119875894561693 + ("sex_male" - -0.999189627228525) * -0.232330900412423', '("survived_0" - -0.999189627228525) * -0.502650716476987 + ("survived_1" - -0.999189627228526) * -0.288511253080016 + ("pclass_1" - -0.999189627228525) * 0.0269792637627993 + ("pclass_2" - -0.999189627228525) * 0.0223962478030921 + ("pclass_3" - -0.999189627228525) * 0.0573309354959348 + ("sex_female" - -0.999189627228525) * 0.372396385080935 + ("sex_male" - -0.999189627228525) * 0.721739660609242', '("survived_0" - -0.999189627228525) * 0.670270183632036 + ("survived_1" - -0.999189627228526) * 0.384721406421604 + ("pclass_1" - -0.999189627228525) * 0.145115433134172 + ("pclass_2" - -0.999189627228525) * 0.120464414044063 + ("pclass_3" - -0.999189627228525) * 0.308370295410028 + ("sex_female" - -0.999189627228525) * 0.239169401566029 + ("sex_male" - -0.999189627228525) * 0.463533078273215']
To Python
To obtain the prediction function in Python syntax, use the following code:
X = [[0, 1, 0, 1, 1, 0, 1]] model.to_python()(X) Out[11]: array([[-0.23724135, 1.41279017, 0.02580683, 0.88306341, 1.51310344, 0.92230412, 3.60684391]])
Hint
The
to_python()
method is used to retrieve the Principal Component values. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.- __init__(name: str = None, overwrite_model: bool = False) None #
Must be overridden in the child class
Methods
__init__
([name, overwrite_model])Must be overridden in the child class
contour
([nbins, chart])Draws the model's contour plot.
deployInverseSQL
([key_columns, ...])Returns the SQL code needed to deploy the inverse model.
deploySQL
([X, n_components, cutoff, ...])Returns the SQL code needed to deploy the model.
does_model_exists
(name[, raise_error, ...])Checks whether the model is stored in the Vertica database.
drop
()Drops the model from the Vertica database.
export_models
(name, path[, kind])Exports machine learning models.
fit
(input_relation[, X, return_report])Trains the model.
get_attributes
([attr_name])Returns the model attributes.
get_match_index
(x, col_list[, str_check])Returns the matching index.
Returns the parameters of the model.
get_plotting_lib
([class_name, chart, ...])Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.
get_vertica_attributes
([attr_name])Returns the model Vertica attributes.
import_models
(path[, schema, kind])Imports machine learning models.
inverse_transform
(vdf[, X])Applies the Inverse Model on a
vDataFrame
.plot
([dimensions, chart])Draws a decomposition scatter plot.
plot_circle
([dimensions, chart])Draws a decomposition circle.
plot_contrib
([dimension, chart])Draws a decomposition contribution plot of the input dimension.
plot_cos2
([dimensions, chart])Draws a MCA (multiple correspondence analysis) cos2 plot of the two input dimensions.
plot_scree
([chart])Draws a decomposition scree plot.
plot_var
([dimensions, method, chart])Draws the MCA (multiple correspondence analysis) graph.
register
(registered_name[, raise_error])Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.
score
([X, input_relation, metric, p])Returns the decomposition score on a dataset for each transformed column.
set_params
([parameters])Sets the parameters of the model.
Summarizes the model.
to_binary
(path)Exports the model to the Vertica Binary format.
Converts the model to an InMemory object that can be used for different types of predictions.
to_pmml
(path)Exports the model to PMML.
to_python
([return_proba, ...])Returns the Python function needed for in-memory scoring without using built-in Vertica functions.
to_sql
([X, return_proba, ...])Returns the SQL code needed to deploy the model without using built-in Vertica functions.
to_tf
(path)Exports the model to the Frozen Graph format (TensorFlow).
transform
([vdf, X, n_components, cutoff])Applies the model on a
vDataFrame
.Attributes