
verticapy.machine_learning.vertica.ensemble.RandomForestRegressor¶
- class verticapy.machine_learning.vertica.ensemble.RandomForestRegressor(name: str = None, overwrite_model: bool = False, n_estimators: int = 10, max_features: Literal['auto', 'max'] | int = 'auto', max_leaf_nodes: Annotated[int | float | Decimal, 'Python Numbers'] = 1000000000.0, sample: float = 0.632, max_depth: int = 5, min_samples_leaf: int = 1, min_info_gain: Annotated[int | float | Decimal, 'Python Numbers'] = 0.0, nbins: int = 32)¶
Creates a
RandomForestRegressor
object using the Vertica RF_REGRESSOR function. It is an ensemble learning method for regression that operates by constructing a multitude of decision trees at training-time and outputting a class with the mode.Parameters¶
- name: str, optional
Name of the model. The model is stored in the DB.
- overwrite_model: bool, optional
If set to
True
, training a model with the same name as an existing model overwrites the existing model.- n_estimators: int, optional
The number of trees in the forest, an
integer
between1
and1000
, inclusive.- max_features: int | str, optional
The number of randomly chosen features from which to pick the best feature to split a given tree node. It can be an
integer
or one of the two following methods.- auto:
square root of the total number of predictors.
- max :
number of predictors.
- max_leaf_nodes: PythonNumber, optional
The maximum number of leaf nodes for a tree in the forest, an
integerv between ``1
and1e9
, inclusive.- sample: float, optional
The portion of the input data set that is randomly selected for training each tree, a
float
between0.0
and1.0
, inclusive.- max_depth: int, optional
aximum depth of each tree, an
integer
between1
and100
, inclusive.- min_samples_leaf: int, optional
The minimum number of samples each branch must have after splitting a node, an
integer
between1
and1e6
, inclusive. A split that results in remaining samples less than this value is discarded.- min_info_gain: PythonNumber, optional
The minimum threshold for including a split, a
float
between0.0
and1.0
, inclusive. A split with information gain less than this threshold is discarded.- nbins: int, optional
Number of bins used to find splits in each column, where more splits leads to a longer runtime but more fine-grained, possibly better splits. Must be an
integer
between2
and1000
, inclusive.
Attributes¶
Many attributes are created during the fitting phase.
- trees_: list of BinaryTreeRegressor
Tree models are instances of `
BinaryTreeRegressor
, each possessing various attributes. For more detailed information, refer to the documentation forBinaryTreeRegressor
.- features_importance_: numpy.array
The importance of features. It is calculated using the MDI (Mean Decreased Impurity). To determine the final score, VerticaPy sums the scores of each tree, normalizes them and applies an activation function to scale them. It is necessary to use the
features_importance()
method to compute it initially, and the computed values will be subsequently utilized for subsequent calls.- features_importance_trees_: dict of numpy.array
Each element of the array represents the feature importance of tree i. The importance of features is calculated using the MDI (Mean Decreased Impurity). It is necessary to use the
features_importance()
method to compute it initially, and the computed values will be subsequently utilized for subsequent calls.- n_estimators_: int
The number of model estimators.
Note
All attributes can be accessed using the
get_attributes()
method.Note
Several other attributes can be accessed by using the
get_vertica_attributes()
method.Examples¶
The following examples provide a basic understanding of usage. For more detailed examples, please refer to the Machine Learning or the Examples section on the website.
Important
Many tree-based models inherit from the
RandomForest
base class, and it’s recommended to use it directly for access to a wider range of options.Load data for machine learning¶
We import
verticapy
:import verticapy as vp
Hint
By assigning an alias to
verticapy
, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions fromverticapy
are used as intended without interfering with functions from other libraries.For this example, we will use the winequality dataset.
import verticapy.datasets as vpd data = vpd.load_winequality()
123fixed_acidity123volatile_acidity123citric_acid123residual_sugar123chlorides123free_sulfur_dioxide123total_sulfur_dioxide123density123pH123sulphates123alcohol123quality123goodAbccolor1 3.8 0.31 0.02 11.1 0.036 20.0 114.0 0.99248 3.75 0.44 12.4 6 0 white 2 3.9 0.225 0.4 4.2 0.03 29.0 118.0 0.989 3.57 0.36 12.8 8 1 white 3 4.2 0.17 0.36 1.8 0.029 93.0 161.0 0.98999 3.65 0.89 12.0 7 1 white 4 4.2 0.215 0.23 5.1 0.041 64.0 157.0 0.99688 3.42 0.44 8.0 3 0 white 5 4.4 0.32 0.39 4.3 0.03 31.0 127.0 0.98904 3.46 0.36 12.8 8 1 white 6 4.4 0.46 0.1 2.8 0.024 31.0 111.0 0.98816 3.48 0.34 13.1 6 0 white 7 4.4 0.54 0.09 5.1 0.038 52.0 97.0 0.99022 3.41 0.4 12.2 7 1 white 8 4.5 0.19 0.21 0.95 0.033 89.0 159.0 0.99332 3.34 0.42 8.0 5 0 white 9 4.6 0.445 0.0 1.4 0.053 11.0 178.0 0.99426 3.79 0.55 10.2 5 0 white 10 4.6 0.52 0.15 2.1 0.054 8.0 65.0 0.9934 3.9 0.56 13.1 4 0 red 11 4.7 0.145 0.29 1.0 0.042 35.0 90.0 0.9908 3.76 0.49 11.3 6 0 white 12 4.7 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.5 5 0 white 13 4.7 0.455 0.18 1.9 0.036 33.0 106.0 0.98746 3.21 0.83 14.0 7 1 white 14 4.7 0.6 0.17 2.3 0.058 17.0 106.0 0.9932 3.85 0.6 12.9 6 0 red 15 4.7 0.67 0.09 1.0 0.02 5.0 9.0 0.98722 3.3 0.34 13.6 5 0 white 16 4.7 0.785 0.0 3.4 0.036 23.0 134.0 0.98981 3.53 0.92 13.8 6 0 white 17 4.8 0.13 0.32 1.2 0.042 40.0 98.0 0.9898 3.42 0.64 11.8 7 1 white 18 4.8 0.17 0.28 2.9 0.03 22.0 111.0 0.9902 3.38 0.34 11.3 7 1 white 19 4.8 0.21 0.21 10.2 0.037 17.0 112.0 0.99324 3.66 0.48 12.2 7 1 white 20 4.8 0.225 0.38 1.2 0.074 47.0 130.0 0.99132 3.31 0.4 10.3 6 0 white 21 4.8 0.26 0.23 10.6 0.034 23.0 111.0 0.99274 3.46 0.28 11.5 7 1 white 22 4.8 0.29 0.23 1.1 0.044 38.0 180.0 0.98924 3.28 0.34 11.9 6 0 white 23 4.8 0.33 0.0 6.5 0.028 34.0 163.0 0.9937 3.35 0.61 9.9 5 0 white 24 4.8 0.34 0.0 6.5 0.028 33.0 163.0 0.9939 3.36 0.61 9.9 6 0 white 25 4.8 0.65 0.12 1.1 0.013 4.0 10.0 0.99246 3.32 0.36 13.5 4 0 white 26 4.9 0.235 0.27 11.75 0.03 34.0 118.0 0.9954 3.07 0.5 9.4 6 0 white 27 4.9 0.33 0.31 1.2 0.016 39.0 150.0 0.98713 3.33 0.59 14.0 8 1 white 28 4.9 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.4666666666667 5 0 white 29 4.9 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.4666666666667 5 0 white 30 4.9 0.345 0.34 1.0 0.068 32.0 143.0 0.99138 3.24 0.4 10.1 5 0 white 31 4.9 0.345 0.34 1.0 0.068 32.0 143.0 0.99138 3.24 0.4 10.1 5 0 white 32 4.9 0.42 0.0 2.1 0.048 16.0 42.0 0.99154 3.71 0.74 14.0 7 1 red 33 4.9 0.47 0.17 1.9 0.035 60.0 148.0 0.98964 3.27 0.35 11.5 6 0 white 34 5.0 0.17 0.56 1.5 0.026 24.0 115.0 0.9906 3.48 0.39 10.8 7 1 white 35 5.0 0.2 0.4 1.9 0.015 20.0 98.0 0.9897 3.37 0.55 12.05 6 0 white 36 5.0 0.235 0.27 11.75 0.03 34.0 118.0 0.9954 3.07 0.5 9.4 6 0 white 37 5.0 0.24 0.19 5.0 0.043 17.0 101.0 0.99438 3.67 0.57 10.0 5 0 white 38 5.0 0.24 0.21 2.2 0.039 31.0 100.0 0.99098 3.69 0.62 11.7 6 0 white 39 5.0 0.24 0.34 1.1 0.034 49.0 158.0 0.98774 3.32 0.32 13.1 7 1 white 40 5.0 0.255 0.22 2.7 0.043 46.0 153.0 0.99238 3.75 0.76 11.3 6 0 white 41 5.0 0.27 0.32 4.5 0.032 58.0 178.0 0.98956 3.45 0.31 12.6 7 1 white 42 5.0 0.27 0.32 4.5 0.032 58.0 178.0 0.98956 3.45 0.31 12.6 7 1 white 43 5.0 0.27 0.4 1.2 0.076 42.0 124.0 0.99204 3.32 0.47 10.1 6 0 white 44 5.0 0.29 0.54 5.7 0.035 54.0 155.0 0.98976 3.27 0.34 12.9 8 1 white 45 5.0 0.3 0.33 3.7 0.03 54.0 173.0 0.9887 3.36 0.3 13.0 7 1 white 46 5.0 0.31 0.0 6.4 0.046 43.0 166.0 0.994 3.3 0.63 9.9 6 0 white 47 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 48 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 49 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 50 5.0 0.33 0.18 4.6 0.032 40.0 124.0 0.99114 3.18 0.4 11.0 6 0 white 51 5.0 0.33 0.23 11.8 0.03 23.0 158.0 0.99322 3.41 0.64 11.8 6 0 white 52 5.0 0.35 0.25 7.8 0.031 24.0 116.0 0.99241 3.39 0.4 11.3 6 0 white 53 5.0 0.35 0.25 7.8 0.031 24.0 116.0 0.99241 3.39 0.4 11.3 6 0 white 54 5.0 0.38 0.01 1.6 0.048 26.0 60.0 0.99084 3.7 0.75 14.0 6 0 red 55 5.0 0.4 0.5 4.3 0.046 29.0 80.0 0.9902 3.49 0.66 13.6 6 0 red 56 5.0 0.42 0.24 2.0 0.06 19.0 50.0 0.9917 3.72 0.74 14.0 8 1 red 57 5.0 0.44 0.04 18.6 0.039 38.0 128.0 0.9985 3.37 0.57 10.2 6 0 white 58 5.0 0.455 0.18 1.9 0.036 33.0 106.0 0.98746 3.21 0.83 14.0 7 1 white 59 5.0 0.55 0.14 8.3 0.032 35.0 164.0 0.9918 3.53 0.51 12.5 8 1 white 60 5.0 0.61 0.12 1.3 0.009 65.0 100.0 0.9874 3.26 0.37 13.5 5 0 white 61 5.0 0.74 0.0 1.2 0.041 16.0 46.0 0.99258 4.01 0.59 12.5 6 0 red 62 5.0 1.02 0.04 1.4 0.045 41.0 85.0 0.9938 3.75 0.48 10.5 4 0 red 63 5.0 1.04 0.24 1.6 0.05 32.0 96.0 0.9934 3.74 0.62 11.5 5 0 red 64 5.1 0.11 0.32 1.6 0.028 12.0 90.0 0.99008 3.57 0.52 12.2 6 0 white 65 5.1 0.14 0.25 0.7 0.039 15.0 89.0 0.9919 3.22 0.43 9.2 6 0 white 66 5.1 0.165 0.22 5.7 0.047 42.0 146.0 0.9934 3.18 0.55 9.9 6 0 white 67 5.1 0.21 0.28 1.4 0.047 48.0 148.0 0.99168 3.5 0.49 10.4 5 0 white 68 5.1 0.23 0.18 1.0 0.053 13.0 99.0 0.98956 3.22 0.39 11.5 5 0 white 69 5.1 0.25 0.36 1.3 0.035 40.0 78.0 0.9891 3.23 0.64 12.1 7 1 white 70 5.1 0.26 0.33 1.1 0.027 46.0 113.0 0.98946 3.35 0.43 11.4 7 1 white 71 5.1 0.26 0.34 6.4 0.034 26.0 99.0 0.99449 3.23 0.41 9.2 6 0 white 72 5.1 0.29 0.28 8.3 0.026 27.0 107.0 0.99308 3.36 0.37 11.0 6 0 white 73 5.1 0.29 0.28 8.3 0.026 27.0 107.0 0.99308 3.36 0.37 11.0 6 0 white 74 5.1 0.3 0.3 2.3 0.048 40.0 150.0 0.98944 3.29 0.46 12.2 6 0 white 75 5.1 0.305 0.13 1.75 0.036 17.0 73.0 0.99 3.4 0.51 12.3333333333333 5 0 white 76 5.1 0.31 0.3 0.9 0.037 28.0 152.0 0.992 3.54 0.56 10.1 6 0 white 77 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 78 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 79 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 80 5.1 0.33 0.27 6.7 0.022 44.0 129.0 0.99221 3.36 0.39 11.0 7 1 white 81 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 82 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 83 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 84 5.1 0.39 0.21 1.7 0.027 15.0 72.0 0.9894 3.5 0.45 12.5 6 0 white 85 5.1 0.42 0.0 1.8 0.044 18.0 88.0 0.99157 3.68 0.73 13.6 7 1 red 86 5.1 0.42 0.01 1.5 0.017 25.0 102.0 0.9894 3.38 0.36 12.3 7 1 white 87 5.1 0.47 0.02 1.3 0.034 18.0 44.0 0.9921 3.9 0.62 12.8 6 0 red 88 5.1 0.51 0.18 2.1 0.042 16.0 101.0 0.9924 3.46 0.87 12.9 7 1 red 89 5.1 0.52 0.06 2.7 0.052 30.0 79.0 0.9932 3.32 0.43 9.3 5 0 white 90 5.1 0.585 0.0 1.7 0.044 14.0 86.0 0.99264 3.56 0.94 12.9 7 1 red 91 5.2 0.155 0.33 1.6 0.028 13.0 59.0 0.98975 3.3 0.84 11.9 8 1 white 92 5.2 0.155 0.33 1.6 0.028 13.0 59.0 0.98975 3.3 0.84 11.9 8 1 white 93 5.2 0.16 0.34 0.8 0.029 26.0 77.0 0.99155 3.25 0.51 10.1 6 0 white 94 5.2 0.17 0.27 0.7 0.03 11.0 68.0 0.99218 3.3 0.41 9.8 5 0 white 95 5.2 0.185 0.22 1.0 0.03 47.0 123.0 0.99218 3.55 0.44 10.15 6 0 white 96 5.2 0.2 0.27 3.2 0.047 16.0 93.0 0.99235 3.44 0.53 10.1 7 1 white 97 5.2 0.21 0.31 1.7 0.048 17.0 61.0 0.98953 3.24 0.37 12.0 7 1 white 98 5.2 0.22 0.46 6.2 0.066 41.0 187.0 0.99362 3.19 0.42 9.73333333333333 5 0 white 99 5.2 0.24 0.15 7.1 0.043 32.0 134.0 0.99378 3.24 0.48 9.9 6 0 white 100 5.2 0.24 0.45 3.8 0.027 21.0 128.0 0.992 3.55 0.49 11.2 8 1 white Rows: 1-100 | Columns: 14Note
VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.
You can easily divide your dataset into training and testing subsets using the
vDataFrame.
train_test_split()
method. This is a crucial step when preparing your data for machine learning, as it allows you to evaluate the performance of your models accurately.data = vpd.load_winequality() train, test = data.train_test_split(test_size = 0.2)
Warning
In this case, VerticaPy utilizes seeded randomization to guarantee the reproducibility of your data split. However, please be aware that this approach may lead to reduced performance. For a more efficient data split, you can use the
vDataFrame.
to_db()
method to save your results intotables
ortemporary tables
. This will help enhance the overall performance of the process.Model Initialization¶
First we import the
RandomForestRegressor
model:from verticapy.machine_learning.vertica import RandomForestRegressor
Then we can create the model:
model = RandomForestRegressor( max_features = "auto", max_leaf_nodes = 32, sample = 0.5, max_depth = 3, min_samples_leaf = 5, min_info_gain = 0.0, nbins = 32, )
Hint
In
verticapy
1.0.x and higher, you do not need to specify the model name, as the name is automatically assigned. If you need to re-use the model, you can fetch the model name from the model’s attributes.Important
The model name is crucial for the model management system and versioning. It’s highly recommended to provide a name if you plan to reuse the model later.
Model Training¶
We can now fit the model:
model.fit( train, [ "fixed_acidity", "volatile_acidity", "citric_acid", "residual_sugar", "chlorides", "density", ], "quality", test, ) =========== call_string =========== SELECT rf_regressor('"public"."_verticapy_tmp_randomforestregressor_v_demo_d1d33a5455a311ef880f0242ac120002_"', '"public"."_verticapy_tmp_view_v_demo_d1e048de55a311ef880f0242ac120002_"', 'quality', '"fixed_acidity", "volatile_acidity", "citric_acid", "residual_sugar", "chlorides", "density"' USING PARAMETERS exclude_columns='', ntree=10, mtry=3, sampling_size=0.5, max_depth=3, max_breadth=32, min_leaf_size=5, min_info_gain=0, nbins=32); ======= details ======= predictor | type ----------------+---------------- fixed_acidity |float or numeric volatile_acidity|float or numeric citric_acid |float or numeric residual_sugar |float or numeric chlorides |float or numeric density |float or numeric =============== Additional Info =============== Name |Value ------------------+----- tree_count | 10 rejected_row_count| 0 accepted_row_count|5192
Important
To train a model, you can directly use the
vDataFrame
or the name of the relation stored in the database. The test set is optional and is only used to compute the test metrics. Inverticapy
, we don’t work usingX
matrices andy
vectors. Instead, we work directly with lists of predictors and the response name.Features Importance¶
We can conveniently get the features importance:
result = model.features_importance()
Note
In models such as
RandomForest
, feature importance is calculated using the MDI (Mean Decreased Impurity). To determine the final score, VerticaPy sums the scores of each tree, normalizes them and applies an activation function to scale them.Metrics¶
We can get the entire report using:
model.report()
value explained_variance 0.172685468333178 max_error 2.81184858072756 median_absolute_error 0.568571830498591 mean_absolute_error 0.624029016229468 mean_squared_error 0.610582506103535 root_mean_squared_error 0.78139778992747 r2 0.171809995860813 r2_adj 0.167981690756934 aic -629.649201401192 bic -593.593405267752 Rows: 1-10 | Columns: 2Important
Most metrics are computed using a single SQL query, but some of them might require multiple SQL queries. Selecting only the necessary metrics in the report can help optimize performance. E.g.
model.report(metrics = ["mse", "r2"])
.You can utilize the
score()
function to calculate various regression metrics, with the R-squared being the default.model.score() Out[4]: 0.171809995860813
Prediction¶
Prediction is straight-forward:
model.predict( test, [ "fixed_acidity", "volatile_acidity", "citric_acid", "residual_sugar", "chlorides", "density", ], "prediction", )
123fixed_acidity123volatile_acidity123citric_acid123residual_sugar123chlorides123free_sulfur_dioxide123total_sulfur_dioxide123density123pH123sulphates123alcohol123quality123goodAbccolor123prediction1 3.8 0.31 0.02 11.1 0.036 20.0 114.0 0.99248 3.75 0.44 12.4 6 0 white 5.64612149752263 2 4.4 0.32 0.39 4.3 0.03 31.0 127.0 0.98904 3.46 0.36 12.8 8 1 white 6.40536563044835 3 4.4 0.54 0.09 5.1 0.038 52.0 97.0 0.99022 3.41 0.4 12.2 7 1 white 6.42153335103855 4 4.5 0.19 0.21 0.95 0.033 89.0 159.0 0.99332 3.34 0.42 8.0 5 0 white 5.65991849887665 5 4.7 0.145 0.29 1.0 0.042 35.0 90.0 0.9908 3.76 0.49 11.3 6 0 white 6.22059586371688 6 4.7 0.6 0.17 2.3 0.058 17.0 106.0 0.9932 3.85 0.6 12.9 6 0 red 5.27183283829591 7 4.8 0.21 0.21 10.2 0.037 17.0 112.0 0.99324 3.66 0.48 12.2 7 1 white 5.71383655348769 8 4.9 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.4666666666667 5 0 white 5.61420883859505 9 4.9 0.42 0.0 2.1 0.048 16.0 42.0 0.99154 3.71 0.74 14.0 7 1 red 5.82728618790732 10 5.0 0.24 0.21 2.2 0.039 31.0 100.0 0.99098 3.69 0.62 11.7 6 0 white 6.23676358430708 11 5.0 0.27 0.32 4.5 0.032 58.0 178.0 0.98956 3.45 0.31 12.6 7 1 white 6.40536563044835 12 5.0 0.27 0.4 1.2 0.076 42.0 124.0 0.99204 3.32 0.47 10.1 6 0 white 5.68000591388113 13 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 5.86540259296358 14 5.0 0.33 0.18 4.6 0.032 40.0 124.0 0.99114 3.18 0.4 11.0 6 0 white 6.38927732246673 15 5.0 0.35 0.25 7.8 0.031 24.0 116.0 0.99241 3.39 0.4 11.3 6 0 white 5.77099544710246 16 5.0 0.35 0.25 7.8 0.031 24.0 116.0 0.99241 3.39 0.4 11.3 6 0 white 5.77099544710246 17 5.0 0.44 0.04 18.6 0.039 38.0 128.0 0.9985 3.37 0.57 10.2 6 0 white 5.62828607302533 18 5.0 0.455 0.18 1.9 0.036 33.0 106.0 0.98746 3.21 0.83 14.0 7 1 white 6.31622061319733 19 5.0 0.74 0.0 1.2 0.041 16.0 46.0 0.99258 4.01 0.59 12.5 6 0 red 5.51864670498941 20 5.1 0.14 0.25 0.7 0.039 15.0 89.0 0.9919 3.22 0.43 9.2 6 0 white 6.23676358430708 21 5.1 0.25 0.36 1.3 0.035 40.0 78.0 0.9891 3.23 0.64 12.1 7 1 white 6.30005289260713 22 5.1 0.29 0.28 8.3 0.026 27.0 107.0 0.99308 3.36 0.37 11.0 6 0 white 5.97294677993842 23 5.1 0.29 0.28 8.3 0.026 27.0 107.0 0.99308 3.36 0.37 11.0 6 0 white 5.97294677993842 24 5.1 0.3 0.3 2.3 0.048 40.0 150.0 0.98944 3.29 0.46 12.2 6 0 white 5.99506581279796 25 5.1 0.31 0.3 0.9 0.037 28.0 152.0 0.992 3.54 0.56 10.1 6 0 white 5.94103412101083 26 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 6.31622061319733 27 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 6.31622061319733 28 5.1 0.33 0.27 6.7 0.022 44.0 129.0 0.99221 3.36 0.39 11.0 7 1 white 5.94103412101083 29 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 6.45057781635967 30 5.1 0.39 0.21 1.7 0.027 15.0 72.0 0.9894 3.5 0.45 12.5 6 0 white 6.31622061319733 31 5.1 0.51 0.18 2.1 0.042 16.0 101.0 0.9924 3.46 0.87 12.9 7 1 red 5.7105812119026 32 5.1 0.52 0.06 2.7 0.052 30.0 79.0 0.9932 3.32 0.43 9.3 5 0 white 5.45053454938001 33 5.1 0.585 0.0 1.7 0.044 14.0 86.0 0.99264 3.56 0.94 12.9 7 1 red 5.60502448276719 34 5.2 0.155 0.33 1.6 0.028 13.0 59.0 0.98975 3.3 0.84 11.9 8 1 white 6.30005289260713 35 5.2 0.22 0.46 6.2 0.066 41.0 187.0 0.99362 3.19 0.42 9.73333333333333 5 0 white 5.87022924082098 36 5.2 0.24 0.15 7.1 0.043 32.0 134.0 0.99378 3.24 0.48 9.9 6 0 white 5.62828607302533 37 5.2 0.31 0.36 5.1 0.031 46.0 145.0 0.9897 3.14 0.31 12.4 7 1 white 6.40536563044835 38 5.2 0.44 0.04 1.4 0.036 43.0 119.0 0.9894 3.36 0.33 12.1 8 1 white 6.31622061319733 39 5.2 0.645 0.0 2.15 0.08 15.0 28.0 0.99444 3.78 0.61 12.5 6 0 red 5.40530832036396 40 5.3 0.21 0.29 0.7 0.028 11.0 66.0 0.99215 3.3 0.4 9.8 5 0 white 6.04579767817583 41 5.3 0.3 0.16 4.2 0.029 37.0 100.0 0.9905 3.3 0.36 11.8 8 1 white 6.38927732246673 42 5.3 0.31 0.38 10.5 0.031 53.0 140.0 0.99321 3.34 0.46 11.7 6 0 white 5.97294677993842 43 5.3 0.31 0.38 10.5 0.031 53.0 140.0 0.99321 3.34 0.46 11.7 6 0 white 5.97294677993842 44 5.3 0.76 0.03 2.7 0.043 27.0 93.0 0.9932 3.34 0.38 9.2 5 0 white 5.51864670498941 45 5.4 0.15 0.32 2.5 0.037 10.0 51.0 0.98878 3.04 0.58 12.6 6 0 white 6.30005289260713 46 5.4 0.18 0.24 4.8 0.041 30.0 113.0 0.99445 3.42 0.4 9.4 6 0 white 5.72931793535308 47 5.4 0.29 0.38 1.2 0.029 31.0 132.0 0.98895 3.28 0.36 12.4 6 0 white 6.30005289260713 48 5.4 0.31 0.47 3.0 0.053 46.0 144.0 0.9931 3.29 0.76 10.0 5 0 white 5.69830113217635 49 5.4 0.33 0.31 4.0 0.03 27.0 108.0 0.99031 3.3 0.43 12.2 7 1 white 6.40536563044835 50 5.4 0.375 0.4 3.3 0.054 29.0 147.0 0.99482 3.42 0.52 9.1 5 0 white 5.66018472712009 51 5.4 0.46 0.15 2.1 0.026 29.0 130.0 0.98953 3.39 0.77 13.4 8 1 white 6.31622061319733 52 5.4 0.5 0.13 5.0 0.028 12.0 107.0 0.99079 3.48 0.88 13.5 7 1 white 6.38927732246673 53 5.4 0.58 0.08 1.9 0.059 20.0 31.0 0.99484 3.5 0.64 10.2 6 0 red 5.42360353865918 54 5.5 0.12 0.33 1.0 0.038 23.0 131.0 0.99164 3.25 0.45 9.8 5 0 white 6.22059586371688 55 5.5 0.17 0.23 2.9 0.039 10.0 108.0 0.99243 3.28 0.5 10.0 5 0 white 5.78479244845648 56 5.5 0.23 0.19 2.2 0.044 39.0 161.0 0.99209 3.19 0.43 10.4 6 0 white 5.73908278817488 57 5.5 0.335 0.3 2.5 0.071 27.0 128.0 0.9924 3.14 0.51 9.6 6 0 white 5.68000591388113 58 5.5 0.42 0.09 1.6 0.019 18.0 68.0 0.9906 3.33 0.51 11.4 7 1 white 6.23676358430708 59 5.6 0.18 0.27 1.7 0.03 31.0 103.0 0.98892 3.35 0.37 12.9 6 0 white 6.30005289260713 60 5.6 0.18 0.58 1.25 0.034 29.0 129.0 0.98984 3.51 0.6 12.0 7 1 white 6.30005289260713 61 5.6 0.19 0.39 1.1 0.043 17.0 67.0 0.9918 3.23 0.53 10.3 6 0 white 6.22059586371688 62 5.6 0.205 0.16 12.55 0.051 31.0 115.0 0.99564 3.4 0.38 10.8 6 0 white 5.71877843344105 63 5.6 0.21 0.4 1.3 0.041 81.0 147.0 0.9901 3.22 0.95 11.6 8 1 white 6.30005289260713 64 5.6 0.235 0.29 1.2 0.047 33.0 127.0 0.991 3.34 0.5 11.0 7 1 white 5.91560878390772 65 5.6 0.24 0.34 2.0 0.041 14.0 73.0 0.98981 3.04 0.45 11.6 7 1 white 6.30005289260713 66 5.6 0.25 0.26 3.6 0.037 18.0 115.0 0.9904 3.42 0.5 12.6 6 0 white 6.37310960187653 67 5.6 0.255 0.57 10.7 0.056 66.0 171.0 0.99464 3.25 0.61 10.4 7 1 white 5.69830113217635 68 5.6 0.26 0.5 11.4 0.029 25.0 93.0 0.99428 3.23 0.49 10.5 6 0 white 5.83023740586128 69 5.6 0.28 0.27 3.9 0.043 52.0 158.0 0.99202 3.35 0.44 10.7 7 1 white 5.94103412101083 70 5.6 0.29 0.05 0.8 0.038 11.0 30.0 0.9924 3.36 0.35 9.2 5 0 white 5.73908278817488 71 5.6 0.295 0.2 2.2 0.049 18.0 134.0 0.99378 3.21 0.68 10.0 5 0 white 5.51715253070855 72 5.6 0.31 0.78 13.9 0.074 23.0 92.0 0.99677 3.39 0.48 10.5 6 0 red 5.61601680633055 73 5.6 0.35 0.14 5.0 0.046 48.0 198.0 0.9937 3.3 0.71 10.3 5 0 white 5.68360827507148 74 5.6 0.39 0.24 4.7 0.034 27.0 77.0 0.9906 3.28 0.36 12.7 5 0 white 6.38927732246673 75 5.6 0.42 0.34 2.4 0.022 34.0 97.0 0.98915 3.22 0.38 12.8 7 1 white 6.30005289260713 76 5.6 0.46 0.24 4.8 0.042 24.0 72.0 0.9908 3.29 0.37 12.6 6 0 white 6.38927732246673 77 5.6 0.49 0.13 4.5 0.039 17.0 116.0 0.9907 3.42 0.9 13.7 7 1 white 6.38927732246673 78 5.6 0.62 0.03 1.5 0.08 6.0 13.0 0.99498 3.66 0.62 10.1 4 0 red 5.40530832036396 79 5.6 0.66 0.0 2.2 0.087 3.0 11.0 0.99378 3.71 0.63 12.8 7 1 red 5.40530832036396 80 5.7 0.1 0.27 1.3 0.047 21.0 100.0 0.9928 3.27 0.46 9.5 5 0 white 5.8885244591162 81 5.7 0.12 0.26 5.5 0.034 21.0 99.0 0.99324 3.09 0.57 9.9 6 0 white 6.04579767817583 82 5.7 0.16 0.32 1.2 0.036 7.0 89.0 0.99111 3.26 0.48 11.0 5 0 white 6.22059586371688 83 5.7 0.2 0.24 13.8 0.047 44.0 112.0 0.99837 2.97 0.66 8.8 6 0 white 5.71877843344105 84 5.7 0.22 0.2 16.0 0.044 41.0 113.0 0.99862 3.22 0.46 8.9 6 0 white 5.69600112899039 85 5.7 0.22 0.2 16.0 0.044 41.0 113.0 0.99862 3.22 0.46 8.9 6 0 white 5.69600112899039 86 5.7 0.22 0.2 16.0 0.044 41.0 113.0 0.99862 3.22 0.46 8.9 6 0 white 5.69600112899039 87 5.7 0.22 0.2 16.0 0.044 41.0 113.0 0.99862 3.22 0.46 8.9 6 0 white 5.69600112899039 88 5.7 0.22 0.22 16.65 0.044 39.0 110.0 0.99855 3.24 0.48 9.0 6 0 white 5.69600112899039 89 5.7 0.22 0.22 16.65 0.044 39.0 110.0 0.99855 3.24 0.48 9.0 6 0 white 5.69600112899039 90 5.7 0.245 0.33 1.1 0.049 28.0 150.0 0.9927 3.13 0.42 9.3 5 0 white 5.69830113217635 91 5.7 0.26 0.24 17.8 0.059 23.0 124.0 0.99773 3.3 0.5 10.1 5 0 white 5.4782249640689 92 5.7 0.26 0.25 10.4 0.02 7.0 57.0 0.994 3.39 0.37 10.6 5 0 white 5.62828607302533 93 5.7 0.265 0.28 6.9 0.036 46.0 150.0 0.99299 3.36 0.44 10.8 7 1 white 5.97294677993842 94 5.7 0.265 0.28 6.9 0.036 46.0 150.0 0.99299 3.36 0.44 10.8 7 1 white 5.97294677993842 95 5.7 0.32 0.18 1.4 0.029 26.0 104.0 0.9906 3.44 0.37 11.0 6 0 white 6.23676358430708 96 5.7 0.37 0.3 1.1 0.029 24.0 88.0 0.98883 3.18 0.39 11.7 6 0 white 6.30005289260713 97 5.7 0.385 0.04 12.6 0.034 22.0 115.0 0.9964 3.28 0.63 9.9 6 0 white 5.62828607302533 98 5.7 0.46 0.46 1.4 0.04 31.0 169.0 0.9932 3.13 0.47 8.8 5 0 white 5.91253254473856 99 5.8 0.12 0.21 1.3 0.056 35.0 121.0 0.9908 3.32 0.33 11.4 6 0 white 5.93042307241773 100 5.8 0.13 0.26 5.1 0.039 19.0 103.0 0.99478 3.36 0.47 9.3 6 0 white 5.99032316507243 Rows: 1-100 | Columns: 15Note
Predictions can be made automatically using the test set, in which case you don’t need to specify the predictors. Alternatively, you can pass only the
vDataFrame
to thepredict()
function, but in this case, it’s essential that the column names of thevDataFrame
match the predictors and response name in the model.Plots¶
Tree models can be visualized by drawing their tree plots. For more examples, check out Machine Learning - Tree Plots.
model.plot_tree()
Note
The above example may not render properly in the doc because of the huge size of the tree. But it should render nicely in jupyter environment.
In order to plot graph using graphviz separately, you can extract the graphviz DOT file code as follows:
model.to_graphviz() Out[5]: 'digraph Tree {\ngraph [bgcolor="#FFFFFFDD"];\n0 [label="\\"chlorides\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n0 -> 1 [label="<= 0.046625", color="#000000", fontcolor="#000000"]\n0 -> 2 [label="> 0.046625", color="#000000", fontcolor="#000000"]\n1 [label="\\"fixed_acidity\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n1 -> 3 [label="<= 7.55625", color="#000000", fontcolor="#000000"]\n1 -> 4 [label="> 7.55625", color="#000000", fontcolor="#000000"]\n2 [label="\\"chlorides\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n2 -> 5 [label="<= 0.065437", color="#000000", fontcolor="#000000"]\n2 -> 6 [label="> 0.065437", color="#000000", fontcolor="#000000"]\n3 [label="\\"volatile_acidity\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n3 -> 7 [label="<= 0.595625", color="#000000", fontcolor="#000000"]\n3 -> 8 [label="> 0.595625", color="#000000", fontcolor="#000000"]\n4 [label="\\"density\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n4 -> 9 [label="<= 0.991991", color="#000000", fontcolor="#000000"]\n4 -> 10 [label="> 0.991991", color="#000000", fontcolor="#000000"]\n5 [label="\\"fixed_acidity\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n5 -> 11 [label="<= 6.459375", color="#000000", fontcolor="#000000"]\n5 -> 12 [label="> 6.459375", color="#000000", fontcolor="#000000"]\n6 [label="\\"fixed_acidity\\"", shape="box", style="filled", fillcolor="#FFFFFFDD", fontcolor="#000000", color="#000000"]\n6 -> 13 [label="<= 9.384375", color="#000000", fontcolor="#000000"]\n6 -> 14 [label="> 9.384375", color="#000000", fontcolor="#000000"]\n7 [label="6.086", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n8 [label="5.222222", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n9 [label="6.058824", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n10 [label="5.586592", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n11 [label="5.567568", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n12 [label="5.728543", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n13 [label="5.384615", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n14 [label="5.832117", fillcolor="#FFFFFFDD", fontcolor="#000000", shape="none", color="#000000"]\n}'
This string can then be copied into a DOT file which can beparsed by graphviz.
Contour plot is another useful plot that can be produced for models with two predictors.
model.contour()
Important
Machine learning models with two predictors can usually benefit from their own contour plot. This visual representation aids in exploring predictions and gaining a deeper understanding of how these models perform in different scenarios. Please refer to Contour Plot for more examples.
Model Register¶
In order to register the model for tracking and versioning:
model.register("model_v1")
Please refer to Model Tracking and Versioning for more details on model tracking and versioning.
Model Exporting¶
To Memmodel
model.to_memmodel()
Note
MemModel
objects serve as in-memory representations of machine learning models. They can be used for both in-database and in-memory prediction tasks. These objects can be pickled in the same way that you would pickle ascikit-learn
model.The following methods for exporting the model use
MemModel
, and it is recommended to useMemModel
directly.To SQL
You can get the SQL code by:
model.to_sql() Out[6]: '((CASE WHEN "chlorides" < 0.046625 THEN (CASE WHEN "fixed_acidity" < 7.55625 THEN (CASE WHEN "volatile_acidity" < 0.595625 THEN 6.086 ELSE 5.222222 END) ELSE (CASE WHEN "density" < 0.991991 THEN 6.058824 ELSE 5.586592 END) END) ELSE (CASE WHEN "chlorides" < 0.065437 THEN (CASE WHEN "fixed_acidity" < 6.459375 THEN 5.567568 ELSE 5.728543 END) ELSE (CASE WHEN "fixed_acidity" < 9.384375 THEN 5.384615 ELSE 5.832117 END) END) END) + (CASE WHEN "volatile_acidity" < 0.54875 THEN (CASE WHEN "chlorides" < 0.046625 THEN (CASE WHEN "density" < 0.993611 THEN 6.257802 ELSE 5.703057 END) ELSE (CASE WHEN "citric_acid" < 0.259375 THEN 5.471545 ELSE 5.750323 END) END) ELSE (CASE WHEN "fixed_acidity" < 10.846875 THEN (CASE WHEN "citric_acid" < 0.466875 THEN 5.27 ELSE 4.833333 END) ELSE 6.0 END) END) + (CASE WHEN "density" < 0.991991 THEN (CASE WHEN "residual_sugar" < 2.6375 THEN (CASE WHEN "volatile_acidity" < 0.501875 THEN 6.236413 ELSE 5.25 END) ELSE (CASE WHEN "density" < 0.990371 THEN 6.87037 ELSE 6.492308 END) END) ELSE (CASE WHEN "chlorides" < 0.046625 THEN (CASE WHEN "citric_acid" < 0.259375 THEN 5.529412 ELSE 5.941957 END) ELSE (CASE WHEN "volatile_acidity" < 0.220625 THEN 5.934783 ELSE 5.506824 END) END) END) + (CASE WHEN "density" < 0.991991 THEN (CASE WHEN "residual_sugar" < 2.6375 THEN (CASE WHEN "fixed_acidity" < 6.825 THEN 6.324444 ELSE 6.0 END) ELSE (CASE WHEN "residual_sugar" < 6.7125 THEN 6.6 ELSE 7.055556 END) END) ELSE (CASE WHEN "volatile_acidity" < 0.220625 THEN (CASE WHEN "residual_sugar" < 8.75 THEN 5.942446 ELSE 6.1625 END) ELSE (CASE WHEN "citric_acid" < 0.259375 THEN 5.408784 ELSE 5.675676 END) END) END) + (CASE WHEN "density" < 0.991991 THEN (CASE WHEN "density" < 0.990371 THEN (CASE WHEN "volatile_acidity" < 0.54875 THEN 6.526316 ELSE 5.666667 END) ELSE (CASE WHEN "residual_sugar" < 2.6375 THEN 6.013699 ELSE 6.581818 END) END) ELSE (CASE WHEN "citric_acid" < 0.259375 THEN (CASE WHEN "fixed_acidity" < 4.996875 THEN 4.2 ELSE 5.448739 END) ELSE (CASE WHEN "volatile_acidity" < 0.220625 THEN 6.044619 ELSE 5.6537 END) END) END) + (CASE WHEN "chlorides" < 0.046625 THEN (CASE WHEN "fixed_acidity" < 7.55625 THEN (CASE WHEN "density" < 0.991991 THEN 6.418708 ELSE 5.890815 END) ELSE (CASE WHEN "fixed_acidity" < 9.384375 THEN 5.724138 ELSE 4.941176 END) END) ELSE (CASE WHEN "citric_acid" < 0.259375 THEN (CASE WHEN "volatile_acidity" < 0.970625 THEN 5.404255 ELSE 4.090909 END) ELSE (CASE WHEN "density" < 0.995232 THEN 5.898601 ELSE 5.647986 END) END) END) + (CASE WHEN "density" < 0.991991 THEN (CASE WHEN "residual_sugar" < 2.6375 THEN (CASE WHEN "density" < 0.990371 THEN 6.373239 ELSE 6.091286 END) ELSE 6.516854 END) ELSE (CASE WHEN "volatile_acidity" < 0.220625 THEN (CASE WHEN "citric_acid" < 0.259375 THEN 5.58 ELSE 6.046512 END) ELSE (CASE WHEN "volatile_acidity" < 0.455 THEN 5.656566 ELSE 5.37155 END) END) END) + (CASE WHEN "chlorides" < 0.046625 THEN (CASE WHEN "residual_sugar" < 6.7125 THEN (CASE WHEN "fixed_acidity" < 7.55625 THEN 6.21791 ELSE 5.761905 END) ELSE (CASE WHEN "density" < 0.993611 THEN 6.537037 ELSE 5.664688 END) END) ELSE (CASE WHEN "volatile_acidity" < 0.220625 THEN (CASE WHEN "fixed_acidity" < 8.2875 THEN 6.040724 ELSE 5.666667 END) ELSE (CASE WHEN "citric_acid" < 0.259375 THEN 5.390519 ELSE 5.614085 END) END) END) + (CASE WHEN "chlorides" < 0.046625 THEN (CASE WHEN "density" < 0.991991 THEN (CASE WHEN "fixed_acidity" < 8.2875 THEN 6.426501 ELSE 5.363636 END) ELSE (CASE WHEN "citric_acid" < 0.259375 THEN 5.4375 ELSE 5.896721 END) END) ELSE (CASE WHEN "density" < 0.995232 THEN (CASE WHEN "volatile_acidity" < 0.36125 THEN 5.859425 ELSE 5.478261 END) ELSE (CASE WHEN "volatile_acidity" < 0.220625 THEN 6.12037 ELSE 5.470149 END) END) END) + (CASE WHEN "volatile_acidity" < 0.54875 THEN (CASE WHEN "citric_acid" < 0.259375 THEN (CASE WHEN "density" < 0.991991 THEN 6.294872 ELSE 5.4573 END) ELSE (CASE WHEN "chlorides" < 0.046625 THEN 6.133195 ELSE 5.800244 END) END) ELSE (CASE WHEN "density" < 0.991991 THEN (CASE WHEN "residual_sugar" < 2.6375 THEN 5.375 ELSE 6.666667 END) ELSE (CASE WHEN "citric_acid" < 0.155625 THEN 5.389535 ELSE 5.120567 END) END) END)) / 10'
To Python
To obtain the prediction function in Python syntax, use the following code:
X = [[4.2, 0.17, 0.36, 1.8, 0.029, 0.9899]] model.to_python()(X) Out[8]: array([6.3000528])
Hint
The
to_python()
method is used to retrieve predictions, probabilities, or cluster distances. For specific details on how to use this method for different model types, refer to the relevant documentation for each model.- __init__(name: str = None, overwrite_model: bool = False, n_estimators: int = 10, max_features: Literal['auto', 'max'] | int = 'auto', max_leaf_nodes: Annotated[int | float | Decimal, 'Python Numbers'] = 1000000000.0, sample: float = 0.632, max_depth: int = 5, min_samples_leaf: int = 1, min_info_gain: Annotated[int | float | Decimal, 'Python Numbers'] = 0.0, nbins: int = 32) None ¶
Must be overridden in the child class
Methods
__init__
([name, overwrite_model, ...])Must be overridden in the child class
contour
([nbins, chart])Draws the model's contour plot.
deploySQL
([X])Returns the SQL code needed to deploy the model.
does_model_exists
(name[, raise_error, ...])Checks whether the model is stored in the Vertica database.
drop
()Drops the model from the Vertica database.
export_models
(name, path[, kind])Exports machine learning models.
features_importance
([tree_id, show, chart])Computes the model's features importance.
fit
(input_relation, X, y[, test_relation, ...])Trains the model.
get_attributes
([attr_name])Returns the model attributes.
get_match_index
(x, col_list[, str_check])Returns the matching index.
Returns the parameters of the model.
get_plotting_lib
([class_name, chart, ...])Returns the first available library (Plotly, Matplotlib, or Highcharts) to draw a specific graphic.
get_score
([tree_id])Returns the feature importance metrics for the input tree.
get_tree
([tree_id])Returns a table with all the input tree information.
get_vertica_attributes
([attr_name])Returns the model Vertica attributes.
import_models
(path[, schema, kind])Imports machine learning models.
plot
([max_nb_points, chart])Draws the model.
plot_tree
([tree_id, pic_path])Draws the input tree.
predict
(vdf[, X, name, inplace])Predicts using the input relation.
register
(registered_name[, raise_error])Registers the model and adds it to in-DB Model versioning environment with a status of 'under_review'.
regression_report
([metrics])Computes a regression report
report
([metrics])Computes a regression report
score
([metric])Computes the model score.
set_params
([parameters])Sets the parameters of the model.
Summarizes the model.
to_binary
(path)Exports the model to the Vertica Binary format.
to_graphviz
([tree_id, classes_color, ...])Returns the code for a Graphviz tree.
Converts the model to an InMemory object that can be used for different types of predictions.
to_pmml
(path)Exports the model to PMML.
to_python
([return_proba, ...])Returns the Python function needed for in-memory scoring without using built-in Vertica functions.
to_sql
([X, return_proba, ...])Returns the SQL code needed to deploy the model without using built-in Vertica functions.
to_tf
(path)Exports the model to the Frozen Graph format (TensorFlow).
Attributes