
verticapy.machine_learning.model_selection.hp_tuning.enet_search_cv¶
- verticapy.machine_learning.model_selection.hp_tuning.enet_search_cv(input_relation: Annotated[str | vDataFrame, ''], X: Annotated[str | list[str], 'STRING representing one column or a list of columns'], y: str, metric: str = 'auto', cv: int = 3, estimator_type: Literal['logit', 'enet', 'auto'] = 'auto', cutoff: float = -1.0, print_info: bool = True, **kwargs) TableSample ¶
Computes the k-fold grid search using multiple ENet models.
- input_relation: SQLRelation
Relation used to train the model.
- X: SQLColumns
list
of the predictor columns.- y: str
Response Column.
- metric: str, optional
Metric used for the model evaluation.
- auto:
logloss for classification & RMSE for regression.
For Classification
- accuracy:
Accuracy.
\[Accuracy = \frac{TP + TN}{TP + TN + FP + FN}\]
- auc:
Area Under the Curve (ROC).
\[AUC = \int_{0}^{1} TPR(FPR) \, dFPR\]
- ba:
Balanced Accuracy.
\[BA = \frac{TPR + TNR}{2}\]
- bm:
Informedness
\[BM = TPR + TNR - 1\]
- csi:
Critical Success Index
\[index = \frac{TP}{TP + FN + FP}\]
- f1:
F1 Score .. math:
F_1 Score = 2 \times
rac{Precision times Recall}{Precision + Recall}
- fdr:
False Discovery Rate
\[FDR = 1 - PPV\]
- fm:
Fowlkes-Mallows index
\[FM = \sqrt{PPV * TPR}\]
- fnr:
False Negative Rate
\[FNR = \frac{FN}{FN + TP}\]
- for:
False Omission Rate
\[FOR = 1 - NPV\]
- fpr:
False Positive Rate
\[FPR = \frac{FP}{FP + TN}\]
- logloss:
Log Loss
\[Loss = -\frac{1}{N} \sum_{i=1}^{N} \left( y_i \log(p_i) + (1 - y_i) \log(1 - p_i) \right)\]
- lr+:
Positive Likelihood Ratio.
\[LR+ = \frac{TPR}{FPR}\]
- lr-:
Negative Likelihood Ratio.
\[LR- = \frac{FNR}{TNR}\]
- dor:
Diagnostic Odds Ratio.
\[DOR = \frac{TP \times TN}{FP \times FN}\]
- mcc:
Matthews Correlation Coefficient
- mk:
Markedness
\[MK = PPV + NPV - 1\]
- npv:
Negative Predictive Value
\[NPV = \frac{TN}{TN + FN}\]
- prc_auc:
Area Under the Curve (PRC)
\[AUC = \int_{0}^{1} Precision(Recall) \, dRecall\]
- precision:
Precision
\[TP / (TP + FP)\]
- pt:
Prevalence Threshold.
\[\frac{\sqrt{FPR}}{\sqrt{TPR} + \sqrt{FPR}}\]
- recall:
Recall.
\[TP / (TP + FN)\]
- specificity:
Specificity.
\[TN / (TN + FP)\]
For Regression
- max:
Max Error.
\[ME = \max_{i=1}^{n} \left| y_i - \hat{y}_i \right|\]
- mae:
Mean Absolute Error.
\[MAE = \frac{1}{n} \sum_{i=1}^{n} \left| y_i - \hat{y}_i \right|\]
- median:
Median Absolute Error.
\[MedAE = \text{median}_{i=1}^{n} \left| y_i - \hat{y}_i \right|\]
- mse:
Mean Squared Error.
\[MSE = \frac{1}{n} \sum_{i=1}^{n} \left( y_i - \hat{y}_i \right)^2\]
- msle:
Mean Squared Log Error.
\[MSLE = \frac{1}{n} \sum_{i=1}^{n} (\log(1 + y_i) - \log(1 + \hat{y}_i))^2\]
- r2:
R squared coefficient.
\[R^2 = 1 - \frac{\sum_{i=1}^{n} (y_i - \hat{y}_i)^2}{\sum_{i=1}^{n} (y_i - \bar{y})^2}\]
- r2a:
R2 adjusted
\[\text{Adjusted } R^2 = 1 - \frac{(1 - R^2)(n - 1)}{n - k - 1}\]
- var:
Explained Variance.
\[VAR = 1 - \frac{Var(y - \hat{y})}{Var(y)}\]
- rmse:
Root-mean-squared error
\[RMSE = \sqrt{\frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2}\]
- cv: int, optional
Number of folds.
- estimator_type: str, optional
Estimator Type.
- auto:
detects if it is a
LogisticRegression
orElasticNet
.
- logit:
- enet:
- cutoff: float, optional
The model cutoff (logit only).
- print_info: bool, optional
If set to
True
, prints the modelinformation at each step.
- TableSample
result of the ENET search.
We import
verticapy
:import verticapy as vp
Hint
By assigning an alias to
verticapy
, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions fromverticapy
are used as intended without interfering with functions from other libraries.For this example, we will use the Wine Quality dataset.
import verticapy.datasets as vpd data = vpd.load_winequality()
123fixed_acidity123volatile_acidity123citric_acid123residual_sugar123chlorides123free_sulfur_dioxide123total_sulfur_dioxide123density123pH123sulphates123alcohol123quality123goodAbccolor1 3.8 0.31 0.02 11.1 0.036 20.0 114.0 0.99248 3.75 0.44 12.4 6 0 white 2 3.9 0.225 0.4 4.2 0.03 29.0 118.0 0.989 3.57 0.36 12.8 8 1 white 3 4.2 0.17 0.36 1.8 0.029 93.0 161.0 0.98999 3.65 0.89 12.0 7 1 white 4 4.2 0.215 0.23 5.1 0.041 64.0 157.0 0.99688 3.42 0.44 8.0 3 0 white 5 4.4 0.32 0.39 4.3 0.03 31.0 127.0 0.98904 3.46 0.36 12.8 8 1 white 6 4.4 0.46 0.1 2.8 0.024 31.0 111.0 0.98816 3.48 0.34 13.1 6 0 white 7 4.4 0.54 0.09 5.1 0.038 52.0 97.0 0.99022 3.41 0.4 12.2 7 1 white 8 4.5 0.19 0.21 0.95 0.033 89.0 159.0 0.99332 3.34 0.42 8.0 5 0 white 9 4.6 0.445 0.0 1.4 0.053 11.0 178.0 0.99426 3.79 0.55 10.2 5 0 white 10 4.6 0.52 0.15 2.1 0.054 8.0 65.0 0.9934 3.9 0.56 13.1 4 0 red 11 4.7 0.145 0.29 1.0 0.042 35.0 90.0 0.9908 3.76 0.49 11.3 6 0 white 12 4.7 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.5 5 0 white 13 4.7 0.455 0.18 1.9 0.036 33.0 106.0 0.98746 3.21 0.83 14.0 7 1 white 14 4.7 0.6 0.17 2.3 0.058 17.0 106.0 0.9932 3.85 0.6 12.9 6 0 red 15 4.7 0.67 0.09 1.0 0.02 5.0 9.0 0.98722 3.3 0.34 13.6 5 0 white 16 4.7 0.785 0.0 3.4 0.036 23.0 134.0 0.98981 3.53 0.92 13.8 6 0 white 17 4.8 0.13 0.32 1.2 0.042 40.0 98.0 0.9898 3.42 0.64 11.8 7 1 white 18 4.8 0.17 0.28 2.9 0.03 22.0 111.0 0.9902 3.38 0.34 11.3 7 1 white 19 4.8 0.21 0.21 10.2 0.037 17.0 112.0 0.99324 3.66 0.48 12.2 7 1 white 20 4.8 0.225 0.38 1.2 0.074 47.0 130.0 0.99132 3.31 0.4 10.3 6 0 white 21 4.8 0.26 0.23 10.6 0.034 23.0 111.0 0.99274 3.46 0.28 11.5 7 1 white 22 4.8 0.29 0.23 1.1 0.044 38.0 180.0 0.98924 3.28 0.34 11.9 6 0 white 23 4.8 0.33 0.0 6.5 0.028 34.0 163.0 0.9937 3.35 0.61 9.9 5 0 white 24 4.8 0.34 0.0 6.5 0.028 33.0 163.0 0.9939 3.36 0.61 9.9 6 0 white 25 4.8 0.65 0.12 1.1 0.013 4.0 10.0 0.99246 3.32 0.36 13.5 4 0 white 26 4.9 0.235 0.27 11.75 0.03 34.0 118.0 0.9954 3.07 0.5 9.4 6 0 white 27 4.9 0.33 0.31 1.2 0.016 39.0 150.0 0.98713 3.33 0.59 14.0 8 1 white 28 4.9 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.4666666666667 5 0 white 29 4.9 0.335 0.14 1.3 0.036 69.0 168.0 0.99212 3.47 0.46 10.4666666666667 5 0 white 30 4.9 0.345 0.34 1.0 0.068 32.0 143.0 0.99138 3.24 0.4 10.1 5 0 white 31 4.9 0.345 0.34 1.0 0.068 32.0 143.0 0.99138 3.24 0.4 10.1 5 0 white 32 4.9 0.42 0.0 2.1 0.048 16.0 42.0 0.99154 3.71 0.74 14.0 7 1 red 33 4.9 0.47 0.17 1.9 0.035 60.0 148.0 0.98964 3.27 0.35 11.5 6 0 white 34 5.0 0.17 0.56 1.5 0.026 24.0 115.0 0.9906 3.48 0.39 10.8 7 1 white 35 5.0 0.2 0.4 1.9 0.015 20.0 98.0 0.9897 3.37 0.55 12.05 6 0 white 36 5.0 0.235 0.27 11.75 0.03 34.0 118.0 0.9954 3.07 0.5 9.4 6 0 white 37 5.0 0.24 0.19 5.0 0.043 17.0 101.0 0.99438 3.67 0.57 10.0 5 0 white 38 5.0 0.24 0.21 2.2 0.039 31.0 100.0 0.99098 3.69 0.62 11.7 6 0 white 39 5.0 0.24 0.34 1.1 0.034 49.0 158.0 0.98774 3.32 0.32 13.1 7 1 white 40 5.0 0.255 0.22 2.7 0.043 46.0 153.0 0.99238 3.75 0.76 11.3 6 0 white 41 5.0 0.27 0.32 4.5 0.032 58.0 178.0 0.98956 3.45 0.31 12.6 7 1 white 42 5.0 0.27 0.32 4.5 0.032 58.0 178.0 0.98956 3.45 0.31 12.6 7 1 white 43 5.0 0.27 0.4 1.2 0.076 42.0 124.0 0.99204 3.32 0.47 10.1 6 0 white 44 5.0 0.29 0.54 5.7 0.035 54.0 155.0 0.98976 3.27 0.34 12.9 8 1 white 45 5.0 0.3 0.33 3.7 0.03 54.0 173.0 0.9887 3.36 0.3 13.0 7 1 white 46 5.0 0.31 0.0 6.4 0.046 43.0 166.0 0.994 3.3 0.63 9.9 6 0 white 47 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 48 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 49 5.0 0.33 0.16 1.5 0.049 10.0 97.0 0.9917 3.48 0.44 10.7 6 0 white 50 5.0 0.33 0.18 4.6 0.032 40.0 124.0 0.99114 3.18 0.4 11.0 6 0 white 51 5.0 0.33 0.23 11.8 0.03 23.0 158.0 0.99322 3.41 0.64 11.8 6 0 white 52 5.0 0.35 0.25 7.8 0.031 24.0 116.0 0.99241 3.39 0.4 11.3 6 0 white 53 5.0 0.35 0.25 7.8 0.031 24.0 116.0 0.99241 3.39 0.4 11.3 6 0 white 54 5.0 0.38 0.01 1.6 0.048 26.0 60.0 0.99084 3.7 0.75 14.0 6 0 red 55 5.0 0.4 0.5 4.3 0.046 29.0 80.0 0.9902 3.49 0.66 13.6 6 0 red 56 5.0 0.42 0.24 2.0 0.06 19.0 50.0 0.9917 3.72 0.74 14.0 8 1 red 57 5.0 0.44 0.04 18.6 0.039 38.0 128.0 0.9985 3.37 0.57 10.2 6 0 white 58 5.0 0.455 0.18 1.9 0.036 33.0 106.0 0.98746 3.21 0.83 14.0 7 1 white 59 5.0 0.55 0.14 8.3 0.032 35.0 164.0 0.9918 3.53 0.51 12.5 8 1 white 60 5.0 0.61 0.12 1.3 0.009 65.0 100.0 0.9874 3.26 0.37 13.5 5 0 white 61 5.0 0.74 0.0 1.2 0.041 16.0 46.0 0.99258 4.01 0.59 12.5 6 0 red 62 5.0 1.02 0.04 1.4 0.045 41.0 85.0 0.9938 3.75 0.48 10.5 4 0 red 63 5.0 1.04 0.24 1.6 0.05 32.0 96.0 0.9934 3.74 0.62 11.5 5 0 red 64 5.1 0.11 0.32 1.6 0.028 12.0 90.0 0.99008 3.57 0.52 12.2 6 0 white 65 5.1 0.14 0.25 0.7 0.039 15.0 89.0 0.9919 3.22 0.43 9.2 6 0 white 66 5.1 0.165 0.22 5.7 0.047 42.0 146.0 0.9934 3.18 0.55 9.9 6 0 white 67 5.1 0.21 0.28 1.4 0.047 48.0 148.0 0.99168 3.5 0.49 10.4 5 0 white 68 5.1 0.23 0.18 1.0 0.053 13.0 99.0 0.98956 3.22 0.39 11.5 5 0 white 69 5.1 0.25 0.36 1.3 0.035 40.0 78.0 0.9891 3.23 0.64 12.1 7 1 white 70 5.1 0.26 0.33 1.1 0.027 46.0 113.0 0.98946 3.35 0.43 11.4 7 1 white 71 5.1 0.26 0.34 6.4 0.034 26.0 99.0 0.99449 3.23 0.41 9.2 6 0 white 72 5.1 0.29 0.28 8.3 0.026 27.0 107.0 0.99308 3.36 0.37 11.0 6 0 white 73 5.1 0.29 0.28 8.3 0.026 27.0 107.0 0.99308 3.36 0.37 11.0 6 0 white 74 5.1 0.3 0.3 2.3 0.048 40.0 150.0 0.98944 3.29 0.46 12.2 6 0 white 75 5.1 0.305 0.13 1.75 0.036 17.0 73.0 0.99 3.4 0.51 12.3333333333333 5 0 white 76 5.1 0.31 0.3 0.9 0.037 28.0 152.0 0.992 3.54 0.56 10.1 6 0 white 77 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 78 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 79 5.1 0.33 0.22 1.6 0.027 18.0 89.0 0.9893 3.51 0.38 12.5 7 1 white 80 5.1 0.33 0.27 6.7 0.022 44.0 129.0 0.99221 3.36 0.39 11.0 7 1 white 81 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 82 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 83 5.1 0.35 0.26 6.8 0.034 36.0 120.0 0.99188 3.38 0.4 11.5 6 0 white 84 5.1 0.39 0.21 1.7 0.027 15.0 72.0 0.9894 3.5 0.45 12.5 6 0 white 85 5.1 0.42 0.0 1.8 0.044 18.0 88.0 0.99157 3.68 0.73 13.6 7 1 red 86 5.1 0.42 0.01 1.5 0.017 25.0 102.0 0.9894 3.38 0.36 12.3 7 1 white 87 5.1 0.47 0.02 1.3 0.034 18.0 44.0 0.9921 3.9 0.62 12.8 6 0 red 88 5.1 0.51 0.18 2.1 0.042 16.0 101.0 0.9924 3.46 0.87 12.9 7 1 red 89 5.1 0.52 0.06 2.7 0.052 30.0 79.0 0.9932 3.32 0.43 9.3 5 0 white 90 5.1 0.585 0.0 1.7 0.044 14.0 86.0 0.99264 3.56 0.94 12.9 7 1 red 91 5.2 0.155 0.33 1.6 0.028 13.0 59.0 0.98975 3.3 0.84 11.9 8 1 white 92 5.2 0.155 0.33 1.6 0.028 13.0 59.0 0.98975 3.3 0.84 11.9 8 1 white 93 5.2 0.16 0.34 0.8 0.029 26.0 77.0 0.99155 3.25 0.51 10.1 6 0 white 94 5.2 0.17 0.27 0.7 0.03 11.0 68.0 0.99218 3.3 0.41 9.8 5 0 white 95 5.2 0.185 0.22 1.0 0.03 47.0 123.0 0.99218 3.55 0.44 10.15 6 0 white 96 5.2 0.2 0.27 3.2 0.047 16.0 93.0 0.99235 3.44 0.53 10.1 7 1 white 97 5.2 0.21 0.31 1.7 0.048 17.0 61.0 0.98953 3.24 0.37 12.0 7 1 white 98 5.2 0.22 0.46 6.2 0.066 41.0 187.0 0.99362 3.19 0.42 9.73333333333333 5 0 white 99 5.2 0.24 0.15 7.1 0.043 32.0 134.0 0.99378 3.24 0.48 9.9 6 0 white 100 5.2 0.24 0.45 3.8 0.027 21.0 128.0 0.992 3.55 0.49 11.2 8 1 white Rows: 1-100 | Columns: 14Note
VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.
Next, we can initialize a
LogisticRegression
model:from verticapy.machine_learning.vertica import LogisticRegression model = LogisticRegression()
Now we can conveniently use the
enet_search_cv()
function to perform the k-fold grid search using multiple ENet models.from verticapy.machine_learning.model_selection import enet_search_cv result = enet_search_cv( model, input_relation = data, X = [ "fixed_acidity", "volatile_acidity", "citric_acid", "residual_sugar", "chlorides", "density", ], y = "good", cv = 3, )
avg_score avg_train_score avg_time score_std score_train_std 1 0.28875503399301 0.2889705699508407 0.5093175570170084 0.0015784406922248492 9.021984241016784e-05 2 0.28988475209330206 0.29087931515384136 0.5078680515289307 0.0008867666937493007 0.000739324649531738 3 0.29199381972192834 0.29122709770394634 0.4692540168762207 0.0017607491819554746 0.0007443208396006909 4 0.2921071315656917 0.29294148288686933 0.49772993723551434 0.0013530637424065766 0.00037384581751064136 5 0.29225262860544166 0.2943395189096936 0.5294616222381592 0.0007464090882081386 0.0006562532757860629 6 0.2929725105527477 0.29229260741654 0.4993574619293213 0.0005183606480262127 6.0482151377715214e-05 7 0.29302922603501536 0.29188328473791497 0.498184601465861 0.00025823240202812125 0.0004614568950753615 8 0.29316936199816834 0.29462668026342137 0.5120463371276855 0.0004133221207798049 0.00015642091652040354 9 0.29321495019449934 0.2940397971177177 0.515332301457723 0.0021449334121091345 0.0002773431265495126 10 0.29350822839592694 0.2924784677716797 0.5130974451700846 0.0016620500248098642 0.00026318299479527793 11 0.2937637569184617 0.29318722680478565 0.44217538833618164 0.0009871300613707075 0.0004785486710449681 12 0.29411143302988835 0.29302039699071397 0.44733603795369464 0.0013805624141662574 0.000530306839753773 13 0.29419424011045897 0.292734058326016 0.4951339562733968 0.001335582912410786 0.0004403397596976341 14 0.29420919646669336 0.2943372229843293 0.49247241020202637 0.000840154912191546 0.00012772934751587858 15 0.294391730314586 0.29570560866993634 0.4886571566263835 0.0005662633251395811 0.0002322281898129302 16 0.294527395841297 0.29427691647211934 0.5049107869466146 0.00138467483527383 0.00045650863486426836 17 0.29465327243009765 0.293972197855005 0.5169869263966879 0.00025840539526298084 0.0003748626071545178 18 0.29467117400584697 0.294131022821196 0.5098327000935873 0.0027077736909821427 0.0006988274397574708 19 0.29468875667530403 0.2942275006844673 0.4958036740620931 0.000562919010907147 0.000300265597443176 20 0.29518639535187136 0.29558725599419705 0.462774117787679 0.000844963910354274 0.00016341213630130728 21 0.29537138016150205 0.29393506435442 0.49330711364746094 0.0005327799359173643 3.7032185326878735e-05 22 0.2954070094853857 0.2956990517085043 0.46976343790690106 0.0009766659327509017 0.00021819825763408473 23 0.29544572012699466 0.29584014161018796 0.48270924886067706 0.0013989781704062303 0.0002592893825424637 24 0.2954654164539413 0.29575700327542936 0.474274476369222 0.0005927720071689901 0.0003315935691227957 25 0.2954803537667636 0.29557433423364066 0.4599888324737549 0.0014781605298268853 0.0004394696321321296 26 0.29561251050348764 0.29426021590923035 0.4972837766011556 0.0003461666135901736 0.00033042248211202725 27 0.2956645187961477 0.2956150760107373 0.47308746973673504 0.0006355044329669372 0.0004306523914199375 28 0.2958203985814067 0.29584067239454837 0.48976818720499676 0.001691114772221626 0.0005699737942436646 29 0.29600928944534166 0.295191068001287 0.44955889383951825 0.0005194927375628087 0.00013932914339884124 30 0.297239770857993 0.295375681020748 0.4832425117492676 0.0018907224316749725 0.00019247645639536117 31 0.29747814435883135 0.29801603360225265 0.4771249294281006 0.00014476006901485244 9.640824439204522e-05 32 0.29790110724664537 0.29743420433346435 0.4645797411600749 0.0005798696938599654 0.00013256640930582524 33 0.29796761784854 0.298347064486824 0.4790957768758138 0.0003339764763465229 6.1555176505539e-05 34 0.298046121268956 0.297846969492212 0.4709161917368571 0.000460908143042418 0.0002735075504638667 35 0.298130332117994 0.29841204361726437 0.4813356399536133 0.000497665979181365 0.0002567417908336975 36 0.2985018587919537 0.2974459224006743 0.4628615379333496 0.0007390336528186136 0.0001408361978420144 37 0.298656759542105 0.29849736621551665 0.4694647789001465 0.00023116205209170777 0.0001572099699721157 38 0.29883284372222335 0.29853332061789667 0.4747133255004883 5.3359540000122785e-05 0.00025357304787172264 39 0.29892822086506804 0.29907529820740436 0.4729510148366292 0.000434044132090088 0.00018364970199760344 40 0.29942727170605665 0.299381006215989 0.4633646011352539 0.0003574822523537655 0.0004070358610976282 41 0.29949412027588435 0.29955897588757635 0.46548930803934735 0.00024082443470814467 2.992681170917025e-05 42 0.299903208424033 0.30026462015501165 0.4058402379353841 2.132396934047377e-05 0.00014219160637806544 43 0.30013708600294664 0.300231043275763 0.30397820472717285 0.0002422621457816846 9.883012467130248e-05 44 0.3004328496255137 0.30043173469987133 0.3180042902628581 0.00012879914088568305 0.0001228978769460069 45 0.30050100071454233 0.30069954374914 0.29770509401957196 0.00010136215733526905 3.41589859123821e-05 46 0.30064301505773366 0.30066116856579733 0.45290883382161456 0.00019115959091819368 6.699357219484447e-05 47 0.3006497710285327 0.300618124410257 0.4346683820088704 0.00016533187028402652 9.531028684657328e-05 48 0.3007065349008533 0.3006463479834413 0.4402651786804199 4.92183761025651e-05 0.00018079134637401077 49 0.300740153662202 0.2998939786621967 0.46817700068155926 0.000341256499634522 0.00023685694065802857 50 0.30077163247443767 0.30043435511147104 0.4719069004058838 0.00028487212813465315 0.00012536396366005884 51 0.3007863966491586 0.30042997082806466 0.4791755676269531 0.0001824541808298649 0.00011818889591886163 52 0.30079479246760965 0.3007576932723677 0.41887299219767254 6.666641180134395e-05 6.791146542654688e-05 53 0.300857630182513 0.30074111689180666 0.41811760266621906 0.00011340381939469661 5.319235184174567e-05 54 0.3009156443023813 0.30076353812569834 0.31933752695719403 7.530684128951878e-05 9.107189333617933e-05 55 0.30095779253412336 0.30090752555715067 0.3187870184580485 2.4376501845209437e-05 2.45921206324533e-05 56 0.30099988453767895 0.3009499887502787 0.34267711639404297 2.187420327314401e-05 1.2120649774923495e-06 57 0.301029995663981 0.301029995663981 0.338547150293986 0.0 0.0 58 0.301029995663981 0.301029995663981 0.32788697878519696 0.0 0.0 59 0.301029995663981 0.301029995663981 0.330471674601237 0.0 0.0 60 0.301029995663981 0.301029995663981 0.3371437390645345 0.0 0.0 61 0.301029995663981 0.301029995663981 0.32029255231221515 0.0 0.0 62 0.301029995663981 0.301029995663981 0.3185081481933594 0.0 0.0 63 0.301029995663981 0.301029995663981 0.32446956634521484 0.0 0.0 64 0.301029995663981 0.301029995663981 0.33326395352681476 0.0 0.0 65 0.301029995663981 0.301029995663981 0.33658544222513836 0.0 0.0 66 0.301029995663981 0.301029995663981 0.342023770014445 0.0 0.0 67 0.301029995663981 0.301029995663981 0.32116421063741046 0.0 0.0 68 0.301029995663981 0.301029995663981 0.3203926881154378 0.0 0.0 69 0.301029995663981 0.301029995663981 0.3169873555501302 0.0 0.0 70 0.301029995663981 0.301029995663981 0.32029302914937335 0.0 0.0 71 0.301029995663981 0.301029995663981 0.3184208075205485 0.0 0.0 72 0.301029995663981 0.301029995663981 0.3200217882792155 0.0 0.0 73 0.301029995663981 0.301029995663981 0.314727783203125 0.0 0.0 74 0.301029995663981 0.301029995663981 0.3060281276702881 0.0 0.0 75 0.301029995663981 0.301029995663981 0.3151041666666667 0.0 0.0 76 0.301029995663981 0.301029995663981 0.32861558596293133 0.0 0.0 77 0.301029995663981 0.301029995663981 0.3199329376220703 0.0 0.0 78 0.301029995663981 0.301029995663981 0.32890931765238446 0.0 0.0 79 0.301029995663981 0.301029995663981 0.32204151153564453 0.0 0.0 80 0.301029995663981 0.301029995663981 0.3260353406270345 0.0 0.0 81 0.301029995663981 0.301029995663981 0.320432186126709 0.0 0.0 82 0.301029995663981 0.301029995663981 0.31733210881551105 0.0 0.0 83 0.301029995663981 0.301029995663981 0.30673980712890625 0.0 0.0 84 0.301029995663981 0.301029995663981 0.3012990156809489 0.0 0.0 85 0.301029995663981 0.301029995663981 0.3070652484893799 0.0 0.0 86 0.301029995663981 0.301029995663981 0.30636270840962726 0.0 0.0 87 0.301029995663981 0.301029995663981 0.30796003341674805 0.0 0.0 88 0.301029995663981 0.301029995663981 0.30862903594970703 0.0 0.0 89 0.301029995663981 0.301029995663981 0.30717817942301434 0.0 0.0 90 0.301029995663981 0.301029995663981 0.30986571311950684 0.0 0.0 91 0.301029995663981 0.301029995663981 0.3089943726857503 0.0 0.0 92 0.301029995663981 0.301029995663981 0.30253807703653973 0.0 0.0 93 0.301029995663981 0.301029995663981 0.3150782585144043 0.0 0.0 94 0.301029995663981 0.301029995663981 0.32866501808166504 0.0 0.0 95 0.301029995663981 0.301029995663981 0.3698059717814128 0.0 0.0 96 0.301029995663981 0.301029995663981 0.34749595324198407 0.0 0.0 97 0.301029995663981 0.301029995663981 0.3227097988128662 0.0 0.0 98 0.301029995663981 0.301029995663981 0.3398718039194743 0.0 0.0 99 0.301029995663981 0.301029995663981 0.32657750447591144 0.0 0.0 100 0.301029995663981 0.301029995663981 0.3452862898508708 0.0 0.0 101 0.301029995663981 0.301029995663981 0.33191625277201336 0.0 0.0 102 0.301029995663981 0.301029995663981 0.3359339237213135 0.0 0.0 103 0.301029995663981 0.301029995663981 0.333176851272583 0.0 0.0 104 0.301029995663981 0.301029995663981 0.3535044193267822 0.0 0.0 105 0.301029995663981 0.301029995663981 0.3341079552968343 0.0 0.0 106 0.301029995663981 0.301029995663981 0.3259601593017578 0.0 0.0 107 0.301029995663981 0.301029995663981 0.33173330624898273 0.0 0.0 108 0.301029995663981 0.301029995663981 0.31542309125264484 0.0 0.0 109 0.301029995663981 0.301029995663981 0.34161678949991864 0.0 0.0 110 0.301029995663981 0.301029995663981 0.3372461795806885 0.0 0.0 Rows: 1-110 | Columns: 6Note
In VerticaPy, Elastic Net Cross-Validation (EnetCV) utilizes multiple
ElasticNet
models for regression tasks andLogisticRegression
models for classification tasks. It systematically tests various combinations of hyperparameters, such as the regularization terms (L1 and L2), to identify the set that optimizes the model’s performance. This process helps in automatically fine-tuning the model to achieve better accuracy and generalization on diverse datasets.See also
grid_search_cv()
: Computes the k-fold grid search of an estimator.randomized_search_cv()
: Computes the K-Fold randomized search of an estimator.