enet_search_cv

In [ ]:
enet_search_cv(input_relation: (str, vDataFrame),
               X: list,
               y: str,
               metric: str = "auto",
               cv: int = 3,
               estimator_type: str = "auto",
               cutoff: float = -1,
               cursor=None,
               print_info: bool = True,)

Computes the k-fold grid search using multiple ENet models.

Parameters

Name Type Optional Description
input_relation
str / vDataFrame
Input Relation.
X
list
List of the predictor columns.
y
str
Response Column.
auto
str / list
Metric used to do the model evaluation.
  • auto : logloss for classification & rmse for regression.

For Classification:
  • accuracy : Accuracy
  • auc : Area Under the Curve (ROC)
  • bm : Informedness = tpr + tnr - 1
  • csi : Critical Success Index = tp / (tp + fn + fp)
  • f1 : F1 Score
  • logloss : Log Loss
  • mcc : Matthews Correlation Coefficient
  • mk : Markedness = ppv + npv - 1
  • npv : Negative Predictive Value = tn / (tn + fn)
  • prc_auc : Area Under the Curve (PRC)
  • precision : Precision = tp / (tp + fp)
  • recall : Recall = tp / (tp + fn)
  • specificity : Specificity = tn / (tn + fp)

For Regression:
  • max : Max Error
  • mae : Mean Absolute Error
  • median : Median Absolute Error
  • mse : Mean Squared Error
  • msle : Mean Squared Log Error
  • r2 : R-squared coefficient
  • r2a : R2 adjusted
  • rmse : Root Mean Squared Error
  • var : Explained Variance
cv
int
Number of folds.
estimator_type
str
Estimator Type.
  • auto : detects if it is a Logit Model or ENet.
  • enet : ElasticNet.
  • logit : Logistic Regression.
cutoff
float
The model cutoff (classification only).
cursor
DBcursor
Vertica database cursor.
print_info
bool
If set to True, prints the model information at each step.

Returns

tablesample : An object containing the result. For more information, see utilities.tablesample.

Example

In [63]:
from verticapy.learn.model_selection import enet_search_cv
enet_search_cv(input_relation = "public.titanic", 
               X = ["age", "fare", "pclass",], 
               y = "survived", 
               cv = 3)
Starting Bayesian Search

Step 1 - Computing Random Models using Grid Search

Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.0}; Test_score: 0.262598430013598; Train_score: 0.274133719943495; Time: 0.3262985547383626;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.1}; Test_score: 0.266291093427745; Train_score: 0.27198829802802066; Time: 0.30567367871602374;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.2}; Test_score: 0.283669488575714; Train_score: 0.264631430687299; Time: 0.28858145078023273;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.30000000000000004}; Test_score: 0.27765932436013835; Train_score: 0.26790878269621965; Time: 0.28299593925476074;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.4}; Test_score: 0.26681449042856936; Train_score: 0.27091851916276766; Time: 0.3265560468037923;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.5}; Test_score: 0.27761610729421865; Train_score: 0.267879726385803; Time: 0.2933203379313151;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.6000000000000001}; Test_score: 0.2813431410163907; Train_score: 0.26608977303223935; Time: 0.322865088780721;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.7000000000000001}; Test_score: 0.27371067192792264; Train_score: 0.2700078071223963; Time: 0.30130958557128906;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.8}; Test_score: 0.271327358139843; Train_score: 0.269882288039403; Time: 0.2919313907623291;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.9}; Test_score: 0.26711046643049097; Train_score: 0.272047697007229; Time: 0.2928773562113444;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.0}; Test_score: 0.273844245656181; Train_score: 0.2700430792031203; Time: 0.2918105125427246;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.1}; Test_score: 0.272151260627639; Train_score: 0.2692774444256; Time: 0.38333924611409503;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.2}; Test_score: 0.2744871040194313; Train_score: 0.26851389273173565; Time: 0.3269970417022705;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.30000000000000004}; Test_score: 0.2688611824467447; Train_score: 0.27192487823291334; Time: 0.28541771570841473;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.4}; Test_score: 0.27435674329418536; Train_score: 0.268383530105114; Time: 0.28881271680196124;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.5}; Test_score: 0.265830035357967; Train_score: 0.2735527377207397; Time: 0.2916240692138672;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.6000000000000001}; Test_score: 0.279605985783173; Train_score: 0.268251216607695; Time: 0.30589548746744794;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.7000000000000001}; Test_score: 0.2739471770777597; Train_score: 0.2670199331998773; Time: 0.30471499760945636;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.8}; Test_score: 0.2703962570325967; Train_score: 0.269869664148955; Time: 0.33469128608703613;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.9}; Test_score: 0.2758269760076687; Train_score: 0.2697030352682843; Time: 0.31498289108276367;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.0}; Test_score: 0.27190338015048665; Train_score: 0.269230251184661; Time: 0.3099230130513509;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.1}; Test_score: 0.26978812994657964; Train_score: 0.27011089307188835; Time: 0.3464393615722656;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.2}; Test_score: 0.26971207011780335; Train_score: 0.27168937759203765; Time: 0.33384863535563153;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.30000000000000004}; Test_score: 0.27048662525399503; Train_score: 0.2718552628195923; Time: 0.34309156735738117;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.4}; Test_score: 0.27561221151756465; Train_score: 0.2677155005204437; Time: 0.30875062942504883;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.5}; Test_score: 0.277363557999443; Train_score: 0.2668737724562886; Time: 0.28888877232869464;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.6000000000000001}; Test_score: 0.26937256597372833; Train_score: 0.270576664932445; Time: 0.2807494004567464;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.7000000000000001}; Test_score: 0.27484874663777004; Train_score: 0.26731255041390767; Time: 0.3062567710876465;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.8}; Test_score: 0.2816827539981557; Train_score: 0.2656013436349037; Time: 0.3246612548828125;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.9}; Test_score: 0.2734556558445937; Train_score: 0.2699779022837977; Time: 0.3173202673594157;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.0}; Test_score: 0.27666989938940867; Train_score: 0.2674973553989017; Time: 0.32061028480529785;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.1}; Test_score: 0.27155212604688733; Train_score: 0.270093144284725; Time: 0.3126896222432454;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.2}; Test_score: 0.28743409542406867; Train_score: 0.26558637448954536; Time: 0.3156779607137044;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.30000000000000004}; Test_score: 0.2692825221249547; Train_score: 0.27163345183925663; Time: 0.29481307665507;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.4}; Test_score: 0.2650493381412157; Train_score: 0.27301827491543634; Time: 0.300190528233846;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.5}; Test_score: 0.27342520189033465; Train_score: 0.2688040990514273; Time: 0.3456309636433919;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.6000000000000001}; Test_score: 0.2711271874696027; Train_score: 0.270117610779609; Time: 0.3209356466929118;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.7000000000000001}; Test_score: 0.273643629327861; Train_score: 0.268843609497238; Time: 0.31479668617248535;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.8}; Test_score: 0.2721716992041833; Train_score: 0.27088880403254534; Time: 0.3185320695241292;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.9}; Test_score: 0.26927092275557035; Train_score: 0.27282144132281666; Time: 0.2949639956156413;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.0}; Test_score: 0.27217083342385534; Train_score: 0.27445455308973; Time: 0.30247847239176434;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.1}; Test_score: 0.27310733478301136; Train_score: 0.2760817661532617; Time: 0.33632763226826984;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.2}; Test_score: 0.278450353908547; Train_score: 0.27430849234964433; Time: 0.33259208997090656;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.30000000000000004}; Test_score: 0.28516833439827; Train_score: 0.2716733424876173; Time: 0.34057267506917316;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.4}; Test_score: 0.282919720495777; Train_score: 0.27618159314262036; Time: 0.3068840503692627;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.5}; Test_score: 0.27864301340771364; Train_score: 0.28125115367144765; Time: 0.32869958877563477;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.6000000000000001}; Test_score: 0.27839218861863735; Train_score: 0.2833007057917167; Time: 0.31580615043640137;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.7000000000000001}; Test_score: 0.284684916058892; Train_score: 0.2786167876189847; Time: 0.31981778144836426;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.8}; Test_score: 0.2793585936308717; Train_score: 0.28332633498493165; Time: 0.3324185212453206;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.9}; Test_score: 0.286173390763846; Train_score: 0.28017274478347765; Time: 0.30952270825703937;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.0}; Test_score: 0.280064586600843; Train_score: 0.28021343532362897; Time: 0.3500709533691406;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.1}; Test_score: 0.2765281808274337; Train_score: 0.28514168077681135; Time: 0.337546428044637;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.2}; Test_score: 0.286300385490234; Train_score: 0.280090942931885; Time: 0.3949916362762451;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.28068088366139365; Train_score: 0.28369223510851266; Time: 0.36893900235493976;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.4}; Test_score: 0.28739997988630667; Train_score: 0.28076816781912434; Time: 0.4097263813018799;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.5}; Test_score: 0.2846031716623277; Train_score: 0.28266511174146197; Time: 0.37319596608479816;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.289253085845893; Train_score: 0.28209080775449435; Time: 0.48996766408284503;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.2840886885264547; Train_score: 0.2835976878936253; Time: 0.30651140213012695;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.8}; Test_score: 0.28241917685981766; Train_score: 0.2864133195008343; Time: 0.3036954402923584;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.9}; Test_score: 0.282482662266238; Train_score: 0.28577528837848665; Time: 0.31153329213460285;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.0}; Test_score: 0.282167561374635; Train_score: 0.2814195229554263; Time: 0.2658512592315674;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.1}; Test_score: 0.28684789565701235; Train_score: 0.281715637509305; Time: 0.23003037770589194;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.2}; Test_score: 0.2868454714040623; Train_score: 0.2831129229959413; Time: 0.26296035448710126;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.28687766808518966; Train_score: 0.28477458801411665; Time: 0.3370985984802246;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.4}; Test_score: 0.28514866627190166; Train_score: 0.28784880620151965; Time: 0.30331945419311523;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.5}; Test_score: 0.28932703887942335; Train_score: 0.28561912196862965; Time: 0.30033214886983234;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.2891735964934667; Train_score: 0.2892378306821; Time: 0.28902967770894367;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.29046114055905564; Train_score: 0.29139875314848535; Time: 0.28153061866760254;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.8}; Test_score: 0.291542927628617; Train_score: 0.2904379614371977; Time: 0.30759231249491376;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.9}; Test_score: 0.293234675793685; Train_score: 0.295116603003443; Time: 0.28280027707417804;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.0}; Test_score: 0.281558807983945; Train_score: 0.285380851526246; Time: 0.20305927594502768;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.1}; Test_score: 0.28565783220454866; Train_score: 0.2882284568581477; Time: 0.20418914159138998;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.2}; Test_score: 0.28727001779710065; Train_score: 0.28908782293762597; Time: 0.2120063304901123;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.29087542931298566; Train_score: 0.28884189096797236; Time: 0.25333285331726074;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.4}; Test_score: 0.292350971154677; Train_score: 0.29254298986055965; Time: 0.25289201736450195;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.5}; Test_score: 0.2945289262612603; Train_score: 0.29384571343192534; Time: 0.23057778676350912;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.29914840213658933; Train_score: 0.29946751621913065; Time: 0.22211543718973795;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.299515315282455; Train_score: 0.2991626296279187; Time: 0.18698430061340332;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.8}; Test_score: 0.301003090760206; Train_score: 0.3009834172613983; Time: 0.2291417916615804;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.9}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.22884726524353027;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.0}; Test_score: 0.28645930861842234; Train_score: 0.28418485752131134; Time: 0.22722291946411133;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.1}; Test_score: 0.29535330796531867; Train_score: 0.2964046817939983; Time: 0.21848750114440918;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.2}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19717979431152344;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.22337841987609863;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.4}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21022280057271323;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.5}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21184388796488443;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.18712735176086426;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19541478157043457;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.8}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19593048095703125;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.9}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.20261740684509277;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.0}; Test_score: 0.2841366456192043; Train_score: 0.286398474342786; Time: 0.20740699768066406;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.1}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19916494687398276;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.2}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21297764778137207;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.2050775686899821;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.4}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.2380808194478353;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.5}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19531591733296713;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.2661433219909668;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21566875775655112;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.8}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21957778930664062;
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.9}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.3620673020680745;

Step 2 - Fitting the RF model with the hyperparameters data

Step 3 - Computing Most Probable Good Models using Grid Search

Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.112}; Test_score: 0.279951046580572; Train_score: 0.2843840354620593; Time: 0.3151123523712158;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.111}; Test_score: 0.28465228542275833; Train_score: 0.28250577726909804; Time: 0.2990090847015381;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.11}; Test_score: 0.28136780941726164; Train_score: 0.282927204694194; Time: 0.3273129463195801;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.109}; Test_score: 0.292524351360777; Train_score: 0.2785090605138967; Time: 0.3239874839782715;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.108}; Test_score: 0.2867461223583447; Train_score: 0.2817534943888157; Time: 0.3574562867482503;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.107}; Test_score: 0.28333598848936997; Train_score: 0.28254275493078834; Time: 0.31406164169311523;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.106}; Test_score: 0.288372257678885; Train_score: 0.279615559473095; Time: 0.30739561716715497;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.105}; Test_score: 0.288898051360634; Train_score: 0.27865162134178534; Time: 0.3076666196187337;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.104}; Test_score: 0.28407667113436; Train_score: 0.281813730131792; Time: 0.32158835728963214;
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.103}; Test_score: 0.28317432379688035; Train_score: 0.28219670606502734; Time: 0.3291612466176351;

Bayesian Search Selected Model
Parameters: {'solver': 'cgd', 'penalty': 'enet', 'max_iter': 100, 'l1_ratio': 0.0, 'C': 1e-05, 'tol': 1e-06}; Test_score: 0.262598430013598; Train_score: 0.274133719943495; Time: 0.3262985547383626;
Out[63]:
avg_score
avg_train_score
avg_time
score_std
score_train_std
10.2625984300135980.2741337199434950.32629855473836260.0043497770282658560.0009329483973997906
20.26504933814121570.273018274915436340.3001905282338460.0127107991943739280.006821677523737474
30.2658300353579670.27355273772073970.29162406921386720.0087715323100748910.005104793135187558
40.2662910934277450.271988298028020660.305673678716023740.00259612663612108660.001294377521097551
50.266814490428569360.270918519162767660.32655604680379230.0026159301721016270.001451844666415892
60.267110466430490970.2720476970072290.29287735621134440.008024284158078910.003618651818559264
70.26886118244674470.271924878232913340.285417715708414730.0039423874633991680.0025701159849661117
80.269270922755570350.272821441322816660.29496399561564130.0044357169151249350.0012253735498850634
90.26928252212495470.271633451839256630.294813076655070.0059640474160550830.0029022716058280363
100.269372565973728330.2705766649324450.28074940045674640.0064757905562432370.00293052548844658
110.269712070117803350.271689377592037650.333848635355631530.0121890906374262560.005173377903525278
120.269788129946579640.270110893071888350.34643936157226560.0150453697392356770.007804434369256671
130.27039625703259670.2698696641489550.334691286087036130.0023550183665494380.002197611487998931
140.270486625253995030.27185526281959230.343091567357381170.009731943353023190.004411628377393284
150.27112718746960270.2701176107796090.32093564669291180.0036659975761380450.0005533505912554856
160.2713273581398430.2698822880394030.29193139076232910.0039036738008865510.0023082466370493164
170.271552126046887330.2700931442847250.31268962224324540.0053531464506894060.0034556369925374288
180.271903380150486650.2692302511846610.30992301305135090.003929027415929130.001785616167235049
190.2721512606276390.26927744442560.383339246114095030.00115552136342459240.0010143975426826545
200.272170833423855340.274454553089730.302478472391764340.0036617023829060540.0014408160506393564
210.27217169920418330.270888804032545340.31853206952412920.0078776900923495670.0029830084197949715
220.273107334783011360.27608176615326170.336327632268269840.00229637126339500480.0014786427386107486
230.273425201890334650.26880409905142730.34563096364339190.00167097190004892260.0007785625621636187
240.27345565584459370.26997790228379770.31732026735941570.0064376528216323980.002568392251263787
250.2736436293278610.2688436094972380.314796686172485350.0088652462593821250.005577331240488202
260.273710671927922640.27000780712239630.301309585571289060.0041385368303686850.0015822078714623896
270.2738442456561810.27004307920312030.29181051254272460.0066184737257904870.0031696105030355817
280.27394717707775970.26701993319987730.304714997609456360.0051072761668127290.0037749507526140154
290.274356743294185360.2683835301051140.288812716801961240.0113433270115746510.004669419126102212
300.27448710401943130.268513892731735650.32699704170227050.0101815714338047770.005857462307846778
310.274848746637770040.267312550413907670.30625677108764650.00170224277162508180.0016018596042515292
320.275612211517564650.26771550052044370.308750629425048830.0056182349946214940.0027389549111835436
330.27582697600766870.26970303526828430.314982891082763670.0035821912821195238.342264758661904e-05
340.27652818082743370.285141680776811350.3375464280446370.0042329716637550350.0015348959270411154
350.276669899389408670.26749735539890170.320610284805297850.0097462436137474490.005190433759763678
360.2773635579994430.26687377245628860.288888772328694640.0053683796353896940.0014484563310489737
370.277616107294218650.2678797263858030.29332033793131510.0034090492841515650.002317637743805133
380.277659324360138350.267908782696219650.282995939254760740.00076869079959615890.0014132043002215468
390.278392188618637350.28330070579171670.315806150436401370.00309601933512218060.0023865996986673897
400.2784503539085470.274308492349644330.332592089970906560.0065256519841047860.001513306090634289
410.278643013407713640.281251153671447650.328699588775634770.00287981203489884670.0030915363886010456
420.27935859363087170.283326334984931650.33241852124532060.003700450320523640.0024265343573997055
430.2796059857831730.2682512166076950.305895487467447940.0065647499591432250.00369740221475518
440.2799510465805720.28438403546205930.31511235237121580.0050685664601962250.0034933509278099783
450.2800645866008430.280213435323628970.35007095336914060.0043400407380640740.00237488384513383
460.280680883661393650.283692235108512660.368939002354939760.002911175625365150.0015756135981438732
470.28134314101639070.266089773032239350.3228650887807210.0187395973522630450.006106576307069964
480.281367809417261640.2829272046941940.32731294631958010.0046716971214939270.0036828703375445513
490.2815588079839450.2853808515262460.203059275945027680.0016587281494633130.0002480139836948314
500.28168275399815570.26560134363490370.32466125488281250.0158508189280564540.005939926949107826
510.2821675613746350.28141952295542630.26585125923156740.00255283028749380340.001749533778199504
520.282419176859817660.28641331950083430.30369544029235840.00315747904312465350.0026420732054964775
530.2824826622662380.285775288378486650.311533292134602850.0040542414172027080.0031085743857976475
540.2829197204957770.276181593142620360.30688405036926270.007963370700961590.0044530406047341665
550.283174323796880350.282196706065027340.32916124661763510.00339159093752325860.0031560088713123015
560.283335988489369970.282542754930788340.314061641693115230.00197084074859196720.0031360954593191532
570.2836694885757140.2646314306872990.288581450780232730.014271168446461460.004706238307507307
580.284076671134360.2818137301317920.321588357289632140.005228884578050440.002737562388302744
590.28408868852645470.28359768789362530.306511402130126950.00139706310275503430.00233961591571665
600.28413664561920430.2863984743427860.207406997680664060.0031289560942155260.002421003100361331
610.28460317166232770.282665111741461970.373195966084798160.00262282011488827040.002467557949697091
620.284652285422758330.282505777269098040.29900908470153810.0040927513323962120.0028206917557600975
630.2846849160588920.27861678761898470.319817781448364260.0088455011371147380.0034786934788982706
640.285148666271901660.287848806201519650.303319454193115230.00097492475262329060.001760905771480469
650.285168334398270.27167334248761730.340572675069173160.0068520378335310480.003736287604119645
660.285657832204548660.28822845685814770.204189141591389980.0014590627745458130.0011943294057966123
670.2861733907638460.280172744783477650.309522708257039370.0063912955805184810.0023876100803798196
680.2863003854902340.2800909429318850.39499163627624510.00699404925710177950.004503654497867974
690.286459308618422340.284184857521311340.227222919464111330.0026187421863492190.0004182644925671466
700.28674612235834470.28175349438881570.35745628674825030.0105154520403123660.0037754110329093577
710.28684547140406230.28311292299594130.262960354487101260.00333535961395169460.0039134155315358925
720.286847895657012350.2817156375093050.230030377705891940.0057552943828341070.004443260096925231
730.286877668085189660.284774588014116650.33709859848022460.00341002651442932770.002776003258892111
740.287270017797100650.289087822937625970.21200633049011230.00052816871617900270.00039881122071910857
750.287399979886306670.280768167819124340.40972638130187990.009744073665426050.004217400104930424
760.287434095424068670.265586374489545360.31567796071370440.015061572591771450.004124859339200786
770.2883722576788850.2796155594730950.307395617167154970.0027243758894021830.0007793662990998128
780.2888980513606340.278651621341785340.30766661961873370.0065667410340967340.003945359678055887
790.28917359649346670.28923783068210.289029677708943670.00062259215793659560.0020841605728100995
800.2892530858458930.282090807754494350.489967664082845030.008551060080650810.004766251865627149
810.289327038879423350.285619121968629650.300332148869832340.0012076694487831950.0013061068060506327
820.290461140559055640.291398753148485350.281530618667602540.000240041966945687770.0034700340932891306
830.290875429312985660.288841890967972360.253332853317260740.00088327906896419433.6960071762331854e-05
840.2915429276286170.29043796143719770.307592312494913760.00069160953526404950.0019126863717182426
850.2923509711546770.292542989860559650.252892017364501950.000163937348128769420.0009362926823337595
860.2925243513607770.27850906051389670.32398748397827150.0086906084142026030.003601059602967378
870.2932346757936850.2951166030034430.282800277074178040.00066188554456336510.0008214793999581614
880.29452892626126030.293845713431925340.230577786763509120.000355922702560091750.0015725273884638175
890.295353307965318670.29640468179399830.218487501144409180.00033374090755890440.0005905536118116889
900.299148402136589330.299467516219130650.222115437189737950.00094269833058719860.0009253941889090591
910.2995153152824550.29916262962791870.186984300613403320.00039582772755237930.0006095270933012482
920.3010030907602060.30098341726139830.22914179166158044.660066031104051e-058.06761598085528e-05
930.3010299956639810.3010299956639810.228847265243530270.00.0
940.3010299956639810.3010299956639810.197179794311523440.00.0
950.3010299956639810.3010299956639810.223378419876098630.00.0
960.3010299956639810.3010299956639810.210222800572713230.00.0
970.3010299956639810.3010299956639810.211843887964884430.00.0
980.3010299956639810.3010299956639810.187127351760864260.00.0
990.3010299956639810.3010299956639810.195414781570434570.00.0
1000.3010299956639810.3010299956639810.195930480957031250.00.0
1010.3010299956639810.3010299956639810.202617406845092770.00.0
1020.3010299956639810.3010299956639810.199164946873982760.00.0
1030.3010299956639810.3010299956639810.212977647781372070.00.0
1040.3010299956639810.3010299956639810.20507756868998210.00.0
1050.3010299956639810.3010299956639810.23808081944783530.00.0
1060.3010299956639810.3010299956639810.195315917332967130.00.0
1070.3010299956639810.3010299956639810.26614332199096680.00.0
1080.3010299956639810.3010299956639810.215668757756551120.00.0
1090.3010299956639810.3010299956639810.219577789306640620.00.0
1100.3010299956639810.3010299956639810.36206730206807450.00.0
Rows: 1-110 | Columns: 6