Expand Menu
-
Home / Documentation / Learn / Enet Search Cv / Index
enet_search_cv¶
In [ ]:
enet_search_cv(input_relation: (str, vDataFrame),
X: list,
y: str,
metric: str = "auto",
cv: int = 3,
estimator_type: str = "auto",
cutoff: float = -1,
print_info: bool = True,)
Computes the k-fold grid search using multiple ENet models.
Parameters¶
Name | Type | Optional | Description |
---|---|---|---|
input_relation | str / vDataFrame | ❌ | Input Relation. |
X | list | ❌ | List of the predictor columns. |
y | str | ❌ | Response Column. |
auto | str / list | ✓ | Metric used to do the model evaluation.
For Classification:
For Regression:
|
cv | int | ✓ | Number of folds. |
estimator_type | str | ✓ | Estimator Type.
|
cutoff | float | ✓ | The model cutoff (classification only). |
print_info | bool | ✓ | If set to True, prints the model information at each step. |
Returns¶
tablesample : An object containing the result. For more information, see utilities.tablesample.
Example¶
In [63]:
from verticapy.learn.model_selection import enet_search_cv
enet_search_cv(input_relation = "public.titanic",
X = ["age", "fare", "pclass",],
y = "survived",
cv = 3)
Starting Bayesian Search Step 1 - Computing Random Models using Grid Search
Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.0}; Test_score: 0.262598430013598; Train_score: 0.274133719943495; Time: 0.3262985547383626; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.1}; Test_score: 0.266291093427745; Train_score: 0.27198829802802066; Time: 0.30567367871602374; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.2}; Test_score: 0.283669488575714; Train_score: 0.264631430687299; Time: 0.28858145078023273; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.30000000000000004}; Test_score: 0.27765932436013835; Train_score: 0.26790878269621965; Time: 0.28299593925476074; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.4}; Test_score: 0.26681449042856936; Train_score: 0.27091851916276766; Time: 0.3265560468037923; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.5}; Test_score: 0.27761610729421865; Train_score: 0.267879726385803; Time: 0.2933203379313151; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.6000000000000001}; Test_score: 0.2813431410163907; Train_score: 0.26608977303223935; Time: 0.322865088780721; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.7000000000000001}; Test_score: 0.27371067192792264; Train_score: 0.2700078071223963; Time: 0.30130958557128906; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.8}; Test_score: 0.271327358139843; Train_score: 0.269882288039403; Time: 0.2919313907623291; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1e-05, 'l1_ratio': 0.9}; Test_score: 0.26711046643049097; Train_score: 0.272047697007229; Time: 0.2928773562113444; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.0}; Test_score: 0.273844245656181; Train_score: 0.2700430792031203; Time: 0.2918105125427246; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.1}; Test_score: 0.272151260627639; Train_score: 0.2692774444256; Time: 0.38333924611409503; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.2}; Test_score: 0.2744871040194313; Train_score: 0.26851389273173565; Time: 0.3269970417022705; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.30000000000000004}; Test_score: 0.2688611824467447; Train_score: 0.27192487823291334; Time: 0.28541771570841473; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.4}; Test_score: 0.27435674329418536; Train_score: 0.268383530105114; Time: 0.28881271680196124; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.5}; Test_score: 0.265830035357967; Train_score: 0.2735527377207397; Time: 0.2916240692138672; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.6000000000000001}; Test_score: 0.279605985783173; Train_score: 0.268251216607695; Time: 0.30589548746744794; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.7000000000000001}; Test_score: 0.2739471770777597; Train_score: 0.2670199331998773; Time: 0.30471499760945636; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.8}; Test_score: 0.2703962570325967; Train_score: 0.269869664148955; Time: 0.33469128608703613; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.0001, 'l1_ratio': 0.9}; Test_score: 0.2758269760076687; Train_score: 0.2697030352682843; Time: 0.31498289108276367; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.0}; Test_score: 0.27190338015048665; Train_score: 0.269230251184661; Time: 0.3099230130513509; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.1}; Test_score: 0.26978812994657964; Train_score: 0.27011089307188835; Time: 0.3464393615722656; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.2}; Test_score: 0.26971207011780335; Train_score: 0.27168937759203765; Time: 0.33384863535563153; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.30000000000000004}; Test_score: 0.27048662525399503; Train_score: 0.2718552628195923; Time: 0.34309156735738117; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.4}; Test_score: 0.27561221151756465; Train_score: 0.2677155005204437; Time: 0.30875062942504883; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.5}; Test_score: 0.277363557999443; Train_score: 0.2668737724562886; Time: 0.28888877232869464; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.6000000000000001}; Test_score: 0.26937256597372833; Train_score: 0.270576664932445; Time: 0.2807494004567464; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.7000000000000001}; Test_score: 0.27484874663777004; Train_score: 0.26731255041390767; Time: 0.3062567710876465; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.8}; Test_score: 0.2816827539981557; Train_score: 0.2656013436349037; Time: 0.3246612548828125; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.001, 'l1_ratio': 0.9}; Test_score: 0.2734556558445937; Train_score: 0.2699779022837977; Time: 0.3173202673594157; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.0}; Test_score: 0.27666989938940867; Train_score: 0.2674973553989017; Time: 0.32061028480529785; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.1}; Test_score: 0.27155212604688733; Train_score: 0.270093144284725; Time: 0.3126896222432454; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.2}; Test_score: 0.28743409542406867; Train_score: 0.26558637448954536; Time: 0.3156779607137044; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.30000000000000004}; Test_score: 0.2692825221249547; Train_score: 0.27163345183925663; Time: 0.29481307665507; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.4}; Test_score: 0.2650493381412157; Train_score: 0.27301827491543634; Time: 0.300190528233846; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.5}; Test_score: 0.27342520189033465; Train_score: 0.2688040990514273; Time: 0.3456309636433919; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.6000000000000001}; Test_score: 0.2711271874696027; Train_score: 0.270117610779609; Time: 0.3209356466929118; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.7000000000000001}; Test_score: 0.273643629327861; Train_score: 0.268843609497238; Time: 0.31479668617248535; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.8}; Test_score: 0.2721716992041833; Train_score: 0.27088880403254534; Time: 0.3185320695241292; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.01, 'l1_ratio': 0.9}; Test_score: 0.26927092275557035; Train_score: 0.27282144132281666; Time: 0.2949639956156413; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.0}; Test_score: 0.27217083342385534; Train_score: 0.27445455308973; Time: 0.30247847239176434; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.1}; Test_score: 0.27310733478301136; Train_score: 0.2760817661532617; Time: 0.33632763226826984; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.2}; Test_score: 0.278450353908547; Train_score: 0.27430849234964433; Time: 0.33259208997090656; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.30000000000000004}; Test_score: 0.28516833439827; Train_score: 0.2716733424876173; Time: 0.34057267506917316; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.4}; Test_score: 0.282919720495777; Train_score: 0.27618159314262036; Time: 0.3068840503692627; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.5}; Test_score: 0.27864301340771364; Train_score: 0.28125115367144765; Time: 0.32869958877563477; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.6000000000000001}; Test_score: 0.27839218861863735; Train_score: 0.2833007057917167; Time: 0.31580615043640137; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.7000000000000001}; Test_score: 0.284684916058892; Train_score: 0.2786167876189847; Time: 0.31981778144836426; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.8}; Test_score: 0.2793585936308717; Train_score: 0.28332633498493165; Time: 0.3324185212453206; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 0.1, 'l1_ratio': 0.9}; Test_score: 0.286173390763846; Train_score: 0.28017274478347765; Time: 0.30952270825703937; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.0}; Test_score: 0.280064586600843; Train_score: 0.28021343532362897; Time: 0.3500709533691406; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.1}; Test_score: 0.2765281808274337; Train_score: 0.28514168077681135; Time: 0.337546428044637; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.2}; Test_score: 0.286300385490234; Train_score: 0.280090942931885; Time: 0.3949916362762451; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.28068088366139365; Train_score: 0.28369223510851266; Time: 0.36893900235493976; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.4}; Test_score: 0.28739997988630667; Train_score: 0.28076816781912434; Time: 0.4097263813018799; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.5}; Test_score: 0.2846031716623277; Train_score: 0.28266511174146197; Time: 0.37319596608479816; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.289253085845893; Train_score: 0.28209080775449435; Time: 0.48996766408284503; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.2840886885264547; Train_score: 0.2835976878936253; Time: 0.30651140213012695; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.8}; Test_score: 0.28241917685981766; Train_score: 0.2864133195008343; Time: 0.3036954402923584; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 1.0, 'l1_ratio': 0.9}; Test_score: 0.282482662266238; Train_score: 0.28577528837848665; Time: 0.31153329213460285; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.0}; Test_score: 0.282167561374635; Train_score: 0.2814195229554263; Time: 0.2658512592315674; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.1}; Test_score: 0.28684789565701235; Train_score: 0.281715637509305; Time: 0.23003037770589194; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.2}; Test_score: 0.2868454714040623; Train_score: 0.2831129229959413; Time: 0.26296035448710126; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.28687766808518966; Train_score: 0.28477458801411665; Time: 0.3370985984802246; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.4}; Test_score: 0.28514866627190166; Train_score: 0.28784880620151965; Time: 0.30331945419311523; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.5}; Test_score: 0.28932703887942335; Train_score: 0.28561912196862965; Time: 0.30033214886983234; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.2891735964934667; Train_score: 0.2892378306821; Time: 0.28902967770894367; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.29046114055905564; Train_score: 0.29139875314848535; Time: 0.28153061866760254; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.8}; Test_score: 0.291542927628617; Train_score: 0.2904379614371977; Time: 0.30759231249491376; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 5.0, 'l1_ratio': 0.9}; Test_score: 0.293234675793685; Train_score: 0.295116603003443; Time: 0.28280027707417804; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.0}; Test_score: 0.281558807983945; Train_score: 0.285380851526246; Time: 0.20305927594502768; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.1}; Test_score: 0.28565783220454866; Train_score: 0.2882284568581477; Time: 0.20418914159138998; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.2}; Test_score: 0.28727001779710065; Train_score: 0.28908782293762597; Time: 0.2120063304901123; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.29087542931298566; Train_score: 0.28884189096797236; Time: 0.25333285331726074; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.4}; Test_score: 0.292350971154677; Train_score: 0.29254298986055965; Time: 0.25289201736450195; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.5}; Test_score: 0.2945289262612603; Train_score: 0.29384571343192534; Time: 0.23057778676350912; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.29914840213658933; Train_score: 0.29946751621913065; Time: 0.22211543718973795; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.299515315282455; Train_score: 0.2991626296279187; Time: 0.18698430061340332; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.8}; Test_score: 0.301003090760206; Train_score: 0.3009834172613983; Time: 0.2291417916615804; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 10.0, 'l1_ratio': 0.9}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.22884726524353027; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.0}; Test_score: 0.28645930861842234; Train_score: 0.28418485752131134; Time: 0.22722291946411133; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.1}; Test_score: 0.29535330796531867; Train_score: 0.2964046817939983; Time: 0.21848750114440918; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.2}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19717979431152344; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.22337841987609863; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.4}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21022280057271323; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.5}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21184388796488443; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.18712735176086426; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19541478157043457; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.8}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19593048095703125; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 50.0, 'l1_ratio': 0.9}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.20261740684509277; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.0}; Test_score: 0.2841366456192043; Train_score: 0.286398474342786; Time: 0.20740699768066406; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.1}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19916494687398276; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.2}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21297764778137207; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.30000000000000004}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.2050775686899821; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.4}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.2380808194478353; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.5}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.19531591733296713; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.6000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.2661433219909668; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.7000000000000001}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21566875775655112; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.8}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.21957778930664062; Model: LogisticRegression; Parameters: {'solver': 'cgd', 'penalty': 'enet', 'C': 100.0, 'l1_ratio': 0.9}; Test_score: 0.301029995663981; Train_score: 0.301029995663981; Time: 0.3620673020680745; Step 2 - Fitting the RF model with the hyperparameters data
Step 3 - Computing Most Probable Good Models using Grid Search
Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.112}; Test_score: 0.279951046580572; Train_score: 0.2843840354620593; Time: 0.3151123523712158; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.111}; Test_score: 0.28465228542275833; Train_score: 0.28250577726909804; Time: 0.2990090847015381; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.11}; Test_score: 0.28136780941726164; Train_score: 0.282927204694194; Time: 0.3273129463195801; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.109}; Test_score: 0.292524351360777; Train_score: 0.2785090605138967; Time: 0.3239874839782715; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.108}; Test_score: 0.2867461223583447; Train_score: 0.2817534943888157; Time: 0.3574562867482503; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.107}; Test_score: 0.28333598848936997; Train_score: 0.28254275493078834; Time: 0.31406164169311523; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.106}; Test_score: 0.288372257678885; Train_score: 0.279615559473095; Time: 0.30739561716715497; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.105}; Test_score: 0.288898051360634; Train_score: 0.27865162134178534; Time: 0.3076666196187337; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.104}; Test_score: 0.28407667113436; Train_score: 0.281813730131792; Time: 0.32158835728963214; Model: LogisticRegression; Parameters: {'C': 3.12, 'l1_ratio': 0.103}; Test_score: 0.28317432379688035; Train_score: 0.28219670606502734; Time: 0.3291612466176351; Bayesian Search Selected Model Parameters: {'solver': 'cgd', 'penalty': 'enet', 'max_iter': 100, 'l1_ratio': 0.0, 'C': 1e-05, 'tol': 1e-06}; Test_score: 0.262598430013598; Train_score: 0.274133719943495; Time: 0.3262985547383626;
Out[63]:
avg_score | avg_train_score | avg_time | score_std | score_train_std | ||
1 | 0.262598430013598 | 0.274133719943495 | 0.3262985547383626 | 0.004349777028265856 | 0.0009329483973997906 | |
2 | 0.2650493381412157 | 0.27301827491543634 | 0.300190528233846 | 0.012710799194373928 | 0.006821677523737474 | |
3 | 0.265830035357967 | 0.2735527377207397 | 0.2916240692138672 | 0.008771532310074891 | 0.005104793135187558 | |
4 | 0.266291093427745 | 0.27198829802802066 | 0.30567367871602374 | 0.0025961266361210866 | 0.001294377521097551 | |
5 | 0.26681449042856936 | 0.27091851916276766 | 0.3265560468037923 | 0.002615930172101627 | 0.001451844666415892 | |
6 | 0.26711046643049097 | 0.272047697007229 | 0.2928773562113444 | 0.00802428415807891 | 0.003618651818559264 | |
7 | 0.2688611824467447 | 0.27192487823291334 | 0.28541771570841473 | 0.003942387463399168 | 0.0025701159849661117 | |
8 | 0.26927092275557035 | 0.27282144132281666 | 0.2949639956156413 | 0.004435716915124935 | 0.0012253735498850634 | |
9 | 0.2692825221249547 | 0.27163345183925663 | 0.29481307665507 | 0.005964047416055083 | 0.0029022716058280363 | |
10 | 0.26937256597372833 | 0.270576664932445 | 0.2807494004567464 | 0.006475790556243237 | 0.00293052548844658 | |
11 | 0.26971207011780335 | 0.27168937759203765 | 0.33384863535563153 | 0.012189090637426256 | 0.005173377903525278 | |
12 | 0.26978812994657964 | 0.27011089307188835 | 0.3464393615722656 | 0.015045369739235677 | 0.007804434369256671 | |
13 | 0.2703962570325967 | 0.269869664148955 | 0.33469128608703613 | 0.002355018366549438 | 0.002197611487998931 | |
14 | 0.27048662525399503 | 0.2718552628195923 | 0.34309156735738117 | 0.00973194335302319 | 0.004411628377393284 | |
15 | 0.2711271874696027 | 0.270117610779609 | 0.3209356466929118 | 0.003665997576138045 | 0.0005533505912554856 | |
16 | 0.271327358139843 | 0.269882288039403 | 0.2919313907623291 | 0.003903673800886551 | 0.0023082466370493164 | |
17 | 0.27155212604688733 | 0.270093144284725 | 0.3126896222432454 | 0.005353146450689406 | 0.0034556369925374288 | |
18 | 0.27190338015048665 | 0.269230251184661 | 0.3099230130513509 | 0.00392902741592913 | 0.001785616167235049 | |
19 | 0.272151260627639 | 0.2692774444256 | 0.38333924611409503 | 0.0011555213634245924 | 0.0010143975426826545 | |
20 | 0.27217083342385534 | 0.27445455308973 | 0.30247847239176434 | 0.003661702382906054 | 0.0014408160506393564 | |
21 | 0.2721716992041833 | 0.27088880403254534 | 0.3185320695241292 | 0.007877690092349567 | 0.0029830084197949715 | |
22 | 0.27310733478301136 | 0.2760817661532617 | 0.33632763226826984 | 0.0022963712633950048 | 0.0014786427386107486 | |
23 | 0.27342520189033465 | 0.2688040990514273 | 0.3456309636433919 | 0.0016709719000489226 | 0.0007785625621636187 | |
24 | 0.2734556558445937 | 0.2699779022837977 | 0.3173202673594157 | 0.006437652821632398 | 0.002568392251263787 | |
25 | 0.273643629327861 | 0.268843609497238 | 0.31479668617248535 | 0.008865246259382125 | 0.005577331240488202 | |
26 | 0.27371067192792264 | 0.2700078071223963 | 0.30130958557128906 | 0.004138536830368685 | 0.0015822078714623896 | |
27 | 0.273844245656181 | 0.2700430792031203 | 0.2918105125427246 | 0.006618473725790487 | 0.0031696105030355817 | |
28 | 0.2739471770777597 | 0.2670199331998773 | 0.30471499760945636 | 0.005107276166812729 | 0.0037749507526140154 | |
29 | 0.27435674329418536 | 0.268383530105114 | 0.28881271680196124 | 0.011343327011574651 | 0.004669419126102212 | |
30 | 0.2744871040194313 | 0.26851389273173565 | 0.3269970417022705 | 0.010181571433804777 | 0.005857462307846778 | |
31 | 0.27484874663777004 | 0.26731255041390767 | 0.3062567710876465 | 0.0017022427716250818 | 0.0016018596042515292 | |
32 | 0.27561221151756465 | 0.2677155005204437 | 0.30875062942504883 | 0.005618234994621494 | 0.0027389549111835436 | |
33 | 0.2758269760076687 | 0.2697030352682843 | 0.31498289108276367 | 0.003582191282119523 | 8.342264758661904e-05 | |
34 | 0.2765281808274337 | 0.28514168077681135 | 0.337546428044637 | 0.004232971663755035 | 0.0015348959270411154 | |
35 | 0.27666989938940867 | 0.2674973553989017 | 0.32061028480529785 | 0.009746243613747449 | 0.005190433759763678 | |
36 | 0.277363557999443 | 0.2668737724562886 | 0.28888877232869464 | 0.005368379635389694 | 0.0014484563310489737 | |
37 | 0.27761610729421865 | 0.267879726385803 | 0.2933203379313151 | 0.003409049284151565 | 0.002317637743805133 | |
38 | 0.27765932436013835 | 0.26790878269621965 | 0.28299593925476074 | 0.0007686907995961589 | 0.0014132043002215468 | |
39 | 0.27839218861863735 | 0.2833007057917167 | 0.31580615043640137 | 0.0030960193351221806 | 0.0023865996986673897 | |
40 | 0.278450353908547 | 0.27430849234964433 | 0.33259208997090656 | 0.006525651984104786 | 0.001513306090634289 | |
41 | 0.27864301340771364 | 0.28125115367144765 | 0.32869958877563477 | 0.0028798120348988467 | 0.0030915363886010456 | |
42 | 0.2793585936308717 | 0.28332633498493165 | 0.3324185212453206 | 0.00370045032052364 | 0.0024265343573997055 | |
43 | 0.279605985783173 | 0.268251216607695 | 0.30589548746744794 | 0.006564749959143225 | 0.00369740221475518 | |
44 | 0.279951046580572 | 0.2843840354620593 | 0.3151123523712158 | 0.005068566460196225 | 0.0034933509278099783 | |
45 | 0.280064586600843 | 0.28021343532362897 | 0.3500709533691406 | 0.004340040738064074 | 0.00237488384513383 | |
46 | 0.28068088366139365 | 0.28369223510851266 | 0.36893900235493976 | 0.00291117562536515 | 0.0015756135981438732 | |
47 | 0.2813431410163907 | 0.26608977303223935 | 0.322865088780721 | 0.018739597352263045 | 0.006106576307069964 | |
48 | 0.28136780941726164 | 0.282927204694194 | 0.3273129463195801 | 0.004671697121493927 | 0.0036828703375445513 | |
49 | 0.281558807983945 | 0.285380851526246 | 0.20305927594502768 | 0.001658728149463313 | 0.0002480139836948314 | |
50 | 0.2816827539981557 | 0.2656013436349037 | 0.3246612548828125 | 0.015850818928056454 | 0.005939926949107826 | |
51 | 0.282167561374635 | 0.2814195229554263 | 0.2658512592315674 | 0.0025528302874938034 | 0.001749533778199504 | |
52 | 0.28241917685981766 | 0.2864133195008343 | 0.3036954402923584 | 0.0031574790431246535 | 0.0026420732054964775 | |
53 | 0.282482662266238 | 0.28577528837848665 | 0.31153329213460285 | 0.004054241417202708 | 0.0031085743857976475 | |
54 | 0.282919720495777 | 0.27618159314262036 | 0.3068840503692627 | 0.00796337070096159 | 0.0044530406047341665 | |
55 | 0.28317432379688035 | 0.28219670606502734 | 0.3291612466176351 | 0.0033915909375232586 | 0.0031560088713123015 | |
56 | 0.28333598848936997 | 0.28254275493078834 | 0.31406164169311523 | 0.0019708407485919672 | 0.0031360954593191532 | |
57 | 0.283669488575714 | 0.264631430687299 | 0.28858145078023273 | 0.01427116844646146 | 0.004706238307507307 | |
58 | 0.28407667113436 | 0.281813730131792 | 0.32158835728963214 | 0.00522888457805044 | 0.002737562388302744 | |
59 | 0.2840886885264547 | 0.2835976878936253 | 0.30651140213012695 | 0.0013970631027550343 | 0.00233961591571665 | |
60 | 0.2841366456192043 | 0.286398474342786 | 0.20740699768066406 | 0.003128956094215526 | 0.002421003100361331 | |
61 | 0.2846031716623277 | 0.28266511174146197 | 0.37319596608479816 | 0.0026228201148882704 | 0.002467557949697091 | |
62 | 0.28465228542275833 | 0.28250577726909804 | 0.2990090847015381 | 0.004092751332396212 | 0.0028206917557600975 | |
63 | 0.284684916058892 | 0.2786167876189847 | 0.31981778144836426 | 0.008845501137114738 | 0.0034786934788982706 | |
64 | 0.28514866627190166 | 0.28784880620151965 | 0.30331945419311523 | 0.0009749247526232906 | 0.001760905771480469 | |
65 | 0.28516833439827 | 0.2716733424876173 | 0.34057267506917316 | 0.006852037833531048 | 0.003736287604119645 | |
66 | 0.28565783220454866 | 0.2882284568581477 | 0.20418914159138998 | 0.001459062774545813 | 0.0011943294057966123 | |
67 | 0.286173390763846 | 0.28017274478347765 | 0.30952270825703937 | 0.006391295580518481 | 0.0023876100803798196 | |
68 | 0.286300385490234 | 0.280090942931885 | 0.3949916362762451 | 0.0069940492571017795 | 0.004503654497867974 | |
69 | 0.28645930861842234 | 0.28418485752131134 | 0.22722291946411133 | 0.002618742186349219 | 0.0004182644925671466 | |
70 | 0.2867461223583447 | 0.2817534943888157 | 0.3574562867482503 | 0.010515452040312366 | 0.0037754110329093577 | |
71 | 0.2868454714040623 | 0.2831129229959413 | 0.26296035448710126 | 0.0033353596139516946 | 0.0039134155315358925 | |
72 | 0.28684789565701235 | 0.281715637509305 | 0.23003037770589194 | 0.005755294382834107 | 0.004443260096925231 | |
73 | 0.28687766808518966 | 0.28477458801411665 | 0.3370985984802246 | 0.0034100265144293277 | 0.002776003258892111 | |
74 | 0.28727001779710065 | 0.28908782293762597 | 0.2120063304901123 | 0.0005281687161790027 | 0.00039881122071910857 | |
75 | 0.28739997988630667 | 0.28076816781912434 | 0.4097263813018799 | 0.00974407366542605 | 0.004217400104930424 | |
76 | 0.28743409542406867 | 0.26558637448954536 | 0.3156779607137044 | 0.01506157259177145 | 0.004124859339200786 | |
77 | 0.288372257678885 | 0.279615559473095 | 0.30739561716715497 | 0.002724375889402183 | 0.0007793662990998128 | |
78 | 0.288898051360634 | 0.27865162134178534 | 0.3076666196187337 | 0.006566741034096734 | 0.003945359678055887 | |
79 | 0.2891735964934667 | 0.2892378306821 | 0.28902967770894367 | 0.0006225921579365956 | 0.0020841605728100995 | |
80 | 0.289253085845893 | 0.28209080775449435 | 0.48996766408284503 | 0.00855106008065081 | 0.004766251865627149 | |
81 | 0.28932703887942335 | 0.28561912196862965 | 0.30033214886983234 | 0.001207669448783195 | 0.0013061068060506327 | |
82 | 0.29046114055905564 | 0.29139875314848535 | 0.28153061866760254 | 0.00024004196694568777 | 0.0034700340932891306 | |
83 | 0.29087542931298566 | 0.28884189096797236 | 0.25333285331726074 | 0.0008832790689641943 | 3.6960071762331854e-05 | |
84 | 0.291542927628617 | 0.2904379614371977 | 0.30759231249491376 | 0.0006916095352640495 | 0.0019126863717182426 | |
85 | 0.292350971154677 | 0.29254298986055965 | 0.25289201736450195 | 0.00016393734812876942 | 0.0009362926823337595 | |
86 | 0.292524351360777 | 0.2785090605138967 | 0.3239874839782715 | 0.008690608414202603 | 0.003601059602967378 | |
87 | 0.293234675793685 | 0.295116603003443 | 0.28280027707417804 | 0.0006618855445633651 | 0.0008214793999581614 | |
88 | 0.2945289262612603 | 0.29384571343192534 | 0.23057778676350912 | 0.00035592270256009175 | 0.0015725273884638175 | |
89 | 0.29535330796531867 | 0.2964046817939983 | 0.21848750114440918 | 0.0003337409075589044 | 0.0005905536118116889 | |
90 | 0.29914840213658933 | 0.29946751621913065 | 0.22211543718973795 | 0.0009426983305871986 | 0.0009253941889090591 | |
91 | 0.299515315282455 | 0.2991626296279187 | 0.18698430061340332 | 0.0003958277275523793 | 0.0006095270933012482 | |
92 | 0.301003090760206 | 0.3009834172613983 | 0.2291417916615804 | 4.660066031104051e-05 | 8.06761598085528e-05 | |
93 | 0.301029995663981 | 0.301029995663981 | 0.22884726524353027 | 0.0 | 0.0 | |
94 | 0.301029995663981 | 0.301029995663981 | 0.19717979431152344 | 0.0 | 0.0 | |
95 | 0.301029995663981 | 0.301029995663981 | 0.22337841987609863 | 0.0 | 0.0 | |
96 | 0.301029995663981 | 0.301029995663981 | 0.21022280057271323 | 0.0 | 0.0 | |
97 | 0.301029995663981 | 0.301029995663981 | 0.21184388796488443 | 0.0 | 0.0 | |
98 | 0.301029995663981 | 0.301029995663981 | 0.18712735176086426 | 0.0 | 0.0 | |
99 | 0.301029995663981 | 0.301029995663981 | 0.19541478157043457 | 0.0 | 0.0 | |
100 | 0.301029995663981 | 0.301029995663981 | 0.19593048095703125 | 0.0 | 0.0 | |
101 | 0.301029995663981 | 0.301029995663981 | 0.20261740684509277 | 0.0 | 0.0 | |
102 | 0.301029995663981 | 0.301029995663981 | 0.19916494687398276 | 0.0 | 0.0 | |
103 | 0.301029995663981 | 0.301029995663981 | 0.21297764778137207 | 0.0 | 0.0 | |
104 | 0.301029995663981 | 0.301029995663981 | 0.2050775686899821 | 0.0 | 0.0 | |
105 | 0.301029995663981 | 0.301029995663981 | 0.2380808194478353 | 0.0 | 0.0 | |
106 | 0.301029995663981 | 0.301029995663981 | 0.19531591733296713 | 0.0 | 0.0 | |
107 | 0.301029995663981 | 0.301029995663981 | 0.2661433219909668 | 0.0 | 0.0 | |
108 | 0.301029995663981 | 0.301029995663981 | 0.21566875775655112 | 0.0 | 0.0 | |
109 | 0.301029995663981 | 0.301029995663981 | 0.21957778930664062 | 0.0 | 0.0 | |
110 | 0.301029995663981 | 0.301029995663981 | 0.3620673020680745 | 0.0 | 0.0 |
Rows: 1-110 | Columns: 6
(c) Copyright [2020-2022] Vertica