Metrics for Regression
| Function | Definition |
|---|---|
| aic_bic | Computes the AIC (Akaike’s Information Criterion) & BIC (Bayesian Information Criterion). |
| anova_table | Computes the ANOVA Table. |
| explained_variance | Computes the Explained Variance. |
| max_error | Computes the Max Error. |
| mean_absolute_error | Computes the Mean Absolute Error. |
| mean_squared_error | Computes the Mean Squared Error. |
| mean_squared_log_error | Computes the Mean Squared Log Error. |
| median_absolute_error | Computes the Median Absolute Error. |
| r2_score | Computes the R2 Score. |
| regression_report / report | Computes a regression report using multiple metrics (r2, mse, max error...). |
Metrics for Classification
| Function | Definition |
|---|---|
| accuracy_score | Computes the Accuracy Score. |
| auc | Computes the ROC AUC (Area Under Curve). |
| classification_report | Computes a classification report using multiple metrics (AUC, accuracy, PRC AUC, F1...). |
| confusion_matrix | Computes the Confusion Matrix. |
| critical_success_index | Computes the Critical Success Index. |
| f1_score | Computes the F1 Score. |
| informedness | Computes the Informedness. |
| log_loss | Computes the Log Loss. |
| markedness | Computes the Markedness. |
| matthews_corrcoef | Computes the Matthews Correlation Coefficient. |
| multilabel_confusion_matrix | Computes the Multi Label Confusion Matrix. |
| negative_predictive_score | Computes the Negative Predictive Score. |
| prc_auc | Computes the PRC AUC (Area Under Curve). |
| precision_score | Computes the Precision Score. |
| recall_score | Computes the Recall Score. |
| specificity_score | Computes the Specificity Score. |
