Metrics#
Regression#
|
Returns the AIC score. |
|
Computes the ANOVA table. |
|
Returns the BIC score. |
|
Computes the Explained Variance. |
|
Computes the Max Error. |
|
Computes the Mean Absolute Error. |
|
Computes the Mean Squared Error. |
|
Computes the Mean Squared Log Error. |
|
Computes the Median Absolute Error. |
|
Computes the input quantile of the Error. |
|
Computes the R2 score. |
|
Computes a regression report using multiple metrics to evaluate the model ( |
Classification#
|
Computes the Accuracy score. |
|
Computes the Balanced Accuracy. |
|
Computes the ROC AUC (Area Under Curve). |
|
Computes a classification report using multiple metrics (AUC, accuracy, PRC AUC, F1...). |
|
Computes the confusion matrix. |
|
Computes the Critical Success Index. |
|
Computes the Diagnostic odds ratio. |
|
Computes the False Discovery Rate. |
|
Computes the False Omission Rate. |
|
Computes the False Negative Rate. |
|
Computes the False Positive Rate. |
|
Computes the F1 score. |
|
Computes the Fowlkes–Mallows index. |
|
Computes the Informedness. |
|
Computes the Log Loss. |
|
Computes the Markedness. |
|
Computes the Matthews Correlation Coefficient. |
|
Computes the Positive Likelihood ratio. |
|
Computes the Negative Predictive Score. |
|
Computes the Positive Likelihood ratio. |
|
Computes the Precision Score. |
|
Computes the Prevalence Threshold. |
|
Computes the area under the curve (AUC) of a Precision-Recall (PRC) curve. |
|
Computes the Recall score. |
|
Computes the ROC AUC (Area Under Curve). |
|
Computes the Specificity score. |