Tutorial by Examples

The AUROC is one of the most commonly used metric to evaluate a classifier's performances. This section explains how to compute it. AUC (Area Under the Curve) is used most of the time to mean AUROC, which is a bad practice as AUC is ambiguous (could be any curve) while AUROC is not. Overview – Abb...
A confusion matrix can be used to evaluate a classifier, based on a set of test data for which the true values are known. It is a simple tool, that helps to give a good visual overview of the performance of the algorithm being used. A confusion matrix is represented as a table. In this example we w...
A Receiver Operating Characteristic (ROC) curve plots the TP-rate vs. the FP-rate as a threshold on the confidence of an instance being positive is varied Algorithm for creating an ROC curve sort test-set predictions according to confidence that each instance is positive step through ...

Page 1 of 1