Search for a tool
Confusion Matrix

Tool to calculate statistical data (sensitivity, specificity, precision, predictive value, etc.) from true positives, true negatives, false positives, false negatives values, also called confusion matrix.

Results

Confusion Matrix -

Tag(s) : Data Processing

Share
dCode and more

dCode is free and its tools are a valuable help in games, maths, geocaching, puzzles and problems to solve every day!
A suggestion ? a feedback ? a bug ? an idea ? Write to dCode!

Please, check our dCode Discord community for help requests!
NB: for encrypted messages, test our automatic cipher identifier!

Feedback and suggestions are welcome so that dCode offers the best 'Confusion Matrix' tool for free! Thank you!

# Confusion Matrix

## Confusion Matrix

True Positive (TP) : item declared TRUE, and in reality TRUE

False Positive (FP) or Type I error: item declared TRUE but in reality FALSE

True Negative (TN): item declared FALSE, and in reality FALSE

False Negative (FN) or Type II error: item declared FALSE but in reality TRUE

### What is a confusion matrix? (Definition)

A confusion matrix, also called an error matrix, is an evaluation tool often presented as a table of 4 boxes containing 4 essential values to statistically measure/evaluate a result. Generally, from a classification model and/or an artificial intelligence algorithm.

The 4 values are:

— the number of true positives (TP): the number of observations correctly predicted as positive.

— the number of false positives (FP): the number of actual negative observations incorrectly predicted as positive.

— the number of true negatives (TN): the number of observations correctly predicted as negative.

— the number of false negatives (FN): the number of actual positive observations incorrectly predicted as negative.

Example: TP:99,FP:1,TN:95:FN:5

### How to evaluate a confusion matrix?

The 4 values of the confusion matrix make it possible to calculate many other values of statistical interest:

— the rate of true TPR positives, also called sensitivity or recall, TPR = TP / (TP + FN)

— the rate of true FPR negatives, also called specificity, FPR = TN / (FP + TN)

— the positive predictive value, also called precision, PPV = TP / (TP + FP)

— the negative predictive value, NPV = TN / (TN + FN)

— the rate of false positives, FPR = FP / (FP + TN)

— the rate of false negatives, FNR = FN / (FN + TP)

— the rate of false discoveries, FDR = FP / (FP + TP)

— the rate of false omissions, FOR = FN / (FN + TN)

In addition, additional indicators can be useful such as accuracy or F1 score.

### How does the Confusion Matrix help improve models?

By providing a detailed view of classification errors, the Confusion Matrix helps to understand the strengths and weaknesses of the model, thus strengthening the interpretability of the results.

## Source code

dCode retains ownership of the "Confusion Matrix" source code. Except explicit open source licence (indicated Creative Commons / free), the "Confusion Matrix" algorithm, the applet or snippet (converter, solver, encryption / decryption, encoding / decoding, ciphering / deciphering, breaker, translator), or the "Confusion Matrix" functions (calculate, convert, solve, decrypt / encrypt, decipher / cipher, decode / encode, translate) written in any informatic language (Python, Java, PHP, C#, Javascript, Matlab, etc.) and all data download, script, or API access for "Confusion Matrix" are not public, same for offline use on PC, mobile, tablet, iPhone or Android app!
Reminder : dCode is free to use.

## Cite dCode

The copy-paste of the page "Confusion Matrix" or any of its results, is allowed (even for commercial purposes) as long as you credit dCode!
Exporting results as a .csv or .txt file is free by clicking on the export icon
Cite as source (bibliography):
Confusion Matrix on dCode.fr [online website], retrieved on 2024-06-14, https://www.dcode.fr/confusion-matrix

## Need Help ?

Please, check our dCode Discord community for help requests!
NB: for encrypted messages, test our automatic cipher identifier!