Confusion Matrix & Cyber Crime

A confusion matrix is a table that is often used to describe the performance of a classification model (or “classifier”) on a set of test data for which the true values are known. The confusion matrix itself is relatively simple to understand, but the related terminology can be confusing.

Let’s understand TP, FP, FN, TN

◼True Positive: Interpretation: You predicted positive and it’s true.

◼True Negative: Interpretation: You predicted negative and it’s true.

◼False Positive: (Type 1 Error): Interpretation: You predicted positive and it’s false..

◼ False Negative: (Type 2 Error): Interpretation: You predicted negative and it’s false.

Just Remember, We describe predicted values as Positive and Negative and actual values as True and False.

What can we learn from this..?

Precision :

Recall :

F-1 Score :

Accuracy :

Confusion Matrix’s implementation in monitoring Cyber Attacks:

In the KDD Cup 99, the criteria used for evaluation of the participant entries is the Cost Per Test (CPT) computed using the confusion matrix and a given cost matrix.
• True Positive (TP): The amount of attack detected when it is actually attack.
• True Negative (TN): The amount of normal detected when it is actually normal.
• False Positive (FP): The amount of attack detected when it is actually normal (False alarm).
• False Negative (FN): The amount of normal detected when it is actually attack.

A Confusion matrix is a tabular summary of a number of correct and incorrect predictions made by a classifier. It is used to measure the performance of a classification model. It can be used to evaluate the performance of a classification model through the calculation of performance metrices like accuracy, precision, recall and F1-score.

Thanks For Reading.

Hope you enjoyed the article.