View Categories

Confusion Matrix

“Your model says ‘SPAM’. But was it right? Your model says ‘NOT SPAM’. Was it correct? You need a scorecard. Not just one number. A full report of where your model wins and where it fails. That report is the Confusion Matrix.”

What Is a Confusion Matrix? #

A 2×2 table that shows exactly what your model got right and wrong.

The Table:

Predicted: YESPredicted: NO
Actual: YESTP (Hit)FN (Miss)
Actual: NOFP (False Alarm)TN (Correct Reject)

Four numbers. That is it.

The Four Numbers (Learn These) #

NamePlain EnglishAnother Name
TP (True Positive)Said YES, was YESHit
TN (True Negative)Said NO, was NOCorrect Reject
FP (False Positive)Said YES, was NOType I Error, False Alarm
FN (False Negative)Said NO, was YESType II Error, Miss

Memory Trick:

  • “True” = Model was correct
  • “False” = Model was wrong
  • “Positive” = Model said YES
  • “Negative” = Model said NO

So “False Positive” = Model was wrong when it said YES.

Real Example: Cancer Test #

1000 patients tested. 100 actually have cancer. 900 are healthy.

Model Results:

Predicted CancerPredicted Healthy
Actually Cancer80 (TP)20 (FN)
Actually Healthy30 (FP)870 (TN)

Read the matrix:

  • TP = 80 → Correctly caught 80 cancer patients
  • FN = 20 → Missed 20 cancer patients (dangerous!)
  • FP = 30 → Told 30 healthy people they have cancer (stressful)
  • TN = 870 → Correctly told 870 healthy people they are fine
Confusion Matrix

One Real Example #

Cancer test. 100 sick people. 900 healthy.

Model results:

  • Found 80 sick correctly (TP)
  • Missed 20 sick (FN)
  • Told 30 healthy they have cancer (FP)
  • Told 870 healthy they are fine (TN)

That is it. The whole matrix.

Why Bother? #

One number like Accuracy hides the truth.

Accuracy says 95% (870+80 = 950 correct out of 1000).

But you missed 20 sick people. That could kill someone.

Confusion matrix shows you the problem.

Every Metric Comes From Here #

You WantLook At
Correct predictionsTP + TN
False alarmsFP
Missed catchesFN
Trust when model says YESTP / (TP + FP) → Precision
How many caughtTP / (TP + FN) → Recall

One Line of Code #

from sklearn.metrics import confusion_matrix
print(confusion_matrix(y_true, y_pred))

Key Takeaways (3 Lines) #

  1. Confusion matrix shows exactly what your model got right and wrong.
  2. Two errors matter: False alarms (FP) and Missed catches (FN).
  3. Always check the matrix before trusting accuracy.
💬
AIRA (AI Research Assistant) Neural Learning Interface • Drag & Resize Enabled
×