Page 138 - FULL REPORT 30012024
P. 138
5.1.1 Confusion Matrix
The confusion matrix is a critical evaluation tool in classification models,
providing a visual representation of the performance of the predictive model.
It is a table used to describe the performance of a classification model on a
set of data for which the true values are known. The matrix itself is segmented
into four parts: true positives (TP), true negatives (TN), false positives (FP),
and false negatives (FN). True positives and true negatives represent the
observations that are correctly predicted as occurring (stroke) and not
occurring (no stroke), respectively. Conversely, false positives occur when
the model incorrectly predicts an event (stroke) when it did not happen, while
false negatives are cases where the model fails to predict an event that
actually occurred. Table 5.1 shows the result table of the confusion matrix
and Figure 5.1 depicts the confusion matrix heatmap.
Table 5.1 Result of Confusion Matrix
Actual Class
Positive Negative
Positive TP = 474 FP = 190
Predicted Class
Negative FN = 65 TN= 570
121