Balanced accuracy is a metric we can use to assess the performance of a classification model.
It is calculated as:
Balanced accuracy = (Sensitivity + Specificity) / 2
where:
- Sensitivity: The “true positive rate” – the percentage of positive cases the model is able to detect.
- Specificity: The “true negative rate” – the percentage of negative cases the model is able to detect.
This metric is particularly useful when the two classes are imbalanced – that is, one class appears much more than the other.
The following example shows how to calculate balanced accuracy in practice and demonstrates why it’s such a useful metric.
Example: Calculating Balanced Accuracy
Suppose a sports analyst uses a logistic regression model to predict whether or not 400 different college basketball players get drafted into the NBA.
The following confusion matrix summarizes the predictions made by the model:
To calculate the balanced accuracy of the model, we’ll first calculate the sensitivity and specificity:
- Sensitivity: The “true positive rate” = 15 / (15 + 5) = 0.75
- Specificity: The “true negative rate” = 375 / (375 + 5) = 0.9868
We can then calculate the balanced accuracy as:
- Balanced accuracy = (Sensitivity + Specificity) / 2
- Balanced accuracy = (0.75 + 9868) / 2
- Balanced accuracy = 0.8684
The balanced accuracy for the model turns out to be 0.8684.
Note that the closer the balanced accuracy is to 1, the better the model is able to correctly classify observations.
In this example, the balanced accuracy is quite high which tells us that the logistic regression model does a pretty good job of predicting whether or not college players will get drafted into the NBA.
In this scenario, since the classes are so imbalanced (20 players got drafted and 380 players did not) the balanced accuracy gives us a more realistic picture of how well the model performs compared to an overall accuracy metric.
For example, we would calculate the accuracy of the model as:
- Accuracy = (TP + TN) / (TP + TN + FP + FN)
- Accuracy = (15 + 375) / (15 + 375 + 5 + 5)
- Accuracy = 0.975
The accuracy of the model is 0.975, which sounds extremely high.
However, consider a model that just predicts every player to not get drafted. It would have an accuracy of 380 / 400 = 0.95. This is only slightly lower than the accuracy of our model.
The balanced accuracy score of 0.8684 gives us a better idea of how well the model is able to predict both classes.
That is, it gives us a better idea of how well the model is able to predict players who won’t get drafted and those who will get drafted.
Additional Resources
The following tutorials explain how to create a confusion matrix in different statistical software:
How to Create a Confusion Matrix in Excel
How to Create a Confusion Matrix in R
How to Create a Confusion Matrix in Python