Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accuracy is wrong when ignore_index is set #1691

Closed
dros1986 opened this issue Apr 5, 2023 · 2 comments 路 Fixed by #1821 路 May be fixed by #2163
Closed

Accuracy is wrong when ignore_index is set #1691

dros1986 opened this issue Apr 5, 2023 · 2 comments 路 Fixed by #1821 路 May be fixed by #2163
Assignees
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Milestone

Comments

@dros1986
Copy link

dros1986 commented Apr 5, 2023

馃悰 Bug

When the ignore_index argument is set to a value, the MulticlassAccuracy object sets the class accuracy to zero, but considers it in the average calculation.

To Reproduce

import torch
from torchmetrics.classification import MulticlassAccuracy

# simulate the output of a perfect predictor (i.e. preds == target)
target = torch.tensor([0, 1, 2, 0, 1, 2])
preds  = target

metric = MulticlassAccuracy(num_classes=3, average='none', ignore_index=0)
res = metric(preds, target)
print(res)
# it prints [0., 1., 1.]

metric = MulticlassAccuracy(num_classes=3, average='macro', ignore_index=0)
res = metric(preds, target)
print(res)
# it prints 0.6667 instead of 1

Expected behavior

It should not take into account the ignored class.

Environment

  • TorchMetrics version: 0.11.4 installed from pip
  • Python & PyTorch Version: 3.10.6 - 1.13.0
  • Any other relevant information such as OS (e.g., Linux): Ubuntu 22.04
@dros1986 dros1986 added bug / fix Something isn't working help wanted Extra attention is needed labels Apr 5, 2023
@github-actions
Copy link

github-actions bot commented Apr 5, 2023

Hi! thanks for your contribution!, great first issue!

@edumotya
Copy link

edumotya commented Oct 8, 2023

Actually, the macro accuracy is still wrong when ignore_index is set. It is not covered by the above example (preds == target). The issue is that the FP for the ignore_index class are still taken into account. The metric used to work for older version of torchmetrics, such as 0.9.3.

To Reproduce

import numpy as np
import torch
from torchmetrics.classification import Accuracy

# class 0 -> 100% TP
cm_00 = 100
cm_01 = 0
cm_02 = 0
# class 1 -> 90% TP + 10% class 0
cm_10 = 10
cm_11 = 90
cm_12 = 0
# class 2 -> 100% TP
cm_20 = 0
cm_21 = 0
cm_22 = 100

predictions = np.array(
    (
        cm_00 * [0]
        + cm_01 * [1]
        + cm_02 * [2]
        + cm_10 * [0]
        + cm_11 * [1]
        + cm_12 * [2]
        + cm_20 * [0]
        + cm_21 * [1]
        + cm_22 * [2]
    ),
    dtype=np.int,
)
targets = np.array(
    (
        cm_00 * [0]
        + cm_01 * [0]
        + cm_02 * [0]
        + cm_10 * [1]
        + cm_11 * [1]
        + cm_12 * [1]
        + cm_20 * [2]
        + cm_21 * [2]
        + cm_22 * [2]
    ),
    dtype=np.int,
)

metric = Accuracy(
    task="multiclass",
    num_classes=3,
    average="macro",
    ignore_index=0,
)
res = metric(
    torch.from_numpy(predictions),
    torch.from_numpy(targets),
)
print(res)
# it prints 0.6333 instead of 0.95

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment