Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

HitrateMetric returns wrong results #1155

Closed
6 tasks done
vinnamkim opened this issue Apr 1, 2021 · 4 comments
Closed
6 tasks done

HitrateMetric returns wrong results #1155

vinnamkim opened this issue Apr 1, 2021 · 4 comments
Assignees
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@vinnamkim
Copy link

馃悰 Bug Report

How To Reproduce

Code sample

import torch
from catalyst.metrics import HitrateMetric

metric = HitrateMetric(topk_args=(1, 10, 20))

logits = torch.randn([10, 100])
targets = torch.zeros_like(logits)
targets[:, 0] = 1.0

print(metric.update_key_value(logits, targets))

Results

Occasionally, it returns hitrate@{x1} > hitrate@{x2} for x1 < x2.

{'hitrate01': 0.0, 'hitrate10': 0.030000001192092896, 'hitrate20': 0.019999999552965164, 'hitrate': 0.0}

Expected behavior

Hitrate should returns hitrate{x1} < hitrate{x2} for every x1 < x2.

Environment

Collecting environment information...
Catalyst version: 21.03.2
PyTorch version: 1.7.1+cu110
Is debug build: No
CUDA used to build PyTorch: 11.0
TensorFlow version: 2.2.0-rc2
TensorBoard version: 2.2.0

OS: Ubuntu 18.04.5 LTS
GCC version: (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
CMake version: version 3.19.4

Python version: 3.8
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration: GPU 0: GeForce RTX 2080 Ti
Nvidia driver version: 450.102.04
cuDNN version: Could not collect

Versions of relevant libraries:
[pip] catalyst==21.3.2
[pip] gpytorch==1.0.1
[pip] numpy==1.18.1
[pip] pytorch-nlp==0.5.0
[pip] tensorboard==2.2.0
[pip] tensorboard-plugin-wit==1.6.0.post2
[pip] tensorboardX==2.1
[pip] tensorflow==2.2.0rc2
[pip] tensorflow-datasets==2.1.0
[pip] tensorflow-estimator==2.2.0rc0
[pip] tensorflow-metadata==0.21.1
[pip] torch==1.7.1+cu110
[pip] torchaudio==0.7.2
[pip] torchvision==0.8.2+cu110
[conda] blas                      1.0                         mkl  
[conda] catalyst                  21.3.2                   pypi_0    pypi
[conda] cudatoolkit               10.1.243             h6bb024c_0  
[conda] gpytorch                  1.0.1                    pypi_0    pypi
[conda] mkl                       2020.2                      256  
[conda] mkl-include               2020.2                      256  
[conda] mkl-service               2.3.0            py38he904b0f_0  
[conda] mkl_fft                   1.0.15           py38ha843d7b_0  
[conda] mkl_random                1.1.0            py38h962f231_0  
[conda] numpy                     1.18.1           py38h4f9e942_0  
[conda] numpy-base                1.18.1           py38hde5b4d6_1  
[conda] pytorch-nlp               0.5.0                    pypi_0    pypi
[conda] tensorboard               2.2.0                    pypi_0    pypi
[conda] tensorboard-plugin-wit    1.6.0.post2              pypi_0    pypi
[conda] tensorboardx              2.1                      pypi_0    pypi
[conda] tensorflow                2.2.0rc2                 pypi_0    pypi
[conda] tensorflow-datasets       2.1.0                    pypi_0    pypi
[conda] tensorflow-estimator      2.2.0rc0                 pypi_0    pypi
[conda] tensorflow-metadata       0.21.1                   pypi_0    pypi
[conda] torch                     1.7.1+cu110              pypi_0    pypi
[conda] torchaudio                0.7.2                    pypi_0    pypi
[conda] torchvision               0.8.2+cu110              pypi_0    pypi

Additional context

hits_score = torch.sum(targets_sort_by_outputs[:, :k], dim=1) / k

The number of hits should be divided by the number of targets as follows.

hits_score = torch.sum(targets_sort_by_outputs[:, :k], dim=1) / targets.sum(dim=1)

Checklist

@vinnamkim vinnamkim added bug Something isn't working help wanted Extra attention is needed labels Apr 1, 2021
@Scitator
Copy link
Member

Scitator commented Apr 1, 2021

I think @zkid18 could help us a bit here.
I will dive into the issue during the weekend, that for sure.
If you could submit a PR with these light changes and extra tests - that's would be great!

@zkid18
Copy link
Contributor

zkid18 commented Apr 4, 2021

@vinnamkim thanks for opening the issue.
I'll double-check the implementation today.

@zkid18 zkid18 mentioned this issue Apr 4, 2021
16 tasks
@Scitator
Copy link
Member

Scitator commented Apr 5, 2021

@vinnamkim could you please check the PR and add your comments? let's make it clear together :)

@Scitator
Copy link
Member

Fixed with 21.04 ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

5 participants