Skip to content

RonMcKay/gradient_metrics

Repository files navigation

PyPI GitHub Workflow Status (branch) PyPI - License PyPI - Downloads

This package implements utilities for computing gradient metrics for measuring uncertainties in neural networks based on the paper "Classification Uncertainty of Deep Neural Networks Based on Gradient Information, Oberdiek et al., 2018".
An application of this can also be found in "On the Importance of Gradients for Detecting Distributional Shifts in the Wild, Huang et al., 2021"

Documentation and examples can be found on GitHub pages.

Table of Contents

Installation

pip install gradient-metrics

Usage

Example of computing the maximum, minimum, mean and standard deviation of gradient entries as in Classification Uncertainty of Deep Neural Networks Based on Gradient Information:

from gradient_metrics import GradientMetricCollector
from gradient_metrics.metrics import Max, Min, MeanStd
import torch.nn.functional as tfunc

# Initialize a network
mynet = MyNeuralNetwork()

# Initialize the GradientMetricCollector
mcollector = GradientMetricCollector(
    [
        Max(mynet),
        Min(mynet),
        MeanStd(mynet),
    ]
)

# Predict your data
out = mynet(x)

# Construct pseudo labels
y_pred = out.argmax(1).clone().detach()

# Construct the sample wise loss for backpropagation
sample_loss = tfunc.cross_entropy(out, y_pred, reduction="none")

# Compute the gradient metrics
metrics = mcollector(sample_loss)

Example of computing the L1-Norm from On the Importance of Gradients for Detecting Distributional Shifts in the Wild:

from gradient_metrics import GradientMetricCollector
from gradient_metrics.metrics import PNorm
import torch
import torch.nn.functional as tfunc

# Initialize a network
mynet = MyNeuralNetwork()

# Initialize the GradientMetricCollector
mcollector = GradientMetricCollector(PNorm(mynet))

# Predict your data
out = mynet(x)

# Construct the sample wise loss for backpropagation
sample_loss = torch.log(tfunc.softmax(out, dim=1)).mean(1).neg()

# Compute the gradient metrics
metrics = mcollector(sample_loss)

Contributing

Requirements:

Contributions in the form of PRs or issues are welcome. To install the development environment run

make setup

Before you open your pull-request, make sure that all tests are passing in your local copy by running make test.

Citing

@inproceedings{OberdiekRG18,  
  author    = {Philipp Oberdiek and  
               Matthias Rottmann and  
               Hanno Gottschalk},  
  editor    = {Luca Pancioni and  
               Friedhelm Schwenker and  
               Edmondo Trentin},  
  title     = {Classification Uncertainty of Deep Neural Networks Based on Gradient  
               Information},  
  booktitle = {Artificial Neural Networks in Pattern Recognition - 8th {IAPR} {TC3}  
               Workshop, {ANNPR} 2018, Siena, Italy, September 19-21, 2018, Proceedings},  
  series    = {Lecture Notes in Computer Science},  
  volume    = {11081},  
  pages     = {113--125},  
  publisher = {Springer},  
  year      = {2018},  
  url       = { https://doi.org/10.1007/978-3-319-99978-4_9 },  
  doi       = { 10.1007/978-3-319-99978-4\_9 },  
}