Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Make saliency interpreter GPU compatible #5656

Merged
merged 4 commits into from Jun 20, 2022
Merged

Conversation

AkshitaB
Copy link
Contributor

@AkshitaB AkshitaB commented Jun 3, 2022

Fixes #5652.

Changes proposed in this pull request:

Before submitting

  • I've read and followed all steps in the Making a pull request
    section of the CONTRIBUTING docs.
  • I've updated or added any relevant docstrings following the syntax described in the
    Writing docstrings section of the CONTRIBUTING docs.
  • If this PR fixes a bug, I've added a test that will fail without my fix.
  • If this PR adds a new feature, I've added tests that sufficiently cover my new functionality.

After submitting

  • All GitHub Actions jobs for my pull request have passed.
  • codecov/patch reports high test coverage (at least 90%).
    You can find this under the "Actions" tab of the pull request once the other checks have finished.

@AkshitaB AkshitaB requested review from dirkgr and epwalsh June 17, 2022 23:23
@AkshitaB AkshitaB merged commit df9d7ca into main Jun 20, 2022
@AkshitaB AkshitaB deleted the fix-saliency-interpreter branch June 20, 2022 18:07
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

.numpy() command in _aggregate_token_embeddings isn't detaching from GPU into CPU label:bug
2 participants