Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update "phantom grad" to use the same forward-backward transformation code as "vjp" #364

Merged
merged 13 commits into from May 13, 2024

Conversation

IvanYashchuk
Copy link
Collaborator

@IvanYashchuk IvanYashchuk commented May 3, 2024

This PR swaps thunder.transforms.grad with thunder.transforms.grad_v1. The difference between the two is that grad_v1 uses the same code as the main code path with PyTorch Autograd integration, they differ only in the implementation but do the same thing.

After this PR, I'm planning to remove the grad function code that is now unused and renamed in this PR to __grad.

The grad transform was added in 6069e6f and it included a new way of defining forward-backward rules in one same Python function and a new way of registering a transformation to the jitted function.
The new way of defining forward-backward rules is now also supported in the main code path; we no longer need that prototype.

Test plan:

  • All existing "phantom grad" tests should still pass: pytest thunder/tests/test_grad.py -k "phantom_grad".

@IvanYashchuk
Copy link
Collaborator Author

All checks have passed.

@mruberry, could you please help merge this PR if the changes make sense?

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mruberry mruberry enabled auto-merge (squash) May 13, 2024 15:36
@mruberry mruberry merged commit 0406df2 into main May 13, 2024
37 checks passed
@mruberry mruberry deleted the remap-grad branch May 13, 2024 15:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants