Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add numerical test comparing BetterDecoder and fairseq decoder #79576

Closed
wants to merge 1 commit into from

Conversation

erichan1
Copy link
Member

@erichan1 erichan1 commented Jun 14, 2022

Summary:
Add a new file test_transformers.py to put transformers tests in and move away from huge monolithic test_nn.py. A todo item is to move existing transformer tests from test_nn.py to test_transformers.py.

Add a numerical test comparing torch.nn._transformer_decoder_layer_fwd and fairseq decoder. Both decoders use the weights of a common nn.TransformerEncoder. Contains both forced decoding and incremental decoding

Stacked on top of #79438

Test Plan:

buck build --show-output mode/opt -c fbcode.enable_gpu_sections=true -c fbcode.nvcc_arch=a100 mode/inplace  //caffe2/test:transformers

./fbcode/buck-out/gen/caffe2/test/transformers#binary.par

Test runs and passes!

Differential Revision: D37157391

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jun 14, 2022

🔗 Helpful links

✅ No Failures (0 Pending)

As of commit b4d1fbc (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D37157391

@erichan1 erichan1 requested a review from zrphercule June 14, 2022 23:11
@erichan1
Copy link
Member Author

Corresponding test for #79438

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D37157391

Copy link
Contributor

@zrphercule zrphercule left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks!

@erichan1
Copy link
Member Author

erichan1 commented Jun 15, 2022

LGTM! Thanks!

@zrphercule Can you also approve #79438? It's the PR this one is stacked on top of.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D37157391

@erichan1
Copy link
Member Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Merge failed due to This PR has internal changes and must be landed via Phabricator
Raised by https://github.com/pytorch/pytorch/actions/runs/2505610792

erichan1 added a commit to erichan1/pytorch that referenced this pull request Jun 17, 2022
…ch#79576)

Summary:
Pull Request resolved: pytorch#79576

Includes both forced and incremental decoding.

Add a new file test_transformers.py to put transformers tests in and move away from huge monolithic test_nn.py. A todo item is to move existing transformer tests from test_nn.py to test_transformers.py.

Add a numerical test comparing torch.nn._transformer_decoder_layer_fwd and fairseq decoder. Both decoders use the weights of a common nn.TransformerEncoder.

Test Plan:
```
buck build --show-output mode/opt -c fbcode.enable_gpu_sections=true -c fbcode.nvcc_arch=a100 mode/inplace  //caffe2/test:transformers

./fbcode/buck-out/gen/caffe2/test/transformers#binary.par
```
Test runs and passes!

Differential Revision: D37157391

fbshipit-source-id: 3b3f1c7fdac8269278982e0dcc2d32ff6b63547d
…ch#79576)

Summary:
Pull Request resolved: pytorch#79576

Includes both forced and incremental decoding.

Add a new file test_transformers.py to put transformers tests in and move away from huge monolithic test_nn.py. A todo item is to move existing transformer tests from test_nn.py to test_transformers.py.

Add a numerical test comparing torch.nn._transformer_decoder_layer_fwd and fairseq decoder. Both decoders use the weights of a common nn.TransformerEncoder.

Test Plan:
```
buck build --show-output mode/opt -c fbcode.enable_gpu_sections=true -c fbcode.nvcc_arch=a100 mode/inplace  //caffe2/test:transformers

./fbcode/buck-out/gen/caffe2/test/transformers#binary.par
```
Test runs and passes!

Reviewed By: mikekgfb

Differential Revision: D37157391

fbshipit-source-id: 18ff84ba22e8b92208d4af97f266df0752c80d54
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D37157391

@erichan1
Copy link
Member Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Merge failed due to This PR has internal changes and must be landed via Phabricator
Raised by https://github.com/pytorch/pytorch/actions/runs/2518175005

@erichan1
Copy link
Member Author

Closing because code in this PR already committed in #79796

@erichan1 erichan1 closed this Jun 20, 2022
This was referenced Jul 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants