Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deberta tf #12972

Merged
merged 11 commits into from Aug 12, 2021
Merged

Deberta tf #12972

merged 11 commits into from Aug 12, 2021

Conversation

kamalkraj
Copy link
Contributor

@kamalkraj kamalkraj commented Aug 1, 2021

What does this PR do?

TFDeBERTa implementation
@patrickvonplaten, @LysandreJik

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your PR! My main concern is the added dependency, but I think we don't really need it. I've left pointers on how to avoid using it.

setup.py Outdated Show resolved Hide resolved
src/transformers/__init__.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
@sgugger
Copy link
Collaborator

sgugger commented Aug 6, 2021

As a result of #13023 , you will need to rebase your PR on master and solve the merge conflicts (basically, you will just need to re-add the models in the auto-mappings as strings). Let us know if you need any help with that.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good! I'd love for @Rocketknight1 to go over the TF code, and pinging @BigBird01 as he's the author of the model and contributed the PyTorch version.

src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
src/transformers/models/deberta/modeling_tf_deberta.py Outdated Show resolved Hide resolved
@BigBird01
Copy link
Contributor

Glad to see a tf version! Thank you!

moved weights to build and fixed name scope

added missing ,

bug fixes to enable graph mode execution

updated setup.py

fixing typo

fix imports

embedding mask fix

added layer names avoid autmatic incremental names

+XSoftmax

cleanup

added names to layer

disable keras_serializable
Distangled attention output shape hidden_size==None
using symbolic inputs

test for Deberta tf

make style

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

Update src/transformers/models/deberta/modeling_tf_deberta.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

removed tensorflow-probability

removed blank line
self.dropout = StableDropout(config.hidden_dropout_prob, name="dropout")

def build(self, input_shape: tf.TensorShape):
with tf.name_scope("word_embeddings"):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not a huge fan of the tf.name_scope(...) here as it makes the weight naming less flexible - could we instead craete three new tf.keras.layers.Layer classes (one for each embedding) to have a 1-to-1 translation from PyTorch?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry - I just noticed that this has become the standard in modeling_tf_bert.py as well:

def build(self, input_shape: tf.TensorShape):
=> all good then! Please ignore my previous comment :-)

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the great PR! The PR seems to be in a very good shape already. Mostly left nits, but would be happy if we could:

  • Add a TFDeberta prefix to most layer classes. This helps when looking up modules later (I know that in PyTocrh we also didn't append a Deberta prefix to all modules, but we should have done this IMO.
  • Avoid using with tf.name_scope and instead replicate the PyTorch weight structure 1-to-1

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, @kamalkraj! Would you be interested in contributing the TensorFlow version of the DeBERTa-v2 model too? :)

@LysandreJik LysandreJik merged commit d329b63 into huggingface:master Aug 12, 2021
@kamalkraj
Copy link
Contributor Author

@LysandreJik
Yes, I am interested in contributing the DeBERTa-v2 model also

@kamalkraj kamalkraj mentioned this pull request Aug 13, 2021
5 tasks
@kamalkraj kamalkraj deleted the deberta-tf branch September 11, 2021 10:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants