Skip to content

myrainbowandsky/classicpapers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 

Repository files navigation

classic NLP papers:

survey: Efficient Transformers: A Survey, 2020: https://arxiv.org/abs/2009.06732

evalucate self-attention: Long Range Arena: A Benchmark for Efficient Transformers,2020 : https://arxiv.org/abs/2011.04006

convolution is a special form of Self-Attentio, On the Relationship between Self-Attention and Convolutional Layers, 2019: https://arxiv.org/abs/1911.03584

RNN is special Self-Attention, 2020: https://arxiv.org/abs/2006.16236

NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE, 2015 : https://arxiv.org/abs/1409.0473

Attention is all your need,2017: https://arxiv.org/abs/1706.03762

Translation using encoder-decoder architecture with attention mechanism, 2017: Massive Exploration of Neural Machine Translation Architectures, https://arxiv.org/pdf/1703.03906.pdf

Translation using seq2seq, Sequence to Sequence Learning with Neural Networks, 2014, https://arxiv.org/abs/1409.3215

Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers,2021, https://arxiv.org/pdf/2103.15679.pdf

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published