Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorFlow 2.14 degradation in WER #1511

Open
albertz opened this issue Mar 28, 2024 · 2 comments
Open

TensorFlow 2.14 degradation in WER #1511

albertz opened this issue Mar 28, 2024 · 2 comments

Comments

@albertz
Copy link
Member

albertz commented Mar 28, 2024

I just wanted to track this here: There seem to be some WER degradation in some setup by @Marvin84 occuring in TensorFlow 2.14 and not in earlier versions (although this specific setup only was tested with TF 2.3). From some feedback from the group, in general TF 2.8 was reported to be a good/safe version.

From @Marvin84:

The comparison is done under the same exact training conditions for a Conformer diphone factored hybrid HMM, Viterbi trained for 15 epochs on Librispeech 960h, using an alignment of a from-scratch monophone posterior-HMM. Both decodings use a 4-gram LM. Both use a recent RETURNN version. The decoding has been tuned in a fair manner for both results.

On dev-other, the models trained with TF 2.3 / Ubuntu 16 and TF2.14 / Ubuntu 22 have 6.6% and 7.7% WER, respectively.

@Marvin84

This comment was marked as duplicate.

@Marvin84

This comment was marked as duplicate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants