TVM is expecting tokenizer.json for HF models. But HF repo (e.g. : Helsinki-NLP/opus-mt-it-en) contains only .spm files based on sentencepiece model #16671
Labels
needs-triage
PRs or issues that need to be investigated by maintainers to find the right assignees to address it
type: bug
How can I upload tokenizer.json file in TVM for translation inference (e.g. : Helsinki-NLP/opus-mt-it-en). Because .json file does not contain merge information and sentencepiece learned weights.
Please help to resolve this issue.
The text was updated successfully, but these errors were encountered: