New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HuggingFace version #2
Comments
Hi Manuel, |
That would be awesome! Looking forward to hearing more about your research. |
@ManuelFay Hi Manuel, |
Any Idea of simple inference code for using the fine-tuned model. |
Hi folks (cc @jpWang), Happy to share that I've got a working HuggingFace implementation. I'm able to reproduce the results (F1 of 88% on FUNSD). The model is here: https://huggingface.co/nielsr/lilt-roberta-en-base-finetuned-funsd I'll open a PR soon :) |
@NielsRogge (cc @jpWang) Hi, Are you planning to add this model to transformer library?It would be really great full. |
Hi, |
Hi @jpWang, I've opened a PR here to add LiLT to 🤗 Transformers: huggingface/transformers#19450. Would you like to create an organization for the company/research institute you're part of, such that we transfer the weigths there? |
Hi @NielsRogge , |
Hi, What's the URL of the HuggingFace organization? I can't seem to find SCUT-DLVCLab on hf.co. |
Hi, I just created our organization: https://huggingface.co/SCUT-DLVCLab. |
Thanks a lot! Now 2 checkpoints are transferred there. Could you give me your email address, such that I can set up a Slack channel? I'd like to discuss some things related to LiLT. |
My email is scutjpwang@foxmail.com. Thanks for your effort :) |
Hi folks, |
Hello, amazing job !
I love the paradigm of decoupling the LM and the Layout model at first, before fine-tuning with joint training ! I've managed to port my LayoutXLM code to your framework almost plug and play, and was wondering if you were planning to open an official model implementation on the HuggingFace transformers library ?
As is, except for a few tweaks due to version changes, and perhaps a processor object wrapping the tokenizer not much is required, so I was wondering if you had plans to do so ?
Cheers and again, great work !
Best,
Manuel
The text was updated successfully, but these errors were encountered: