Skip to content
View gsarti's full-sized avatar
๐Ÿ“š
Learning
๐Ÿ“š
Learning

Highlights

  • Pro

Organizations

@sissa @interpretingdl @AI-Student-Society @factcheck-it @inseq-team @Hugging-Face-Supporter @GroNLP
Block or Report

Block or report gsarti

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this userโ€™s behavior. Learn more about reporting abuse.

Report abuse
gsarti/README.md

Welcome! ๐Ÿ™Œ

Portfolio Huggingface Hub Twitter LinkedIn Google Scholar

I am a PhD student at the University of Groningen GroNLP Lab and part of the project InDeep: Interpreting Deep Learning Models for Text and Sound, working mainly on interpretability for neural machine translation. Previously, I was a research intern at AWS AI Labs NYC, a research scientist at Aindo, a student in the Data Science MSc at University of Trieste & SISSA and a founding member of the AI Student Society.

My research focuses on interpretability for NLP models, particularly to the benefit of end-users and by leveraging human behavioral signals. I am also very passionate about open-source collaboration :octocat: and I currently lead the development of the Inseq toolkit for interpreting generative language models.

Pinned

  1. inseq-team/inseq inseq-team/inseq Public

    Interpretability for sequence generation models ๐Ÿ› ๐Ÿ”

    Python 307 34

  2. pecore pecore Public

    Materials for "Quantifying the Plausibility of Context Reliance in Neural Machine Translation" at ICLR'24 ๐Ÿ‘ ๐Ÿ‘

    Jupyter Notebook 9 1

  3. divemt divemt Public

    Materials for "DivEMT: Neural Machine Translation Post-Editing Effort Across Typologically Diverse Languages" at EMNLP'22 ๐Ÿ—บ๏ธ

    HTML 7 1

  4. it5 it5 Public

    Materials for "IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation" ๐Ÿ‡ฎ๐Ÿ‡น

    Jupyter Notebook 30 4

  5. t5-flax-gcp t5-flax-gcp Public

    Tutorial to pretrain & fine-tune a ๐Ÿค— Flax T5 model on a TPUv3-8 with GCP

    Python 57 6

  6. covid-papers-browser covid-papers-browser Public

    Browse Covid-19 & SARS-CoV-2 Scientific Papers with Transformers ๐Ÿฆ  ๐Ÿ“–

    CSS 183 27