Lydia Nishimwe
Lydia Nishimwe
Home
Experience
Projects
Publications
Talks
Blog
Contact
Light
Dark
Automatic
Word Embeddings
Your Fairseq-trained model might have more embedding parameters than it should.
How a bug in reading SentencePiece vocabulary files causes some Fairseq-trained models to have up to 3k extra parameters in the embedding layer.
Lydia Nishimwe
,
posted on Mar 16, 2024
Last updated on Mar 22, 2025
Fairseq Bug Fix
A bug in reading SentencePiece vocabulary files causes models to have 3k extra params in the embedding layer.
Cite
×