AI Research Scientist working on large-scale Transformer models, with a focus on how they behave under distribution shift and how to make them more robust to real-world data.
I completed my PhD at Inria Paris and Sorbonne Université, where I studied the interaction between data, scale, and model behaviour in neural systems, from representation learning to generation.
My work sits at the intersection of model design, evaluation, and real-world reliability, with a particular interest in understanding and improving how AI systems generalise beyond clean benchmarks.
Beyond research, I enjoy writing, public speaking, and exploring languages and cultures.
Currently exploring research scientist and applied AI roles.
Nishimwe (/niːʃiːmŋé/) is a Rwandan name meaning ‘Thanks be to God.'
Fun fact: there is another Lydia Nishimwe who is a singer—we are not related.
PhD in Computer Science, 2021-2025
Inria Paris, Sorbonne Université
MEng in Mathematics and Computer Science, 2017-2021
École Centrale de Nantes
BSc in Mathematics and Computer Science, 2014-2017
Université Grenoble Alpes
Native
Native
Advanced
Intermediate
Intermediate
Elementary
Topic: Robust Neural Machine Translation of User-Generated Content
Supervised by Benoît SAGOT and Rachel BAWDEN, defended on June 18, 2025
Tech stack: Python, PyTorch (Fairseq, Hugging Face Transformers), Pandas, Scikit-Learn, SLURM
Tech stack: Python, TensorFlow, Keras, Pandas, Scikit-Learn
Tech stack: Erlang, HTTP
Topic: Functional verification of an ARM7 microprocessor
Tech stack: VHDL, C, ARM Assembly, ModelSim
🏆Stage d’excellence (Excellence Internship Program) - Université Grenoble Alpes🏆
Tech stack: Lutin, Lustre
🏆Stage d’excellence (Excellence Internship Program) - Université Grenoble Alpes🏆