Victor Morand
Hi ! I’m Victor, currently PhD Student working in the MLIA team of the Institute of Intelligent Systems and Robotics (ISIR) located in Sorbonne Université, under the supervision of Benjamin Piwowarski and Josiane Mothe.
🎓 Towards Language models that know what they know
My thesis is entitled Domain adaptation in Pretrained Language Models, Applications to information retrieval and Conversational search.
Put more simply, we aim at improving information acess using the amazing ability of Pretrained Language Models to understand text, while they’re not that amazing at manipulating factual information yet…
📄 Publications
Victor Morand, Mathias Vast, Basile Van Cooten, Laure Soulier, Josiane Mothe, Benjamin Piwowarski, arXiv preprint, 2026
We reproduce and compare different distillation strategies for cross-encoders across a wide range of architectures, identifying robust design choices for effective re-ranking.
Mathias Vast*, Victor Morand*, Basile van Cooten, Laure Soulier, Josiane Mothe, Benjamin Piwowarski, arXiv preprint, 2026
We propose MICE (Minimal Interaction Cross-Encoders), a new architecture that achieves a 4x decrease in inference latency while maintaining high ranking effectiveness.
Victor Morand, Nadi Tomeh, Josiane Mothe, Benjamin Piwowarski, , 2025
We introduce ToMMeR, a lightweight model probing mention detection capabilities from early LLM layers, achieving high zero-shot recall across diverse benchmarks.
Victor Morand, Josiane Mothe, Benjamin Piwowarski, BlackBoxNLP@EMNLP, 2025
We investigate how Large Language Models internally represent entities by introducing entity mention reconstruction as a novel framework for studying how LLMs encode and manipulate entities.
Victor Morand, Nils Müller, Ryan Weightman, Benedetto Piccoli, Alexander Keimer, Alexandre M. Bayen, Journal of Computational Physics, 2024
We study the numerical approximation of scalar conservation laws by computational optimization of the numerical flux function in a first-order finite volume method.