📘 Publications
🧠 > Conference papers & pre-prints
Optical Training of Large-Scale Transformers and Deep Neural Networks with Direct Feedback Alignment.
Ziao Wang, Kilian Müller, Matthew Filipovich, Julien Launay, Ruben Ohana, Gustave Pariente, Safa Mokaadi, Charles Brossollet, Fabien Moreau, Alessandro Cappelli, Iacopo Poli, Igor Carron, Laurent Daudet, Florent Krzakala, Sylvain Gigan.
arXiv pre-print, 2024.
The Falcon Series of Open Language Models.
Ebtesam Almazrouei, Hamza Alobeidli, Abdulaziz Alshamsi, Alessandro Cappelli, Ruxandra Cojocaru, Mérouane Debbah, Étienne Goffinet, Daniel Hesslow, Julien Launay, Quentin Malartic, Daniele Mazzotta, Badreddine Noune, Baptiste Pannier, Guilherme Penedo. Technical lead.
arXiv pre-print, 2023.
The RefinedWeb Dataset for Falcon LLM: Outperforming Curated Corpora with Web Data, and Web Data Only.
Guilherme Penedo, Quentin Malartic, Daniel Hesslow, Ruxandra Cojocaru, Alessandro Cappelli, Hamza Alobeidli, Baptiste Pannier, Ebtesam Almazrouei, Julien Launay.
NeurIPS Datasets and Benchmarks, 2023.
BLOOM: a 176B-Parameter Open-Access Multilingual Language Model. Big Science Workshop, among major contributors.
arXiv pre-print, 2022.
What Language Model to Train if You Have One Million GPU Hours?
The Big Science Architecture & Scaling Group, co-chair.
Findings of EMNLP (long paper), 2022.
ACL Workshop: Challenges & Perspectives in Creating Large Language Models (long paper), 2022.
What Language Model Architecture and Pretraining Objective Works Best for Zero-Shot Generalization?
Thomas Wang, Adam Roberts, Daniel Hesslow, Teven Le Scao, Hyung Won Chung, Iz Beltagy, Colin Raffel*, Julien Launay*.
ICML (poster), 2022.
Adversarial Robustness by Design through Analog Computing and Synthetic Gradients.
Alessandro Cappeli*, Ruben Ohana*, Julien Launay, Laurent Meunier, Iacopo Poli, Florent Krzakala.
IEEE ICASSP, 2022.
NeurIPS Workshop: Beyond Backpropagation (short paper, poster), 2021.
Photonic Differential Privacy with Direct Feedback Alignment.
Julien Launay*, Ruben Ohana*, Hamlet J. Medina Ruiz*, Alessandro Cappelli, Iacopo Poli, Liva Ralaivola, Alain Rakotomamonjy.
NeurIPS (poster), 2021.
PAGnol: an Extra-Large French Generative Model.
Julien Launay*, E.L. Tommasone*, Baptiste Pannier, François Boniface, Amélie Chatelain, Iacopo Poli, Djamé Seddah.
LREC (poster), 2022.
Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures.
Julien Launay, Iacopo Poli, François Boniface, Florent Krzakala.
NeurIPS (poster), 2020.
Principled Training of Neural Networks with Direct Feedback Alignment.
Julien Launay, Iacopo Poli, Florent Krzakala.
arXiv pre-print, 2019.
Analysis of Factors Affecting the Performance of BIPV Panels.
Julien Launay, Eric W.M. Lee, Rachid Bennacer, Richard K.K. Yuen.
European Physical Journal Applied Physics, 2018.
ICOME (oral), 2018.
📝 > Workshops and short papers
Artificial Neural Network Training on an Optical Processor via Direct Feedback Alignment.
Kilian Müller, Julien Launay, Iacopo Poli, Matthew Filipovich, Alessandro Capelli, Daniel Hesslow, Igor Carron, Laurent Daudet, Florent Krzakala, Sylvain Gigan.
The European Conference on Lasers and Electro-Optics, 2023.
AlGhafa Evaluation Benchmark for Arabic Language Models.
Ebtesam Almazrouei, Ruxandra Cojocaru, Michele Baldo, Quentin Malartic, Hamza Alobeidli, Daniele Mazzotta, Guilherme Penedo, Giulia Campesan, Mugariya Farooq, Maitha Alhammadi, Julien Launay, Badreddine Noune.
Proceedings of ArabicNLP, 2023.
Scaling Laws Beyond Backpropagation.
Matthew J. Filipovich, Alessandro Cappelli, Daniel Hesslow, Julien Launay.
NeurIPS Workshop: I Can't Believe It's Not Better (poster), 2022.
A Holistic Assessment of the Carbon Footprint of Noor, a Very Large Arabic Language Model.
Imad Lakim, Ebtesam Almazrouei, Ibrahim Abualhaol, Merouane Debbah, Julien Launay.
ACL Workshop: Challenges & Perspectives in Creating Large Language Models (poster), 2022.
Is the Number of Trainable Parameters All That Actually Matters?
Amélie Chatelain, Amine Djeghri, Daniel Hesslow, Iacopo Poli*, Julien Launay*. NeurIPS I Can't Believe It's Not Better (PMLR+spotlight), 2021.
LightOn Optical Processing Unit: Scaling-up AI and HPC with a Non von Neumann co-processor.
Charles Brossollet, Alessandro Cappelli, Igor Carron, Charidimos Chaintoutis, Amélie Chatelain, Laurent Daudet, Sylvain Gigan, Daniel Hesslow, Florent Krzakala, Julien Launay, Safa Mokaadi, Fabien Moreau, Kilian Müller, Ruben Ohana, Gustave Pariente, Iacopo Poli, and E. L. Tommasone.
IEEE Hot Chips (poster), 2021.
ROPUST: Improving Robustness through Fine-tuning with Photonic Processors and Synthetic Gradients.
Alessandro Cappeli, Julien Launay, Laurent Meunier, Ruben Ohana, Iacopo Poli. ICML
Workshop on Adversarial Machine Learning (poster), 2021.
Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment.
Julien Launay, Iacopo Poli, Kilian Müller, Gustave Pariente, Igor Carron, Laurent Daudet, Florent Krzakala, Sylvain Gigan.
NeurIPS Workshop: Beyond Backpropagation (oral), 2020.
Light-in-the-loop: Using a Photonics Co-Processor for Scalable Training of Neural Networks.
Julien Launay, Iacopo Poli, Killian Müller, Igor Carron, Laurent Daudet, Florent Krzakala, Sylvain Gigan.
IEEE Hot Chips (poster), 2020.
Computer Vision and Feedback Alignment Methods.
Julien Launay, Iacopo Poli, Florent Krzakala.
NAISys (poster), 2020.
⚖️ > Patents
Method and System for Distributed Training Using Synthetic Gradients.
Julien Launay, Iacopo Poli, Kilian Müller, Gustave Pariente, Igor Carron, Laurent Daudet.
US Patent 17/117,925, 2020 (pending).
Method and System for Machine Learning Using Optical Data.
Iacopo Poli, Julien Launay, Kilian Müller, Gustave Pariente, Igor Carron, Laurent Daudet, Ruben Ohana, Daniel Hesslow.
US Patent Publication US2021/0287079 A1.
🔗 > Blogposts
- Adaptive raises a $20M Seed to help companies build singular GenAI experiences;
- From Zero to PPO: Understanding the Path to Helpful AI Models.
- LightOn's Summer Series #1, #2, and #3.
🎓 > Educational resources
Les glissements de terrain, modélisation et prévision. (Landslides: modeling and prediction).
Éduscol, 2017.
📖 > Books
Aventure, survie et création: le Guide Minecraft. (Adventure, Survival, and Creation: a Minecraft Guide).
Sold 20,000 copies.
Pearson, 2015.