You can reach me at <firstname>@lighton.ai.
🔥 Hot stuff
- 🎓 – Our paper on Photonic Differential Privacy got accepted at NeurIPS 2021! Also, I received the Outstanding Reviewer Award.
- 🌸 – I'm chairing the architecture & scaling working group for the BigScience project!
- 🇫🇷 – We've just released PAGnol, the largest language model for French! Write with PAGnol, and check-out coverage in VentureBeat, and in Jack Clark's ImportAI. We have also recently released esPAGnol, for Spanish.
Get in touch if you are interested to meet—physically or virtually, for increased social distancing in these strange times 😷.
📈 > Extreme-scale learning
As supported by recent breakthroughs like GPT-3, the scaling hypothesis provides a clear motivation for growing models past trillions of parameters.
To venture into the training of such models, novel approaches are required. I am a proponent of co-design beyond silicon and beyond backpropagation, enabling exotic architectures fit for the challenges of modern deep learning.
🧠 > How can machines learn?
My research is focused on the nature of learning. In neural networks, knowledge is commonly distilled from repeated observations using end-to-end backpropagation. I am interested in alternative methods, such as direct feedback alignment and locally learned synthetic gradients.
These alternatives may be motivated by biological realism (e.g. the weight transport problem), by computational arguments (e.g. increased parallelism), or by broader matters in artificial intelligence (e.g. local learning).
💾 > Learning beyond silicon
For machines to learn, scalable and powerful hardware is needed. As the ultimate limits of silicon are reached, new paradigms need to emerge. I am interested in optical computing, and its use in co-processors dedicated to accelerate machine learning computations.