I am an applied math PhD student in the Pehlevan Group at Harvard. My research interests lie in the convex hull of machine learning, statistical physics, and theoretical neuroscience. Before graduate school, I studied physics, engineering, and computer science at Washington University in St. Louis.
My works can be found on Google Scholar. I also occasionally post new preprints on twitter.
Bordelon, Chaudhry, Pehlevan “Infinite Limits of Multi-head Transformer Dynamics”. Neurips 2024.
Bordelon, Atanasov, Pehlevan “A Dynamical Model of Neural Scaling Laws” ICML 2024.
Bordelon, Noci, Li, Hanin, Pehlevan “Depthwise Hyperparameter Transfer in Residual Networks: Dynamics and Scaling Limit” ICLR 2024
Kumar, Bordelon, Gershman, Pehlevan “Grokking as the Transition from Lazy to Rich Training Dynamics” ICLR 2024
Bordelon, Pehlevan “Dynamics of Finite Width Kernel and Prediction Fluctuations in Mean Field Neural Networks”, Neurips spotlight 2023.
Bordelon, Masset, Kuo, Pehlevan “Loss Dynamics of Temporal Difference Reinforcement Learning”, Neurips 2023.
Vyas, Atanasov, Bordelon, Morwani, Sainathan, Pehlevan Feature-Learning Networks are consistent across widths at realistic scales Neurips 2023 (equal contribution of first three authors).
Bordelon, Pehlevan “The Influence of Learning Rule on Representation Dynamics in Wide Neural Networks”, ICLR 2023. Notable-top 25%.
Atanasov, Bordelon, Sainathan, Pehlevan The Onset of Variance-Limited Behavior for Networks in the Lazy and Rich Regimes (Equal Contribution), ICLR 2023.
Bordelon, Pehlevan “Population Codes Enable Learning from Few Examples By Shaping Inductive Bias”, Elife 2022.
Bordelon, Pehlevan Self-consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks. Neurips 2022.
Atanasov, Bordelon, Pehlevan “Neural Networks as Kernel Learners: The Silent Alignment Effect” (Equal Contribution), ICLR 2022.
Farrell, Bordelon, Trivedi, Pehlevan “Capacity of Group-invariant Linear Readouts from Equivariant Representations: How Many Objects can be Linearly Classified Under All Possible Views?” (Equal Contribution), ICLR 2022.
Bordelon, Pehlevan “Learning Curves for SGD on Structured Features”, ICLR 2022.
Canatar, Bordelon, Pehlevan “Out-of-Distribution Generalization in Kernel Regression”, Neurips 2021.
Canatar, Bordelon, and Pehlevan, “Spectral bias and task-model alignment explain generalization in kernel regression and infinitely wide neural networks” Nature Communications 2021.
Bordelon, Canatar, and Pehlevan, “Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks,” ICML, 2020.
Atkinson, Mahzoon, Keim, Bordelon, Pruitt, Charity, and Dickhoff “Dispersive optical model analysis of Pb-208 generating a neutron-skin prediction beyond the mean field,” Phys. Rev. C 101, 044303, 2020
Bagley, Bordelon, Moseley, Wessel “Pre-Synaptic Pool Modification (PSPM): A supervised learning procedure for recurrent spiking neural networks”" PLOS ONE, 2020.
Bordelon, Atanasov, Pehlevan “How Feature Learning Can Improve Neural Scaling Laws” 2024.
Kumar, Anker, Spector, Bordelon, Muenninghof, Paul, Pehlevan, Re, Raghunathan “Scaling Laws for Precision” 2024.
Kumar, Bordelon, Pehlevan, Murthy, Gershman “Do Mice Grok? Glimpses of Hidden Progress During Overtraining in Sensory Cortex” 2024.
Shan, Bordelon “Rapid Feature Evolution Accelerates Learning in Neural Networks” (Equal Contribution) 2021.