About me
I am a senior research engineer at Meta AI Research, working with Ilya Mironov and Mike Rabbat. My research interests include large-scale machine learning, federated and on-device learning, and differential privacy. I served as technical program committee for FL-ICML, and reviewer for ICML, NeurIPS, AISTATS and MLSys.
Prior to Meta, I received my Masters from UC Davis in 2019, where I worked with Prem Devanbu and Vincent Hellendoorn as a member of the empirical software engineering lab (DECAL).
Publications (see all)
Preprints
Where to Begin? Exploring the Impact of Pre-Training and Initialization in Federated Learning
- John Nguyen, Kshitiz Malik, Maziar Sanjabi, Michael Rabbat.
2022
Kiwan Maeng, Haiyu Lu, Luca Melis, John Nguyen, Mike Rabbat, Carole-Jean Wu.
The ACM Conference Series on Recommender Systems (RecSys), 2022.
Papaya: Practical, Private, and Scalable Federated Learning
Dzmitry Huba, John Nguyen, Kshitiz Malik, Ruiyu Zhu, Mike Rabbat, Ashkan Yousefpour, Carole-Jean Wu, Hongyuan Zhan, Pavel Ustinov, Harish Srinivas, Kaikai Wang, Anthony Shoumikhin, Jesik Min, Mani Malek.
Conference on Machine Learning and Systems (MLSys), 2022.
Federated Learning with Buffered Asynchronous Aggregation
John Nguyen, Kshitiz Malik, Hongyuan Zhan, Ashkan Yousefpour, Mike Rabbat, Mani Malek, Dzmitry Huba.
International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.
2021
Opacus: User-Friendly Differential Privacy Library in PyTorch
Ashkan Yousefpour*, Igor Shilov*, Alexandre Sablayrolles*, Davide Testuggine, Karthik Prasad, Mani Malek, John Nguyen, Sayan Ghosh, Akash Bharadwaj, Jessica Zhao, Graham Cormode, Ilya Mironov. ∗Equal contribution.
Privacy in Machine Learning (PriML) workshop, NeurIPS 2021.