Iman Mirzadeh

Iman Mirzadeh

Machine Learning Research Engineer

Apple

About Me

I’m currently an ML Research Engineer at Apple. Prior to joining Apple, I received my PhD from Washington State University, where I worked at the Embedded Machine Intelligence Lab (EMIL) under the supervision of Dr. Hassan Ghasemzadeh.

I am interested in the real-world challenges of working with machine learning models, such as model efficiency and learning ability. Specifically, I am focusing on Model Optimization (e.g., model compression) and Continual (Lifelong) Learning, and my goal is to build efficient learning algorithms that help machines learn as effectively as humans.

Interests
  • Machine Learning
  • Deep Learning
  • Continual Learning
  • Computer Vision
  • Model Optimization
Education
  • Ph.D. in Computer Science (Artificial Intelligence), 2018-2022

    Washington State University

  • M.Sc. in Computer Science (Artificial Intelligence), 2018-2020

    Washington State University

  • B.Sc. in Computer Engineering (Information Technology), 2013-2018

    University of Tehran

Experience

 
 
 
 
 
Apple
Machine Learning Research Engineer
May 2023 – Present
 
 
 
 
 
DeepMind
Research Scientist Intern
Aug 2021 – Dec 2021 Remote
 
 
 
 
 
Washington State University
Graduate Research Assistant
Aug 2018 – Aug 2022
 
 
 
 
 
Sokhan AI
Machine Learning Engineer
Aug 2017 – Aug 2018

Selected Publications

Wide Neural Networks Forget Less Catastrophically
International Conference on Machine Learning (ICML), 2022
Wide Neural Networks Forget Less Catastrophically
CL-Gym: Full-Featured PyTorch Library for Continual Learning
IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2021
CL-Gym: Full-Featured PyTorch Library for Continual Learning
Linear Mode Connectivity in Multitask and Continual Learning
International Conference on Learning Representations (ICLR), 2021
Linear Mode Connectivity in Multitask and Continual Learning
Understanding the Role of Training Regimes in Continual Learning
Advances in Neural Information Processing Systems (NeurIPS), 2020
The abstract version presented at ICML 2020 Workshop on Continual Learning
Understanding the Role of Training Regimes in Continual Learning
Dropout as an Implicit Gating Mechanism For Continual Learning
IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020
Runner-up award at CVPR'20 Workshop on Continual Learning in Computer Vision
Dropout as an Implicit Gating Mechanism For Continual Learning
Optimal Policy for Deployment of Machine Learning Models on Energy-Bounded Systems
International Joint Conference on Artificial Intelligence (IJCAI), 2020
Optimal Policy for Deployment of Machine Learning Models on Energy-Bounded Systems
Improved Knowledge Distillation via Teacher Assistant
AAI Conference on Artificial Intelligence (AAAI), 2020
Improved Knowledge Distillation via Teacher Assistant