About Me

I am a final-year Computer Science Ph.D. student at University of Illinois Urbana-Champaign (UIUC), advised by Prof. Han Zhao. I obtained my dual bachelor’s degree in Data Science from the University of Michigan (UM) and in Electrical and Computer Engineering from Shanghai Jiao Tong University (SJTU). I have also interned at Microsoft Turing, Microsoft GenAI, and Amazon Search Science & AI.

I am interested in making foundation models and agents learn from multiple sources and tasks in an efficient, robust and scalable manner. In particular, I work on

Previously, I have worked on multi-objective optimization, domain adaptation and multimodal learning.

I am currently on the job market for full-time industry research scientist positions, starting summer 2026. Please feel free to reach out if my background aligns with your team’s needs!

News

  • [Sept 2025] MergeBench is accepted at NeurIPS 2025! We established the first standardized benchmark for merging domain-specialized LLMs!
  • [Aug 2025] My Microsoft GenAI internship work on efficient MoE editing is accpeted at EMNLP 2025! Check out how we compress auxiliary experts to save inference costs while maitaining performance!
  • [May 2025] I return to Microsoft Turing as an applied scientist intern working on reasoning for computer use agents!
  • [May 2025] My Microsoft Turing internship work Multilingual Scaling Laws is accepted at ACL 2025! With this law, you can compute optimal sampling ratios of langauges to design your multilingual pretraining mixture for any model size!
  • [Jan 2025] Our work on mechanistic interpretability is accepted at NAACL 2025!
  • [Dec 2024] Localize-and-Stitch is accepted by TMLR! Check out how better localization improves model merging!
  • [Sept 2024] Semi-Supervised Reward Modeling (SSRM) is accepted at EMNLP 2024!
  • [Aug 2024] Start internship at Microsoft GenAI working on improve MoE efficiency!
  • [May 2024] Start internship at Microsoft Turing working on multilingual scaling laws!
  • [May 2024] Our work on robust multi-task learning is accepted at ICML 2024!
  • [May 2024] Our work on gradual domain adaptation is accepted at JMLR!
  • [May 2023] Start internship at Amazon working on large-scale multi-task learning!