CV

Education

Research Interest

My general research interests include LLM reasoning and understanding the fundamental abilities of LLMs. Specifically, first, I study the inference dynamics of Diffusion Language Models (DLMs), developing decoding methods that leverage planning tokens and other internal signals to improve efficiency and controllability. Second, I explore LLMs for real-world decision-making—for example, in optimization—building systems that orchestrate large-scale workflows.

Honors & Awards

Publications & Preprints

Selected works. * indicates equal contribution.

  1. T. H. Hoang, J. Fuhrman, M. Klarqvist, M. Li, et al.
    “Enabling end-to-end secure federated learning in biomedical research on heterogeneous computing environments with APPFLx.” Computational and Structural Biotechnology Journal, Vol. 28, 2025.

  2. M. Li, M. Klamkin, P. Van Hentenryck, R. Bent, W. Li.
    “Constraint-Informed Active Learning for End-to-End ACOPF Optimization Proxies.”
    Under review at Power Systems Computation Conference (PSCC) 2026.

  3. M. Li, M. Klamkin, P. Van Hentenryck.
    “Conformal Prediction with Upper and Lower Bound Models.”
    In submission to ICML 2026.

  4. M. Li, S. Na, M. Kolar.
    “A Theoretically Sound Sequential Quadratic Programming Algorithm on Riemannian Manifolds.” Preprint 2022, available upon request

Works in Progress

  1. M. Li*, H. Jiang*, et al.
    “Unlocking and Verifying Structured Parallel Decoding in Diffusion Language Models via Planning Tokens.”
    Manuscript expected Jan 15th 2026, in submission to ICML 2026.

  2. M. Li, H. Jiang, D. Meng, Z.Chen, R. Bent, W. Li,P. Van Hentenryck, et al.
    “Agentic LLM Orchestration for Real-Time Hybrid Optimization.”
    Investigating the robust integration of neural proxies with commercial solvers using data-driven logic to minimize computational cost.

Research Experience

Teaching Experience

Technical Skills