About Me

I am a 4-th year PhD student at Stanford Computer Science advised by Stefano Ermon. Before coming to stanford, I completed my undergrad at Tsinghua University.

Contact: sjzhao at stanford dot edu

Research Interests

(Ranked by the amount of time I spend thinking about each topic, as of Jul 2020)

  • Individual risk, Interpretations for probability and forecast, Calibration
  • Variational definitions of information and machine learning quantities
  • Deep generative models, high dimensional distribution modeling, representation learning
  • Active learning and online learning
  • Information theory


  • Individual Calibration with Randomized Forecast
    Shengjia Zhao, Tengyu Ma, Stefano Ermon [arXiv] (ICML’2020)

  • Cross domain imitation learning
    Kun Ho Kim, Yihong Gu, Jiaming Song, Shengjia Zhao, Stefano Ermon (ICML’2020)

  • A framework for Sample Efficient Interval Estimation with Control Variates
    Shengjia Zhao, Christopher Yeh, Stefano Ermon (AISTATS’2020)

  • Permutation Invariant Graph Generation via Score-Based Generative Modeling
    Chenhao Niu, Yang Song, Jiaming Song, Shengjia Zhao , Aditya Grover, Stefano Ermon (AISTATS’2020)

  • A Theory of Usable Information under Computational Constraints
    Yilun Xu, Shengjia Zhao , Jiaming Song, Russell Stewart, Stefano Ermon (ICLR’2020 Oral)

  • Adaptive Antithetic Sampling for Variance Reduction
    Hongyu Ren*, Shengjia Zhao* , Stefano Ermon (ICML’2019)

  • Learning Neural PDE Solvers with Convergence Guarantees
    Jun-Ting Hsieh*, Shengjia Zhao*, Lucia Mirabella, Stefano Ermon (ICLR’2019)

  • Regular LDPC Construction for Sparse Hashing
    Jonathan Kuck, Tri Dao, Shengjia Zhao, Burak Burtan, Stefano Ermon (UAI’2019)

  • Learning Controllable Fair Representations
    Jiaming Song, Pratyusha Kalluri, Aditya Grover, Shengjia Zhao, Stefano Ermon (AISTATS’2019)

  • InfoVAE: Balancing Learning and Inference in Variational Autoencoders
    Shengjia Zhao, Jiaming Song, Stefano Ermon [arXiv] (AAAI’2019)

  • Bias and Generalization in Deep Generative Models: An Empirical Study
    Shengjia Zhao*, Hongyu Ren*, Arianna Yuan, Jiaming Song, Noah Goodman, Stefano Ermon (NeurIPS’2018 Spotlight )

  • Amortized Inference Regularization
    Rui Shu, Hung H Bai, Shengjia Zhao, Stefano Ermon (NeurIPS’2018)

  • A Lagrangian Perspective on Latent Variable Generative Models
    Shengjia Zhao, Jiaming Song, Stefano Ermon (UAI’2018 Oral) [arXiv]

  • A-NICE-MC: Adversarial Training for MCMC
    Jiaming Song, Shengjia Zhao, Stefano Ermon (NeurIPS’2017) [arXiv]

  • Learning Hierarchical Features from Generative Models
    Shengjia Zhao, Jiaming Song, Stefano Ermon (ICML’2017) [arXiv] [code]

  • Adaptive Concentration Inequalities for Sequential Decision Problems
    Shengjia Zhao, Enze Zhou, Ashish Sabharwal, Stefano Ermon (NeurIPS’2016) [pdf]

  • Closing the Gap Between Short and Long XORs for Model Counting
    Shengjia Zhao, Sorathan Chaturapruek, Ashish Sabharwal, Stefano Ermon (AAAI’2016) [arXiv]


Awards and Fellowships

  • JP Morgan PhD Fellowship (2019)
  • Qualcomm Innovation Fellowship (QInF) (2018)
  • Qualcomm Scholarship (2016)
  • Google Excellence Scholarship (2015)

Professional Service

  • Reviewer: NeurIPS (2017, 2019, 2020), ICLR (2019, 2020), ICML (2019, 2020)
  • Organizer: Information Theory and Machine Learning (ITML) Workshop (NeurIPS’2019)
  • Teaching: CS228 (Head TA)