March 16, 2023 Thursday
In the absence of explicit or tractable likelihoods, Bayesians often resort to approximate Bayesian computation (ABC) for inference. Our work bridges ABC with deep neural implicit samplers based on generative adversarial networks (GANs) and adversarial variational Bayes. Both ABC and GANs compare aspects of observed and fake data to simulate from posteriors and likelihoods, respectively. We develop a Bayesian GAN (B-GAN) sampler that directly targets the posterior by solving an adversarial optimization problem. B-GAN is driven by a deterministic mapping learned on the ABC reference by conditional GANs. Once the mapping has been trained, iid posterior samples are obtained by filtering noise at a negligible additional cost. We propose two post-processing local refinements using (1) data-driven proposals with importance reweighting, and (2) variational Bayes. We support our findings with frequentist-Bayesian results, showing that the typical total variation distance between the true and approximate posteriors converges to zero for certain neural network generators and discriminators. Our findings on simulated data show highly competitive performance relative to some of the most recent likelihood-free posterior simulators.
Ms. Yuexi WANG
Fifth-year Ph.D. student, Econometrics and Statistics, University of Chicago Booth School of Business
Yuexi Wang is a fifth-year Ph.D. student in the Econometrics and Statistics group at the University of Chicago Booth School of Business. Her research interests are at the intersection of machine learning, Bayesian computation, and optimization. In particular, her recent work examines approximate Bayesian inference with adversarial learning techniques. Before Booth, she received her B.S. in mathematics from Zhejiang University (China) and M.S. in statistics from the University of Chicago. She received the 2022 Arnold Zellner Doctoral Prize for applications of Bayesian methodology in finance.