Highlight

A stochastic approximate gradient ascent method for Bayesian experimental design with implicit models

quantum control
Performance comparison between our method and Bayesian optimization method for tuning quantum pulse. The contour image (left column) shows the model photon counts for optically detected spin manipulation for pulse duration and amounts of detuning from the spin's natural resonance frequency. The right column displays the evolution of the posterior distribution with the number of designed measurements. The red points are the true model parameters.

Achievement

Bayesian experimental design (BED) is to answer the question that how to choose designs that maximize the information gathering. For implicit models, where the likelihood is intractable but sampling is possible, conventional BED methods have difficulties in efficiently estimating the posterior distribution and maximizing the mutual information (MI) between data and parameters. Recent work using gradient ascent to maximize a lower bound on MI was proposed to deal with the issues. However, the approach requires a sampling path to compute the pathwise gradient of the MI lower bound with respect to the design variables, and such a pathwise gradient is usually inaccessible for implicit models. In this paper, we propose a novel approach that leverages recent advances in stochastic approximate gradient ascent incorporated with a smoothed variational MI estimator for efficient and robust BED. Without the necessity of pathwise gradients, our approach allows the design process to be achieved through a unified procedure with an approximate gradient for implicit models. Several experiments show that our approach outperforms baseline methods, and significantly improves the scalability of BED in high-dimensional settings. 

Publications:

  • Jiaxin Zhang, Sirui Bi, and Guannan Zhang, A hybrid gradient method to designing Bayesian experiments for implicit models, NeurIPS Workshop on Machine Learning and the Physical Sciences, 2020.
  • Jiaxin Zhang, Sirui Bi, and Guannan Zhang, A stochastic approximate gradient ascent method for Bayesian experimental design with implicit models, submitted.

Significance and Impact

  • We propose a general unified framework that lever-ages stochastic approximate gradient without therequirement or assumption of pathwise gradients forimplicit models.
  • We introduce a smoothed MI lower bound to conductrobust MI estimation and optimization, which allowsthe variance of the design and posterior distributionto be much smaller than existing approaches.
  • We show the superior performance of the approachthrough several experiments and demonstrate thatthe approach enables the optimization to be per-formed by stochastic gradient ascent algorithm andthus well scaled to considerable high dimensionaldesign problems.

Last Updated: November 4, 2020 - 9:42 pm