9

One of the biggest drawbacks of Bayesian learning against deep learning is runtime: applying Bayes' theorem requires knowledge on how the data is distributed, and this usually requires either expensive integrals or some sampling mechanism (with the corresponding drawbacks).

Since at the end of the day is all about distribution propagations, and this is (as far as I understand) the nature of quantum computing, is there a way to perform these efficiently? If yes, what limitations do apply?

Edit (directly related links):

Sanchayan Dutta
  • 17,945
  • 8
  • 50
  • 112
fr_andres
  • 774
  • 7
  • 17

1 Answers1

5

Gaussian Processes are a key component of the model-building procedure at the core of Bayesian Optimization. Therefore speeding up the training of Gaussian processes directly enhances Bayesian Optimization. The recent paper by Zhao et. al on Quantum algorithms for training Gaussian Processes does exactly this.