1

Let suppose that I have a distance $d_{\text{GMM}}$ in the space of Gaussian mixture models (denoted $\text{GMM}(\mathbb{R}^n))$. Given a discrete distribution $\mu_{\text{data}}$ I would like to create a generator $G_\theta$ (small neural network in torch) that takes random noise and whose output ($({G_\theta})_\# \mathcal{N}(\mathbf{0},\mathbf{1})$) is similar to $\mu_\text{data}$ in the sense of the distnce $d$. I would need:

  1. Project $({G_\theta})_\# \mathcal{N}(\mathbf{0},\mathbf{1})$ and $\mu_{\text{data}}$ onto $\text{GMM}(\mathbb{R}^n)$,
  2. Solve the problem: $$\min_\theta d_{\text{GMM}}\left(\text{Proj}_{\text{GMM}}\left[({G_\theta})_\# \mathcal{N}(\mathbf{0},\mathbf{1})\right], \text{Proj}_{\text{GMM}}(\mu_{\text{data}})\right).$$

When the constraint of being in the GMM space is not considered and the distance being the Wasserstein-2 distance can be found here (implemented with the library POT in torch).

Problem

The main issue of my situation is that I can't find a way of projecting the distribution on GMM space that is compatible with backpropagation on pytorch. I only know EM algorithm or MLE estimation for GMM estimation.

Does anyone know how to do such a projection ?

Thank you very much!

NancyBoy
  • 111
  • 1

0 Answers0