GMVAE (PGM)
📐 Gaussian GeometricGaussian Mixture VAE - Product of Experts
VAE with Product of Experts Gaussian mixture prior for clustering in Euclidean space
Publications
Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders
Gaussian Mixture Clustering
GMVAE-PGM uses a mixture of Gaussians as the prior to learn discrete clusters while maintaining continuous representations with Product of Experts and full VAE reconstruction
Main Idea
Learn interpretable clusters by combining Gaussian mixtures with VAE framework in Euclidean space
Key Components
Mixture Encoder
Encodes to mixture of K Gaussians
Mixture Prior
Prior is mixture of K Gaussians for K clusters
Categorical Variable
Discrete cluster assignment variable
Joint Inference
Infer both continuous embedding and cluster
Cluster-Specific Decoder
Reconstructs expression conditioned on cluster
Mathematical Formulation
Loss Functions
Data Flow
Expression → Mixture Encoder → (z, c) → Cluster-Specific Decoder → Reconstructed Expression
Architecture Details
Architecture Type
Gaussian Mixture VAE (VAE Architecture)
Input/Output Types
single-cell → reconstruction