Back to Models

GMVAE (PGM)

📐 Gaussian Geometric

Gaussian Mixture VAE - Product of Experts

RNA

VAE with Product of Experts Gaussian mixture prior for clustering in Euclidean space

Publications

Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders

Dilokthanakul et al.2016
Complexity
complex
Interpretability
high
Architecture
GMVAE-PGM
Latent Dim
10
Used in LAIOR Framework

Gaussian Mixture Clustering

GMVAE-PGM uses a mixture of Gaussians as the prior to learn discrete clusters while maintaining continuous representations with Product of Experts and full VAE reconstruction

Main Idea

Learn interpretable clusters by combining Gaussian mixtures with VAE framework in Euclidean space

Key Components

Mixture Encoder

Encodes to mixture of K Gaussians

Mixture Prior

Prior is mixture of K Gaussians for K clusters

Categorical Variable

Discrete cluster assignment variable

Joint Inference

Infer both continuous embedding and cluster

Cluster-Specific Decoder

Reconstructs expression conditioned on cluster

Mathematical Formulation

p(z) = Σ_k π_k N(μ_k, Σ_k); p(c) = Categorical(π); X̂ = Decoder(z,c)

Loss Functions

ELBO
Reconstruction + KL with mixture prior

Data Flow

Expression → Mixture Encoder → (z, c) → Cluster-Specific Decoder → Reconstructed Expression

Architecture Details

Architecture Type

Gaussian Mixture VAE (VAE Architecture)

Input/Output Types

single-cellreconstruction

Key Layers

MixtureEncoderCategoricalSamplerClusterDecoder

Frameworks

PyTorch

Tags

vaemixtureclusteringgaussiangenerativerna