Back to Models

GMVAE (HW)

📐 Gaussian Geometric

Hyperbolic-Wrapped Gaussian Mixture VAE

RNA

GMVAE using Hyperbolic-Wrapped distributions for hierarchical clustering on Lorentz hyperboloid

Publications

Hyperbolic-Wrapped Variational Autoencoders for Hierarchical Representations

Gu et al.2021
Complexity
complex
Interpretability
high
Architecture
HW-GMVAE
Latent Dim
10
Used in LAIOR Framework

Wrapped Hyperbolic Clustering

Uses Hyperbolic-Wrapped normal distributions on the hyperboloid model to naturally encode hierarchical relationships with full VAE reconstruction

Main Idea

Capture complex hierarchical structures using wrapped distributions on Lorentz hyperboloid geometry

Key Components

Lorentz Encoder

Hyperbolic embedding on hyperboloid model

Wrapped Normal Distribution

Gaussian analog on hyperboloid for mixture components

Hierarchical Clustering

Natural tree-like structure preservation

Geometric Inference

Leverages hyperbolic geometry for inference

HW Decoder

Reconstructs expression from hyperboloid embeddings

Mathematical Formulation

p(z) = Σ_k π_k WrappedNormal_k(μ_k, σ_k) on Lorentz hyperboloid; X̂ = Decoder(z)

Loss Functions

HW-ELBO
Reconstruction + KL with wrapped distributions

Data Flow

Expression → HW Encoder → Hyperboloid Mixture → HW Decoder → Reconstructed Expression + Hierarchical Clusters

Architecture Details

Architecture Type

GMVAE with Hyperbolic-Wrapped Distributions (VAE Architecture)

Input/Output Types

single-cellreconstruction

Key Layers

HWEncoderWrappedMixtureHWDecoder

Frameworks

PyTorch

Tags

vaehyperbolicwrappedhierarchicalgenerativerna