Back to Models

InfoVAE

🔍 Disentanglement

Information Maximizing VAE

RNA

VAE with mutual information maximization for disentangled and informative representations

Publications

InfoVAE: Balancing Learning and Inference in Variational Autoencoders

Zhao et al.2019
Complexity
complex
Interpretability
high
Architecture
InfoVAE
Latent Dim
10

Information-Theoretic Disentanglement

InfoVAE balances reconstruction, KL divergence, and mutual information to learn disentangled yet informative factors with full VAE reconstruction

Main Idea

Maximize information between data and latent code while encouraging disentanglement and reconstruction

Key Components

Encoder

Maps data to informative latent factors

Mutual Information Term

I(x;z) encourages informativeness

Maximum Mean Discrepancy

Matches aggregate posterior to prior

Decoder

Reconstructs from informative factors

Mathematical Formulation

L = -E_q[log p(x|z)] + (1-α)KL(q(z|x)||p(z)) + (α+λ-1)KL(q(z)||p(z)); X̂ = Decoder(z)

Loss Functions

InfoVAE Loss
Reconstruction + Weighted KL + MMD

Data Flow

Data → Encoder → Informative Factors → Decoder → Reconstruction

Architecture Details

Architecture Type

Information-Theoretic VAE (VAE Architecture)

Input/Output Types

single-cellreconstruction

Key Layers

EncoderMMDLayerDecoder

Frameworks

PyTorch

Tags

vaedisentanglementinformation-theorygenerativerna