Back to Models

TCVAE

🔍 Disentanglement

Total Correlation VAE

RNA

VAE that explicitly decomposes and minimizes total correlation for disentanglement

Publications

Isolating Sources of Disentanglement in Variational Autoencoders

Chen et al.2018
Complexity
complex
Interpretability
high
Architecture
TCVAE
Latent Dim
10

Total Correlation Decomposition

TCVAE decomposes the KL term into index-code mutual information, total correlation, and dimension-wise KL, then minimizes total correlation with full VAE reconstruction

Main Idea

Achieve disentanglement by explicitly minimizing statistical dependence (total correlation) between latent dimensions while reconstructing

Key Components

Encoder

Maps to decorrelated latent factors

KL Decomposition

Decomposes KL into three interpretable terms

Total Correlation

TC(z) = KL(q(z)||∏_j q(z_j))

Minibatch Stratified Sampling

Estimates TC from minibatches

Decoder

Reconstructs from decorrelated factors

Mathematical Formulation

KL(q(z|x)||p(z)) = I(x;z) + TC(z) + Σ_j KL(q(z_j)||p(z_j)); X̂ = Decoder(z)

Loss Functions

β-TCVAE Loss
Reconstruction + I(x;z) + β*TC(z) + Dimension KL

Data Flow

Data → Encoder → Decorrelated Latents → Decoder → Reconstruction

Architecture Details

Architecture Type

Total Correlation VAE (VAE Architecture)

Input/Output Types

single-cellreconstruction

Key Layers

EncoderTotalCorrelationEstimatorDecoder

Frameworks

PyTorch

Tags

vaedisentanglementtotal-correlationgenerativerna