GAHIB
A graph-attention variational autoencoder with information bottleneck and Lorentz hyperbolic geometry for single-cell latent representation learning.
Available on this site
Method, data, metrics, and code
Method notes, datasets, metrics, and code are available here.
Pick a route through the project
Method
Three losses shape the latent: reconstruction, information bottleneck, and a Lorentz hyperbolic distance that anchors a hierarchy-aware manifold.
Read the architecture02Data
The 53-dataset evaluation cohort spans cancer and development, with identifiers grouped by the same taxonomy used in the manuscript.
See the datasets03Metrics
Twenty metrics cover clustering quality, dimensionality-reduction preservation, and intrinsic latent-space diagnostics.
Inspect the metrics04Benchmarks
Deep-learning, classical, geometric VAE, disentanglement, encoder, graph-convolution, and robustness tracks share one protocol.
Review the design05Code
MIT-licensed PyTorch reference implementation with a familiarscanpy entry-point.
Cite
Pre-publication citation stub today; full @article entry on acceptance.
GAHIB and related single-cell resources
GAHIB connects to the PeterPonyu homepage and SCPortal alongside related single-cell method and benchmark resources. These links place the project in a broader dataset, benchmark, and code context.
How to use these links
GAHIB is the project-specific site; the other links point to broader lab, dataset, and benchmark resources.
Homepage
ZF Lab projects, papers, and related links.
Open linkSCPortal
Single-cell datasets, benchmarks, models, and companion sites.
Open linkGAHIB
Method, data, metrics, and code for the GAHIB project.
Open linkGAHIB repository
MIT-licensed PyTorch source and experiment entry points.
Open linkRelated resources
Single-cell methods and benchmark context
These links give context for the method and benchmarks across related single-cell projects.