Skip to content

Hyper3Labs/hyper-models

Repository files navigation

hyper-models

A model zoo for non-Euclidean embedding models
Hyperbolic · Spherical · Product Manifolds

Hugging Face License: MIT


Why?

  • Standardized access to non-Euclidean embedding models
  • One catalog surface: model names map to internal loaders such as ONNX or optional torch-backed runtimes
  • Simple APIload() and encode_images()

Installation

uv pip install hyper-models

This base install is the simple path: it stays torch-free and is enough for ONNX-backed catalog entries such as HyCoCLIP and MERU.

For torch-backed checkpoints (for example UNCHA):

uv pip install "hyper-models[ml]"

Usage

import hyper_models
from PIL import Image

# List available models
hyper_models.list_models()
# ['hycoclip-vit-s', 'hycoclip-vit-b', 'meru-vit-s', 'meru-vit-b', 'uncha-vit-s', 'uncha-vit-b']

# Inspect supported internal loader kinds
hyper_models.list_loaders()
# ['onnx', 'uncha-image-torch']

# Load model (auto-downloads from Hugging Face Hub)
model = hyper_models.load("hycoclip-vit-s")
model.geometry  # 'hyperboloid'
model.dim       # 513

# Encode PIL images
images = [Image.open("image.jpg")]
embeddings = model.encode_images(images)  # (1, 513) ndarray

# Get model info
info = hyper_models.get_model_info("hycoclip-vit-s")
info.hub_id     # 'mnm-matin/hyperbolic-clip'
info.loader     # 'onnx'
info.license    # 'CC-BY-NC'

# Low-level: preprocess images yourself
batch = hyper_models.preprocess_images(images)  # (B, 3, 224, 224)
embeddings = model.encode(batch)

Architecture

hyper-models is intended to be a timm-like catalog for non-Euclidean models.

  • The public abstraction is the catalog entry name, for example hycoclip-vit-s.
  • Each entry declares metadata such as geometry, dimensionality, artifact path, and an internal loader kind.
  • Internal loaders may differ by model family:
    • onnx for exported, torch-free runtimes
    • uncha-image-torch for raw checkpoints that need a PyTorch image runtime

This keeps callers on one stable API:

model = hyper_models.load("hycoclip-vit-s")
model = hyper_models.load("uncha-vit-b")

Callers do not need to know which internal loader is used, except for optional dependency installation when choosing entries that need hyper-models[ml].

HyperView integration

HyperView auto-detects hyper-models names and routes them to the hyper-models provider.

import hyperview as hv

dataset = hv.Dataset.from_huggingface(
  name="demo",
  hf_dataset="uoft-cs/cifar10",
  split="train",
  image_key="img",
)

# Uses provider='hyper-models' automatically.
space_key = dataset.compute_embeddings(model="uncha-vit-b")
layout_key = dataset.compute_visualization(space_key=space_key, layout="poincare")

HyperView's simple path remains torch-free. If you use the default ONNX-backed hyper-models entries or the default embed-anything provider, HyperView does not need PyTorch. PyTorch is only needed when you explicitly select a torch-backed catalog entry such as uncha-vit-s or uncha-vit-b.

Models

Hyperbolic

Model Available Paper Code
hycoclip-vit-s HF ICLR 2025 PalAvik/hycoclip
hycoclip-vit-b HF ICLR 2025 PalAvik/hycoclip
meru-vit-s HF ICML 2023 facebookresearch/meru
meru-vit-b HF ICML 2023 facebookresearch/meru
uncha-vit-s HF CVPR 2026 jeeit17/UNCHA
uncha-vit-b HF CVPR 2026 jeeit17/UNCHA
hyp-vit CVPR 2022 htdt/hyp_metric
hie CVPR 2020 leymir/hyperbolic-image-embeddings
hcnn ICLR 2024 kschwethelm/HyperbolicCV

Hyperspherical

Model Available Paper Code
megadescriptor (via timm) HF WACV 2024 WildlifeDatasets/wildlife-datasets
sphereface CVPR 2017 wy1iu/sphereface
arcface CVPR 2019 deepinsight/insightface

Product Manifolds

Model Available Paper Code
hyperbolics ICLR 2019 HazyResearch/hyperbolics

Export Tooling

This repo also contains tooling to export PyTorch models to ONNX:

cd export/hycoclip
uv run python export_onnx.py --checkpoint model.pth --onnx model.onnx

See export/hycoclip/README.md for details.

References

About

Model zoo for non-Euclidean embedding models (hyperbolic, hyperspherical) - ONNX exports for HyperView

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages