jaxls#
jaxls is a solver for sparse, constrained, and/or non-Euclidean least squares problems in JAX.
To install (Python >=3.10 minimum, >=3.12 recommended):
# Core package only
pip install "git+https://github.com/brentyi/jaxls.git"
# Or, with dev dependencies
pip install "git+https://github.com/brentyi/jaxls.git#egg=jaxls[dev,docs]"
Goals#
jaxls is a research artifact and intentionally minimal. It’s designed to be:
Lightweight and hackable, but fast. Performance is enabled by analyzing and exploiting problem structure: jaxls automatically vectorizes repeated cost and variable operations, while translating sparse cost/variable relationships into sparse matrix operations.
Python-native. jaxls combines a functional, PyTree-first implementation with recent Python typing constructs. Its API is type-safe, compatible with standard JAX function transforms, and more concise than traditional optimization tools.
Features#
We currently support:
Automatic sparse Jacobians, defined analytically or via autodiff. See Custom Jacobians and Sparse matrices.
Optimization on manifolds, including SO(2), SO(3), SE(2), SE(3). See Non-Euclidean variables.
Nonlinear solvers: Levenberg-Marquardt and Gauss-Newton.
Linear subproblem solvers. See Tips and gotchas for selection guidance.
Sparse iterative with Conjugate Gradient (recommended for most problems).
Block and point Jacobi preconditioning; inexact Newton via Eisenstat-Walker.
Dense Cholesky (fast for small problems).
Sparse Cholesky on CPU (CHOLMOD).
Augmented Lagrangian solver for constrained problems. See Constraints.
Automatic adaptive penalties for equalities and inequalities:
h(x) = 0,g(x) <= 0.
Acknowledgements#
Thanks to Nick Heppert, Chung Min Kim, and Yuqing Du for technical feedback.
jaxls is inspired by libraries like GTSAM, Ceres Solver, minisam, SwiftFusion, and g2o.
Algorithmic references:
Eisenstat & Walker (1996): adaptive inexact Newton tolerances for conjugate gradient.
Birgin & Martínez (2014): ALGENCAN-style augmented Lagrangian method.