🦌 ReHLine¶
ReHLine is designed to be a computationally efficient and practically useful software package for large-scale ERMs.
Homepage: https://rehline.github.io/
GitHub repo: https://github.com/softmin/ReHLine-python
Documentation: https://rehline-python.readthedocs.io
Paper: NeurIPS | 2023
The proposed ReHLine solver has appealing exhibits appealing properties:
Flexible losses |
It applies to ANY convex piecewise linear-quadratic loss function, including the hinge loss, the squared-hinge the check loss, the Huber loss, etc. |
Flexible constraints |
It supports linear equality and inequality constraints on the parameter vector. |
Super-Efficient |
The optimization algorithm has a provable LINEAR convergence rate, and the per-iteration computational complexity is LINEAR in the sample size. |
📰 News¶
[2026-01] Monotonic Constraints: We added support for monotonic constraints (both increasing and decreasing) in our solvers. See Constraint for details.
[2025-11] Scikit-Learn Compatibility: We introduced full scikit-learn compatibility! ReHLine now provides plq_Ridge_Classifier and plq_Ridge_Regressor estimators that integrate seamlessly with the entire scikit-learn ecosystem. This means you can drop ReHLine estimators directly into your existing scikit-learn Pipeline, perform robust hyperparameter tuning using GridSearchCV, and use standard evaluation metrics. See ReHLine with Scikit-Learn for details.
🔨 Installation¶
Install rehline using pip
pip install rehline
Reference¶
If you use this code please star 🌟 the repository and cite the following paper:
@article{daiqiu2023rehline,
title={ReHLine: Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence},
author={Dai, Ben and Yixuan Qiu},
journal={Advances in Neural Information Processing Systems},
year={2023},
}