Show HN: LimiX-2M – The super-light foundation model redefining tabular learning
We’re excited to release our LimiX-2M -- a lightweight version of our SOTA tabular foundation model, LimiX-16M. LimiX-2M delivers strong performance in tabular modeling. Unlike other existing tabular foundation models that train separate models for classification and regression, LimiX is the first unified tabular foundation model in the world — enabling classification, regression, and a wide range of tabular tasks with a single model.
What’s new in LimiX-2M: built upon LimiX-16M, the new version introduces several major architectural improvements. Thanks to these updates, LimiX-2M achieves excellent modeling performance with a dramatically smaller footprint, making it the ideal choice for research and personal projects.
Key features:
- 3.6× faster inference and 1/4 model size vs TabPFN-2.5: high performance with much lower compute cost.
- Excellent performance: LimiX-2M outperforms TabPFN-V2 and AutoGluon across multiple public benchmarks, and performs comparably to TabPFN-2.5 and RealTabPFN-2.5, with only a small gap from the larger LimiX-16M.
- Farewell to annoying model management: with LimiX, a single checkpoint supports multiple downstream tasks—no need to download or maintain multiple models ;)
- Plug-and-play: run it anywhere — CPU or GPU. A step-by-step guide is included to help you get started in your own projects instantly.
Of course, there’s still plenty of room for improvement in LimiX. But as the first unified tabular foundation model, it’s definitely worth a try. We look forward to your feedback!
Technical report: https://arxiv.org/abs/2509.03505
Project: https://www.limix.ai/
Github: https://github.com/limix-ldm/LimiX/
HuggingFace: https://huggingface.co/stable-ai/LimiX-2M