TabICL offers two checkpoints:
v1: Version in the paper, open-source pre-training code
v1.1: Better, preview of upcoming work, same architecture, benchmarked in TabArena. Pre-training code not yet public. Outperforms TabPFNv2 for, roughly, >= 7,000 samples.
Stay tuned for v2 😀
I’ll present "TabICL: A Tabular Foundation Model for In‑Context Learning on Large Data" at ICML 2025.
🗓 Tuesday, July 15, 2025 | 4:30–7:00 PM PDT
📍 East Exhibition Hall A‑B, Booth hashtag#E‑320
If you're curious about TabICL, come by the poster — I'd love to chat !
🚨Visit Jingang at our ICML poster later today if you are interested in
- scaling tabular foundation models to larger datasets
- open-source tabular foundation models
- meta-learning + in-context learning