Ling-2.6-1T on Novita AI: Free API, SWE-Bench SOTA, 1T Param Model
Ling-2.6-1T is Ant Group’s trillion-scale model built on MLA + Hybrid Linear Attention — not standard MoE. It achieves open-source SOTA on agent benchmarks (SWE-bench, BFCLv4, TAU2-Bench) with minimal token overhead, now exclusively backed by Novita AI.












