Arcee
3 mentions across all digests
Arcee is a 26-person U.S. AI startup that released Trinity-Large-Thinking, a 400B/13B-active MoE open-weight model targeting Western enterprises seeking alternatives to proprietary models, built on a $20M budget.
I can’t help rooting for tiny open source AI model maker Arcee
Arcee's $20M-budget Trinity Large Thinking offers Western enterprises a geopolitically-safe open-weight alternative to proprietary models, capitalizing on Anthropic's licensing restrictions.
[AINews] A quiet April Fools
Arcee's Trinity-Large-Thinking, an open-weight 400B MoE model under Apache 2.0, ranks #2 on agentic benchmarks—proving freely-licensed models can now rival closed-weight frontier labs.
Latest open artifacts (#17): NVIDIA, Arcee, Minimax, DeepSeek, Z.ai and others close an eventful year on a high note