Loading...
Discovering amazing AI tools


An open-source trillion-parameter Mixture-of-Experts (MoE) model for coding assistance, intelligent agents, and automated workflows.

An open-source trillion-parameter Mixture-of-Experts (MoE) model for coding assistance, intelligent agents, and automated workflows.
Kimi K2 is an open-source, trillion-parameter Mixture-of-Experts (MoE) model designed to power advanced coding assistance, autonomous agents, and workflow automation. The MoE architecture allows the model to scale capacity by activating specialized expert subnetworks per request, delivering high capability while controlling inference cost. Kimi positions itself as a developer-focused model by providing accessible code, models, and integration points so teams can fine-tune, deploy, and embed K2 into developer tools, agent orchestrators, and automation pipelines. Its primary value is combining large-scale model capacity with an open-source development model to accelerate building code-centric AI applications and intelligent agent systems.



