How to Use Novita AI API in ForgeCode: Setup Guide

use novita ai api in forgecode

ForgeCode now supports Novita AI as a built-in provider starting in v2.2.2. That means you can access models like Kimi-K2.5, GLM-5, and MiniMax-M2.5 directly from your terminal — no extra configuration layers, no separate apps. This guide walks you through the setup in under five minutes.

What Is ForgeCode?

ForgeCode is a CLI-based coding harness that integrates directly into Zsh. Instead of switching between your editor, a browser-based chat, and a terminal, you type : followed by a prompt and ForgeCode handles the rest — code edits, file creation, debugging, all inside the shell you already use.

It currently ranks #1 on TermBench 2.0 with 81.8% accuracy, ahead of other terminal-based coding agents. ForgeCode uses a multi-agent architecture with three specialized agents:

  • Forge — the default agent for direct implementation: fixing bugs, writing code, creating features
  • Muse — a planning agent that analyzes your codebase and proposes solutions without executing them
  • Sage — a read-only research agent that both Forge and Muse call automatically when they need deeper codebase understanding

The key differentiator: ForgeCode supports hundreds of LLM providers natively. You can switch between models from different vendors mid-session without restarting, and your conversation context carries over.

Why Use Novita AI with ForgeCode?

ForgeCode already supports major providers like OpenAI and Anthropic. Here is why Novita AI is worth adding:

Access to open-source frontier models. Novita AI hosts models you will not find on most other providers — Kimi-K2.5, GLM-5, and MiniMax-M2.5 among them. These are competitive coding models that you can test against proprietary options on the same real tasks, inside the same ForgeCode session.

Lower cost on coding workloads. Novita AI’s pricing runs significantly below proprietary model APIs. For example, MiniMax-M2.5 costs just $0.30 per million input tokens — a fraction of what GPT-5.4 or Claude Sonnet charges. For developers who use coding agents daily, this adds up.

OpenAI and Anthropic-compatible API. Novita AI provides both OpenAI-compatible and Anthropic-compatible API endpoints. ForgeCode expects the OpenAI format from providers, and Novita AI fits that directly — no adapter layers, no custom configuration, just a base URL and an API key.

Pay-as-you-go billing. No subscriptions required. You pay only for the tokens you actually use, which makes it easy to experiment with multiple models without commitment.

Prerequisites

Before you start, you will need two things:

1. ForgeCode Installed

Install ForgeCode and configure the Zsh plugin:

# Install ForgeCode
curl -fsSL https://forgecode.dev/install.sh | sh

# Verify installation
forge --help

# Set up the Zsh plugin
forge zsh setup

After running forge setup, restart your terminal for the plugin to take effect. If the : prompt trigger is not working, this is the most common cause.

2. Novita AI API Key

  1. Sign up or log in at novita.ai
  2. Find API Keys
  3. Click Add New Key
  4. Copy the key and keep it ready
how to create api key

That is everything you need. The rest happens inside ForgeCode.

How to Connect Novita AI to ForgeCode

Step 1: Run the Login Flow

Open your terminal and type:

:login

ForgeCode presents a list of available providers. Select Novita from the list.

Step 2: Enter Your API Key

Paste your Novita AI API key when prompted. The key is masked in the terminal for security. Confirm the provider switch when asked.

Step 3: Choose a Model

After login, open the model picker:

:model

Browse or search for Novita models. Select one and press Enter. ForgeCode remembers your choice across sessions — you can change it anytime with :model.

That is the entire setup. Your workflow stays exactly the same; only the provider behind it changes.

Which Novita AI Models Work with ForgeCode?

Here is the current lineup available through Novita AI, with pricing and recommended use cases:

ModelInput (/M tokens)Output (/M tokens)Cash Read
(/M tokens)
ContextBest For
Kimi-K2.5$0.60$3.00$0.10262KGeneral coding, reasoning, tool use
GLM-5$1.00$3.20$0.20202KAgentic coding, structured reasoning
MiniMax-M2.5$0.30$1.20$0.03204KLong-context coding sessions

The real advantage of having Novita AI in ForgeCode is comparison. Run the same refactoring task across Kimi-K2.5, GLM-5, and MiniMax-M2.5, and you will quickly develop a feel for which model fits which type of work. ForgeCode makes switching instant — type :model, pick a different one, and your context carries over.

Conclusion

ForgeCode plus Novita AI gives you terminal-native access to a range of open-source coding models at competitive pricing. The setup takes under five minutes: get an API key, run :login, pick a model, and start coding.

Novita AI models are available on ForgeCode — no setup overhead, pay only for what you use. Try it today with the steps above, or explore model options on the Novita AI Playground.

Novita AI is an AI and agent cloud platform helping developers and startups build, deploy, and scale models and agentic applications with high performance, reliability, and cost efficiency.

Frequently Asked Questions

What is ForgeCode?

ForgeCode is a CLI-based coding harness that integrates with Zsh, letting you send AI prompts directly from your terminal. It ranks #1 on TermBench 2.0 and supports multiple AI providers including Novita AI.

Do I need to change my workflow after connecting Novita AI?

No. The setup is just :login, select Novita, paste your API key, and pick a model. After that, you use ForgeCode exactly the same way — the provider changes, your workflow does not.

Which Novita AI model should I try first in ForgeCode?

Start with Kimi-K2.5 for general coding tasks. It handles reasoning and tool use well. For cost-sensitive workloads, MiniMax M2.5 offers strong performance at $0.30 per million input tokens.


Discover more from Novita

Subscribe to get the latest posts sent to your email.

Leave a Comment

Scroll to Top

Discover more from Novita

Subscribe now to keep reading and get access to the full archive.

Continue reading