Codex CLI is an open-source AI coding assistant that runs locally in your terminal while connecting to remote AI models through APIs. Built in Rust for performance and security, it can read, modify, and execute code in your chosen directory while maintaining strict sandbox protections.
This guide shows you how to configure Codex CLI with advanced AI models through Novita AI, including specialized coding models like Qwen Coder, reasoning-focused models like DeepSeek, and agentic models like Kimi K2.
What is Codex CLI?
Codex CLI is a terminal-based coding agent that combines local execution with cloud AI capabilities. Unlike code generation tools that only produce code snippets, Codex CLI can understand your entire project, execute the code it creates, debug issues, and iterate until solutions work correctly.
Key Features
Local-First Architecture: Runs entirely on your machine while making API calls to AI models. Your code stays local while leveraging powerful cloud AI capabilities.
Project Understanding: Reads your entire codebase, understands existing patterns, dependencies, and coding style to generate code that fits seamlessly into your project.
Autonomous Operation: Can work independently for extended periods, breaking down complex tasks and executing them systematically with minimal supervision.
Flexible Approval Modes:
- Suggest Mode: Prompts for approval at every step (default)
- Auto Edit Mode: Automatically edits files but asks before running commands
- Full Auto Mode: Complete autonomy without prompting
Security Features: Runs in network-disabled, directory-sandboxed environments to protect your system while maintaining full functionality.
Codex CLI vs Claude Code: Key Differences
Before diving into Codex CLI setup, it’s important to understand how it differs from Claude Code, another popular AI coding assistant:
Codex CLI
- Model Support: Supports OpenAI models and OpenAI-compatible APIs, with support for multiple providers
- Architecture: Local agent that runs in your terminal with API calls to remote models
- Open Source: Fully open-source with community contributions
Claude Code
- Model Support: Focuses on Anthropic’s Claude models and Anthropic compatible APIs
- Architecture: Integrated experience across web, desktop, and IDE platforms
- Development: Proprietary core with API access
Codex CLI is superior for its precision and ability to handle complex tasks, especially in existing codebases. Claude Code is often preferred for its user experience and initial project setups. Some users find the best results by combining both tools.
For Claude-specific workflows, check out our guide on using Claude Code with Novita AI.
Why Use Third-Party APIs with Codex CLI?
While Codex CLI supports OpenAI’s models natively, third-party APIs like Novita AI offer key advantages:
Specialized Models: Access cutting-edge models like DeepSeek V3.1 for reasoning, Qwen Coder for programming, and Kimi K2 for agentic workflows.
Cost & Performance: Competitive pricing with models optimized for specific tasks, from lightweight responses to complex problem-solving.
Custom Models: With Novita AI, you can even use your own custom models quickly in Codex CLI for specialized requirements.
Reliability: Alternative providers reduce rate limits and ensure consistent access during outages or regional restrictions.
How to Access Novita AI Models in Codex CLI
Prerequisites
- Create an account: Visit Novita AI’s website and sign up for an account.
- Generate your API Key: After logging in, navigate to the Key Management page to generate your API key.
- Select a Model Name: You’ll need to copy the model name you want to use from Novita AI’s Model Library. Some available models include:
deepseek/deepseek-v3.1qwen/qwen3-coder-480b-a35b-instructmoonshotai/kimi-k2-0905openai/gpt-oss-120bzai-org/glm-4.5google/gemma-3-12b-it
- Save it Securely: you’ll need it for configuration.
Installation
Install via npm (Recommended)
npm install -g @openai/codex
Install via Homebrew (macOS)
brew install codex
Verify Installation
codex --version
Configuring Novita AI Models
Setup Configuration File
Codex CLI uses a TOML configuration file located at:
- macOS/Linux:
~/.codex/config.toml - Windows:
%USERPROFILE%\.codex\config.toml
Basic Configuration Template
model = "MODEL_NAME"
model_provider = "novitaai"
[model_providers.novitaai]
name = "Novita AI"
base_url = "https://api.novita.ai/openai"
http_headers = {"Authorization" = "Bearer YOUR_NOVITA_API_KEY"}
wire_api = "chat"
Available Models and When to Use Them
| Model | Best For | Strengths |
|---|---|---|
deepseek/deepseek-v3.1 | Complex algorithms and architecture | Superior reasoning and problem-solving |
qwen/qwen3-coder-480b-a35b-instruct | Code generation and refactoring | Specialized for programming tasks |
moonshotai/kimi-k2-0905 | Agentic workflows and automation | Fast execution, long context handling |
openai/gpt-oss-120b | General development tasks | Reliable baseline performance |
zai-org/glm-4.5 | Tool integration and debugging | High success rate for tool calling |
google/gemma-3-12b-it | Lightweight development tasks | Efficient and fast responses |
Getting Started
Launch Codex CLI
codex
Basic Usage Examples
Code Generation:
> Create a Python class for handling REST API responses with error handling
Project Analysis:
> Review this codebase and suggest improvements for performance
Bug Fixing:
> Fix the authentication error in the login function
Testing:
> Generate comprehensive unit tests for the user service module
Working with Existing Projects
Navigate to your project directory before launching Codex CLI:
cd /path/to/your/project codex
Codex CLI will automatically understand your project structure, read existing files, and maintain context about your codebase throughout the session.
Conclusion
Codex CLI with Novita AI models provides a powerful, flexible development environment that combines local control with cloud AI capabilities. By choosing the right model for each task and configuring your environment properly, you can significantly accelerate your development workflow while maintaining code quality and security.
Start with the basic configuration using Qwen Coder for general development tasks, then experiment with specialized models like DeepSeek for complex reasoning or Kimi K2 for autonomous workflows as your needs evolve.
About Novita AI
Novita AI is an AI cloud platform that offers developers an easy way to deploy AI models using our simple API, while also providing an affordable and reliable GPU cloud for building and scaling.
Recommended Reading
- How to Use Kimi-K2 in Claude Code on Windows, Mac, and Linux
- How to Use OpenAI Compatible API in Qwen Code (60s Setup!)
- Trae + Novita AI: Step-by-Step Guide to Access AI Models in Your IDE
Discover more from Novita
Subscribe to get the latest posts sent to your email.





