OpenClaw - Self-Hosted AI Smart Assistant Platform
OpenClaw Tutorial — Install OpenClaw, integrate with New API, and quickly set up a self-hosted AI assistant. An open-source project supporting multi-channel integration like Telegram, Discord, WhatsApp.
Project Introduction
OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. Designed for developers and advanced users, it allows you to have an autonomous AI assistant without giving up control of your data.
- Official Homepage: https://openclaw.ai
- Project Documentation: https://docs.openclaw.ai
- GitHub: https://github.com/openclaw/openclaw
OpenClaw is completely open source. You can browse the source code, submit issues, or contribute at OpenClaw's GitHub repository. This tutorial covers the complete steps for installation, configuration, and integrating OpenClaw with New API.
🌟 Core Features
Multi-Channel Integration
- Multi-channel integration: Supports various messaging channels like Telegram, Discord, WhatsApp, iMessage, and can be extended to more platforms via plugins.
- Single Gateway: Unified management of all channels through a single Gateway process.
- Voice Support: Supports macOS/iOS/Android voice interaction.
- Canvas Interface: Capable of rendering interactive Canvas interfaces.
Self-Hosting and Data Security
- Fully Self-Hosted: Runs on your own machine or server.
- Open Source & Transparent: MIT open-source license, fully transparent code.
- Data Localization: Context and skills are stored on your local computer, not in the cloud.
Smart Agent Capabilities
- Continuous Operation: Supports persistent background operation with long-term memory.
- Scheduled Tasks: Supports cron-based scheduled tasks.
- Session Isolation: Isolates sessions by agent/workspace/sender.
- Multi-Agent Routing: Supports collaborative work among multiple agents.
- Tool Calling: Native support for tool calling and code execution.
📦 Pre-integration Preparation
Preparation Information
- Node.js 22 or higher
- An available New API address (usually ending with
/v1) - An available New API API Key
Before integrating with New API, it's recommended to first get the Gateway and Control UI running according to OpenClaw's currently recommended official process. This makes it easier to distinguish whether OpenClaw itself hasn't started or if the model provider configuration is incorrect when troubleshooting later.
1. Install OpenClaw (macOS/Linux)
curl -fsSL https://openclaw.ai/install.sh | bashFor other installation methods, refer to the OpenClaw official documentation: Getting Started.
2. Run the Onboarding Wizard
openclaw onboard --install-daemonThis wizard completes basic authentication, Gateway setup, and optional channel initialization. The goal here is to get OpenClaw running first, then switch the default model to New API later.
3. Check Gateway and Control UI
openclaw gateway statusopenclaw dashboardIf your browser can open the Control UI, it means OpenClaw's basic operation is normal. At this stage, there's no need to configure messaging channels like Telegram, Discord, or Feishu yet.
4. Locate the Configuration File
OpenClaw's configuration file is usually located at ~/.openclaw/openclaw.json. You can continue to modify it based on what the onboarding wizard generates.
Path-Related Environment Variables
If you run OpenClaw under a dedicated service account, or wish to customize the configuration/state directory, you can use:
OPENCLAW_HOMEOPENCLAW_STATE_DIROPENCLAW_CONFIG_PATH
For detailed explanations, see the official environment variables documentation: Environment Variables.
🚀 Using New API as a Model Provider
OpenClaw supports integrating custom or OpenAI-compatible model gateways via models.providers. For New API, the most common approach is to add it as a custom provider to the configuration, then point the default model to newapi/MODEL_ID.
Integration Approach
- Declare a
newapiprovider undermodels.providers. - Point
baseUrlto your New API address, ensuring it includes/v1. - Set
apitoopenai-completions. - List the model IDs you want OpenClaw to use in
models. - Switch the default model to
newapi/...inagents.defaults.model.primary.
Recommended Practice: Store API Keys in Environment Variables
First, provide your New API key in the current shell, service environment, or a .env file readable by OpenClaw:
export NEWAPI_API_KEY="sk-your-newapi-key"Then, add or modify the following snippet in openclaw.json:
{
models: {
mode: "merge",
providers: {
newapi: {
baseUrl: "https://<your-newapi-domain>/v1",
apiKey: "${NEWAPI_API_KEY}",
api: "openai-completions",
models: [
{ id: "gemini-2.5-flash", name: "Gemini 2.5 Flash" },
{ id: "kimi-k2.5", name: "Kimi K2.5" },
],
},
},
},
agents: {
defaults: {
model: {
primary: "newapi/gemini-2.5-flash",
fallbacks: ["newapi/kimi-k2.5"],
},
models: {
"newapi/gemini-2.5-flash": { alias: "flash" },
"newapi/kimi-k2.5": { alias: "kimi" },
},
},
},
}This is not a complete configuration to be copied verbatim, but rather the most critical part for integrating New API. As long as the provider, model ID, and default model references are correctly matched, OpenClaw will be able to call the model resources you expose via New API.
Key Configuration Explanation
| Configuration Item | Description |
|---|---|
models.mode | Recommended to set to merge to append newapi while retaining OpenClaw's built-in providers. |
models.providers.newapi.baseUrl | Your New API address, usually needs to include /v1. |
models.providers.newapi.apiKey | New API key, recommended to inject via ${NEWAPI_API_KEY}. |
models.providers.newapi.api | For OpenAI-compatible gateways like New API, use openai-completions. |
models.providers.newapi.models | The model IDs listed here must match the actual model names exposed by your New API. |
agents.defaults.model.primary | Default primary model, format must be provider/model-id. |
agents.defaults.model.fallbacks | Fallback model list, automatically switches if the primary model fails. |
agents.defaults.models | Optional, used to create aliases for models, convenient for referencing in UI or conversations. |
Verify Successful Integration
After completing the configuration, return to or reopen the Control UI:
openclaw dashboardIf you can initiate conversations normally in OpenClaw and the default model has become newapi/..., then the integration is successful. You can also use:
openclaw models listto confirm that models with the newapi/ prefix appear in the selectable list.
Common Issues
baseUrlwithout/v1: This is one of the most common integration errors.- Incorrect model ID:
primaryandfallbacksmust correspond to theidinmodels.providers.newapi.models. - Key only effective in the current terminal: If Gateway runs as a background service, ensure the service process can also read
NEWAPI_API_KEY. - For foreground troubleshooting: You can use the official foreground running method
openclaw gateway --port 18789to observe logs and errors.
How is this guide?
Last updated on
FluentRead - Open Source Translation Plugin
FluentRead configuration guide — open-source browser translation plugin with 20+ engines and AI model support. Connect to New API for immersive bilingual reading.
LangBot - Instant Messaging Bot Development Platform
LangBot integration guide — build AI-powered chatbots for Feishu, DingTalk, Telegram, Discord, and more. Supports Knowledge Base, Agent, and MCP with New API.