First Agent in 15 Minutes: Install, Configure and Run Your First Conversation
Chapter 2: Build Your First Agent in 15 Minutes: Installation, Configuration, and Your First Conversation
2.1 System Requirements Explained
Before starting the installation, understanding the reasoning behind each system requirement will help you avoid common mistakes.
Node.js 24+
OpenClaw strictly requires Node.js 24 or higher. This is not an arbitrary version constraint — there are clear technical reasons behind it:
- Native WebSocket support: Node.js 22 introduced experimental built-in WebSocket; Node.js 24 stabilized it. The Gateway's WebSocket server (localhost:18789) relies on this native implementation, avoiding the historical baggage of the
wslibrary. - Native Fetch API stabilized: Pi's internal HTTP calls all use the native
fetchAPI, which is fully stable in Node.js 24. - V8 engine optimizations: Node.js 24's bundled V8 has significant improvements for concurrent Promise handling, which is critical for Command Queue throughput.
# Check your current Node.js version
node --version
# Should output v24.x.x or higher
# If the version doesn't meet requirements, install via nvm
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
source ~/.bashrc # or source ~/.zshrc
nvm install 24
nvm use 24
nvm alias default 24
Memory Requirements: 8GB RAM
8GB is the recommended minimum, for the following reasons:
- Pi Agent Core process itself: ~512MB
- Gateway process (including all Channel Bridges): ~256MB–512MB
- Local SQLite (with WAL mode): ~128MB
- Local LLM inference (if using Ollama, etc.): 4GB–6GB
If you're using a cloud LLM API (Anthropic, OpenAI), local memory requirements can drop to 2GB, but 8GB is still the recommended configuration to handle multi-Session concurrency.
LLM API Key
OpenClaw requires at least one LLM Provider API Key. Supported providers:
| Provider | Environment Variable | Recommended Model |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY |
claude-opus-4-5, claude-sonnet-4-5 |
| OpenAI | OPENAI_API_KEY |
gpt-4o, gpt-4o-mini |
GOOGLE_API_KEY |
gemini-2.0-flash | |
| Ollama (local) | No key needed | llama3.3, qwen2.5 |
| OpenAI-compatible | CUSTOM_API_KEY |
Any compatible API |
Storage Requirements
- Minimal installation: 500MB (Node modules + core packages)
- Recommended: 5GB+ (Workspace knowledge base index, session history, logs)
- With local LLM: Additional 4GB–20GB (model files)
Port Requirements
| Port | Purpose | Configurable |
|---|---|---|
| 18789 | Gateway WebSocket | Yes |
| 18790 | Control UI HTTP | Yes |
| 18791 | Plugin Registry HTTP | Yes |
2.2 Three Installation Methods
Method 1: curl Script (Recommended for Beginners)
curl -fsSL https://openclaw.ai/install.sh | bash
This script will:
- Detect your operating system and Node.js version
- Install the
openclawCLI globally via npm - Create the default configuration directory
~/.openclaw/ - Add
openclawto PATH - Run a post-installation self-check (
openclaw doctor)
After successful installation, you'll see:
✓ OpenClaw v1.5.2 installed successfully
✓ Node.js v24.3.0 detected
✓ Default config directory created at ~/.openclaw/
✓ Run `openclaw onboard` to get started
Method 2: npm Global Install
# Using npm
npm install -g openclaw@latest
# Using pnpm (recommended — matches the framework's own package manager)
pnpm add -g openclaw@latest
# Using yarn
yarn global add openclaw@latest
After global installation, the openclaw CLI is available from any directory.
Method 3: Source Installation (Developers / Deep Customization)
Source installation is for users who need to modify core code, contribute to the project, or use the latest unreleased features.
# Clone the repository
git clone https://github.com/openclaw/openclaw.git
cd openclaw
# Install dependencies (must use pnpm — the monorepo structure requires pnpm workspaces)
pnpm install
# Build all packages
pnpm build
# Link to global (so local version overrides any installed global version)
pnpm link --global
# Verify
openclaw --version
The monorepo package structure:
packages/
openclaw-cli/ # CLI entry point
pi-ai/ # LLM Provider abstraction layer
pi-agent-core/ # Pi core execution engine
pi-coding-agent/ # Extended Agent for coding scenarios
pi-tui/ # Terminal UI (TUI mode Control UI)
gateway/ # Integration Gateway
plugin-sdk/ # Plugin development SDK
skills-runtime/ # Skills execution runtime
integrations/
whatsapp/ # WhatsApp Business Channel Bridge
telegram/ # Telegram Channel Bridge
slack/ # Slack Channel Bridge
... (50+ integration packages)
2.3 The onboard Initialization Flow
After installation, run openclaw onboard to launch the interactive setup wizard:
openclaw onboard
The wizard guides you through the following steps:
? Welcome to OpenClaw! Let's set up your first agent.
? Choose your primary LLM provider:
❯ Anthropic (Claude)
OpenAI
Google Gemini
Ollama (local)
Custom OpenAI-compatible
? Enter your Anthropic API Key: sk-ant-api03-...
✓ API Key validated
? Choose your first channel:
❯ Terminal (for testing)
Telegram
Discord
Slack
WhatsApp Business
Skip (configure later)
? Choose a channel name: my-terminal
? Enter a name for your agent: MyFirstAgent
? Choose agent personality:
❯ General Assistant
Customer Support
Developer Assistant
Custom
✓ Configuration written to ~/.openclaw/openclaw.json
✓ First agent "MyFirstAgent" created
? Run the agent now? (Y/n) Y
After initialization is complete, the configuration file is generated automatically.
2.4 Understanding openclaw.json — The Minimal Configuration
The minimal configuration file generated by onboard is located at ~/.openclaw/openclaw.json:
{
"version": "1",
"gateway": {
"port": 18789,
"controlUiPort": 18790
},
"llm": {
"providers": [
{
"id": "anthropic-primary",
"type": "anthropic",
"apiKey": "${ANTHROPIC_API_KEY}",
"defaultModel": "claude-sonnet-4-5-20251201"
}
],
"defaultProvider": "anthropic-primary"
},
"channels": [
{
"id": "terminal-test",
"type": "terminal",
"name": "Terminal Test Channel"
}
],
"agents": [
{
"id": "my-first-agent",
"name": "MyFirstAgent",
"llmProvider": "anthropic-primary",
"channels": ["terminal-test"],
"systemPrompt": "You are a helpful assistant named MyFirstAgent.",
"memory": {
"shortTerm": {
"maxMessages": 50
}
}
}
]
}
Configuration Field Reference
gateway section: Controls the network configuration of the Gateway process. port is the internal WebSocket port (only Pi processes connect here); controlUiPort is the Control UI HTTP port for browser access.
llm.providers section: Supports configuring multiple Providers; Pi selects which one to use based on Agent configuration. apiKey supports environment variable reference syntax ${ENV_VAR} — never hard-code API Keys in the configuration file.
channels section: Defines message sources. The terminal type is a built-in test Channel that accepts messages typed directly in the terminal, requiring no external account configuration.
agents section: Each Agent specifies which LLM Provider to use, which Channels to listen on, and what System Prompt to use.
Using Environment Variables
# Set in your shell profile (~/.bashrc or ~/.zshrc)
export ANTHROPIC_API_KEY="sk-ant-api03-your-key-here"
# Or use a .env file (OpenClaw automatically loads .env from the project directory)
cat > ~/.openclaw/.env << 'EOF'
ANTHROPIC_API_KEY=sk-ant-api03-your-key-here
OPENAI_API_KEY=sk-your-openai-key
EOF
2.5 Connecting Anthropic Claude and Sending Your First Message
Once configured, start OpenClaw:
# Method 1: Foreground (development and debugging)
openclaw start
# Method 2: Background daemon
openclaw start --daemon
# Method 3: Quick test with Terminal Channel only
openclaw chat
When using openclaw chat, you enter the Terminal Channel interactive interface directly:
OpenClaw v1.5.2 | Agent: MyFirstAgent | Channel: terminal-test
Type your message and press Enter. Type /help for commands.
──────────────────────────────────────────────────────────────
You: Hello! Please introduce yourself.
MyFirstAgent: Hello! I'm MyFirstAgent, an AI assistant powered by Anthropic
Claude and running on the OpenClaw framework. I can help you with a wide
range of tasks — answering questions, analyzing information, assisting with
writing, and more. How can I help you today?
You: /stats
Session Stats:
Session ID: sess_01HX7K2M...
Messages: 2 (1 user, 1 assistant)
Tokens used: 312 (input: 278, output: 34)
LLM Provider: anthropic-primary (claude-sonnet-4-5-20251201)
Latency: 1.23s (last response)
You: /quit
Goodbye!
Verifying the API Connection
To verify that your API Key is valid before starting:
openclaw doctor --check-llm
# Example output:
✓ Anthropic API: Connected (claude-sonnet-4-5-20251201 available)
✓ Latency: 342ms
✓ Rate limit: 50 RPM remaining
2.6 Starting the Gateway and Accessing Control UI
When the volume of messages your Agent handles increases, or when you need to manage multiple Channels simultaneously, you'll need to start the full Gateway process and use the Control UI.
Starting the Full Gateway
openclaw start
Example startup log:
[2026-04-26T10:00:00Z] INFO Gateway starting...
[2026-04-26T10:00:00Z] INFO Plugin Registry initialized (0 plugins loaded)
[2026-04-26T10:00:01Z] INFO Channel Bridge: terminal-test → READY
[2026-04-26T10:00:01Z] INFO Command Queue initialized
└─ Lanes: Global(4) Session(1) SubAgent(8) Cron(∞)
[2026-04-26T10:00:01Z] INFO Pi Agent "MyFirstAgent" registered
[2026-04-26T10:00:01Z] INFO WebSocket server listening on ws://localhost:18789
[2026-04-26T10:00:01Z] INFO Control UI available at http://localhost:18790
[2026-04-26T10:00:01Z] INFO OpenClaw ready ✓
Accessing Control UI
Open http://localhost:18790 in your browser to see the Control UI. It provides:
- Real-time monitoring dashboard: Displays active status, message throughput, and error rates for all Sessions
- Session management: View conversation history for all active and historical Sessions
- Human-in-the-loop approval queue: List of high-risk actions pending human approval
- Agent configuration editor: Edit openclaw.json online with real-time validation
- Log viewer: Structured logs with filtering and search
- Plugin management: Install, enable, and disable Plugins
Configuring Remote Access to Control UI
By default, Control UI only binds to localhost. If you need access from another machine (such as a server deployment):
{
"gateway": {
"controlUi": {
"host": "0.0.0.0",
"port": 18790,
"auth": {
"enabled": true,
"username": "admin",
"passwordHash": "${CONTROL_UI_PASSWORD_HASH}"
}
}
}
}
Generate a password hash:
openclaw util hash-password "your-secure-password"
# Output: $argon2id$v=19$m=65536,t=3,p=4$...
Security warning: In production environments, it is strongly recommended to place a reverse proxy (Nginx/Caddy) in front of the Control UI and enable HTTPS and a strong password.
2.7 Troubleshooting Common Errors
Error 1: Port Conflict
Error: Address already in use (EADDRINUSE): 18789
Cause: Another process (possibly an old OpenClaw instance) is occupying port 18789.
Diagnosis steps:
# Find the process occupying the port
lsof -i :18789
# or
ss -tulpn | grep 18789
# Terminate the old process
openclaw stop # Use OpenClaw's built-in stop command
# Or force kill
kill -9 $(lsof -t -i:18789)
Configuration approach: If you need to run multiple instances, change the ports in configuration:
{
"gateway": {
"port": 18800,
"controlUiPort": 18801
}
}
Error 2: Node.js Version Insufficient
Error: OpenClaw requires Node.js >= 24.0.0, found v20.11.0
Solution:
# Switch version using nvm
nvm install 24 && nvm use 24
# Verify
node --version # Should output v24.x.x
# Retry
openclaw start
Error 3: Invalid or Expired API Key
LLM Error: Authentication failed (401)
Provider: anthropic-primary
Model: claude-sonnet-4-5-20251201
Diagnosis steps:
# Check whether the environment variable is set correctly
echo $ANTHROPIC_API_KEY
# Use the doctor command to validate
openclaw doctor --check-llm --verbose
# If you need to update the key, edit configuration or update the environment variable
export ANTHROPIC_API_KEY="sk-ant-api03-new-key"
openclaw restart
Error 4: pnpm Not Installed (for Source Installation)
Error: pnpm is required for monorepo management
# Install pnpm
npm install -g pnpm@latest
# Verify
pnpm --version
Error 5: Configuration File JSON Parse Error
Error: Failed to parse openclaw.json
SyntaxError: Unexpected token } in JSON at position 847
Diagnosis steps:
# Use the built-in configuration validation tool
openclaw config validate
# Or use jq to check JSON syntax
cat ~/.openclaw/openclaw.json | jq .
Common JSON mistakes:
- Trailing commas (JSON does not allow them)
- Unescaped special characters inside strings
- Comments (JSON doesn't support them — use a
_commentkey as a workaround)
Error 6: Memory Database Locked
Error: SQLite database is locked
Path: ~/.openclaw/data/memory.db
Cause: Another OpenClaw instance is writing to the database. OpenClaw uses the Single-writer pattern and does not support multiple processes writing concurrently.
# Stop all OpenClaw processes
openclaw stop --all
# If processes are unresponsive, force cleanup
rm ~/.openclaw/data/memory.db-wal
rm ~/.openclaw/data/memory.db-shm
openclaw start
Quick Diagnostic Command
When encountering any problem, run this first:
openclaw doctor
Example output (when issues are present):
OpenClaw Doctor v1.5.2
──────────────────────────────────────
✓ Node.js: v24.3.0
✗ Port 18789: In use by PID 12847
✓ Anthropic API: Connected
✓ Config: Valid JSON
✗ Memory DB: Locked (stale WAL file detected)
→ Run: openclaw fix --memory-db
Suggested fixes:
openclaw fix --port-conflict
openclaw fix --memory-db
Summary
In this chapter, we completed the full journey from OpenClaw installation to your first conversation:
- Understood the reasoning behind system requirements like Node.js 24+ and 8GB RAM
- Mastered three installation methods and their respective use cases
- Completed initialization using the
onboardwizard - Understood what each configuration field in
openclaw.jsonmeans - Successfully sent a first message and validated the API connection
- Learned how to access the Control UI for visual management
- Acquired troubleshooting skills for six common error categories
In the next chapter, we will go deep into OpenClaw's core concepts: the relationship between Gateway, Pi, Skills, Plugins, and Memory, and how data flows between these components.