AI Native Lang
how-to

How to run OpenClaw with completely free AI models using OpenRouter (zero-cost setup)

Get an OpenRouter API key step by step, plug it into OpenClaw (openclaw.json or env), and use $0 / :free models. Same OpenRouter API works with Hermes-Agents and other OpenRouter-capable stacks.

March 24, 2026·7 min read
#openclaw#openrouter#free-models#cost#how-to#agents#hermes-agent
Share:TwitterLinkedIn

How to run OpenClaw with completely free AI models using OpenRouter

OpenClaw gives you a powerful, self-hosted AI agent with tools, memory, and automation. For many people the friction is paying for model API credits.

OpenRouter exposes an OpenAI-compatible HTTP API (https://openrouter.ai/api/v1/...) and lists many free preview models (tagged :free or $0 / $0 pricing). The same API key and model ids work anywhere that stack supports OpenRouter—not only OpenClaw.

New to OpenClaw? Install and onboard first: How to install & set up AINL with OpenClaw.


Not just OpenClaw: Hermes-Agents and other agent platforms

If a product or framework lets you point at OpenRouter (or a custom OpenAI-compatible base URL plus Bearer auth), you can use the workflow below for the key and model names, then follow that product’s config UI or env vars.

Examples of the pattern:

  • OpenClaw — first-class openrouter/... model ids and OPENROUTER_API_KEY (this guide uses OpenClaw as the concrete example).
  • Hermes Agent (Nous Research) and other self-hosted agent stacks — when they support OpenRouter as a provider, you typically set the OpenRouter base URL (often https://openrouter.ai/api/v1), OPENROUTER_API_KEY (or their equivalent env var), and the model slug exactly as OpenRouter documents (same ids you see on openrouter.ai/models). See their docs for where those values live (e.g. ~/.hermes/ / .env patterns on Hermes).
  • MCP hosts, coding agents, and automations — anything that offers “OpenRouter” or “OpenAI-compatible endpoint” + API key can consume free :free models the same way, subject to that app’s own config shape.

OpenClaw-specific wiring (file paths, openrouter/free alias, openclaw.json) is in the sections below; for Hermes or another tool, substitute their environment variables or settings panel using the key from Step 1 and the model ids from Step 3.


Step 1: Get your OpenRouter API key (step by step)

You only create the key once; you’ll reuse it in OpenClaw’s env block (or in your shell). Free-tier models do not require a credit card on OpenRouter’s side at the time of writing—always confirm on openrouter.ai if their policy changes.

  1. Open the site
    Go to https://openrouter.ai/ in your browser.

  2. Sign up or log in
    Use Sign up if you don’t have an account, or Log in if you do. Complete email verification if prompted.

  3. Open API keys
    After login, open your account menu (usually top-right) and go to Keys, or go to Settings and find API Keys (wording may be “API keys” or “Keys” depending on the UI revision).

  4. Create a new key
    Click Create key (or Create API key). Optionally give it a name (e.g. openclaw-home) so you can revoke it later without guessing.

  5. Copy the key immediately
    OpenRouter shows the secret once in full—something like sk-or-v1-.... Copy it to a password manager or a temporary local note. If you lose it, revoke the old key and create a new one; you cannot recover the raw secret again from the dashboard.

  6. Keep it private

    • Do not paste keys into public chats, tickets, or screenshots.
    • Do not commit openclaw.json with a real key to git—use a placeholder in shared docs and replace locally.

You will paste this value into OpenClaw as OPENROUTER_API_KEY in Step 2.


Step 2: Wire the key into OpenClaw

OpenClaw needs the key available to the gateway / daemon process as the environment variable OPENROUTER_API_KEY. The usual place is your openclaw.json env object so every restart picks it up.

2a. Locate your config

On a typical install the file is:

~/.openclaw/openclaw.json

If you used a custom config path during install, use that path instead.

2b. Add or merge env.OPENROUTER_API_KEY

Open the file in an editor:

nano ~/.openclaw/openclaw.json

Find or create a top-level env object and set your real key (not the placeholder):

{
  "env": {
    "OPENROUTER_API_KEY": "sk-or-v1-PASTE_YOUR_KEY_FROM_STEP_1"
  }
}

If env already exists with other variables, add only the OPENROUTER_API_KEY line—do not delete unrelated entries.

Alternative (advanced): export in the shell before starting OpenClaw, e.g. export OPENROUTER_API_KEY=sk-or-v1-..., then launch the gateway the same way you usually do. Prefer openclaw.json so the key survives reboots and matches OpenClaw’s OpenRouter docs.

2c. Save and restart OpenClaw

openclaw restart

If that subcommand is missing, restart the gateway / daemon the same way you did after install & onboard (e.g. service manager or dashboard).

2d. Quick sanity check

Send a short test message to your agent. If OpenRouter rejects the key, you’ll see auth errors in logs—double-check copy/paste (no trailing spaces) and that the key is still active in the OpenRouter dashboard.


Step 3: Pick free models on OpenRouter

  1. Open OpenRouter Models.
  2. Use the sort/filter controls and sort by lowest prompt price (or equivalent “cheapest first”).
  3. Pick models that show $0 / $0 or include :free in the model id.

Examples that are often useful (March 2026—verify on the site):

  • arcee-ai/trinity-large-preview:free — strong general-purpose option
  • nvidia/nemotron-3-super-120b-a12b:free
  • stepfun/step-3.5-flash:free
  • z-ai/glm-4.5-air:free
  • Various Llama, Qwen, and Gemma free variants

Beginner shortcut: use the smart router openrouter/free — OpenRouter picks an available free model per request. Good default while you learn the stack.


Step 4: Point OpenClaw at those models

With OPENROUTER_API_KEY already set in Step 2, you only need to set which OpenRouter models OpenClaw uses.

OpenClaw model ids for OpenRouter typically look like:

openrouter/<provider>/<model-id>

or, for the free router:

openrouter/free

You can wire this in two ways, depending on whether you already have a working model.

Method A: You already have a working model (easiest)

If you can already chat with OpenClaw:

  1. Open a chat with your agent.
  2. Ask in plain language, for example:
    • “Switch my primary model to openrouter/free and add openrouter/arcee-ai/trinity-large-preview:free and openrouter/nvidia/nemotron-3-super-120b-a12b:free as fallbacks.”
    • Or: “Edit the config to set the primary model to openrouter/arcee-ai/trinity-large-preview:free.”
  3. Confirm when it offers to edit ~/.openclaw/openclaw.json (or your install’s config path).
  4. Restart OpenClaw (or follow your install’s restart flow—some setups accept a restart command in chat).

The agent updates config for you. Ensure OPENROUTER_API_KEY remains in env from Step 2.

Method B: No working model yet (manual config)

  1. On the machine where OpenClaw runs, open ~/.openclaw/openclaw.json again:
nano ~/.openclaw/openclaw.json
  1. Merge or add something like the following. Keep your real OPENROUTER_API_KEY from Step 1 inside env. Adjust structure to match the current schema in OpenClaw OpenRouter docs if your file already has different nesting:
{
  "env": {
    "OPENROUTER_API_KEY": "sk-or-v1-PASTE_YOUR_KEY_FROM_STEP_1"
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "openrouter/free"
      },
      "models": {
        "openrouter/free": {},
        "openrouter/arcee-ai/trinity-large-preview:free": {},
        "openrouter/nvidia/nemotron-3-super-120b-a12b:free": {}
      }
    }
  }
}
  1. Save, then restart:
openclaw restart

If openclaw restart is not available on your build, use the same restart steps you used after install (gateway/daemon, etc.).


Bonus: CLI helpers

If your OpenClaw build includes model subcommands, they are convenient for discovery and switching:

openclaw models scan
openclaw models list
openclaw models set openrouter/free

If a command is missing, use the dashboard or edit openclaw.json per the official docs.


Tips for a good free-tier experience

  • Start with openrouter/free as primary, then add 2–3 explicit :free models for redundancy.
  • Rate limits apply on free models and vary by model; they’re usually fine for personal experimentation.
  • You can add paid models later without restructuring the rest of your config.
  • Different free models excel at coding, reasoning, or creative tasks—benchmark with small prompts and keep the winners in models.

Summary

OpenRouter gives you one API key and model slugs that work across OpenClaw, Hermes Agent, and any stack with OpenRouter or OpenAI-compatible routing.

For OpenClaw specifically: Step 1 (create key) → Step 2 (OPENROUTER_API_KEY in openclaw.json + restart) → Steps 3–4 (pick :free models, set primary to openrouter/free or a specific openrouter/.../:free id). Paid usage is optional.

Useful links

For vision / image models or a heavier multi-model template, start from the OpenRouter model list and the OpenClaw provider page above, then extend the models map the same way.

A

AI Native Lang Team

The team behind AI Native Lang — building deterministic AI workflow infrastructure.

Related Articles