Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs-apexspriteai.reliatrack.org/llms.txt

Use this file to discover all available pages before exploring further.

Claude Code reads a JSON configuration file at startup to decide where to send API requests and how to behave. By default it connects to Anthropic’s cloud API, which requires an active Anthropic subscription. ApexSpriteAI overrides this default by pointing Claude Code at your local LM Studio server instead. This page explains the configuration file, the environment variable alternative, and where MCP settings are stored.

The ~/.claude/config.json file

Claude Code stores its configuration in ~/.claude/config.json on your Mac. If the file does not exist yet, create it — Claude Code reads it automatically on each invocation. The most important field is ANTHROPIC_BASE_URL. When this field is present, Claude Code bypasses Anthropic’s OAuth flow entirely and routes all /v1/messages requests to the URL you specify. This is how ApexSpriteAI directs traffic to LM Studio.

Example configuration for remote GPU (Tailscale)

{
  "env": {
    "ANTHROPIC_BASE_URL": "http://100.82.56.40:1234"
  }
}
Replace 100.82.56.40 with your GPU server’s Tailscale IP address.

Example configuration for local-only mode

{
  "env": {
    "ANTHROPIC_BASE_URL": "http://localhost:1234"
  }
}
Do not include a trailing slash or the /v1/messages path in ANTHROPIC_BASE_URL. Claude Code appends the path automatically.

Full config.json with MCP settings

If you have MCP servers configured, your file will contain an additional mcpServers key alongside the env block:
{
  "env": {
    "ANTHROPIC_BASE_URL": "http://100.82.56.40:1234"
  },
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/yourname/projects"]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_..."
      }
    }
  }
}
~/.claude/config.json may contain API tokens for MCP servers. Do not commit this file to version control. Add ~/.claude/config.json to your global .gitignore.

Setting ANTHROPIC_BASE_URL as a shell environment variable

As an alternative to config.json, you can export ANTHROPIC_BASE_URL in your shell profile. This is useful when you want to switch between local and cloud mode without editing a file, or when running Claude Code in CI pipelines. Add one of the following to your ~/.zshrc or ~/.bash_profile:
export ANTHROPIC_BASE_URL="http://100.82.56.40:1234"
Reload your shell after saving:
source ~/.zshrc
When both config.json and a shell environment variable are present, the shell environment variable takes precedence.

MCP configuration

MCP (Model Context Protocol) servers are registered with the claude mcp add command. Claude Code writes the resulting configuration into ~/.claude/config.json under the mcpServers key automatically — you do not need to edit the file by hand when adding a new MCP server.
claude mcp add filesystem npx -y @modelcontextprotocol/server-filesystem /path/to/projects
After running this command, verify the entry was written:
cat ~/.claude/config.json
The mcpServers block should contain a new entry for filesystem.
Because MCP tools execute locally on your Mac — not on the remote GPU server — they work regardless of whether you are in local-only or remote GPU mode. The model on the remote server returns a JSON payload instructing Claude Code which tool to call, and Claude Code executes the tool locally.

Validating your configuration

Run the following test to confirm that Claude Code is reaching LM Studio and not Anthropic’s cloud:
1

Check the environment variable is set

echo $ANTHROPIC_BASE_URL
Expected output (remote GPU example):
http://100.82.56.40:1234
If this is empty, your shell profile change has not been reloaded. Run source ~/.zshrc and check again, or verify that config.json contains the correct value.
2

Confirm LM Studio is reachable

curl -s http://<your-tailscale-ip>:1234/v1/models | python3 -m json.tool
You should receive a JSON object listing the currently loaded model. A connection refused error means LM Studio is not running or the bind address is not set to 0.0.0.0.
3

Run a quick Claude Code session

claude "What model are you running on?"
The response should come from your local model rather than from Anthropic’s cloud. If you see an authentication error, double-check that ANTHROPIC_BASE_URL is set correctly and does not include a trailing slash.