Documentation

Set up Brogrammer

Install the plugin, connect your AI agent, and give it real repo context. Everything runs locally — nothing about your code leaves your machine.

Overview

Brogrammer is an IDE plugin that ships a local MCP server. Your AI agent — Claude Code, Cursor, Continue, or any MCP-compatible client — connects to it and gets access to tools for semantic search, symbol lookup, editor state, git context, file reads, and more. There is no cloud indexing layer: the plugin walks your workspace, builds an in-process index, and answers tool calls from RAM / local disk.

The setup differs slightly between JetBrains and VS Code — JetBrains runs the agent directly inside the IDE, while VS Code exposes a TCP server that external agents (Claude Code, Cursor) talk to via a stdio bridge. Pick the IDE you use below.

Prerequisites

Core requirements

ComponentMinimum versionNotes
Operating systemLinux, macOS 13+, or Windows 10/11All three are first-class.
JetBrains IDE2023.3+IntelliJ, PyCharm, WebStorm, GoLand, Rider, etc.
VS Code1.95+Natively installed — do not use a WSL-only VS Code with a Windows Claude Code client.
Node.js18+Windows only — used by the MCP stdio bridge that connects external agents (Claude Code, Cursor) to the VS Code extension. Not needed on Linux/macOS or for JetBrains users.
GitAny modern versionNeeded for the git-context tools (blame, diff, branch).

Optional components

ComponentVersionWhy you might want it
OllamaLatestEnables semantic-search. Runs a local embedding model; no data leaves your machine.
Embedding modelqwen3-embedding:0.6b~1 GB. Pulled via ollama pull. Swap in any compatible embedding model in settings.
llama.cpp / llama-serverDec 2025+Only needed if you want to run a local LLM instead of Anthropic cloud. Requires /v1/messages support.
socatAnyLinux/macOS only — bridges stdio↔TCP for Claude Code. Not needed on Windows (we ship a Node bridge).

Do I need Postgres / pgvector?

No. By default the plugin keeps its symbol index and embeddings in memory and on local disk — no database required. pgvector is an optional persistent backend you can enable in Settings → Tools → Brogrammer → Vector store if you want embeddings to survive IDE restarts on very large repos. When enabled it talks to a Postgres instance you control on localhost; nothing about your code leaves your machine either way.

Install — JetBrains IDEs

The JetBrains plugin ships the MCP server and exposes all tools directly to any agent running inside the IDE (or via an external MCP client that connects to the plugin's endpoint).

Linux / macOS

  1. In your IDE: Settings → Plugins → Marketplace, search for Brogrammer, and click Install.
  2. Restart the IDE when prompted.
  3. Open the Brogrammer tool window (right sidebar) and click Sign in — a browser tab opens to confirm the device code.

Windows

Identical to Linux/macOS, with two notes:

  • If you install Ollama for semantic search, use the Windows installer — it registers a background service that starts at login.
  • If your IDE is running inside WSL, install the plugin in that WSL instance specifically. The plugin binds to 127.0.0.1 and will not be reachable from Windows-side tooling unless you enable mirrored networking.

Install — VS Code

The VS Code extension launches a local TCP MCP server (default port 3333). External agents like Claude Code and Cursor speak MCP over stdio, so a tiny bridge pipes stdio ↔ TCP. On Linux and macOS the bridge is socat; on Windows it's a small Node script the extension ships with its VSIX.

Linux / macOS

  1. Install socat: sudo apt install socat (Debian/Ubuntu) or brew install socat (macOS).
  2. Install the extension from the VS Code Marketplace: Extensions (Ctrl+Shift+X) → search BrogrammerInstall. The extension auto-updates.
  3. Open a workspace. The extension auto-starts the MCP server on 127.0.0.1:3333. The Brogrammer activity-bar icon shows the Status view.
  4. Verify the server is listening:
    ss -ltn | grep 3333             # Linux
    lsof -iTCP:3333 -sTCP:LISTEN    # macOS
  5. Register it with Claude Code (at user scope, so it's visible from every project):
    claude mcp add --scope user brogrammer \
      -- socat STDIO TCP:localhost:3333
    claude mcp list    # brogrammer: ✓ Connected

Windows

Run Claude Code natively on Windows

Do not call the Windows claude.cmd from inside WSL — its shim calls exec node and will fail with exec: node: not found. Install Claude Code in native PowerShell with npm install -g @anthropic-ai/claude-code.

  1. Install the extension from the VS Code Marketplace: Extensions → search Brogrammer Install.
  2. Open a workspace. Confirm the server is listening:
    Get-NetTCPConnection -LocalPort 3333 -State Listen
    Test-NetConnection -ComputerName 127.0.0.1 -Port 3333
  3. Locate the bundled Node bridge (shipped inside the extension's install folder). In PowerShell:
    $ext    = Get-ChildItem "$env:USERPROFILE\.vscode\extensions\brogrammer.brogrammer-*" |
              Sort-Object Name | Select-Object -Last 1
    $bridge = Join-Path $ext.FullName "scripts\mcp-stdio-bridge.js"
    $bridge    # copy this absolute path for the next step
  4. Register with Claude Code, substituting the path from the previous step:
    claude mcp add --scope user brogrammer \
      -- node "<paste $bridge here>" 127.0.0.1 3333
    claude mcp list

Smoke-test

From a terminal:

(echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'; sleep 0.3) \
  | socat - TCP:localhost:3333   # Linux/macOS

'{"jsonrpc":"2.0","id":1,"method":"tools/list"}' \
  | node "$bridge" 127.0.0.1 3333   # Windows ($bridge from step 3 above)

A JSON response listing all Brogrammer tools confirms the server is live.

Optional — Ollama for semantic search

Brogrammer's default codebase-retrieval tool uses a local TF-IDF index and works without any external dependencies. For semantic search (asking questions like “where do we debounce input events?”), install Ollama and pull an embedding model.

1. Install Ollama

  • Linux: curl -fsSL https://ollama.com/install.sh | sh
  • macOS: download the app from ollama.com or brew install ollama.
  • Windows: run the Windows installer. It registers a background service.

2. Pull the embedding model

ollama pull qwen3-embedding:0.6b

That's roughly a 1 GB download. Any Ollama-compatible embedding model works — override the name in the plugin's settings.

3. Verify

# Linux/macOS
curl http://localhost:11434/api/tags

# Windows PowerShell
Invoke-RestMethod http://localhost:11434/api/tags

A JSON response containing qwen3-embeddingmeans you're ready. Restart your IDE so the plugin picks up the Ollama connection.

Everything stays local

The embedding model runs entirely on your machine. Neither your source code nor your queries leave your device when semantic search is used.

Optional — running a local LLM

Claude Code defaults to Anthropic's cloud. If you'd rather run the model itself locally, you can point Claude Code at a llama-server instance serving the Anthropic Messages API. The Brogrammer side is unaffected — it only cares about the MCP client.

1. Run llama-server

./llama-server \
  -m ~/models/Qwen3.5-9B-Claude-Distilled.Q4_K_M.gguf \
  --alias claude-distilled \
  --host 0.0.0.0 --port 8080 \
  -ngl 99 -c 131072 \
  --jinja

--jinja is required for MCP tool-calling to format correctly. Any llama.cpp build from December 2025+ includes the /v1/messages route needed for Anthropic-shaped requests.

2. Redirect Claude Code

# Linux/macOS
export ANTHROPIC_BASE_URL="http://localhost:8080"
export ANTHROPIC_AUTH_TOKEN="dummy"
export ANTHROPIC_API_KEY="dummy"

# Windows PowerShell (persistent)
[Environment]::SetEnvironmentVariable("ANTHROPIC_BASE_URL",   "http://localhost:8080", "User")
[Environment]::SetEnvironmentVariable("ANTHROPIC_AUTH_TOKEN", "dummy", "User")
[Environment]::SetEnvironmentVariable("ANTHROPIC_API_KEY",    "dummy", "User")

Then run claude --model claude-distilled. The --model name must match your --alias.

Expectation setting

Smaller models (≤ 9B) occasionally emit malformed MCP tool-call JSON. For reliable work on a real codebase, a 14B–32B coder-class model (Qwen2.5-Coder, DeepSeek-Coder-V2, GLM-4.7-Flash) is noticeably more consistent.

Connect an MCP client

Brogrammer speaks the Model Context Protocol, so any MCP-compatible agent can use its tools. The most common clients:

Claude Code (CLI)

Use the claude mcp add command shown in the install steps above. Quick verification:

claude mcp list
# brogrammer: ... ✓ Connected

claude
# then ask:
list the brogrammer MCP tools
use brogrammer's codebase-retrieval to find the auth flow

Cursor

Open Settings → MCP and add a server. For VS Code with Linux/macOS:

{
  "mcpServers": {
    "brogrammer": {
      "command": "socat",
      "args": ["STDIO", "TCP:localhost:3333"]
    }
  }
}

On Windows, swap socat for the Node bridge that ships with the extension. Resolve the absolute path first (see VS Code Windows install), then paste it here:

{
  "mcpServers": {
    "brogrammer": {
      "command": "node",
      "args": [
        "C:\\Users\\<you>\\.vscode\\extensions\\brogrammer.brogrammer-<version>\\scripts\\mcp-stdio-bridge.js",
        "127.0.0.1",
        "3333"
      ]
    }
  }
}

Continue, Cline, other MCP clients

Any client that accepts an stdio server command works the same way — point it at socat STDIO TCP:localhost:3333 (or the bundled Node bridge on Windows).

Sign in & device approval

The plugin is free to install but requires a Brogrammer account on paid tiers (see pricing). Sign-in uses single-use magic links — no passwords.

  1. In the plugin's Status view, click Sign in.
  2. Your browser opens to brogrammer.net/device showing a short device code. Enter your email to receive a sign-in link.
  3. Click the link in the email, confirm the code matches the one shown in your IDE, and approve.
  4. The plugin is now linked to your subscription. You can manage and revoke devices anytime from the billing page.

Memories & rules

Brogrammer persists two kinds of context across sessions:

Memories — project facts

Small durable facts (“we use pgvector 0.7 with 768-dim embeddings”, “never push to main without a PR”). Scoped per-project, returned by the get-memories MCP tool.

Three ways to add one:

  • Ask Claude “save a memory that we use pgvector”.
  • Command Palette Brogrammer: Add Memory….
  • Auto-hook — install the UserPromptSubmit hook (see scripts/brogrammer-remember-hook.py); any prompt starting with “remember …” is captured automatically.

Rules — style & convention guidance

Rules are prescriptive (how to write code) and can be global or project-scoped:

  • Global: ~/.brogrammer/global-rules/*.md
  • Project-local: .brogrammer/rules/*.md

Frontmatter controls when each rule activates:

---
type: always          # in context on every get-rules call
---
Use TypeScript strict mode everywhere.

---
type: auto            # active when the open file matches a glob
paths: ["**/test_*.py", "**/tests/**/*.py"]
---
Use pytest fixtures. Never setUp / tearDown.

---
type: manual          # toggled on from the Rules sidebar
---
When I ask for "strict review", flag cross-module coupling.

Troubleshooting

brogrammer: ✗ Failed to connect in claude mcp list

  • Is your IDE running with a workspace open? The extension only starts the server after a workspace loads.
  • Is port 3333 actually listening? ss -ltn | grep 3333 (Linux), lsof -iTCP:3333 -sTCP:LISTEN (macOS), or Get-NetTCPConnection -LocalPort 3333 -State Listen (Windows).
  • Check Output → Brogrammer in VS Code (or the Log viewer in your JetBrains IDE) for activation errors.

semantic-search returns “Ollama unavailable”

  • Ollama daemon not running — systemctl start ollama (Linux), brew services start ollama (macOS), or launch the Ollama app (Windows).
  • Embedding model missing — ollama pull qwen3-embedding:0.6b.
  • Wrong endpoint or model name in brogrammer.ollama.endpoint / brogrammer.ollama.embeddingModel settings.

Symbol index is empty

  • Files > 512 KB are skipped, and the index caps at 20,000 files — very large monorepos may exceed the cap.
  • Look for Initial index complete: N files, M symbol names in the plugin output log.

find-usages returns empty for a symbol you know exists

The VS Code language server for that file type isn't installed or isn't responding. Install the matching language extension (TypeScript, Python, Go, Rust Analyzer, Java, etc.).

Port 3333 already in use

Change brogrammer.mcp.port in settings, then re-run claude mcp add with the new port in the socat / bridge command.

Model replies but MCP tool calls never fire (local LLM)

Usually missing --jinja. Restart llama-server with the flag. If tool calls fire but return malformed JSON, try a larger or more tool-tuned model.

Support

Questions, feedback, or bug reports: support@brogrammer.net.

For billing and account management, head to the billing page. You can see all devices linked to your account there and revoke any of them instantly.