Get Started
Install the extension, satisfy prerequisites, and run Corvus in under five minutes.
Prerequisites
- VS Code 1.85 or newer
- Node.js and pnpm if you install from source
- Ollama (recommended) for local chat — ollama.ai
Install the extension
Choose one:
- VS Code Marketplace — Search for “RavenKit” in the Extensions view and install. (When published.)
- VSIX — Download the latest
.vsixfrom GitHub Releases, then in VS Code:Extensions → ⋯ → Install from VSIX... - From source — Clone the repo, then from
apps/ravenkit:pnpm install,pnpm compile, thenpnpm run vsix:package. Install the generatedravenkit.vsixas above.
Set up Ollama (recommended)
Corvus works best with a local model. Install Ollama, then:
- Install Ollama from ollama.ai.
- Pull a model:
ollama pull qwen2.5-coder:7borollama pull llama3.2. - Start the server:
ollama serve(often already running).
You can also use Anthropic (Claude) or OpenAI by setting the provider and API key in Settings.
First run
- Open the Command Palette (
Cmd+Shift+P/Ctrl+Shift+P). - Run RavenKit: Open Chat Agent (or press
Cmd+Shift+R/Ctrl+Shift+R). - Type a message; Corvus will use your workspace and tools to respond.
Optional: set RavenKit: Select Project if you’re in a monorepo so the agent and System Lens use the right scope.