Install Windsurf first
Install Windsurf, open a project, and only use CorvusLLM if your build exposes a provider override or OpenAI-compatible base URL field.
| What to install | Official source | Quick check |
|---|---|---|
| Windsurf IDE | Windsurf setup docs | Open Windsurf after install. |
| Optional Git | Git install guide | git --version |
| CorvusLLM key | Buy or top up balance | Paste only if the custom provider field exists. |
Local checks
git --version If your build has a provider override
Open Windsurf settings and look for provider, model, or OpenAI-compatible override fields. Use this path only if your build lets you set the provider endpoint directly.
| Windsurf field | Where to look | Value to paste |
|---|---|---|
| Provider type | Settings > model/provider configuration | OpenAI-compatible or custom OpenAI |
| Base URL | Provider endpoint / base URL input | https://base.corvusllm.com/v1 |
| API key | The same custom provider credential input | Your delivered CorvusLLM key |
| Model slug | Model name, model ID, or custom model input | gpt-5.5 |
Values to paste
Custom provider values
Base URL: https://base.corvusllm.com/v1
API Key: YOUR_CORVUSLLM_KEY
Model: gpt-5.5
If it does not
Use a first-class setup such as OpenAI SDKs, Open WebUI, ChatBox, n8n, or Claude Code instead of forcing an unsupported path.