OpenAI SDKs

Use CorvusLLM anywhere with the OpenAI /v1 path.

This is the main integration path for curl, fetch, backend apps, scripts, the OpenAI Python SDK, and the OpenAI Node SDK.

Install SDK prerequisites

Use curl for the smallest test. For application code, install Python or Node.js first, then install the official OpenAI SDK and point it at CorvusLLM with the base URL shown below.

What to installOfficial sourceQuick check
PythonPython downloadspython --version
Node.js and npmNode.js downloadnode --version and npm --version
OpenAI SDK packageOpenAI Python SDK or OpenAI Node SDKInstall only the language you use.
Install commands
python --version
python -m pip install --upgrade openai

node --version
npm --version
npm install openai

Environment values

Set the same two variables before running local SDK examples, CI scripts, or backend apps. The SDK reads the key from OPENAI_API_KEY and the CorvusLLM endpoint from OPENAI_BASE_URL.

PowerShell
$env:OPENAI_BASE_URL="https://base.corvusllm.com/v1"
$env:OPENAI_API_KEY="YOUR_CORVUSLLM_KEY"
macOS / Linux
export OPENAI_BASE_URL="https://base.corvusllm.com/v1"
export OPENAI_API_KEY="YOUR_CORVUSLLM_KEY"

curl

curl
curl https://base.corvusllm.com/v1/chat/completions ^
  -H "Authorization: Bearer YOUR_CORVUSLLM_KEY" ^
  -H "Content-Type: application/json" ^
  -d "{\"model\":\"gpt-5.5\",\"messages\":[{\"role\":\"user\",\"content\":\"Say sdk-ok\"}]}"

OpenAI Python SDK

Python
import os
from openai import OpenAI

client = OpenAI(
    base_url=os.environ.get("OPENAI_BASE_URL", "https://base.corvusllm.com/v1"),
    api_key=os.environ["OPENAI_API_KEY"],
)

response = client.chat.completions.create(
    model="gpt-5.5",
    messages=[{"role": "user", "content": "Say sdk-ok"}],
)

print(response.choices[0].message.content)

OpenAI Node SDK

Node.js
import OpenAI from "openai";

const client = new OpenAI({
  baseURL: process.env.OPENAI_BASE_URL || "https://base.corvusllm.com/v1",
  apiKey: process.env.OPENAI_API_KEY,
});

const response = await client.chat.completions.create({
  model: "gpt-5.5",
  messages: [{ role: "user", content: "Say sdk-ok" }],
});

console.log(response.choices[0].message.content);

Responses API

CorvusLLM exposes /v1/responses, but keep the first implementation simple: use it in non-stream mode first.

Live provider matrix

The matrix below is rendered from the current public catalog. Desktop shows one model per column; mobile uses compact model rows so slugs, providers, and endpoints stay readable.

ModelLoading
ProviderLoading current catalog.
Endpoint/v1
Browse docs
On this page