Independent AI API proxy

LLM API cost calculator

Estimate CorvusLLM prepaid usage for supported Claude, GPT, and GLM rows before topping up or moving a workflow.

Independent service. Not affiliated with OpenAI, Anthropic, Google, or Z.AI.

Prepaid balanceEstimate how far a top-up may go.
Card, wallet, or crypto checkoutAvailable methods appear before order creation.
No financially backed SLABest-effort infrastructure; verify status first.
Calculator

Estimate monthly API spend

Use rough monthly token totals, then compare the official reference cost with the current CorvusLLM prepaid rate.

Official reference $0.00
CorvusLLM estimate $0.00
Estimated savings $0.00
Difference 0%
Claude Opus 4.7 Anthropic model row
Pricing tracker
Input Enter token totals to calculate the estimate.

Estimates depend on actual tokenization, selected model, cache behavior, and current pricing data. For Anthropic cache writes, generic cache-write tokens use the 5-minute cache-write reference unless duration buckets are available.

What counts as input?

User text, system messages, tool schemas, selected files, retrieved context, and prior chat messages can all increase input tokens.

What counts as output?

Output tokens are what the model generates, including code, explanations, structured JSON, or streamed assistant text.

What about cache?

Cache reads and writes are usually visible in provider or proxy usage logs. Enter them separately when your workflow reuses large contexts.

Start with a small test balance.

Run one known workflow, compare the calculator against real usage logs, then scale only after the estimate matches your setup.