Install requirement
Install your language runtime, package manager, and OpenAI SDK package before running a first request.
Connect OpenAI SDKs to supported Claude models through CorvusLLM with a documented endpoint shape, prepaid balance, current public model slugs, and clear service boundaries.
Independent service. Not affiliated with OpenAI, Anthropic, Google, or Z.AI.
CorvusLLM can fit OpenAI SDKs users who need an independent prepaid access layer for supported Claude models, a clear base URL, current public model slugs, pricing proof, and setup documentation before they send real prompts.
Install the OpenAI SDK for your language, set the base URL to the CorvusLLM OpenAI-compatible endpoint, pass your CorvusLLM key, and use a public catalog slug.
Install your language runtime, package manager, and OpenAI SDK package before running a first request.
Claude models are usually a fit for coding agents, reasoning-heavy writing, analysis, refactors, and assistant workflows that benefit from Claude-family behavior.
Claude-native clients can use Anthropic-style request shapes, while OpenAI-compatible clients should use the OpenAI-style CorvusLLM endpoint and public model slugs.
base_url = "https://base.corvusllm.com/v1"
api_key = "YOUR_CORVUSLLM_KEY"
model = "claude-haiku-4-5"
| Field | Use this | Why it matters |
|---|---|---|
| Where to edit | Your application environment variables or the SDK client constructor in backend code. | Use the tool-owned settings area instead of hardcoding keys in prompts or visible node text. |
| Base URL | https://base.corvusllm.com/v1 | Use the OpenAI-compatible route for SDKs, Open WebUI, Cursor-style custom providers, ChatBox, n8n, Windsurf, and similar clients. |
| API key | Your CorvusLLM key | Keep it in the tool secret store, credentials area, or server environment variables. |
| Model slug | claude-haiku-4-5 or claude-opus-4-5 | Use public Claude catalog rows and verify availability before larger runs. |
| First test | One small non-sensitive prompt | Confirm endpoint, slug, latency, response format, and billed usage before production or repository-wide context. |
This table separates a real custom-endpoint setup from cases where the tool version, secret handling, or model field will make the connection fail.
| Check | Good state | Avoid this |
|---|---|---|
| Required control | OpenAI SDKs exposes a custom base URL, custom provider, or compatible API host field. | If OpenAI SDKs hides endpoint controls, use the docs fallback instead of guessing hidden settings. |
| Endpoint shape | OpenAI-compatible custom endpoint support is available and can be set to https://base.corvusllm.com/v1. | Do not mix OpenAI-compatible setup values with another request format. |
| Secret handling | The CorvusLLM key is stored in OpenAI SDKs settings, credentials, or server environment variables. | Do not paste API keys into public repositories, prompts, screenshots, or client-side production code. |
| Model slug | The selected value is a public Claude slug from the model catalog. | Do not use private upstream route names, provider-account names, or guessed model aliases. |
| First request | A small non-sensitive prompt succeeds before larger project context, automation, or team rollout. | Do not start with private repositories, regulated data, or high-volume automation before a small test. |
These examples are only starting points. The model catalog and data/models.json are the source of truth for availability, pricing, cache fields, and public slugs.
These checks turn common setup errors into crawlable answers and send private account cases to the right support path only after public checks are exhausted.
| Symptom | First check | Public source |
|---|---|---|
| Custom base URL field is missing | Confirm your OpenAI SDKs version exposes custom provider controls. | OpenAI SDKs docs |
| Model not found | Use a public Claude slug from the current catalog and avoid hidden aliases. | Claude catalog |
| Authentication or balance error | Confirm the delivered CorvusLLM key, account balance, and that the key was not pasted with extra spaces. | Troubleshooting |
| Unexpected cost | Estimate input, output, cache read, and cache write before sending large context or loops. | Cost calculator |
| Timeout or provider unavailable | Retry a tiny non-streaming prompt, then check customer-facing status before changing credentials. | Service Status |
This is a commercial setup page, not a private account console. Use the supporting pages for exact pricing, operational status, and trust details.
Do not send sensitive or regulated data through shared API proxies. CorvusLLM forwards prompts to upstream model providers for processing and keeps request metadata for billing, abuse prevention, and support diagnostics.
These answers keep setup pages useful for search, AI answers, and first-time buyers without replacing the exact docs.
OpenAI SDKs should use https://base.corvusllm.com/v1 for this Claude setup path. Check the dedicated docs page when the tool changes its custom-provider settings.
No. This page is for using a CorvusLLM prepaid key as an independent access layer. CorvusLLM is not affiliated with Anthropic, and exact supported rows should be verified in the public catalog.
Do not send sensitive or regulated data through shared API proxies without a separate risk review. Start with non-sensitive tests and move larger workloads only after checking trust, status, and billing behavior.
Provider+tool pages are linked together so developers and crawlers can move from a broad query to the exact setup route.
Confirm base URL, key, public slug, latency, output quality, and billed usage before larger prompts, repository context, or automated workflows.
Compare the setup path, model catalog, pricing proof, and trust pages before you choose an endpoint.