Everything you need to integrate LLM Consensus
Try every endpoint live in Swagger UI. Authenticate, send requests, and inspect responses in your browser.
Open Swagger UIBeautiful, three-panel ReDoc reference with schemas, examples, and request/response details.
Open ReDocDownload the raw OpenAPI 3.1 JSON schema. Import into Postman, Insomnia, or your code generator.
Download JSONPass your key in the header. Manage keys from your account dashboard.
X-API-Key: orch_your_key
Get your API key →
HTTP 402 micropayments. No account needed — pay with stablecoins per request.
X-Payment: base64(...)
Learn about x402 →
Unauthenticated requests return a 402 with payment instructions and pricing info.
HTTP 402 Payment Required
Auto-discovery for agents
import requests response = requests.post( "https://llmconsensus.io/v1/orchestrate", headers={"X-API-Key": "orch_your_key"}, json={ "prompt": "Explain quantum computing in one paragraph", "mode": "balanced" } ) data = response.json() print(data["answer"]) # Consensus answer print(data["metadata"]["quality_score"]) # Quality score print(data["metadata"]["token_usage"]) # Tokens consumed
| Method | Path | Auth | Description |
|---|---|---|---|
| POST | /v1/orchestrate |
Yes | Multi-model orchestration with consensus |
| POST | /v1/batch |
Yes | Submit batch of prompts |
| GET | /v1/batch/{batch_id} |
Yes | Check batch processing status |
| POST | /v1/auth/register |
No | Register a new account |
| POST | /v1/auth/login |
No | Login to your account |
| POST | /v1/auth/api-keys |
Yes | Generate a new API key |
| GET | /v1/auth/me |
Yes | Get current user info |
| GET | /v1/billing/packs |
No | List available credit packs |
| POST | /v1/billing/checkout |
Yes | Create Stripe checkout session |
| POST | /v1/billing/crypto-checkout |
Yes | Create USDC payment invoice |
| GET | /v1/billing/history |
Yes | Credit transaction history |
| GET | /v1/billing/usage |
Yes | API usage history with credits spent |
| GET | /v1/billing/discount-tier |
Yes | Get your volume discount tier |
| PUT | /v1/webhooks/config |
Yes | Configure webhook URL |
| GET | /v1/webhooks/config |
Yes | Get webhook configuration |
| POST | /v1/webhooks/test |
Yes | Send a test webhook |
| DELETE | /v1/webhooks/config |
Yes | Remove webhook configuration |
| GET | /v1/health |
No | System health check |
| GET | /v1/usage |
Yes | Usage statistics |
Official Python client with async support, automatic retries, and type hints.
pip install llmconsensus
Available Now
Machine-readable context file for LLM agents. Describes capabilities, endpoints, and usage.
View llms.txt →OpenAI-compatible plugin manifest for ChatGPT and other AI agent platforms.
View ai-plugin.json →