Getting Started
Welcome to the OutLLM API documentation. OutLLM provides a powerful platform for managing and executing AI prompts at scale. This guide will help you integrate our API into your applications.
Rate Limits
Our API has the following rate limits:
- 100 requests per minute per API token
- 10,000 requests per day per API token
Errors
When an error occurs, the response will include an error object with message about what went wrong:
{
error: {
code: "NOT_BILLABLE",
message: "Insufficient credit balance",
}
}
Common Error Codes
403 NOT_BILLABLE
- Your account doesn't have enough credit to make the request401 INVALID_TOKEN
- The provided API token is invalid or expired429 RATE_LIMIT_EXCEEDED
- You've exceeded the rate limit for your API token400 INVALID_INPUT
- The request body contains invalid data
Error Response Fields
code
- A machine-readable error codemessage
- A human-readable error message
Bring Your Own Tokens
OutLLM allows you to use your own API tokens for supported LLM providers. We never store your tokens - they are only used for the specific request they are provided with.
Compound API Key
To use your own tokens, you need to create a compound API key that combines your OutLLM API key with your provider's API key:
outllm_api_key:provider_api_key
For example, if your OutLLM API key is outllm-123
and your OpenAI API key is sk-456
, your compound key would be:
outllm-123:sk-456
This compound key should be used in the Authorization
header:
Authorization: Bearer outllm_api_key:provider_api_key
Important Notes
- Tokens are only used for the specific request they are provided with.
- We never store or cache full tokens.
- If no provider token is provided, we'll use your credit balance.
Proxy API
OutLLM provides a proxy API that allows you to use our platform as a drop-in replacement for your existing LLM provider's API. This means you can use the same code and libraries you're already using, just by changing the base URL and using our compound API key.
Base URL
Replace your provider's base URL with our proxy URL:
- OpenAI:
https://openai.outllm.com
- Anthropic:
https://anthropic.outllm.com
- Google:
https://google.outllm.com
Example Usage
Here's how to use our proxy API with different programming languages:
JavaScript
import OpenAI from 'openai';
const openai = new OpenAI({
baseURL: 'https://openai.outllm.com/v1',
apiKey: 'outllm_api_key:provider_api_key'
});
const response = await openai.chat.completions.create({
model: 'gpt-4o-mini',
messages: [{ role: 'user', content: 'Hello!' }]
});
Python
from openai import OpenAI
client = OpenAI(
base_url="https://openai.outllm.com/v1",
api_key="outllm_api_key:provider_api_key"
)
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}]
)
PHP
use OpenAI\\Client;
$client = OpenAI::client('outllm_api_key:provider_api_key', [
'base_uri' => 'https://openai.outllm.com/v1'
]);
$response = $client->chat()->create([
'model' => 'gpt-4o-mini',
'messages' => [['role' => 'user', 'content' => 'Hello!']]
]);
cURL
curl https://openai.outllm.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer outllm_api_key:provider_api_key" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Hello!"}]
}'