Skip to main content

1. Get an API key

Go to your Developers dashboard:
  1. Click Create API Key, give it a name, and copy it (starts with nj_sk_, shown once)
No credit card required. Your first 50 requests/month are free.

2. Make a request

curl -X POST https://ninjachat.ai/api/v1/chat \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer nj_sk_YOUR_API_KEY" \
  -d '{
    "model": "auto",
    "messages": [{"role": "user", "content": "What is the capital of France?"}]
  }'

3. See the response

{
  "id": "chatcmpl-1749584400000",
  "object": "chat.completion",
  "model": "gemini-3-flash",
  "choices": [{
    "message": { "role": "assistant", "content": "The capital of France is Paris." },
    "finish_reason": "stop"
  }],
  "usage": { "prompt_tokens": 14, "completion_tokens": 8, "total_tokens": 22 },
  "cost": { "this_request": "$0.003" },
  "balance": "$49.997",
  "metadata": { "latency_ms": 312 }
}
model: "auto" routed a quick factual question to gemini-3-flash (fastest, cheapest for short queries). Add "include_routing": true to see the routing decision.

Switch models

Same endpoint, just change model:
{"model": "gpt-5"}          // General purpose — $0.006/req
{"model": "claude-sonnet-4.6"}  // Best for code — $0.015/req
{"model": "deepseek-v3"}    // Fast & cheap — $0.003/req
{"model": "auto"}           // NinjaChat picks — billed at resolved rate
{"model": "ensemble"}       // 3 models vote — $0.040/req

Try the power features

These are unique to NinjaChat — no other API offers them.
# auto routes to best model based on what you ask
r = requests.post("https://ninjachat.ai/api/v1/chat",
    headers={"Authorization": "Bearer nj_sk_YOUR_API_KEY"},
    json={
        "model": "auto",
        "messages": [{"role": "user", "content": "Solve: ∫x²dx"}],
        "include_routing": True,
    }
)
data = r.json()
print(data["choices"][0]["message"]["content"])
print("Routed to:", data["routing"]["resolved"])  # o3-mini (math specialist)

Try every endpoint type

r = requests.post("https://ninjachat.ai/api/v1/images",
    headers={"Authorization": "Bearer nj_sk_YOUR_API_KEY"},
    json={"model": "flux-2-klein", "prompt": "A cute robot drinking coffee, cartoon style"}
)
print(r.json()["images"][0]["url"])

Next steps

I want to…Go here
Auto-select the best model per taskSmart Routing →
Compare models side-by-sideModel Compare →
Run 20 prompts at onceBatch →
Build a multi-turn chatbotSessions →
Know cost before runningEstimate →
See all available modelsModels →
Understand pricingPricing →
Handle errors properlyError Handling →