Skip to main content
We currently recommend that you bring your own keys for production traffic, as you may require higher provider API rate limits than Auto is able to provide.
You can use LLMs through Auto in two ways:
  • Use Auto’s keys: We pay the LLM provider, and then bill you for the usage.
  • Use your own API keys: We use your provider API keys, and you’re billed directly for usage.
Your keys are never used for experiments & evals that Auto runs - only for your own traffic. To bring your own keys, include one or more keys in your API request, for example:
Fetch API (.ts)
const response = await fetch("https://api.auto.venki.dev/api/v1/chat/completions", {
  method: "POST",
  headers: {
    "Authorization": `Bearer ${process.env.AUTO_API_KEY}`,
    "Content-Type": "application/json",
    // This key will be used for this request
    "X-OpenAI-Api-Key": process.env.OPENAI_API_KEY, 
    "X-Auto-Prompt-Id": "summarize-webpage"
  },
  body: JSON.stringify({
    model: "openai/gpt-4o",
    messages: ...,
    temperature: 0
  })
});

Multiple API keys

You can include multiple API keys - and only the relevant API keys will be used.
Fetch API (.ts)
const response = await fetch("https://api.auto.venki.dev/api/v1/chat/completions", {
  method: "POST",
  headers: {
    "Authorization": `Bearer ${process.env.AUTO_API_KEY}`,
    "Content-Type": "application/json",
    // This key will be used for this request.
    "X-OpenAI-Api-Key": process.env.OPENAI_API_KEY, 
    // This key will not be used.
    "X-Anthropic-Api-Key": process.env.ANTHROPIC_API_KEY, 
    "X-Auto-Prompt-Id": "summarize-webpage"
  },
  body: JSON.stringify({
    model: "openai/gpt-4o",
    messages: ...,
    temperature: 0
  })
});

Supported API keys

  • OpenAI: X-OpenAI-Api-Key
  • Anthropic: X-Anthropic-Api-Key
  • Google Gemini: X-Gemini-Api-Key
  • Google Vertex AI: See below.

BYOK for Google Vertex AI

Instead of using a simple API key, Google Vertex AI requires the use of service account credentials instead. To pass them in, simply URI-encode the JSON service account key, and pass it in as X-Vertex-Credentials. Pass the project and location as X-Vertex-Project and X-Vertex-Location respectively.
Fetch API (.ts)
const response = await fetch("https://api.auto.venki.dev/api/v1/chat/completions", {
  method: "POST",
  headers: {
    "Authorization": `Bearer ${process.env.AUTO_API_KEY}`,
    "Content-Type": "application/json",
    // "credentials" is your service account JSON object with top-level fields such as "client_id" and "private_key"
    "X-Vertex-Credentials": encodeURIComponent(JSON.stringify(credentials)),
    "X-Vertex-Project": process.env.VERTEX_AI_PROJECT,
    "X-Vertex-Location": process.env.VERTEX_AI_LOCATION,
    "X-Auto-Prompt-Id": "summarize-webpage"
  },
  body: JSON.stringify({
    model: "vertex_ai/gemini-2.0-flash",
    messages: ...,
    temperature: 0
  })
});