Skip to main content
The basic setup described in Quickstart is the simplest way to get started and is recommended for most users. However, it does not support prompt optimization. It supports: Model routing, fallbacks and retries
Automatic model leaderboard
Finetuning & distillation
Prompt optimization
In order to perform prompt optimization, Auto must understand:
  • What parts of a request may be optimized (eg: instructions to the LLM )
  • What parts of a request may not be optimized (eg: inputs, previous chat messages)
Thus, you can provide metadata to Auto to help it understand how your request is structured.

Marking optimizable segments

Use markOptimizable and markInput to denote the parts of your request that may be optimized. The Auto server removes these tags before submitting them to the LLM. markOptimizable and markInput are rather simple, they merely surround an input with an XML marker.
const markOptimizable = (input: string) => `<_auto_optimizable>${input}</_auto_optimizable>`;
const markInput = (input: string) => `<_auto_input>${input}</_auto_input>`;
Here’s a full example:
OpenAI SDK (.ts)
const markOptimizable = (input: string) =>
  `<_auto_optimizable>${input}</_auto_optimizable>`;
const markInput = (input: string) =>
  `<_auto_input>${input}</_auto_input>`;

const messages = [
  {
    role: "system",
    // We mark this optimizable, because it's an instruction to the LLM
    content: markOptimizable("You are a helpful assistant that answers question."),
  },
  {
    role: "user",
    // We do not mark this optimizable, because it's a past chat message
    content: "How do I make a peanut butter and jelly sandwich?",
  },
  {
    role: "assistant",
    // We do not mark this optimizable, because it's a past chat message
    content: "To make a peanut butter and jelly sandwich, you need to follow these steps: 1. Spread peanut butter on one slice of bread. 2. Spread jelly on the other slice of bread. 3. Put the two slices of bread together.",
  },
  {
    role: "user",
   // This message has optimizable parts that should be changed
   // But it also has inputs that should not be changed
   content: markOptimizable(`Use the following documents to answer your question: 
    ${markInput(doc1)}
    ${markInput(doc2)}
    
    Please follow the following instructions: 
    - Be concise and to the point
    - Use the provided documents to answer the question
    - If the documents do not contain the answer, say so
    - If the answer is not clear, ask follow-up questions

    User query: ${markInput(userQuery)}`
    )
  },
]
If you find that you need to remove the tags from your request, eg: if you decide to send it to an LLM without using Auto, you can use removeMarkers to do so.
OpenAI SDK (.ts)
const markerRegex =
  /<_auto_optimizable>|<\/_auto_optimizable>|<_auto_input>|<\/_auto_input>/g;
const removePromptMarkers = (input: string) => input.replace(markerRegex, "");