AACFlow

AI API (OpenAI-Compatible)

OpenAI-compatible chat completions and model listing endpoints for the AACFlow AI API

AACFlow exposes an OpenAI-compatible AI API at /api/v1/ai. Any client or SDK that targets the OpenAI API can be pointed at AACFlow by changing the base URL and swapping the API key.

All AI API requests deduct from your AACFlow credit balance. A small deposit is reserved at the start of each request and reconciled against the actual model cost once the response is complete.

Authentication

Pass your AACFlow API key in either the x-api-key header (native) or the Authorization: Bearer <key> header (OpenAI SDK compatible).

Authorization: Bearer aacf_your_api_key_here

Base URL

https://www.aacflow.io/api/v1/ai

For local development:

http://localhost:4000/api/v1/ai

Chat Completions

POST /api/v1/ai/chat/completions

Generate a model response for a conversation. Request and response schemas are compatible with the OpenAI Chat Completions API.

Request body

FieldTypeRequiredDescription
modelstringYesModel identifier, e.g. gpt-4o-mini, claude-sonnet-4-6
messagesMessage[]YesConversation history. At least one message required
temperaturenumberNoSampling temperature 0–2
max_tokensintegerNoMaximum tokens to generate
streambooleanNoStream response as SSE (default false)
top_pnumberNoNucleus sampling
stopstring | string[]NoStop sequences
toolsTool[]NoFunction-calling tools
tool_choicestring | objectNo"auto", "none", "required", or named function
response_formatobjectNo\{ "type": "json_object" \} or JSON schema

Response headers

HeaderDescription
X-Request-IdUnique request identifier for support
X-Credits-UsedCredits deducted for this request (USD, 6 decimal places)
X-Credits-RemainingCredit balance after this request

Example — non-streaming

curl -X POST https://www.aacflow.io/api/v1/ai/chat/completions \
  -H "Authorization: Bearer aacf_your_key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [
      { "role": "user", "content": "Summarise the top 3 risks in this contract." }
    ],
    "temperature": 0.2
  }'
from openai import OpenAI

client = OpenAI(
    api_key="aacf_your_key",
    base_url="https://www.aacflow.io/api/v1/ai",
)

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
import OpenAI from 'openai'

const client = new OpenAI({
  apiKey: 'aacf_your_key',
  baseURL: 'https://www.aacflow.io/api/v1/ai',
})

const response = await client.chat.completions.create({
  model: 'gpt-4o-mini',
  messages: [{ role: 'user', content: 'Hello!' }],
})
console.log(response.choices[0].message.content)

Example response

{
  "id": "chatcmpl-a1b2c3d4e5f6",
  "object": "chat.completion",
  "created": 1720000000,
  "model": "gpt-4o-mini",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Hello! How can I help you today?"
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 9,
    "completion_tokens": 12,
    "total_tokens": 21
  }
}

Streaming

Set "stream": true to receive a text/event-stream response. Each chunk is a data: {...} SSE line with a partial ChatCompletionChunk object, terminated by data: [DONE].

curl -X POST https://www.aacflow.io/api/v1/ai/chat/completions \
  -H "Authorization: Bearer aacf_your_key" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"Count to 5"}],"stream":true}'

List Models

GET /api/v1/ai/models

Returns the list of model identifiers available through the AI API.

curl https://www.aacflow.io/api/v1/ai/models \
  -H "Authorization: Bearer aacf_your_key"

Error responses

All errors follow the OpenAI error envelope format:

{
  "error": {
    "message": "Human-readable description",
    "type": "invalid_request_error",
    "code": "model_not_found"
  }
}
HTTP statustypecodeCause
401invalid_request_errorinvalid_api_keyMissing or invalid API key
400invalid_request_errorMalformed request body
402insufficient_quotainsufficient_creditsCredit balance too low
404invalid_request_errormodel_not_foundModel not supported
500server_errorInternal error

Supported models

AACFlow routes requests to the underlying provider based on the model identifier. Any model listed on the Models page can be used here. The identifier is the same slug used elsewhere in the platform.

Not all models support all features (e.g. tool calling, structured output). Capabilities are documented on the individual model pages.


Using with LangChain / LlamaIndex

Point the OpenAI-compatible integration at the AACFlow base URL:

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="claude-sonnet-4-6",
    openai_api_key="aacf_your_key",
    openai_api_base="https://www.aacflow.io/api/v1/ai",
)
from llama_index.llms.openai import OpenAI

llm = OpenAI(
    model="gpt-4o-mini",
    api_key="aacf_your_key",
    api_base="https://www.aacflow.io/api/v1/ai",
)

On this page

Start building today
Trusted by over 100,000 builders.
The SaaS platform to build AI agents and run your agentic workforce.
Get started