Chat completion (openai)
Last updated
Last updated
You can use the completion endpoint with the official OpenAI sdk:
import OpenAI from "openai";
const openaiClient = new OpenAI({
apiKey: "<kirha_api_key>",
baseURL: "https://api.kirha.ai/chat/v1/openai/crypto",
});
const response = await openaiClient.chat.completions.create({
model: "gemini:gemini-2.5-flash",
messages: [
{ role: "system", content: "You are a helpful crypto assistant." },
{ role: "user", content: "give me the balance of vitalik.eth" }
],
stream: true,
});
for await (const chunk of response) {
if (chunk.choices[0].delta.content) {
console.log(chunk.choices[0].delta.content);
}
}
Creates a completion for chat messages
openai:gpt-4.1
Possible values: false
POST /chat/v1/openai/{verticalIds}/chat/completions HTTP/1.1
Host: api.kirha.ai
Content-Type: application/json
Accept: */*
Content-Length: 115
{
"model": "openai:gpt-4.1",
"messages": [
{
"role": "system",
"content": "text"
}
],
"stream": false,
"temperature": 1,
"top_p": 1
}
An openai compatible chat completion response (JSON or SSE stream)
{
"id": "text",
"object": "text",
"created": 1,
"choices": [
{
"index": 1,
"message": {
"role": "system",
"content": "text"
},
"finish_reason": "text"
}
]
}