Interfaze

logo

Beta

pricing

docs

blog

sign in

Get Started

Introduction

Examples

Vision

Concepts

Resources

Projects

Integrations

Streaming

copy markdown

Streaming enables Interfaze to deliver responses in real time, creating a faster, more interactive experience.

Text streaming

OpenAI SDK

Vercel AI SDK

LangChain SDK

const stream = await interfaze.chat.completions.create({
  model: "interfaze-beta",
  messages: [
    { role: "user", content: "Write a short story about a robot learning to paint" }
  ],
  stream: true
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    process.stdout.write(content);
  }
}

Object streaming

Stream structured JSON objects as they are generated, receiving partial data incrementally rather than waiting for the full response.

OpenAI SDK

Vercel AI SDK

LangChain SDK

import { z } from "zod";
import { zodResponseFormat } from "openai/helpers/zod";

const storySchema = z.object({
  title: z.string().describe("The title of the story"),
  genre: z.string().describe("The genre of the story"),
  summary: z.string().describe("A brief summary of the story"),
});

const stream = interfaze.beta.chat.completions.stream({
  model: "interfaze-beta",
  messages: [
    { role: "user", content: "Write a short story about a robot learning to paint" }
  ],
  response_format: zodResponseFormat(storySchema, "story_schema"),
});

for await (const chunk of stream) {
  const partial = chunk.choices[0]?.delta?.content;
  if (partial) {
    process.stdout.write(partial);
  }
}

const finalObject = await stream.finalMessage();
console.log(JSON.parse(finalObject.choices[0].message.content));