Interfaze

logo

Beta

pricing

docs

blog

sign in

Get Started

Introduction

Examples

Vision

Concepts

Resources

Projects

Integrations

Handling Files

copy markdown

Pass files as base64, binary file object or URL in prompt with URL context understanding.

File size limits

  • URL in prompt: 80 MB
  • Base64: 20 MB
  • Binary file object: 20 MB

URL

Pass a publicly accessible file URL directly in the prompt text. The model fetches and reads the file at inference time.

This is great for handling large files that don't fit in the context window.

OpenAI SDK

Vercel AI SDK

LangChain SDK

const response = await interfaze.chat.completions.create({
    model: "interfaze-beta",
    messages: [
        {
            role: "user",
            content: "Summarize this document for me: https://arxiv.org/pdf/2602.04101",
        },
    ],
});

console.log(response.choices[0].message.content);

Base64

Follows the same pattern as OpenAI file handling SDK.

Read a local file, encode it as base64, and pass it in the message content using the file content type.

OpenAI SDK

Vercel AI SDK

LangChain SDK

import fs from "fs";

const fileBuffer = fs.readFileSync("document.pdf");
const base64Data = fileBuffer.toString("base64");

const response = await interfaze.chat.completions.create({
    model: "interfaze-beta",
    messages: [
        {
            role: "user",
            content: [
                {
                    type: "file",
                    file: {
                        filename: "document.pdf",
                        file_data: `data:application/pdf;base64,${base64Data}`,
                    },
                },
                {
                    type: "text",
                    text: "Summarize this document.",
                },
            ],
        },
    ],
});

console.log(response.choices[0].message.content);

Binary File Object

Read a file as a binary Buffer or Blob (TypeScript) / bytes (Python) and pass it directly in the message. The SDK handles serialization automatically.

OpenAI SDK

Vercel AI SDK

LangChain SDK

import fs from "fs";

const fileBuffer = fs.readFileSync("document.pdf");
const blob = new Blob([fileBuffer], { type: "application/pdf" });

const response = await interfaze.chat.completions.create({
    model: "interfaze-beta",
    messages: [
        {
            role: "user",
            content: [
                {
                    type: "file",
                    file: {
                        filename: "document.pdf",
                        file_data: blob,
                    },
                },
                {
                    type: "text",
                    text: "Summarize this document.",
                },
            ],
        },
    ],
});

console.log(response.choices[0].message.content);