Get Started
Examples
Concepts
Resources
Projects
Integrations
copy markdown
Interfaze supports the Chat Completion API standard, so you can use it out of the box with LangChain by pointing it to Interfaze's base URL and API key.
npm / yarn
npm install @langchain/openai @langchain/core
# or
yarn add @langchain/openai @langchain/coreNote: Interfaze supports the Chat Completion API standard and not the Response API standard.
LangChain SDK
import { ChatOpenAI } from "@langchain/openai";
const interfaze = new ChatOpenAI({
configuration: {
baseURL: "https://api.interfaze.ai/v1",
},
apiKey: "<your-api-key>",
model: "interfaze-beta",
});LangChain SDK
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, SystemMessage } from "@langchain/core/messages";
const response = await interfaze.invoke([
new SystemMessage("You are a helpful assistant."),
new HumanMessage("Write a short story about a robot learning to paint"),
]);
console.log(response.content);Learn more about structured output.
LangChain SDK
import { z } from "zod";
const weatherSchema = z.object({
city: z.string().describe("The name of the city"),
temperature_celsius: z.number().describe("Current temperature in Celsius"),
condition: z.string().describe("Weather condition, e.g. sunny, rainy, cloudy"),
});
const structuredModel = interfaze.withStructuredOutput(weatherSchema);
const result = await structuredModel.invoke("What is the current weather in Tokyo?");
console.log(result);LangChain SDK
import { HumanMessage } from "@langchain/core/messages";
const response = await interfaze.invoke([
new HumanMessage({
content: [
{ type: "text", text: "What is in this image?" },
{
type: "image_url",
image_url: {
url: "https://r2public.jigsawstack.com/interfaze/examples/construction.png",
},
},
],
}),
]);
console.log(response.content);Learn more about streaming.
LangChain SDK
import { HumanMessage } from "@langchain/core/messages";
const stream = await interfaze.stream([
new HumanMessage("Write a short story about a robot learning to paint"),
]);
for await (const chunk of stream) {
process.stdout.write(chunk.content as string);
}Learn more about function calling.
LangChain SDK
import { HumanMessage } from "@langchain/core/messages";
// Step 1: Define tools
const tools = [
{
type: "function" as const,
function: {
name: "get_horoscope",
description: "Get today's horoscope for an astrological sign.",
parameters: {
type: "object",
properties: {
sign: {
type: "string",
description: "An astrological sign like Taurus or Aquarius",
},
},
required: ["sign"],
},
},
},
];
// Step 2: Get tool call from model
const response = await interfaze.invoke(
[new HumanMessage("Get my horoscope for Taurus")],
{ tools, tool_choice: "auto" },
);
// Step 3: Check if tool was called and execute function
if (response.additional_kwargs.tool_calls?.length) {
const toolCall = response.additional_kwargs.tool_calls[0];
const args = JSON.parse(toolCall.function.arguments);
const result = `Today's horoscope for ${args.sign}: You will have a great day!`;
// Step 4: Provide result back to model
const finalResponse = await interfaze.invoke(
[
new HumanMessage("Get my horoscope for Taurus"),
response,
{ role: "tool", tool_call_id: toolCall.id, content: result },
],
{ tools, tool_choice: "auto" },
);
console.log(finalResponse.content);
}