Interfaze

logo

Beta

pricing

docs

blog

sign in

Introducing postgres-llm: Run AI Natively Inside Postgres

copy markdown

Ever wondered how seamless your data pipeline would be if you could invoke a deterministic large language model (LLM) directly from your Postgres database—without having to shuttle data back and forth through an application layer?

Meet postgres-llm: an open-source dynamic Postgres integration that runs an LLM request directly in your database which makes it great for things like translation, classification, summarization and more.

Features

  • Use any OpenAI chat completion API compatible LLM
  • Reference context from any column within the same row using {column_name} placeholders
  • Automatically update one or more target columns with the LLM result
  • Uses OpenAI-compatible JSON schema structured output for reliable responses
  • Supports common use cases: Translation, Sentiment Analysis, Image Analysis, Web Search, and more

Why Do LLMs Belong Inside the Database?

Machine learning and NLP models are typically orchestrated from a backend server which means your data makes a round trip outside of Postgres and then returns. This is fine for batch jobs or when data security isn’t critical. But what if you want richer processing that’s:

  • Automated at the data tier: No extra middleware required
  • Always in sync: No lag between human input and AI output
  • Secure: Data never leaves your trusted database boundary

postgres-llm allows you to do just that, with idiomatic Postgres triggers.

How Does It Work?

postgres-llm is implemented as a dynamic trigger function written in PL/pgSQL. It uses the http and hstore extensions for outbound API requests and flexible responses.

It’s built to work with any LLM provider that matches the Chat completion API. Out of the box it’s configured for Interfaze — just plug in your API key and you’re ready to go.

The main user workflow:

  1. Create the call_llm function in your database.
  2. Create a trigger on your table for the column you want to process.
  3. Write a prompt with {column_name} placeholders referencing row data.
    When the column is inserted/updated, the LLM is invoked and the result is written to your target column(s) — no app code required.

Quick Start: Sentiment Analysis End-to-End

Suppose you have a customer review system and want to analyze the sentiment of each review as it’s written.

1. Install Requirements

  • Ensure your Postgres instance includes the http and hstore extensions.

2. Create the User Reviews Table

CREATE TABLE public.user_reviews (
  id bigint generated by default as identity not null,
  created_at timestamp with time zone not null default now(),
  user_id uuid not null default gen_random_uuid(),
  review_text text not null,
  sentiment text null,
  constraint user_reviews_pkey primary key (id)
);

3. Set Up the LLM Call Function

  • Download or copy call_llm.sql from the repo.
  • Replace the API_KEY with your Interfaze API key from the dashboard.
  • Execute the script in your SQL environment to create the function.

4. Create a Trigger for Sentiment Analysis

call_llm takes the following parameters:

ParameterDescriptionRequired
promptThe prompt to use for the LLM. Supports {column_name} placeholders that get replaced with values from the row.Yes
target_column(s)One or more columns to update with the LLM result. Pass additional column names as extra arguments.Yes

For a single target column:

call_llm('<prompt>','<target_column>');

For multiple target columns:

call_llm('<prompt>','<target_column_1>','<target_column_2>','<target_column_3>');

The function uses OpenAI JSON schema structured output (response_format with type: json_schema) to ensure the LLM returns a valid JSON object with exactly the specified target column names as keys.

For example, to create a trigger for sentiment analysis:

DROP TRIGGER IF EXISTS analyze_sentiment ON user_reviews;
CREATE TRIGGER analyze_sentiment
BEFORE INSERT OR UPDATE OF review_text ON user_reviews
FOR EACH ROW
WHEN (NEW.review_text IS NOT NULL)
EXECUTE FUNCTION call_llm(
  'Analyze the sentiment of this text and respond with only "positive", "negative", or "neutral". return value in lowercase. Text: {review_text}',
  'sentiment'
);

Once your trigger is set up, it will automatically process rows on both INSERT and UPDATE events.

Insert Example

INSERT INTO user_reviews (review_text)
VALUES ('I love this hackathon, I can build anything I want!!!!')
RETURNING *;

Effect:
The trigger runs as the row is created. The sentiment column is populated via the LLM (e.g., "positive").

Update Example

UPDATE user_reviews
SET review_text = 'The item arrived broken and late.'
WHERE id = 1
RETURNING *;

Effect:
The LLM re-analyzes the updated review text and updates the sentiment column accordingly (e.g., now "negative").

Table Evolution Example

idreview_textsentiment
1This product exceeded my expectations!positive
1The item arrived broken and late.negative

More Example Use Cases

Translation on Insert/Update

If you add a Spanish translation trigger:

CREATE TRIGGER translate_es
BEFORE INSERT OR UPDATE OF user_review_og ON user_reviews
FOR EACH ROW
WHEN (NEW.user_review_og IS NOT NULL)
EXECUTE FUNCTION call_llm(
  'Translate the following text to spanish (es). Only return the spanish text with no additional text. Text: {user_review_og}',
  'user_review_es'
);

Web Search & OCR

Summarize a name:

CREATE TRIGGER background_search
BEFORE INSERT OR UPDATE OF full_name ON user_reviews
FOR EACH ROW
WHEN (NEW.full_name IS NOT NULL)
EXECUTE FUNCTION call_llm(
  'Give a summary background on {full_name}.',
  'user_background'
);

OCR from image:

CREATE TRIGGER vision_ocr
BEFORE INSERT OR UPDATE OF attached_image_url ON user_reviews
FOR EACH ROW
WHEN (NEW.attached_image_url IS NOT NULL)
EXECUTE FUNCTION call_llm(
  'Extract all text from this image: {attached_image_url}',
  'image_description'
);

Multi-column: Sentiment Analysis + Translation

A single trigger can populate multiple columns at once from one LLM call:

CREATE TRIGGER analyze_and_translate
BEFORE INSERT OR UPDATE OF user_review_og ON user_reviews
FOR EACH ROW
WHEN (NEW.user_review_og IS NOT NULL)
EXECUTE FUNCTION call_llm(
  'Analyze the sentiment and translate the following review to Spanish. Text: {user_review_og}',
  'emotion',
  'user_review_es'
);

Multi-column Input + Multi-column Output

You can reference multiple columns in the prompt using {column_name} placeholders and write the results to multiple target columns at once:

CREATE TABLE public.user_visits (
  id uuid not null default gen_random_uuid(),
  created_at timestamp with time zone not null default now(),
  full_name text not null,
  company text null,
  summary text null,
  linkedin text null,
  constraint user_visits_pkey primary key (id)
);
DROP TRIGGER IF EXISTS user_visits_search_summary ON user_visits;
CREATE TRIGGER user_visits_search_summary
BEFORE INSERT OR UPDATE OF full_name ON user_visits
FOR EACH ROW
WHEN (NEW.full_name IS NOT NULL)
EXECUTE FUNCTION call_llm(
  'Give a summary background on who this person is and their linkedin url. Details: {full_name}, {company}',
  'summary',
  'linkedin'
);

Here {full_name} and {company} are both pulled from the row as context, and the LLM result is split across both the summary and linkedin columns.

Under the Hood: Technical Details

  • No extra middleware: Everything happens at the database layer.
  • API-based: Outbound HTTP is used for LLM calls, so network egress is required.
  • Safe to test: The trigger only runs for changes; use standard SQL to manage/disable triggers.
  • Any LLM Provider: Pre-configured for Interfaze, but works with any OpenAI chat completion API compatible provider.

Why Interfaze works the best for database tasks?

  • Interfaze architecture is designed to be highly deterministic. Learn more from our paper here.
  • Low cost which makes it perfect for database tasks as rows scales to the millions.
  • Optimized for developer tasks and objectives like OCR, translation, and more.

Conclusion

postgres-llm brings the power of LLMs right to where your data lives removing friction, reducing latency, and opening up a world of real-time AI automation possibilities.

No pipelines, no ETL, just pure Postgres and AI.

Check out the code and readme for all the details here:
👉 https://github.com/JigsawStack/postgres-llm