logo

Beta

Introducing postgres-llm: Run AI Natively Inside Postgres

copy markdown

Ever wondered how seamless your data pipeline would be if you could invoke a deterministic large language model (LLM) directly from your Postgres database—without having to shuttle data back and forth through an application layer?

Meet postgres-llm: an open-source Postgres trigger function that brings AI to your database rows and columns, allowing for tasks like translation, sentiment analysis, OCR, and more, right into your SQL world.

Why Do LLMs Belong Inside the Database?

Machine learning and NLP models are typically orchestrated from a backend server which means your data makes a round trip outside of Postgres and then returns. This is fine for batch jobs or when data security isn’t critical. But what if you want richer processing that’s:

  • Automated at the data tier: No extra middleware required
  • Always in sync: No lag between human input and AI output
  • Secure: Data never leaves your trusted database boundary

postgres-llm allows you to do just that, with idiomatic Postgres triggers.

How Does It Work?

postgres-llm is implemented as a dynamic trigger function written in PL/pgSQL. It uses the http and hstore extensions for outbound API requests and flexible responses.

It’s built to work with any LLM provider that matches the OpenAI chat completion API including, by default, Interfaze. You can configure it for your chosen key and endpoint.

The main user workflow:

  1. Define a Postgres function called call_llm.
  2. Create a trigger on your table for the column you want to process.
  3. Write a prompt instructing the LLM what to do.
    When the column is inserted/updated, the LLM is invoked and the result is written to your target column no app code required.

Quick Start: Sentiment Analysis End-to-End

Suppose you have a customer review system and want to analyze the sentiment of each review as it’s written.

1. Install Requirements

  • Ensure your Postgres instance includes the http and hstore extensions.

2. Create the User Reviews Table

...

3. Set Up the LLM Call Function

  • Download or copy call_llm.sql from the repo.
  • Get your API key from the interfaze dashboard and replace the placeholders (API_KEY).
  • Execute the script in your SQL environment to create the function.

4. Create a Trigger for Sentiment Analysis

call_llm has three parameters:

  • 1st parameter: prompt: The prompt to send to the LLM.
  • 2nd parameter: output_column: The column to write the LLM response to.
  • 3rd parameter: input_column: The column to read the input as context for the prompt.

For example, to create a trigger for sentiment analysis, you would do the following:

...

Once your trigger is set up, it will automatically process rows on both INSERT and UPDATE events.

Insert Example

...

Effect:
The trigger runs as the row is created. The sentiment column is populated via the LLM (e.g., "positive").

Update Example

...

Effect:
The LLM re-analyzes the updated review text and updates the sentiment column accordingly (e.g., now "negative").

Table Evolution Example

idreview_textsentiment
1This product exceeded my expectations!positive
1The item arrived broken and late.negative

More Example Use Cases

Translation on Insert/Update

If you add a Spanish translation trigger:

...

Web Search & OCR

Summarize a name:

...

OCR from image:

...

Under the Hood: Technical Details

  • No extra middleware: Everything happens at the database layer.
  • API-based: Outbound HTTP is used for LLM calls, so network egress is required.
  • Safe to test: The trigger only runs for changes; use standard SQL to manage/disable triggers.
  • Any LLM Provider: Defaults to Interfaze, but you can use OpenAI, Azure, Anthropic, or any compatible plain API.

Why Interfaze works the best for database tasks?

  • Interfaze architecture is designed to be highly deterministic. Learn more from our paper here.
  • Low cost which makes it perfect for database tasks as rows scales to the millions.
  • Optimized for developer tasks and objectives like OCR, translation, and more.

Conclusion

postgres-llm brings the power of LLMs right to where your data lives removing friction, reducing latency, and opening up a world of real-time AI automation possibilities.

No pipelines, no ETL, just pure Postgres and AI.

Check out the code and readme for all the details here:
👉 https://github.com/JigsawStack/postgres-llm