
copy markdown
Four months ago, we launched Interfaze. It's a new model architecture that outperforms both closed and open-source models on deterministic tasks like OCR, data extraction, web scraping, and classification. These are tasks where developers expect high accuracy, structured output, and reliable automated backend processing.
We initially tackled this by training SLMs for specialized use cases. This approach gave birth to JigsawStack: one model for per hyper-focused task, bundled with the necessary infrastructure.
JigsawStack Feedback
Speaking with customers, we identified three main challenges with JigsawStack:
We spent over 8 months researching and building Interfaze to solve these exact problems. You can read our paper here: https://arxiv.org/abs/2602.04101
The solution: Interfaze
Interfaze outperforms JigsawStack and generalized LLMs in deterministic developer across all modalities: text, vision, audio, and multilingual tasks. Check out our landing page for benchmarks. Beyond better results, Interfaze offers:
JigsawStack & Interfaze Updates
JigsawStack isn't going anywhere. Much of the Interfaze infrastructure is built on top of it. For those needing smaller models running on-premise on a single H100, JigsawStack will continue to power them with constant upgrades and fixes.
However, since most self-serve users are now on Interfaze, we are shifting our branding focus to Interfaze across all socials and emails. Our official company entity will remain JigsawStack.
This transition will happen progressively over the next few days with zero downtime or effects on your systems and APIs.
Upcoming Changes:
Our support team, SOC2 compliance, agreements, and licenses remain unchanged across both Interfaze and JigsawStack.
If you have any questions or want to migrate to Interfaze, Feel free to reach out to support@jigsawstack.com :)