Eclaire v0.6.0: Deploy Private AI in Minutes
•
TL;DR: Eclaire v0.6.0 makes self-hosting dramatically simpler: single-container deployment, optional Redis, SQLite support, and a new setup script. Plus expanded AI model support including vision models and llama.cpp multi-model routing.
The Big Picture
This release removes barriers to self-hosting. We’ve eliminated mandatory dependencies, simplified configuration, and made deployments more flexible. At the same time, we’ve expanded support for AI models and providers so you can use the setup that works best for you.
What’s New
One Container, Zero Hassle
Previously, running Eclaire required separate containers for the backend and workers, plus a Redis server. Now:
- Single container deployment: Backend and workers run together by default
- Redis is optional: Use your database for job processing instead
- SQLite support: Lightweight alternative to Postgres for simpler setups
This means you can run Eclaire as a single Docker container with just a database: no message broker required.
Easier Setup and Configuration
- New
setup.shscript guides you through initial configuration - Single
.envfile instead of multiple configuration files - Upgrade system: Runs an upgrade check and determines the best way to proceed
No more manual migration commands after pulling a new version.
More AI Models and Providers
We’ve significantly expanded AI model support:
- Vision models: Better support for multimodal models like Qwen3-VL
- llama.cpp router: Works with llama-server’s new multi-model routing
- More backends: Improved support for llama.cpp, MLX-LM, MLX-VLM, and LM Studio
- Better tool calling: More robust support for AI native tool calling
New Admin CLI
Manage AI providers and models from the command line:
docker compose run --rm eclaire admin-cli
The new admin CLI is built into the Docker image and replaces the previous model-cli.
Modern Frontend
Under the hood, we’ve rebuilt the frontend:
- Migrated from Next.js to Vite + TanStack Router
- Faster development builds and improved performance
- Updated to Tailwind CSS v4
Before & After
| Before (v0.5.x) | After (v0.6.0) |
|---|---|
| 2-3 containers required | 1 container |
| Redis required | Redis optional |
| Postgres only | Postgres or SQLite |
| Multiple .env files | Single .env |
| Manual migrations | Upgrade system |
Upgrading from v0.5.x
Due to significant architectural changes, there is no automated upgrade path from v0.5.x to v0.6.0.
We recommend setting up a fresh v0.6.0 instance using the new setup.sh script and transferring your data from your previous installation. Key changes to be aware of:
- Data directory paths have changed (
data/db→data/postgresordata/sqlite) - Some environment variable names have changed
- The
model-clihas been replaced withadmin-cli
Please review the CHANGELOG for the full list of breaking changes.
Need help migrating? Open an issue or reach out to us: we’re happy to help.
Getting Started
New to Eclaire? Getting started takes just a few commands:
mkdir eclaire && cd eclaire
curl -fsSL https://raw.githubusercontent.com/eclaire-labs/eclaire/main/setup.sh | sh
docker compose up
Start your local LLM server (we recommend llama.cpp), open http://localhost:3000, and create your account.
Thank You
Thanks to everyone who has contributed feedback, reported issues, and helped make Eclaire better.
Try v0.6.0 today and let us know what you think!
Resources: