Eclaire v0.6.0: Deploy Private AI in Minutes

Eclaire v0.6.0: Deploy Private AI in Minutes

TL;DR: Eclaire v0.6.0 makes self-hosting dramatically simpler: single-container deployment, optional Redis, SQLite support, and a new setup script. Plus expanded AI model support including vision models and llama.cpp multi-model routing.

The Big Picture

This release removes barriers to self-hosting. We’ve eliminated mandatory dependencies, simplified configuration, and made deployments more flexible. At the same time, we’ve expanded support for AI models and providers so you can use the setup that works best for you.

What’s New

One Container, Zero Hassle

Previously, running Eclaire required separate containers for the backend and workers, plus a Redis server. Now:

This means you can run Eclaire as a single Docker container with just a database: no message broker required.

Easier Setup and Configuration

No more manual migration commands after pulling a new version.

More AI Models and Providers

We’ve significantly expanded AI model support:

New Admin CLI

Manage AI providers and models from the command line:

docker compose run --rm eclaire admin-cli

The new admin CLI is built into the Docker image and replaces the previous model-cli.

Modern Frontend

Under the hood, we’ve rebuilt the frontend:

Before & After

Before (v0.5.x)After (v0.6.0)
2-3 containers required1 container
Redis requiredRedis optional
Postgres onlyPostgres or SQLite
Multiple .env filesSingle .env
Manual migrationsUpgrade system

Upgrading from v0.5.x

Breaking Release

Due to significant architectural changes, there is no automated upgrade path from v0.5.x to v0.6.0.

We recommend setting up a fresh v0.6.0 instance using the new setup.sh script and transferring your data from your previous installation. Key changes to be aware of:

Please review the CHANGELOG for the full list of breaking changes.

Need help migrating? Open an issue or reach out to us: we’re happy to help.

Getting Started

New to Eclaire? Getting started takes just a few commands:

mkdir eclaire && cd eclaire
curl -fsSL https://raw.githubusercontent.com/eclaire-labs/eclaire/main/setup.sh | sh
docker compose up

Start your local LLM server (we recommend llama.cpp), open http://localhost:3000, and create your account.

Thank You

Thanks to everyone who has contributed feedback, reported issues, and helped make Eclaire better.

Try v0.6.0 today and let us know what you think!

Resources: