Running Sentient from Source (Self-Host)
Use this documentation to run Sentient locally, in Self-Hosted Mode.
Self-Hosting Sentient with Docker: A Complete Guide
Welcome! This guide will walk you through setting up and running your own private instance of Sentient using Docker. The process is designed to be as simple as possible, boiling down to two main steps: configuring your environment and running a single command.
This self-hosted setup is special because it does not require Auth0 for authentication. Instead, it uses a secure, static token that you generate, ensuring your instance is completely independent and private. The default configuration uses a local LLM (Ollama) for chat, so you can get started without any external AI service API keys.
Prerequisites
Before you begin, please ensure you have the following installed and running on your system:
Docker and Docker Compose:
Windows/Mac: Docker Desktop is required. For Windows, make sure it's configured to use the WSL 2 backend.
Linux: Install Docker Engine and the Docker Compose Plugin.
Git: Required for cloning the project repository.
A Code Editor: A good editor like VS Code will make editing configuration files easier.
Step 1: Configure Your Environment
This is the most important step. You will create three environment files by copying and editing the provided templates. These files control everything from your login token to API keys for various features.
All paths are relative to the src
directory inside the project you cloned.
1.1. Root Environment (src/.env
)
This file is the master configuration for Docker Compose. It defines your database credentials, your private authentication token, and the build-time configuration for the services.
Navigate to the
src
directory.Copy the template file
src/.env.selfhost.template
and rename the copy tosrc/.env
.Open the new
src/.env
file and edit the placeholders. See the updated template below for guidance.
1.2. Client Environment (src/client/.env.selfhost
)
This file provides runtime environment variables specifically for the Next.js client container.
Navigate to the
src/client
directory.Copy
src/client/.env.selfhost.template
and rename the copy tosrc/client/.env.selfhost
.Open the new file and edit the placeholders, ensuring they match the values from your root
src/.env
file.
1.3. Server Environment (src/server/.env.selfhost
)
This file configures the Python backend, including its database connections and optional API keys for third-party services.
Navigate to the
src/server
directory.Copy
src/server/.env.selfhost.template
and rename the copy tosrc/server/.env.selfhost
.Open the new file and edit the required and optional values.
Step 2: Build and Run the Application
Once your environment files are configured, starting the application is a single command.
Open your terminal and navigate to the
src
directory of the project.Run the following command:
docker compose -f docker-compose.selfhost.yml up --build -d
-f docker-compose.selfhost.yml
: Explicitly tells Docker to use the self-hosting configuration.--build
: This tells Docker to build the images from the Dockerfiles the first time you run it, or if any code has changed.-d
: This runs the containers in "detached" mode, meaning they will run in the background.
The initial build may take several minutes as it downloads dependencies and the local LLM. Once it's complete, all services (Client, Server, Databases, etc.) will be running.
Step 3: Accessing Your Sentient Instance
That's it! You can now access your private Sentient instance by navigating to:
Managing Your Self-Hosted Instance
Here are some useful Docker commands to manage your running application. Run these from the src
directory.
View Logs: To see the live logs from any container, use:
# View backend server logs (most common) docker compose -f docker-compose.selfhost.yml logs -f server # View frontend client logs docker compose -f docker-compose.selfhost.yml logs -f client ```* **Stop the Application:** To stop all the running containers: ```bash docker compose -f docker-compose.selfhost.yml down
Stop and Delete All Data: If you want to stop the application and completely wipe the database volumes (for a clean restart), use the
-v
flag. Warning: This is irreversible.docker compose -f docker-compose.selfhost.yml down -v
Appendix: LLM Configuration
The default self-host setup uses Ollama to run a local LLM, so you don't need any external API keys for basic chat functionality. However, for memory and other features, you'll need a Gemini API Key.
You can also configure Sentient to use a different LLM provider:
Using Gemini for Chat:
Get a
GEMINI_API_KEY
and add it to yoursrc/.env
file.In
src/server/.env.selfhost
, changeOPENAI_API_BASE_URL
tohttp://litellm:4000
.In
src/server/.env.selfhost
, changeOPENAI_MODEL_NAME
togemini-1.5-flash
.Restart your application:
docker compose -f docker-compose.selfhost.yml up --build -d
.
Using other OpenAI-compatible services (e.g., Groq, TogetherAI):
In
src/.env
, changeOPENAI_API_KEY
fromollama
to your actual API key.In
src/server/.env.selfhost
, changeOPENAI_API_BASE_URL
to the service's endpoint URL andOPENAI_MODEL_NAME
to the desired model.Restart your application.
Appendix: Optional API Keys
To unlock all of Sentient's capabilities, you'll need to provide API keys for various third-party services in the src/server/.env.selfhost
file.
GEMINI_API_KEY
Google AI Studio
(Highly Recommended) Powers memory embeddings and can be used for chat via LiteLLM.
HF_TOKEN
Hugging Face
(Optional) Required for WebRTC voice connections if not on localhost.
COMPOSIO_API_KEY
Composio
(Optional) A tool aggregator that simplifies connecting to Google services like Gmail, Calendar, and Drive.
GOOGLE_CSE_ID
Google Search
(Optional) The ID for your custom search engine.
UNSPLASH_ACCESS_KEY
Unsplash
(Optional) Used by agents to find images for presentations.
DISCORD_CLIENT_ID
Discord
(Optional) For connecting your Discord account.
Last updated
Was this helpful?