Settings class is the central configuration backbone of RAGLight. It provides a single source of truth for:
- Provider Identifiers (LLMs, embeddings, vector stores)
- Default Models (optimized for local use)
- Environment Configuration (loading
.envautomatically)
Philosophy: RAGLight avoids “magic values”. All defaults are exposed as
constants in
Settings, making pipelines predictable, debuggable, and easy to
override.Global Setup
Logging
Before building any pipeline, it is recommended to initialize the logging system. This ensures that RAGLight’s internal logs (ingestion progress, retrieval stats) are formatted correctly.Provider Identifiers
RAGLight uses string constants to identify providers. Using these constants prevents typo-related errors when configuring yourRAGConfig.
LLM & Embeddings
These constants are used inRAGConfig.provider and VectorStoreConfig.provider.
| Constant | Value | Description |
|---|---|---|
Settings.OLLAMA | "ollama" | Recommended for local. Requires an Ollama instance. |
Settings.OPENAI | "openai" | Uses OpenAI API (requires key). |
Settings.MISTRAL | "mistral" | Uses Mistral API (requires key). |
Settings.VLLM | "vllm" | Connects to a vLLM server (high throughput). |
Settings.LMSTUDIO | "lmstudio" | Connects to LM Studio local server. |
Settings.GOOGLE_GEMINI | "google_gemini" | Uses Google’s Gemini API. |
Settings.HUGGINGFACE | "huggingface" | Embeddings only. Runs locally via sentence-transformers. |
Vector Stores
| Constant | Value | Description |
|---|---|---|
Settings.CHROMA | "chroma" | Default. Local, persistent vector database. |
Default Configuration
RAGLight ships with “batteries-included” defaults, optimized for a standard local laptop (Apple Silicon / NVIDIA GPU).System Prompts
You can inspect or override the default system prompts used by the pipelines. These are designed to be readable and inspectable.Environment Variables
Settings automatically looks for a .env file in your working directory using python-dotenv. This is the secure way to manage API keys and custom endpoints without hardcoding them.
API Keys
Supported keys for remote providers: *
OPENAI_API_KEY * MISTRAL_API_KEYGEMINI_API_KEY*ANTHROPIC_API_KEY
Custom Endpoints
URLs for local/custom servers: *
OLLAMA_CLIENT_URL (default:
http://localhost:11434) * LMSTUDIO_CLIENT * OPENAI_CLIENT_URL (useful
for vLLM/compatible APIs)Example .env File
Create a .env file in your project root:
.env
Ignore Folders
When usingFolderSource to ingest data, RAGLight automatically ignores common build, cache, and system directories to keep your index clean and relevant.
.git,.vscode,.idea__pycache__,venv,envnode_modulesdist,build.DS_Store