generate() and generate_streaming() propagate tracing callbacks automatically. No extra configuration is needed beyond setting LangfuseConfig on your pipeline.
Local Providers
Recommended for prototyping. Run entirely offline with Ollama, LMStudio,
or vLLM. Zero cost, total privacy.
Remote APIs
Recommended for production. Connect to OpenAI, Mistral, or Gemini for
higher reasoning capabilities.
AWS Bedrock
Managed cloud inference. Use Claude, Titan, Llama and other Bedrock
models with your existing AWS credentials.
Configuration
Providers are configured using constants from theSettings class. This ensures type safety and prevents typo-related errors.
Two ways to use LLMs
You can use a provider in two modes:- Directly (via
Builder) for testing prompts or models without retrieval. - In a Pipeline (via
RAGPipeline) for the full RAG experience.
1. Direct Usage (The Builder)
Use theBuilder pattern when you want a simple chat loop to validate a model or a system prompt.
2. RAG Pipeline Usage
In a RAG pipeline, the LLM (Generation) and Embeddings (Retrieval) are configured separately. This allows you to mix and match (e.g., Local Embeddings + Remote LLM). Here is a full example using Google Gemini for both.main.py
Provider Setup Checklist
Before running the code, ensure your environment is ready.Local Providers
Ollama
Ollama
- Install Ollama from ollama.com. 2. Run
ollama serve. 3. Pull a model:ollama pull llama3. 4. Default URL:http://localhost:11434(handled by RAGLight).
LMStudio
LMStudio
- Open LMStudio. 2. Go to the Local Server tab. 3. Load a model. 4.
Click Start Server. 5. Ensure
Settings.DEFAULT_LMSTUDIO_CLIENTmatches the URL (usuallyhttp://localhost:1234/v1).
vLLM
vLLM
- Start vLLM with an OpenAI-compatible server. 2. Set
api_basein your config to your vLLM endpoint.
Remote Providers
API Keys
API Keys
Ensure these environment variables are set in your
.env file:- OpenAI:
OPENAI_API_KEY - Mistral:
MISTRAL_API_KEY - Google:
GEMINI_API_KEY
AWS Bedrock
AWS Bedrock
Authentication uses the standard boto3 credential chain — no extra install needed:
- Environment variables:
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY,AWS_DEFAULT_REGION - Credentials file:
~/.aws/credentials - IAM role: automatic when running on EC2, ECS, or Lambda