Providers¶
Perspt supports multiple LLM providers through the genai crate.
Supported Providers¶
Provider |
Env Variable |
Default Model |
Notes |
|---|---|---|---|
OpenAI |
|
|
GPT-4, GPT-4o, o-series |
Anthropic |
|
|
Claude 4 family |
Google Gemini |
|
|
Gemini 2.x family |
Groq |
|
|
Llama, Mixtral |
Cohere |
|
|
Command R family |
xAI |
|
|
Grok models |
DeepSeek |
|
|
DeepSeek models |
Ollama |
(none) |
|
Local models via Ollama |
Configuration Methods¶
1. Environment Variables (recommended):
export GEMINI_API_KEY="your-key"
perspt
2. CLI Flags:
perspt chat --api-key "your-key" --provider-type openai --model gpt-4.1
3. Config File (~/.config/perspt/config.json):
{
"default_provider": "anthropic",
"default_model": "claude-sonnet-4-20250514",
"api_key": "sk-ant-xxx"
}
Priority order: CLI flags > environment variables > config file > auto-detection.
Listing Available Models¶
perspt --list-models
Provider-Specific Notes¶
OpenAI
export OPENAI_API_KEY="sk-xxx"
perspt chat --model gpt-4.1
Anthropic
export ANTHROPIC_API_KEY="sk-ant-xxx"
perspt chat --model claude-sonnet-4-20250514
Google Gemini
export GEMINI_API_KEY="AIza..."
perspt chat --model gemini-3.1-flash-lite-preview
Ollama (Local)
ollama serve
ollama pull llama3.2
perspt chat --model llama3.2
No API key required. Perspt auto-detects Ollama as the fallback provider.