oAI - Open AI Chat Client
A powerful, modern Textual TUI chat client with multi-provider support (OpenRouter, Anthropic, OpenAI, Ollama) and MCP (Model Context Protocol) integration, enabling AI to access local files and query SQLite databases.
Features
Core Features
- 🖥️ Modern Textual TUI with async streaming and beautiful interface
- 🔄 Multi-Provider Support - OpenRouter, Anthropic (Claude), OpenAI (ChatGPT), Ollama (local)
- 🤖 Interactive chat with 300+ AI models across providers
- 🔍 Model selection with search, filtering, and capability icons
- 💾 Conversation save/load/export (Markdown, JSON, HTML)
- 📎 File attachments (images, PDFs, code files)
- 💰 Real-time cost tracking and credit monitoring (OpenRouter)
- 🎨 Dark theme with syntax highlighting and Markdown rendering
- 📝 Command history navigation (Up/Down arrows)
- 🌐 Universal Online Mode - Web search for ALL providers:
- Anthropic Native - Built-in search with automatic citations ($0.01/search)
- DuckDuckGo - Free web scraping (all providers)
- Google Custom Search - Premium search option
- 🧠 Conversation memory toggle
- ⌨️ Keyboard shortcuts (F1=Help, F2=Models, Ctrl+S=Stats)
MCP Integration
-
🔧 File Mode: AI can read, search, and list local files
- Automatic .gitignore filtering
- Virtual environment exclusion
- Large file handling (auto-truncates >50KB)
-
✍️ Write Mode: AI can modify files with permission
- Create, edit, delete files
- Move, copy, organize files
- Always requires explicit opt-in
-
🗄️ Database Mode: AI can query SQLite databases
- Read-only access (safe)
- Schema inspection
- Full SQL query support
Requirements
- Python 3.10-3.13
- API key for your chosen provider:
- OpenRouter: openrouter.ai
- Anthropic: console.anthropic.com
- OpenAI: platform.openai.com
- Ollama: No API key needed (local server)
Installation
Option 1: Pre-built Binary (macOS/Linux) (Recommended)
Download from Releases:
- macOS (Apple Silicon):
oai_v3.0.0_mac_arm64.zip - Linux (x86_64):
oai_v3.0.0_linux_x86_64.zip
# Extract and install
unzip oai_v3.0.0_*.zip
mkdir -p ~/.local/bin
mv oai ~/.local/bin/
# macOS only: Remove quarantine and approve
xattr -cr ~/.local/bin/oai
# Then right-click oai in Finder → Open With → Terminal → Click "Open"
Add to PATH
# Add to ~/.zshrc or ~/.bashrc
export PATH="$HOME/.local/bin:$PATH"
Option 2: Install from Source
# Clone the repository
git clone https://gitlab.pm/rune/oai.git
cd oai
# Install with pip
pip install -e .
Quick Start
# Start oAI (launches TUI)
oai
# Start with specific provider
oai --provider anthropic
oai --provider openai
oai --provider ollama
# Or with options
oai --provider openrouter --model gpt-4o --online --mcp
# Show version
oai version
On first run, you'll be prompted for your API key. Configure additional providers anytime with /config.
Enable Web Search (All Providers)
# Using Anthropic native search (best quality, automatic citations)
/config search_provider anthropic_native
/online on
# Using free DuckDuckGo (works with all providers)
/config search_provider duckduckgo
/online on
# Using Google Custom Search (requires API key)
/config search_provider google
/config google_api_key YOUR_KEY
/config google_search_engine_id YOUR_ID
/online on
Basic Commands
# In the TUI interface:
/provider # Show current provider or switch
/provider anthropic # Switch to Anthropic (Claude)
/provider openai # Switch to OpenAI (ChatGPT)
/provider ollama # Switch to Ollama (local)
/model # Select AI model (or press F2)
/online on # Enable web search
/help # Show all commands (or press F1)
/mcp on # Enable file/database access
/stats # View session statistics (or press Ctrl+S)
/config # View configuration settings
/credits # Check account credits (shows API balance or console link)
Ctrl+Q # Quit
Web Search
oAI provides universal web search capabilities for all AI providers with three options:
Anthropic Native Search (Recommended for Anthropic)
Anthropic's built-in web search API with automatic citations:
/config search_provider anthropic_native
/online on
# Now ask questions requiring current information
What are the latest developments in quantum computing?
Features:
- ✅ Automatic citations - Claude cites its sources
- ✅ Smart searching - Claude decides when to search
- ✅ Progressive searches - Multiple searches for complex queries
- ✅ Best quality - Professional-grade results
Pricing: $10 per 1,000 searches ($0.01 per search) + token costs
Note: Only works with Anthropic provider (Claude models)
DuckDuckGo Search (Default - Free)
Free web scraping that works with ALL providers:
/config search_provider duckduckgo # Default
/online on
# Works with Anthropic, OpenAI, Ollama, and OpenRouter
Features:
- ✅ Free - No API key or costs
- ✅ Universal - Works with all providers
- ✅ Privacy-friendly - Uses DuckDuckGo
Google Custom Search (Premium Option)
Google's Custom Search API for high-quality results:
/config search_provider google
/config google_api_key YOUR_GOOGLE_API_KEY
/config google_search_engine_id YOUR_SEARCH_ENGINE_ID
/online on
Get your API key: Google Custom Search API
MCP (Model Context Protocol)
MCP allows the AI to interact with your local files and databases.
File Access
/mcp on # Enable MCP
/mcp add ~/Projects # Grant access to folder
/mcp list # View allowed folders
# Now ask the AI:
"List all Python files in Projects"
"Read and explain main.py"
"Search for files containing 'TODO'"
Write Mode
/mcp write on # Enable file modifications
# AI can now:
"Create a new file called utils.py"
"Edit config.json and update the API URL"
"Delete the old backup files" # Always asks for confirmation
Database Mode
/mcp add db ~/app/data.db # Add database
/mcp db 1 # Switch to database mode
# Ask the AI:
"Show all tables"
"Find users created this month"
"What's the schema for the orders table?"
Command Reference
Chat Commands
| Command | Description |
|---|---|
/help [cmd] |
Show help |
/model [search] |
Select model |
/info [model] |
Model details |
/memory on|off |
Toggle context |
/online on|off |
Toggle web search |
/retry |
Resend last message |
/clear |
Clear screen |
MCP Commands
| Command | Description |
|---|---|
/mcp on|off |
Enable/disable MCP |
/mcp status |
Show MCP status |
/mcp add <path> |
Add folder |
/mcp add db <path> |
Add database |
/mcp list |
List folders |
/mcp db list |
List databases |
/mcp db <n> |
Switch to database |
/mcp files |
Switch to file mode |
/mcp write on|off |
Toggle write mode |
Conversation Commands
| Command | Description |
|---|---|
/save <name> |
Save conversation |
/load <name> |
Load conversation |
/list |
List saved conversations |
/delete <name> |
Delete conversation |
/export md|json|html <file> |
Export |
Configuration
| Command | Description |
|---|---|
/config |
View settings |
/config provider <name> |
Set default provider |
/config openrouter_api_key |
Set OpenRouter API key |
/config anthropic_api_key |
Set Anthropic API key |
/config openai_api_key |
Set OpenAI API key |
/config ollama_base_url |
Set Ollama server URL |
/config search_provider <provider> |
Set search provider (anthropic_native/duckduckgo/google) |
/config google_api_key |
Set Google API key (for Google search) |
/config online on|off |
Set default online mode |
/config model <id> |
Set default model |
/config stream on|off |
Toggle streaming |
/stats |
Session statistics |
/credits |
Check credits (OpenRouter) |
CLI Options
oai [OPTIONS]
Options:
-p, --provider TEXT Provider to use (openrouter/anthropic/openai/ollama)
-m, --model TEXT Model ID to use
-s, --system TEXT System prompt
-o, --online Enable online mode (OpenRouter only)
--mcp Enable MCP server
-v, --version Show version
--help Show help
Commands:
oai # Launch TUI (default)
oai version # Show version information
oai --help # Show help message
Configuration
Configuration is stored in ~/.config/oai/:
| File | Purpose |
|---|---|
oai_config.db |
Settings, conversations, MCP config |
oai.log |
Application logs |
history.txt |
Command history |
Project Structure
oai/
├── oai/
│ ├── __init__.py
│ ├── __main__.py # Entry point for python -m oai
│ ├── cli.py # Main CLI entry point
│ ├── constants.py # Configuration constants
│ ├── commands/ # Slash command handlers
│ ├── config/ # Settings and database
│ ├── core/ # Chat client and session
│ ├── mcp/ # MCP server and tools
│ ├── providers/ # AI provider abstraction
│ ├── tui/ # Textual TUI interface
│ │ ├── app.py # Main TUI application
│ │ ├── widgets/ # Custom widgets
│ │ ├── screens/ # Modal screens
│ │ └── styles.tcss # TUI styling
│ └── utils/ # Logging, export, etc.
├── pyproject.toml # Package configuration
├── build.sh # Binary build script
└── README.md
Troubleshooting
macOS Binary Issues
# Remove quarantine attribute
xattr -cr ~/.local/bin/oai
# Then in Finder: right-click oai → Open With → Terminal → Click "Open"
# After this, oai works from any terminal
MCP Not Working
# Check if model supports function calling
/info # Look for "tools" in supported parameters
# Check MCP status
/mcp status
# View logs
tail -f ~/.config/oai/oai.log
Import Errors
# Reinstall package
pip install -e . --force-reinstall
Version History
v3.0.0-b3 (Current - Beta 3)
- 🔄 Multi-Provider Support - OpenRouter, Anthropic (Claude), OpenAI (ChatGPT), Ollama (local)
- 🔌 Provider Switching - Switch between providers mid-session with
/provider - 🧠 Provider Model Memory - Remembers last used model per provider
- ⚙️ Provider Configuration - Separate API keys for each provider
- 🌐 Universal Web Search - Three search options for all providers:
- Anthropic Native - Built-in search with citations ($0.01/search)
- DuckDuckGo - Free web scraping (default)
- Google Custom Search - Premium option
- 🎨 Enhanced Header - Shows current provider and model
- 📊 Updated Config Screen - Shows provider-specific settings
- 🔧 Command Dropdown - Added
/providercommands for discoverability - 🎯 Auto-Scrolling - Chat window auto-scrolls during streaming responses
- 🎨 Refined UI - Thinner scrollbar, cleaner styling
- 🔧 Updated Models - Claude 4.5 (Sonnet, Haiku, Opus)
v3.0.0 (Beta)
- 🎨 Complete migration to Textual TUI - Modern async terminal interface
- 🗑️ Removed CLI interface - TUI-only for cleaner codebase (11.6% smaller)
- 🖱️ Modal screens - Help, stats, config, credits, model selector
- ⌨️ Keyboard shortcuts - F1 (help), F2 (models), Ctrl+S (stats), etc.
- 🎯 Capability indicators - Visual icons for model features (vision, tools, online)
- 🎨 Consistent dark theme - Professional styling throughout
- 📊 Enhanced model selector - Search, filter, capability columns
- 🚀 Default command - Just run
oaito launch TUI - 🧹 Code cleanup - Removed 1,300+ lines of CLI code
v2.1.0
- 🏗️ Complete codebase refactoring to modular package structure
- 🔌 Extensible provider architecture for adding new AI providers
- 📦 Proper Python packaging with pyproject.toml
- ✨ MCP integration (file access, write mode, database queries)
- 🔧 Command registry pattern for slash commands
- 📊 Improved cost tracking and session statistics
v1.9.x
- Single-file implementation
- Core chat functionality
- File attachments
- Conversation management
License
MIT License - See LICENSE for details.
Author
Rune Olsen
- Project: https://iurl.no/oai
- Repository: https://gitlab.pm/rune/oai
Contributing
- Fork the repository
- Create a feature branch
- Submit a pull request
⭐ Star this project if you find it useful!