A powerful command-line interface and REPL for interacting with local Ollama models.
Featuring tool execution, screen vision, memory management, and RAG capabilities.
A modern terminal-based chat interface with rich Markdown support.
Slash Commands
| /new | Start a fresh session context |
| /history | View list of past sessions |
| /model-set <name> | Switch active Ollama model |
| /clear | Clear the current screen |
| /exit | Close the application |
Execute single-shot tasks directly from your shell.
CLI Args
| -p "..." | Prompt to execute immediately |
| --rag <db> | Use specific RAG database |
Contextual Retrieval Augmented Generation using local vector stores.
Management Commands
| /rag-create <name> | Initialize a new knowledge base |
| /rag-add <path> | Ingest file or directory |
| /rag-load <name> | Activate a specific DB |
Long-term persistence powered by Mem0 and Qdrant.
The agent automatically stores and retrieves user preferences and context across sessions.
Actions
| mem0_add_memory | Tool to explicitly save facts |
| mem0_search | Tool to recall information |
Give your agent eyes. Capture and analyze your screen contents.
Supported on Linux (X11/Wayland).
Syntax
| @dp0 | Capture primary display |
| @dp1 | Capture second monitor |
Define reusable workflows in YAML to automate complex queries.
Task Commands
| /tasks | List available tasks |
| /task-run <id> | Execute a specific task |
| /task-create | Interactive task builder |
Extend capabilities with Model Context Protocol servers.
Edit ~/.ollama-agent/mcp_servers.json to add servers.
The agent can interact with your system to perform real-world actions.
Built-in Capabilities
| execute_command | Run shell commands securely |
Requirements
Ensure you have Ollama running with a tool-capable model (like llama3.1) and an embedding model.
ollama pull gpt-oss:20b
ollama pull nomic-embed-text
Installation
pipx install git+https://github.com/arrase/ollama-agent.git
Interactive Mode
Start the chat interface to begin a session.
ollama-agent
One-off Commands
Execute a single prompt directly from your shell.
ollama-agent -p "Find large files in /var/log and summarize them"