Aura
Agentic coding CLI.
aura is a terminal-based coding assistant that connects to local or remote LLMs. Agents, tools, modes, guardrails, and providers are defined as YAML and Markdown files.
# Start interactive session
aura
# One-off prompt
aura run "Write a Go function that reverses a string"
# Embedding-based codebase search
aura query "token counting"
# List available models
aura models
Task Orchestration
Tasks are YAML files that sequence prompts, slash commands, and shell commands. /assert and /until add condition gates — the LLM keeps working until the condition is satisfied.
build-app:
agent: high
timeout: 60m
commands:
- /mode plan
- Read SPEC.md and generate a plan.
- /until not todo_empty "Create the plan with TodoCreate"
- /mode edit
- /auto on
- Execute the plan.
- /until bash:"go build ./..." "Build is failing. Fix the errors."
Tasks support cron scheduling, foreach iteration over files or shell output, pre/post shell hooks, session continuity across runs, and template variables. See Examples for more patterns.
Providers
Aura connects to local and remote LLM providers. No single vendor required — switch providers per agent, per task, or at runtime.
| Provider | Notes |
|---|---|
| Ollama | Local models, embedding, thinking, vision |
| LlamaCPP | Local llama.cpp server; also used for whisper and kokoro |
| OpenRouter | Cloud models, token auth |
| OpenAI | Any /v1/responses endpoint |
| Anthropic | Claude models, thinking, vision |
| Gemini models, thinking, vision, embeddings | |
| Copilot | GitHub Copilot subscription |
| Codex | ChatGPT Plus/Pro subscription |
Configuration
Everything is file-based YAML and Markdown. Agents, modes, prompts, hooks, and tasks support inherit: with DAG-based resolution (multi-parent, cycle detection). Git-trackable and mergeable.
Every Bash tool command can be intercepted through a Go template (tools.bash.rewrite) before execution — wrap with output optimizers, activate virtualenvs, or route through containers.
Extensibility
Go plugins — interpreted Go code (Yaegi) with lifecycle hooks at 8 timings, custom tools with sandbox integration, and custom slash commands. Distributed via git with vendored dependencies.
Skills — LLM-invocable Markdown prompts. The LLM sees names and descriptions, and calls them when relevant.
Custom commands — user-defined slash commands as Markdown files with argument substitution.
Hooks — shell commands before/after tool execution with file glob filtering and DAG ordering.
Features
| Feature | Description |
|---|---|
| Agents | Per-agent model, provider, system prompt, and tool filters |
| Tools | Built-in tools + custom tools via Go plugins |
| Modes | Tool availability and bash command restrictions |
| Guardrails | Secondary LLM validation of tool calls and user messages |
| Slash Commands | Built-in + user-defined as Markdown files |
| Skills | LLM-invocable capabilities with progressive disclosure |
| Compaction | Automatic context compression via dedicated agent |
| Embeddings | Embedding-based codebase search with AST-aware chunking |
| Sessions | Save, resume, and fork conversations |
| Sandboxing | Landlock LSM filesystem restrictions |
| MCP | HTTP and STDIO transports |
| Thinking | Extended reasoning with configurable levels |
| Vision | Image/PDF analysis via vision-capable model delegation |
| Audio | Speech-to-text transcription and text-to-speech synthesis |
| Hooks | Shell commands before/after tool execution |
| LSP | Language server diagnostics appended to tool results |
| Plugins | User-defined Go plugins via Yaegi interpreter |
| Memory | Persistent key-value storage across sessions |
| Deferred Tools | On-demand tool loading to reduce initial context usage |
| Scheduled Tasks | Cron-based task scheduling with foreach iteration |
| Web UI | Browser-based chat interface with SSE streaming |
| Auto Mode | Continuous execution with condition gates and Done tool |
Contributing
See Contributing for architecture, extension guides, testing, and package organization.
