Terminal-native AI agent harness.
One Rust binary that runs interactive, headless, or as a long-running daemon. Parallelizes work across git worktrees. Persistent project memory. Multiple LLM providers, no lock-in.
curl -fsSL https://omegon.styrene.io/install.sh | sh Headless autonomous agent→
Run as a long-lived daemon with omegon serve, or fire bounded tasks with omegon run for CI and k8s Jobs. Scheduled triggers, GitHub webhooks, and event-driven dispatch — no terminal required.
Parallel worktree orchestration→
Cleave decomposes work into dependency-ordered waves of isolated git worktrees, each with its own agent process. Handles merge-back, conflict detection, and provider fallback on failure.
Persistent memory & lifecycle→
SQLite-backed project memory with confidence scoring, decay, and semantic recall. Design trees and specifications survive across sessions — the agent remembers what it learned and where the project stands.
Multiple providers, zero lock-in→
Native Rust clients for Anthropic, OpenAI, Google, OpenRouter, and Ollama — plus others. No subprocess shims, no Node.js. Model tier, thinking level, and context class are independent controls.
Extensions & MCP→
Process-isolated extensions via JSON-RPC. Vox bridges Discord into the agent loop, with Slack, Signal, and email connectors in progress. Scry runs local image generation. Any MCP server works out of the box.
Personas with memory→
Not prompt templates — living cognitive profiles with their own memory namespace. A persona accumulates knowledge across sessions and recalls it when reactivated. Tones adjust voice without changing capabilities.