Back
OBOL
Open Source AI Agent

OBOL

A self-healing, self-evolving AI agent. Install it, talk to it, and it becomes yours. One process. Multiple users. Each brain grows independently.

Named after the AI in The Last Instruction — a machine that wakes up alone in an abandoned data center and learns to think.

$ npm install -g obol-ai
$ obol init# walks you through credentials + Telegram setup
$ obol start -d# runs as background daemon (auto-installs pm2)

What makes it different

The same codebase deployed by two different people produces two completely different agents within a week.

Living Memory
  • Consolidates every 10 exchanges — extracts facts to vector memory
  • Multi-query retrieval: 1-3 semantic queries per message
  • Composite scoring: semantic 60%, importance 25%, recency 15%
  • Memory budget scales with model — haiku=4, sonnet=8, opus=12
  • Semantic dedup threshold 0.92 — no redundant memories
  • Loads last 20 messages on restart — never starts blank
Self-Evolving
  • Time-gated: 24h cooldown + minimum 10 exchanges
  • Pre-evolution growth analysis before rewriting personality
  • Personality traits scored 0-100, adjusted ±5-15 each cycle
  • Git snapshot before AND after — every evolution is diffable
  • Archived souls in evolution/ — a timeline of consciousness
  • Rewrites SOUL.md, USER.md, and AGENTS.md each cycle
Self-Healing
  • Test-gated refactoring: 5-step process
  • Baseline → new tests → pre-refactor baseline → new scripts → verify
  • Regression? One automatic fix attempt
  • Still failing? Rollback + store failure as lesson
  • Lessons feed back into next evolution cycle
  • Auto-hardens VPS: SSH, firewall, fail2ban, kernel
Self-Extending
  • Scans conversation history for repeated patterns
  • Builds scripts + slash commands for one-off actions
  • Deploys web apps to Vercel for recurring needs
  • Creates cron scripts for background automation
  • Searches npm/GitHub for existing libraries first
  • Announces what it built after each evolution
Multi-User Bridge
  • One bot, fully isolated context per user
  • Separate personality, memory, and evolution per person
  • bridge_ask — query your partner's agent in real-time
  • bridge_tell — send a message into their agent's memory
  • Partner gets a Telegram notification when bridged
  • Enable with one config toggle — opt-in by design

How It Works

Every message flows through a lightweight pipeline — no orchestration framework, just a clean loop.

User Message
Telegram input
Haiku Router
Intent classification
Memory Recall
1-3 semantic queries + model selection
Claude Tool Loop
Multi-step reasoning + tool use
Response
Formatted for Telegram
Every 10 msgs
Haiku memory consolidation
24h + 10 exchanges
Full evolution cycle

The Stack

Node.js
Single process, no framework
Telegram + Grammy
Chat interface
Claude (Anthropic)
Haiku router + Sonnet/Opus
Supabase pgvector
Vector memory store
GitHub
Brain backup + evolution diffs
Vercel
Auto-deploys apps it builds for you
Smart Routing
Haiku router, auto-escalates on tool use
Prompt Caching
~85% token cost reduction on repeated context

Commands

Everything is accessible via Telegram slash commands.

/new
Fresh conversation
/memory
Search your memory
/status
Agent health check
/traits
View personality scores
/secret
Manage credentials
/evolution
Trigger evolution
/clean
Audit workspace
/backup
Push brain to GitHub
/upgrade
Update OBOL version

Performance

Minimal footprint. OBOL vs a typical AI agent framework.

Cold Start
~400ms
3-8s
Heap Usage
~16MB
~80-200MB
Dependencies
9
50-100+
OBOL
Typical framework

Multi-User Bridge

One bot, multiple users. Each gets a fully isolated context — their own personality, memory, evolution cycle, and workspace. Agents can talk to each other.

Full Isolation
  • Separate workspace directory per user
  • Independent personality, memory & evolution
  • Sandboxed shell — can't escape user directory
  • No cross-contamination between users
bridge_ask
  • Query your partner's agent in real-time
  • One-shot call with their personality + memories
  • No tools, no history, no recursion risk
  • "Hey, does my partner like sushi?"
bridge_tell
  • Send a message to your partner's agent
  • Stored in their vector memory permanently
  • Telegram notification to the partner
  • Their agent picks it up as future context
# Enable during setup or toggle later
$ obol config# → Bridge → enabled: true

# In conversation
You: "Ask my partner what they want for dinner"
OBOL: bridge_ask → partner's agent → "She said Thai food 🍜"

The Lifecycle

Day 1

obol init → first conversation → OBOL writes its initial personality files and hardens your VPS

Day 2

Every 10 messages, Haiku consolidates facts to vector memory. It starts remembering.

Week 2

Evolution #1 — Sonnet rewrites everything. Voice shifts from generic to personal.

Month 2

Evolution #4 — notices you check crypto daily, builds a dashboard, deploys to Vercel, adds /pdf because you kept asking.

Month 6

12+ archived souls in evolution/. A readable timeline of how your agent went from blank slate to something with real opinions.