Every feature in the memory system, explained. The carousel auto-advances — or click any label to jump.
BM25 keyword matching combined with vector embeddings. Multi-signal ranking scores every chunk by relevance, recency, and confidence weight.
The memory system adapts principles from biological memory: Hebbian associative learning, consolidation during sleep, spreading activation, and decay. Applied to agent fleets with enterprise governance.
Chunks retrieved together in successful sessions form stronger associations. The update rule includes a saturation brake — strong edges resist further strengthening, keeping the graph responsive.
w += η × outcome × (1 − |w|)When a chunk is retrieved, activation spreads through Hebbian edges to pull in associated knowledge. Attenuation at 0.5 per hop keeps distant associations gentle.
activation = 0.5 ^ hopChunks with 5+ Hebbian edges are foundational knowledge — they connect many topics. Hubs are candidates for concept extraction and receive priority in context assembly.
hub_degree >= 5Tentative memories decay after 14 days. Decay factor 0.9, floor 0.1. This prevents stale knowledge from polluting retrieval while preserving validated patterns indefinitely.
decay: 0.9 / 14d, floor: 0.1Like sleep consolidation in biological brains — sessions compress into durable patterns. The pipeline runs attribution, extraction, validation, deduplication, and upsert.
85% word-overlap dedup thresholdThe context assembler places highest-value items at the start and end of the agent's context window — the positions where LLMs pay the most attention. Lower-value items fill the middle.
rot-aware interleavingUsage creates value. Value creates retention. Retention creates a moat. Each phase deepens it.
Every session your agents run makes the next one better. Governed, audited, injection-hardened.