local-ai-llm
A local-first memory layer for users and AI agents to capture, search, and synthesize project context.
Source-available local AI inference orchestration — broadcast dispatch and multi-step pipelines across up to 10 nodes