PyPI Stats
  • Insights
  • PyPI
  • GitHub
  • Search
  • Compare
  • Advisories
  • Ecosystem
  • About
Home

Search Packages

Find Python packages by name, description, GitHub topic, or filter by metrics
Ambar-13
constrai

Formal safety framework for AI agents. Pluggable LLM reasoning constrained by mathematically proven budget, invariant, and termination guarantee. 7 theorems enforced by construction, not by prompting. Includes Bayesian belief tracking, causal dependency graphs, sandboxed attestors, environment reconciliation, and a 155-test adversarial suite.

223 1 1
Ambar-13
clampai

Safety guardrails for LLM agents — budget enforcement, invariants, and provable guarantees. Zero dependencies.

170 1 1
    • Data from PyPI, GitHub, ClickHouse, and BigQuery