PyPI Stats
  • Insights
  • PyPI
  • GitHub
  • Search
  • Compare
  • Advisories
  • Ecosystem
  • About
Home

Search Packages

Find Python packages by name, description, GitHub topic, or filter by metrics
maheshmakvana
llm-injection-guard

Drop-in prompt injection defense for LLM apps and AI agents — detect, sanitize, block, and audit injection attacks in real time. Includes multi-turn session scanning, allow-lists, rate-abuse detection, multi-layer scanner, FastAPI and Flask middleware.

2K 0 0
vpdeva
blackwall-llm-shield-python

Security middleware for Python LLM apps and services. Blocks prompt injection, masks PII, inspects outputs, and gates agent tools.

366 1 0
perfecxion-ai
ai-agent-scanner

AI agent discovery and security assessment platform with vulnerability testing, risk scoring, and compliance mapping

199 2 1
    • Data from PyPI, GitHub, ClickHouse, and BigQuery