PyPI Stats
  • Insights
  • PyPI
  • GitHub
  • Search
  • Compare
  • Advisories
  • Ecosystem
  • About
Home

Search Packages

Find Python packages by name, description, GitHub topic, or filter by metrics
Libr-AI
openfactverification-kongzii

Loki: Open-source solution designed to automate the process of verifying factuality

7K 1K 61
cvs-health
uqlm

UQLM: Uncertainty Quantification for Language Models, is a Python package for UQ-based LLM hallucination detection

6K 1K 121
wauldoai
wauldo

Official Python SDK for Wauldo — verified AI answers with zero hallucinations. pip install wauldo

3K 1 2
FastBuilderAI
fastmemory

FastMemory is a topological representation of text data using concepts as the primary input. It helps in improving the RAG(by replacing embedding and vectors entirely), AI memory and LLM queries by upto 100% as in the huggingface benchmarks(22+ SOTA)

3K 31 5
MigoXLab
dingo-python

Dingo: A Comprehensive AI Data, Model and Application Quality Evaluation Tool

2K 691 71
QWED-AI
qwed

The Deterministic Verification Protocol for AI - 11 verification engines for math, logic, code, SQL, facts, images, and more. Now with Agentic Security Guards.

1K 55 8
ylu999
jingu-trust-gate

jingu-trust-gate – deterministic admission layer that blocks LLM hallucinations from becoming system state

882 1 0
aimonlabs
hdm2

HalluciNot: Hallucination Detection Through Context and Common Knowledge Verification

414 11 0
bh3r1th
ega

Runtime enforcement layer for LLM outputs. Verifies claims against source evidence before emit. Not an eval tool.

362 1 0
dhanushk-offl
hallx

Lightweight hallucination risk scoring for LLM outputs

338 2 0
rudra496
ai-trust-validator

🛡️ Validate AI-generated code for security, hallucinations & logic errors. Open-source trust layer for AI-assisted development.

299 4 0
Saivineeth147
llm-testlab

Comprehensive Testing Tool for Large Language Models

218 6 0
Mmorgan-ML
phase-slip-sampler

Phase-Slip is a stochastic intervention architecture that operates on the Key-Value Cache of the model. Phase-Slip gently rotates the semantic vectors of the context window, asking the model: "How would you finish this sentence if you looked at it from a slightly different perspective?"

160 6 0
IAAR-Shanghai
eval-suite

User-friendly evaluation framework: Eval Suite & Benchmarks: UHGEval, HaluEval, HalluQA, etc.

139 180 13
frmoretto
cgd-verify

Stop LLMs from hallucinating your guesses as facts. Clarity Gate is a verification protocol for your documents that are going to be provided to LLMs or RAG systems. Place automatically the missing uncertainty markers to avoid confident hallucinations. HITL for non-directly verifiable claims.

135 27 3
frmoretto
cgd-validator

Validator for Clarity-Gated Document (.cgd) files - verified documents for safe LLM ingestion

130 25 3
rwondo
maven-ai

Multi-Agent Verification Engine - Production-ready hallucination detection for high-stakes AI applications

129 1 0
frmoretto
sot-validator

Validator for Source of Truth (.sot) files - epistemic quality verification for AI-safe documentation

123 24 2
frmoretto
sot-verify

Stop LLMs from hallucinating your guesses as facts. Clarity Gate is a verification protocol for your documents that are going to be provided to LLMs or RAG systems. Place automatically the missing uncertainty markers to avoid confident hallucinations. HITL for non-directly verifiable claims.

107 27 3
frmoretto
cgd-creator

Stop LLMs from hallucinating your guesses as facts. Clarity Gate is a verification protocol for your documents that are going to be provided to LLMs or RAG systems. Place automatically the missing uncertainty markers to avoid confident hallucinations. HITL for non-directly verifiable claims.

90 27 3
frmoretto
cgd-generator

Stop LLMs from hallucinating your guesses as facts. Clarity Gate is a verification protocol for your documents that are going to be provided to LLMs or RAG systems. Place automatically the missing uncertainty markers to avoid confident hallucinations. HITL for non-directly verifiable claims.

89 27 3
frmoretto
clarity-gate

Stop LLMs from hallucinating your guesses as facts. Clarity Gate is a verification protocol for your documents that are going to be provided to LLMs or RAG systems. Place automatically the missing uncertainty markers to avoid confident hallucinations. HITL for non-directly verifiable claims.

83 27 3
frmoretto
memory-trail

Decision memory and session logging for AI-assisted development - track architectural decisions across sessions

82 27 3
bcdnlp
faithscore

FaithScore: Fine-grained Evaluations of Hallucinations in Large Vision-Language Models

78 33 7
    • Data from PyPI, GitHub, ClickHouse, and BigQuery