Open-source LLM FinOps proxy — track OpenAI, Anthropic (Claude), and Google Gemini costs by feature, team, and customer. Zero code changes. pip install burnlens.
Universal LLM token counting and cost management. Track, compare, and optimize your LLM API spending.
A blazing-fast BPE tokenizer for LLMs. Drop-in tiktoken replacement, 20-80x faster.
Token Optimization for Context Engineers. 4.8 KB WASM. Sub-millisecond. Zero dependencies.