PyPI Stats
  • Insights
  • PyPI
  • GitHub
  • Search
  • Compare
  • Advisories
  • Ecosystem
  • About
Home

Search Packages

Find Python packages by name, description, GitHub topic, or filter by metrics
WilliamLwj
pyxab

PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms

1K 127 30
SMPyBandits
smpybandits

🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-player (MusicalChair, MEGA, rhoRand, MCTop/RandTopM etc).. Available on PyPI: https://pypi.org/project/SMPyBandits/ and documentation on

351 422 61
doerlbh
banditzoo

Python library of bandits and RL agents in different real-world environments

294 7 4
singhsidhukuldeep
contextual-bandits-algos

A library for contextual multi-armed bandit algorithms.

62 13 1
    • Data from PyPI, GitHub, ClickHouse, and BigQuery