PyPI Stats
  • Insights
  • PyPI
  • GitHub
  • Search
  • Compare
  • Advisories
  • Ecosystem
  • About
Home

Search Packages

Find Python packages by name, description, GitHub topic, or filter by metrics
deepspeedai
deepspeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

1.3M 42K 5K
relf
egobox

Efficient global optimization toolbox in Rust: bayesian optimization, mixture of gaussian processes, sampling methods

64K 172 10
lucidrains
mixture-of-experts

A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models

55K 859 71
SMTorg
smt

Surrogate Modeling Toolbox

33K 872 227
codelion
optillm

Optimizing inference proxy for LLMs

12K 3K 266
learning-at-home
hivemind

Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.

8K 2K 228
brontoguana
krasis

Krasis is no longer distributed via PyPI. Install from GitHub: https://github.com/brontoguana/krasis

5K 447 22
PR0CK0
dissenter

Multi-LLM debate engine for complex questions — surface disagreement, synthesize decisions

4K 1 0
theoddden
terradev-cli

Cross-Cloud Compute Optimization Platform with Migration & Evaluation - v4.0.12

3K 10 1
wuwangzhang1216
abliterix

Automated alignment adjustment for LLMs — direct steering, LoRA, and MoE expert-granular abliteration, optimized via multi-objective Optuna TPE.

2K 215 42
lucidrains
peer-pytorch

Pytorch implementation of the PEER block from the paper, Mixture of A Million Experts, by Xu Owen He at Deepmind

2K 136 7
lucidrains
st-moe-pytorch

Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch

2K 382 33
szibis
mlx-flash

Run AI models too large for your Mac's memory — expert caching, speculative execution, and 15+ research techniques for MoE inference on Apple Silicon

1K 2 0
eriirfos-eng
ternlang-jupyter

Ternlang is a ternary programming language (.tern), a runtime for XAI, MoE-LLMs and autonomous agents, shipped with Agentic CLI and in house SDK/IDE.

976 16 6
lucidrains
soft-moe-pytorch

Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch

955 345 10
jaisidhsingh
pytorch-mixtures

One-stop solutions for Mixture of Expert modules in PyTorch.

922 28 1
cgrtml
neural-trees

sklearn-compatible PyTorch implementations of Soft Decision Trees, HMoE, and classifier comparison tests (5×2cv F-test). pip install neural-trees

829 23 0
kyegomez
switch-transformers

Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"

755 139 17
lucidrains
mixture-of-attention

Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts

736 122 4
lucidrains
sinkhorn-router-pytorch

Sinkhorn Router - Pytorch

681 40 0
michaelellis003
lmxlab

Transformer language models on Apple Silicon with MLX

654 1 0
Leeroo-AI
mergoo

Impelementation of Leeroo LLM composer.

294 511 33
andriygav
mixturelib

The implementation of mixtures for different tasks.

248 2 0
scouzi1966
mlxlmprobe

Visual probing and interpretability tool for MLX language models

218 3 0
    • Data from PyPI, GitHub, ClickHouse, and BigQuery