PyPI Stats
  • Insights
  • PyPI
  • GitHub
  • Search
  • Compare
  • Advisories
  • Ecosystem
  • About
Home

Search Packages

Find Python packages by name, description, GitHub topic, or filter by metrics
explosion
curated-transformers

🤖 A PyTorch library of curated Transformer models and their composable components

521K 895 35
microsoft
loralib

Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

204K 13K 898
fhamborg
news-please

news-please - an integrated web crawler and information extractor for news that just works

118K 2K 452
codelion
adaptive-classifier

A flexible, adaptive classification system for dynamic text classification

37K 549 38
jessevig
bertviz

BertViz: Visualize Attention in Transformer Models

13K 8K 877
deepset-ai
farm

Framework for finetuning and evaluating transformer based language models

3K 2K 247
EricFillion
happytransformer

Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.

2K 546 69
langformers
langformers

🚀 Unified NLP Pipelines for Language Models

2K 19 1
asyml
texar-pytorch

Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/

2K 747 113
amansrivastava17
embedding-as-service

embedding-as-service: one-stop solution to encode sentence to vectors using various embedding methods

1K 210 32
920232796
bert-seq2seq

pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。

1K 1K 208
mim-solutions
belt-nlp

BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation allows fine-tuning.

805 146 35
microsoft
deberta

The implementation of DeBERTa

640 2K 239
labteral
ernie

An Accessible Python Library for State-of-the-art Natural Language Processing. Built with HuggingFace's Transformers.

626 201 30
920232796
bert-seq2seq-ddp

bert_seq2seq的DDP版本,支持bert、roberta、nezha、t5、gpt2等模型,支持seq2seq、ner、关系抽取等任务,无需添加额外代码,轻松启动DDP多卡训练。

626 54 5
microsoft
adamix-gpt2

This is the implementation of the paper AdaMix: Mixture-of-Adaptations for Parameter-efficient Model Tuning (https://arxiv.org/abs/2205.12410).

313 138 11
SMAPPNYU
smaberta

Wrapper for stable version of RoBERTa language models

310 11 9
EveripediaNetwork
fastc

Unattended Lightweight Text Classifiers with State-of-the-Art LLM Embeddings

184 186 10
amansrivastava17
embedding-as-service-client

One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques

175 210 32
haozhg
nlp-lmd

Language Model Decomposition

84 10 1
microsoft
sam-lora-lib

Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

54 13K 898
    • Data from PyPI, GitHub, ClickHouse, and BigQuery