🥁🥁🥁 Welcome to 'pytorch-transformers', the 👾 library for Natural Language Processing!
18 Jul 2019, Prathyush SPPyTorch-Transformers is a library of state-of-the-art pre-trained models for Natural Language Processing (NLP).
The library currently contains PyTorch implementations, pre-trained model weights, usage scripts and conversion utilities for the following models:
- BERT (from Google) released with the paper BERT
- GPT (from OpenAI) released with the paper Improving Language Understanding
- GPT-2 (from OpenAI) released with the paper Language Models are Unsupervised Multitask Learners
- Transformer-XL (from Google/CMU) released with the paper Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
- XLNet (from Google/CMU) released with the paper XLNet: Generalized Autoregressive Pretraining for Language Understanding
- XLM (from Facebook) released together with the paper Cross-lingual Language Model Pretraining
For more details, visit the source.