Ongoing research training transformer language models at scale, including: BERT & GPT-2
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
an easy-to-use knn-mt toolkit
Optimized primitives for collective multi-GPU communication
SpikingJelly is an open-source deep learning framework for Spiking Neural Network (SNN) based on PyTorch.
Sample codes for CUDA programming book
physics meets neural networks
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.