Explore projects
-
Huang xinwie / mmcv
Apache License 2.0Updated -
zhijun wang / Megatron-DeepSpeed
Apache License 2.0Ongoing research training transformer language models at scale, including: BERT & GPT-2
Updated -
-
Xuqin Zhang / mamba
Apache License 2.0Updated -
-
rui gao / Llama.Cpp
MIT LicenseUpdated -
-
yao xu / Lammps Oxdna Nano Units LATBOLTZ
GNU General Public License v2.0 or laterUpdated -
-
Updated
-
Hao Lin / horovod
Apache License 2.0Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
Updated -
Updated
-
rui gao / HierarchicalKV
Apache License 2.0Updated -
Yuntian Zhang / gputreeshap
Apache License 2.0Updated -
Wang Laosi / GPUMD
GNU General Public License v3.0 onlyUpdated -
rui gao / FlashMLA
MIT LicenseUpdated -
rui gao / Flash Attention
BSD 3-Clause "New" or "Revised" LicenseUpdated -
丛 霄 / fast-hadamard-transform
BSD 3-Clause "New" or "Revised" LicenseFast Hadamard transform in CUDA, with a PyTorch interface
Updated -
rui gao / Fast Hadamard Transform
BSD 3-Clause "New" or "Revised" LicenseUpdated -
zhijun wang / fairseqffn
MIT LicenseFacebook AI Research Sequence-to-Sequence Toolkit written in Python.
Updated