PaddlePaddle / PaddleFormers
PaddleFormers is a Transformers library built on the PaddlePaddle framework, designed to provide training interfaces for Large Language Models and Vision-Language Models equivalent to Hugging Face. By integrating tensor parallelism, pipeline parallelism, and automatic mixed precision, the project achieves training performance that surpasses Megatron-LM on key models. Furthermore, it fully supports the Safetensors format and is deeply adapted to various domestic computing chips, helping developers efficiently complete the full model training process.