Side-by-side comparison of stars, features, and trends
ncnn is a high-performance neural network forward computation framework deeply optimized for mobile platforms. The framework has no third-party dependencies and features cross-platform capabilities, outperforming all known open-source frameworks on mobile CPUs. Developers can easily port deep learning models to mobile devices using ncnn to build various intelligent applications.
RTP-LLM is a high-performance LLM inference acceleration engine developed by the Alibaba Foundation Model Inference team. This engine is widely applied in various business scenarios such as Taobao and Tmall, and it supports multiple hardware platforms and model formats. By integrating advanced operator optimization and scheduling technologies, it provides efficient inference services for large language models.