Making large AI models cheaper, faster and more accessible
40.8k
Stars
4.5k
Forks
427
Issues
188
Contributors
392
Watchers
deep-learninghpclarge-scaledata-parallelismpipeline-parallelismmodel-parallelismaibig-modeldistributed-computinginferenceheterogeneous-trainingfoundation-models
Python
{"name":"Apache License 2.0","spdxId":"Apache-2.0"}
Project Description
A project called Colossal-AI, which is open-sourced by Chinese people on GitHub, only needs a small amount of modification to enable existing deep learning projects to complete large model training on a single consumer-level graphics card, greatly reducing the cost of project development! In short, with this open-source project, everyone can train AI large models at home! Especially, it has significantly reduced the threshold for fine-tuning, inference and other downstream tasks and application deployment of AI large models.