Relaxed System Lab

Popular repositories Loading

  1. 🚀🚀 Efficient implementations of Native Sparse Attention

    Python 751 15

  2. This is the repo for the paper Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining.

    Python 47 3

  3. [ICML 2024] Serving LLMs on heterogeneous decentralized clusters.

    Python 35 3

  4. Accommodating Large Language Model Training over Heterogeneous Environment.

    Python 27 8

  5. We introduce UltraLLaDA , a scaled variant of LLaDA-8B-Base that extends the context length up to 128K tokens with light-weight post-training, enabling long-context comprehension and generation.

    Python 11