TSAIL group
Tsinghua Statistical Artificial Intelligence & Learning Group
- 1.2k followers
- FIT Building, Tsinghua University, Beijing, China
- https://ml.cs.tsinghua.edu.cn
Pinned Loading
Repositories
Showing 10 of 81 repositories
- SageAttention Public
[ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models.
thu-ml/SageAttention’s past year of commit activity - DiT-Extrapolation Public
Official implementation for "RIFLEx: A Free Lunch for Length Extrapolation in Video Diffusion Transformers" (ICML 2025) and "UltraViCo: Breaking Extrapolation Limits in Video Diffusion Transformers"
thu-ml/DiT-Extrapolation’s past year of commit activity - SpargeAttn Public
[ICML2025] SpargeAttention: A training-free sparse attention that accelerates any model inference.
thu-ml/SpargeAttn’s past year of commit activity
Top languages
Loading…
Most used topics
Loading…