https://twitter.com/virattt/status/1778828787951546382
https://coconut-mode.com/posts/ring-attention/
https://twitter.com/bonniesjli/status/1778846068588814486
Leave No Context Behind: Efficient Infinite Context Trnasformers with Infini-attention
https://arxiv.org/abs/2404.07143
'Daily-Trend-Review' 카테고리의 다른 글
24/05/10: 1.58 bits, FrugalGPT (0) | 2024.05.10 |
---|---|
24/04/16: Are All Large Language Models Really in 1.58 Bits? (0) | 2024.04.16 |
24/03/31: Transformer math 101 (0) | 2024.03.31 |
24/03/10: AGI의 정의 (0) | 2024.03.10 |
24/03/10: It is fake AGI, stupid! (0) | 2024.03.10 |