1. Vector databases
source: https://medium.com/aimonks/vector-databases-7d46054e933
2. Leveraging Large Language Models in your Software Applications
3. GPT in 60 Lines of NumPy
source: https://jaykmody.com/blog/gpt-from-scratch/#gpt-architecture
4. hatGPT의 전두엽(장기기억 저장소)으로 각광받고 있는 Vector DB에 대해 알아보자
source: https://devocean.sk.com/blog/techBoardDetail.do?ID=164964
5. Inside Transformers: An In-depth Look at the Game-Changing Machine Learning Architecture
source: https://medium.com/p/f619a704e72
6. A Mathematical Framework for Transformer Circuits
source: https://transformer-circuits.pub/2021/framework/index.html
7. Putting it All Together: The Implemented Transformer
8. Prompt Engineering Guide
source: https://www.promptingguide.ai/
9. The Secret Sauce behind 100K context window in LLMs: all tricks in one place
10. Meet vLLM: UC Berkeley’s Open Source Framework for Super Fast and Chearp LLM Serving
'Daily-Trend-Review' 카테고리의 다른 글
2023/07/10: An Infinite Memory ChatGPT? (0) | 2023.07.10 |
---|---|
2023/07/07: SW 애플리케이션에서 대규모 언어모델 활용 (0) | 2023.07.07 |
2023/07/01: Emerging Architectures for LLM Applications (0) | 2023.07.01 |
2023/06/22: Generative AI 등 (0) | 2023.06.22 |
2023/05/29: State of GPT, Voyager, LLaMA-Adapter 등 (0) | 2023.05.29 |