Daily-Trend-Review

2023/07/06: Vector DB, Transformer, Context Window, vLLM 등

hellcat 2023. 7. 6. 11:29

1. Vector databases

source: https://medium.com/aimonks/vector-databases-7d46054e933

2. Leveraging Large Language Models in your Software Applications

source: https://medium.com/@simon_attard/leveraging-large-language-models-in-your-software-applications-9ea520fb2f34

3. GPT in 60 Lines of NumPy

source: https://jaykmody.com/blog/gpt-from-scratch/#gpt-architecture

4. hatGPT의 전두엽(장기기억 저장소)으로 각광받고 있는 Vector DB에 대해 알아보자

source: https://devocean.sk.com/blog/techBoardDetail.do?ID=164964

5. Inside Transformers: An In-depth Look at the Game-Changing Machine Learning Architecture

source: https://medium.com/p/f619a704e72

6. A Mathematical Framework for Transformer Circuits

source: https://transformer-circuits.pub/2021/framework/index.html

7. Putting it All Together: The Implemented Transformer

source: https://medium.com/@hunter-j-phillips/putting-it-all-together-the-implemented-transformer-bfb11ac1ddfe

8. Prompt Engineering Guide

source: https://www.promptingguide.ai/

9. The Secret Sauce behind 100K context window in LLMs: all tricks in one place

source: https://blog.gopenai.com/how-to-speed-up-llms-and-use-100k-context-window-all-tricks-in-one-place-ffd40577b4c

10. Meet vLLM: UC Berkeley’s Open Source Framework for Super Fast and Chearp LLM Serving

source: https://pub.towardsai.net/meet-vllm-uc-berkeleys-open-source-framework-for-super-fast-and-chearp-llm-serving-23b2f540a756