Daily-Trend-Review

2023/07/21(2) : In-Context Learning, Emergent Abilities,

hellcat 2023. 7. 21. 16:14

1. Reducing LLM Costs & Latency with Semantic Cache

source: https://portkey.ai/blog/reducing-llm-costs-and-latency-semantic-cache

2. In-Context Learning Approaches in Large Language Models

source: https://towardsdatascience.com/in-context-learning-approaches-in-large-language-models-9c0c53b116a1

3. Llama 2: Open Foundation and Fine-Tuned Chat Models

source: https://arxiv.org/pdf/2307.09288.pdf

4. Question answering using Retrieval Augmented Generation with foundation models in Amazon SageMaker JumpStart

source: https://aws.amazon.com/ko/blogs/machine-learning/question-answering-using-retrieval-augmented-generation-with-foundation-models-in-amazon-sagemaker-jumpstart/

5. Introduction to Large Language Models for Generative AI

source: https://www.assemblyai.com/blog/introduction-large-language-models-generative-ai/

6. Emergent Abilities of Large Language Models

source: https://www.assemblyai.com/blog/emergent-abilities-of-large-language-models/

7. The Full Story of Large Language Models and RLHF

source: https://www.assemblyai.com/blog/the-full-story-of-large-language-models-and-rlhf/

8. Exploring LangChain and LlamaIndex to Achieve Standardization and Iteroperability in Large Language Models

source: https://medium.com/badal-io/exploring-langchain-and-llamaindex-to-achieve-standardization-and-interoperability-in-large-2b5f3fabc360

9. All You Need to Know to Build Your First LLM App

source: https://towardsdatascience.com/all-you-need-to-know-to-build-your-first-llm-app-eb982c78ffac

10. How does in-context learning work? A framework for understanding the differences from traditional supervised learning

source: http://ai.stanford.edu/blog/understanding-incontext/

11. Knowledge Retrieval Architecture for LLM's (2023)

source: https://mattboegner.com/knowledge-retrieval-architecture-for-llms/

12. Large Language Models Use Cases and Applications

source: https://vectara.com/large-language-models-use-cases/

13. Build Industry-Specific LLMs Using Retrieval Augmented Generation

source: https://towardsdatascience.com/build-industry-specific-llms-using-retrieval-augmented-generation-af9e98bb6f68

14. Ask a Book Questions with LangChain and OpenAI

source: https://bennycheung.github.io/ask-a-book-questions-with-langchain-openai

15. Why is in-context learning lower quality than fine-tuning? And...what if it wasn't?

source: https://hazyresearch.stanford.edu/blog/2023-06-12-icl-vs-finetuning

16. Vector Similarity Search: From Basics to Production

source: https://mlops.community/vector-similarity-search-from-basics-to-production/

'Daily-Trend-Review' 카테고리의 다른 글

2023/07/31: Aligning LLMs 등  (0) 2023.07.31
2023/07/24: LongNet  (0) 2023.07.24
2023/07/21: MQA, LLaMA2, Flashattention2  (0) 2023.07.21
2023/07/18: Long Sequence  (0) 2023.07.18
2023/07/16: LLM에 대한 실용적인 소개 등  (0) 2023.07.16