From 5f6442f6dbbd53927de0e84d8730acb3faf75fb0 Mon Sep 17 00:00:00 2001 From: Richard Kuo Date: Mon, 11 Nov 2024 09:27:00 +0800 Subject: [PATCH] Update 2024-08-18-RAG.md --- _posts/2024-08-18-RAG.md | 36 ++++++++++++++++-------------------- 1 file changed, 16 insertions(+), 20 deletions(-) diff --git a/_posts/2024-08-18-RAG.md b/_posts/2024-08-18-RAG.md index 1d602cf9..fba323a0 100644 --- a/_posts/2024-08-18-RAG.md +++ b/_posts/2024-08-18-RAG.md @@ -82,24 +82,6 @@ Introduction to RAG, LlamaIndex, examples. Perspective API0.7280.7870.5320.699 ---- -## GraphRAG -**Blog:** [從 RAG 到 GraphRAG:透過圖譜節點關係增強回應精確度](https://idataagent.com/2024/05/06/from-rag-to-graphrag-enhance-response-accuracy-through-graph-node-relationships/)
-![](https://miro.medium.com/v2/resize:fit:786/format:webp/0*22VVg9YOaqHLRJ0-) - -**Blog:** [GraphRAG: Unlocking LLM discovery on narrative private data)](https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/)
- -**Paper:** [From Local to Global: A Graph RAG Approach to Query-Focused Summarization](https://arxiv.org/pdf/2404.16130)
-**Code:**
-* [GraphRAG Accelerator](https://github.com/Azure-Samples/graphrag-accelerator) -* [GraphRAG Library](https://github.com/microsoft/graphrag) - ---- -### HippoRAG -**Paper:** [HippoRAG: Neurobiologically Inspired Long-Term Memory for Large Language Models](https://arxiv.org/abs/2405.14831)
-**Code:** [https://github.com/OSU-NLP-Group/HippoRAG](https://github.com/OSU-NLP-Group/HippoRAG)
-![](https://github.com/OSU-NLP-Group/HippoRAG/raw/main/images/hook_figure.png) - --- ### [Contextual Retrieval RAG](https://www.anthropic.com/news/contextual-retrieval) [Anthropic Prompt Caching](https://github.com/anthropics/anthropic-cookbook/blob/main/misc/prompt_caching.ipynb)
@@ -164,8 +146,6 @@ index.storage_context.persist() **Kaggle:** [https://www.kaggle.com/code/rkuo2000/llm-llamaindex](https://www.kaggle.com/code/rkuo2000/llm-llamaindex) --- -## Applications - ### [RAG using LlamaIndex framework to build a simple chatbot, to Q&A a bunch of documents](https://abvijaykumar.medium.com/prompt-engineering-retrieval-augmented-generation-rag-cd63cdc6b00) ![](https://miro.medium.com/v2/resize:fit:720/format:webp/1*PL-HZqYOdczK4PoZjEPlKQ.png) @@ -197,8 +177,24 @@ Anyscale剛剛發布的一篇精彩好文,裡頭介紹了很多提升RAG成效 ![](https://i.imgur.com/oWhiZHb.png) **Code:** [https://github.com/HuskyInSalt/CRAG](https://github.com/HuskyInSalt/CRAG)
+--- +### GRAG +**Paper:** [From Local to Global: A Graph RAG Approach to Query-Focused Summarization](https://arxiv.org/pdf/2404.16130)
+**Paper:** [GRAG: Graph Retrieval-Augmented Generation](https://arxiv.org/abs/2405.16506)
+**Blog:** [GraphRAG: Unlocking LLM discovery on narrative private data](https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/)
+**Blog:** [Knowledge Graph + RAG | Microsoft GraphRAG 實作與視覺化教學](https://medium.com/@cch.chichieh/knowledge-graph-rag-microsoft-graphrag-%E5%AF%A6%E4%BD%9C%E8%88%87%E8%A6%96%E8%A6%BA%E5%8C%96%E6%95%99%E5%AD%B8-ac07991855e6) +**Code:** [https://github.com/microsoft/graphrag](https://github.com/microsoft/graphrag)
+**Code:** [GraphRAG Accelerator](https://github.com/Azure-Samples/graphrag-accelerator)
+ +--- +### HippoRAG +**Paper:** [HippoRAG: Neurobiologically Inspired Long-Term Memory for Large Language Models](https://arxiv.org/abs/2405.14831)
+**Code:** [https://github.com/OSU-NLP-Group/HippoRAG](https://github.com/OSU-NLP-Group/HippoRAG)
+![](https://github.com/OSU-NLP-Group/HippoRAG/raw/main/images/hook_figure.png) + --- ### TAG (Table-Augmented Generation) +**Paper:** [Text2SQL is Not Enough: Unifying AI and Databases with TAG](https://arxiv.org/abs/2408.14717)
**Blog:** [Goodbye, Text2SQL: Why Table-Augmented Generation (TAG) is the Future of AI-Driven Data Queries!](https://ai.plainenglish.io/goodbye-text2sql-why-table-augmented-generation-tag-is-the-future-of-ai-driven-data-queries-892e24e06922)
**Code:** [https://github.com/TAG-Research/TAG-Bench](https://github.com/TAG-Research/TAG-Bench)
![](https://github.com/TAG-Research/TAG-Bench/raw/main/assets/tag.png)