diff --git a/Solar-Fullstack-LLM-101/08_RAG.ipynb b/Solar-Fullstack-LLM-101/08_RAG.ipynb
index 85c360d..49558aa 100644
--- a/Solar-Fullstack-LLM-101/08_RAG.ipynb
+++ b/Solar-Fullstack-LLM-101/08_RAG.ipynb
@@ -1,337 +1,464 @@
{
- "cells": [
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "\n",
- "\n",
- ""
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "# 08. RAG\n",
- "\n",
- "## Overview \n",
- "In this exercise, we will explore Retrieval-Augmented Generation (RAG) using the Solar framework. RAG combines retrieval-based techniques with generative models to improve the relevance and accuracy of generated responses by leveraging external knowledge sources. This notebook will guide you through implementing RAG and demonstrating its benefits in enhancing model outputs.\n",
- " \n",
- "## Purpose of the Exercise\n",
- "The purpose of this exercise is to integrate Retrieval-Augmented Generation into the Solar framework. By the end of this tutorial, users will understand how to use RAG to access external information and generate more informed and contextually accurate responses, thereby improving the performance and reliability of the language model.\n",
- "\n"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "## RAG: Retrieval Augmented Generation.\n",
- "- Large language models (LLMs) have a limited context size.\n",
- "- TLDR\n",
- "- Not all context is relevant to a given question\n",
- "- Query → Retrieve (Search) → Results → (LLM) → Answer\n",
- "- RAG is a method to combine LLM with Retrieval: Retrieval Augmented Generation\n",
- "\n"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 1,
- "metadata": {},
- "outputs": [],
- "source": [
- "! pip3 install -qU markdownify langchain-upstage rank_bm25 python-dotenv tokenizers"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 2,
- "metadata": {},
- "outputs": [],
- "source": [
- "# @title set API key\n",
- "import os\n",
- "import getpass\n",
- "from pprint import pprint\n",
- "import warnings\n",
- "\n",
- "warnings.filterwarnings(\"ignore\")\n",
- "\n",
- "from IPython import get_ipython\n",
- "\n",
- "if \"google.colab\" in str(get_ipython()):\n",
- " # Running in Google Colab. Please set the UPSTAGE_API_KEY in the Colab Secrets\n",
- " from google.colab import userdata\n",
- " os.environ[\"UPSTAGE_API_KEY\"] = userdata.get(\"UPSTAGE_API_KEY\")\n",
- "else:\n",
- " # Running locally. Please set the UPSTAGE_API_KEY in the .env file\n",
- " from dotenv import load_dotenv\n",
- "\n",
- " load_dotenv()\n",
- "\n",
- "if \"UPSTAGE_API_KEY\" not in os.environ:\n",
- " os.environ[\"UPSTAGE_API_KEY\"] = getpass.getpass(\"Enter your Upstage API key: \")\n"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 3,
- "metadata": {},
- "outputs": [],
- "source": [
- "from langchain_upstage import UpstageLayoutAnalysisLoader\n",
- "\n",
- "\n",
- "layzer = UpstageLayoutAnalysisLoader(\n",
- " \"pdfs/kim-tse-2008.pdf\", use_ocr=False, output_type=\"html\"\n",
- ")\n",
- "# For improved memory efficiency, consider using the lazy_load method to load documents page by page.\n",
- "docs = layzer.load() # or layzer.lazy_load()"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 4,
- "metadata": {},
- "outputs": [
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "8Lmj4RhlCDz3"
+ },
+ "source": [
+ "\n",
+ "\n",
+ ""
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "my7YFX2ACDz5"
+ },
+ "source": [
+ "# 08. RAG\n",
+ "\n",
+ "## Overview \n",
+ "In this exercise, we will explore Retrieval-Augmented Generation (RAG) using the Solar framework. RAG combines retrieval-based techniques with generative models to improve the relevance and accuracy of generated responses by leveraging external knowledge sources. This notebook will guide you through implementing RAG and demonstrating its benefits in enhancing model outputs.\n",
+ "\n",
+ "## Purpose of the Exercise\n",
+ "The purpose of this exercise is to integrate Retrieval-Augmented Generation into the Solar framework. By the end of this tutorial, users will understand how to use RAG to access external information and generate more informed and contextually accurate responses, thereby improving the performance and reliability of the language model.\n",
+ "\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "YVyNmqERCDz6"
+ },
+ "source": [
+ "## RAG: Retrieval Augmented Generation.\n",
+ "- Large language models (LLMs) have a limited context size.\n",
+ "- TLDR\n",
+ "- Not all context is relevant to a given question\n",
+ "- Query → Retrieve (Search) → Results → (LLM) → Answer\n",
+ "- RAG is a method to combine LLM with Retrieval: Retrieval Augmented Generation\n",
+ "\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "OoHatSGHCDz6"
+ },
+ "outputs": [],
+ "source": [
+ "! pip3 install -qU markdownify langchain-upstage rank_bm25 python-dotenv tokenizers langchain-community"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {
+ "id": "ccxQw5wBCDz7"
+ },
+ "outputs": [],
+ "source": [
+ "# @title set API key\n",
+ "from pprint import pprint\n",
+ "import os\n",
+ "\n",
+ "import warnings\n",
+ "\n",
+ "warnings.filterwarnings(\"ignore\")\n",
+ "\n",
+ "if \"google.colab\" in str(get_ipython()):\n",
+ " # Running in Google Colab. Please set the UPSTAGE_API_KEY in the Colab Secrets\n",
+ " from google.colab import userdata\n",
+ "\n",
+ " os.environ[\"UPSTAGE_API_KEY\"] = userdata.get(\"UPSTAGE_API_KEY\")\n",
+ "else:\n",
+ " # Running locally. Please set the UPSTAGE_API_KEY in the .env file\n",
+ " from dotenv import load_dotenv\n",
+ "\n",
+ " load_dotenv()\n",
+ "\n",
+ "assert (\n",
+ " \"UPSTAGE_API_KEY\" in os.environ\n",
+ "), \"Please set the UPSTAGE_API_KEY environment variable\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 6,
+ "metadata": {
+ "id": "s7T8keMECDz8"
+ },
+ "outputs": [],
+ "source": [
+ "from langchain_upstage import UpstageDocumentParseLoader\n",
+ "\n",
+ "\n",
+ "layzer = UpstageDocumentParseLoader(\n",
+ " \"pdfs/kim-tse-2008.pdf\",output_format=\"html\"\n",
+ ")\n",
+ "# For improved memory efficiency, consider using the lazy_load method to load documents page by page.\n",
+ "docs = layzer.load() # or layzer.lazy_load()"
+ ]
+ },
{
- "data": {
- "text/html": [
- "
Classifying Software Changes:
Clean or Buggy?
Sunghun Kim, E. James Whitehead Jr., Member, IEEE, and Yi Zhang, Member, IEEE
Abstract-This paper introduces a new technique for predicting latent software bugs, called change classification. Change
classification uses a machine learning classifier to determine whether a new software change is more similar to prior buggy changes or
clean changes. In this manner, change classification predicts the existence of bugs in software changes. The classifier is trained using
features (in the machine learning sense) extracted from the revision history of a software project stored in its software configuration
management repository. The trained classifier can classify changes as buggy or clean, with a 78 percent accuracy and a 60 percent
buggy change recall on average. Chan"
+ "cell_type": "code",
+ "execution_count": 7,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 297
+ },
+ "id": "y3VMaobxCDz8",
+ "outputId": "7cd611ae-f66e-4fe5-ae60-ceef73233ed0"
+ },
+ "outputs": [
+ {
+ "output_type": "display_data",
+ "data": {
+ "text/plain": [
+ ""
+ ],
+ "text/html": [
+ "IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. 34, NO. 2, MARCH/APRIL 2008
Classifying Software Changes:
Clean or Buggy?
Sunghun Kim, E. James Whitehead Jr., Member, IEEE, and Yi Zhang, Member, IEEE
Abstract—This paper introduces a new technique for predicting latent software bugs, called change classification. Change
classification uses a machine learning classifier to determine whether a new software change is more similar to prior buggy changes or
clean changes. In this manner, change classification predicts the existence of bugs in software changes. The classifier is trained using
features (in the machine learning sense) extracted from the revision history of a software project stored in its software config"
+ ]
+ },
+ "metadata": {}
+ }
],
- "text/plain": [
- ""
+ "source": [
+ "from IPython.display import display, HTML\n",
+ "\n",
+ "display(HTML(docs[0].page_content[:1000]))"
]
- },
- "metadata": {},
- "output_type": "display_data"
- }
- ],
- "source": [
- "from IPython.display import display, HTML\n",
- "\n",
- "display(HTML(docs[0].page_content[:1000]))"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 5,
- "metadata": {},
- "outputs": [],
- "source": [
- "from langchain_core.prompts import PromptTemplate\n",
- "from langchain_core.output_parsers import StrOutputParser\n",
- "from langchain_upstage import ChatUpstage\n",
- "\n",
- "\n",
- "llm = ChatUpstage()\n",
- "\n",
- "prompt_template = PromptTemplate.from_template(\n",
- " \"\"\"\n",
- " Please provide most correct answer from the following context. \n",
- " If the answer is not present in the context, please write \"The information is not present in the context.\"\n",
- " ---\n",
- " Question: {question}\n",
- " ---\n",
- " Context: {Context}\n",
- " \"\"\"\n",
- ")\n",
- "chain = prompt_template | llm | StrOutputParser()"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 6,
- "metadata": {},
- "outputs": [
+ },
{
- "ename": "BadRequestError",
- "evalue": "Error code: 400 - {'error': {'message': \"This model's maximum context length is 32768 tokens. However, your messages resulted in 35940 tokens. Please reduce the length of the messages.\", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}",
- "output_type": "error",
- "traceback": [
- "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
- "\u001b[0;31mBadRequestError\u001b[0m Traceback (most recent call last)",
- "Cell \u001b[0;32mIn[6], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43mchain\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[43m{\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mquestion\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mWhat is bug classficiation?\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mContext\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mdocs\u001b[49m\u001b[43m}\u001b[49m\u001b[43m)\u001b[49m\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_core/runnables/base.py:3022\u001b[0m, in \u001b[0;36mRunnableSequence.invoke\u001b[0;34m(self, input, config, **kwargs)\u001b[0m\n\u001b[1;32m 3020\u001b[0m \u001b[38;5;28minput\u001b[39m \u001b[38;5;241m=\u001b[39m context\u001b[38;5;241m.\u001b[39mrun(step\u001b[38;5;241m.\u001b[39minvoke, \u001b[38;5;28minput\u001b[39m, config, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[1;32m 3021\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m-> 3022\u001b[0m \u001b[38;5;28minput\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[43mcontext\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrun\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstep\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minvoke\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 3023\u001b[0m \u001b[38;5;66;03m# finish the root run\u001b[39;00m\n\u001b[1;32m 3024\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py:284\u001b[0m, in \u001b[0;36mBaseChatModel.invoke\u001b[0;34m(self, input, config, stop, **kwargs)\u001b[0m\n\u001b[1;32m 273\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21minvoke\u001b[39m(\n\u001b[1;32m 274\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 275\u001b[0m \u001b[38;5;28minput\u001b[39m: LanguageModelInput,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 279\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any,\n\u001b[1;32m 280\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m BaseMessage:\n\u001b[1;32m 281\u001b[0m config \u001b[38;5;241m=\u001b[39m ensure_config(config)\n\u001b[1;32m 282\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m cast(\n\u001b[1;32m 283\u001b[0m ChatGeneration,\n\u001b[0;32m--> 284\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgenerate_prompt\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 285\u001b[0m \u001b[43m \u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_convert_input\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 286\u001b[0m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 287\u001b[0m \u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mcallbacks\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 288\u001b[0m \u001b[43m \u001b[49m\u001b[43mtags\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtags\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 289\u001b[0m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmetadata\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 290\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_name\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrun_name\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 291\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_id\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mpop\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mrun_id\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 292\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 293\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mgenerations[\u001b[38;5;241m0\u001b[39m][\u001b[38;5;241m0\u001b[39m],\n\u001b[1;32m 294\u001b[0m )\u001b[38;5;241m.\u001b[39mmessage\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py:784\u001b[0m, in \u001b[0;36mBaseChatModel.generate_prompt\u001b[0;34m(self, prompts, stop, callbacks, **kwargs)\u001b[0m\n\u001b[1;32m 776\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mgenerate_prompt\u001b[39m(\n\u001b[1;32m 777\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 778\u001b[0m prompts: \u001b[38;5;28mlist\u001b[39m[PromptValue],\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 781\u001b[0m \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs: Any,\n\u001b[1;32m 782\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m LLMResult:\n\u001b[1;32m 783\u001b[0m prompt_messages \u001b[38;5;241m=\u001b[39m [p\u001b[38;5;241m.\u001b[39mto_messages() \u001b[38;5;28;01mfor\u001b[39;00m p \u001b[38;5;129;01min\u001b[39;00m prompts]\n\u001b[0;32m--> 784\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgenerate\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprompt_messages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mcallbacks\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py:641\u001b[0m, in \u001b[0;36mBaseChatModel.generate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 639\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n\u001b[1;32m 640\u001b[0m run_managers[i]\u001b[38;5;241m.\u001b[39mon_llm_error(e, response\u001b[38;5;241m=\u001b[39mLLMResult(generations\u001b[38;5;241m=\u001b[39m[]))\n\u001b[0;32m--> 641\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m e\n\u001b[1;32m 642\u001b[0m flattened_outputs \u001b[38;5;241m=\u001b[39m [\n\u001b[1;32m 643\u001b[0m LLMResult(generations\u001b[38;5;241m=\u001b[39m[res\u001b[38;5;241m.\u001b[39mgenerations], llm_output\u001b[38;5;241m=\u001b[39mres\u001b[38;5;241m.\u001b[39mllm_output) \u001b[38;5;66;03m# type: ignore[list-item]\u001b[39;00m\n\u001b[1;32m 644\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m res \u001b[38;5;129;01min\u001b[39;00m results\n\u001b[1;32m 645\u001b[0m ]\n\u001b[1;32m 646\u001b[0m llm_output \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_combine_llm_outputs([res\u001b[38;5;241m.\u001b[39mllm_output \u001b[38;5;28;01mfor\u001b[39;00m res \u001b[38;5;129;01min\u001b[39;00m results])\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py:631\u001b[0m, in \u001b[0;36mBaseChatModel.generate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 628\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m i, m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28menumerate\u001b[39m(messages):\n\u001b[1;32m 629\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 630\u001b[0m results\u001b[38;5;241m.\u001b[39mappend(\n\u001b[0;32m--> 631\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_generate_with_cache\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 632\u001b[0m \u001b[43m \u001b[49m\u001b[43mm\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 633\u001b[0m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 634\u001b[0m \u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrun_managers\u001b[49m\u001b[43m[\u001b[49m\u001b[43mi\u001b[49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mif\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mrun_managers\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01melse\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 635\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 636\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 637\u001b[0m )\n\u001b[1;32m 638\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 639\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py:853\u001b[0m, in \u001b[0;36mBaseChatModel._generate_with_cache\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 851\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 852\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m inspect\u001b[38;5;241m.\u001b[39msignature(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_generate)\u001b[38;5;241m.\u001b[39mparameters\u001b[38;5;241m.\u001b[39mget(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mrun_manager\u001b[39m\u001b[38;5;124m\"\u001b[39m):\n\u001b[0;32m--> 853\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_generate\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 854\u001b[0m \u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrun_manager\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\n\u001b[1;32m 855\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 856\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 857\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_generate(messages, stop\u001b[38;5;241m=\u001b[39mstop, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/langchain_upstage/chat_models.py:236\u001b[0m, in \u001b[0;36mChatUpstage._generate\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 234\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m generate_from_stream(stream_iter)\n\u001b[1;32m 235\u001b[0m payload \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_get_request_payload(messages, stop\u001b[38;5;241m=\u001b[39mstop, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[0;32m--> 236\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mclient\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mcreate\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mpayload\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 237\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_create_chat_result(response)\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/openai/_utils/_utils.py:274\u001b[0m, in \u001b[0;36mrequired_args..inner..wrapper\u001b[0;34m(*args, **kwargs)\u001b[0m\n\u001b[1;32m 272\u001b[0m msg \u001b[38;5;241m=\u001b[39m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mMissing required argument: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mquote(missing[\u001b[38;5;241m0\u001b[39m])\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 273\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m(msg)\n\u001b[0;32m--> 274\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mfunc\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/openai/resources/chat/completions.py:704\u001b[0m, in \u001b[0;36mCompletions.create\u001b[0;34m(self, messages, model, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, n, parallel_tool_calls, presence_penalty, response_format, seed, service_tier, stop, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, extra_headers, extra_query, extra_body, timeout)\u001b[0m\n\u001b[1;32m 668\u001b[0m \u001b[38;5;129m@required_args\u001b[39m([\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mmessages\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mmodel\u001b[39m\u001b[38;5;124m\"\u001b[39m], [\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mmessages\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mmodel\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mstream\u001b[39m\u001b[38;5;124m\"\u001b[39m])\n\u001b[1;32m 669\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mcreate\u001b[39m(\n\u001b[1;32m 670\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 701\u001b[0m timeout: \u001b[38;5;28mfloat\u001b[39m \u001b[38;5;241m|\u001b[39m httpx\u001b[38;5;241m.\u001b[39mTimeout \u001b[38;5;241m|\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m \u001b[38;5;241m|\u001b[39m NotGiven \u001b[38;5;241m=\u001b[39m NOT_GIVEN,\n\u001b[1;32m 702\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m ChatCompletion \u001b[38;5;241m|\u001b[39m Stream[ChatCompletionChunk]:\n\u001b[1;32m 703\u001b[0m validate_response_format(response_format)\n\u001b[0;32m--> 704\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_post\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43m/chat/completions\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mmaybe_transform\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 707\u001b[0m \u001b[43m \u001b[49m\u001b[43m{\u001b[49m\n\u001b[1;32m 708\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmessages\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 709\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmodel\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mmodel\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 710\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mfrequency_penalty\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mfrequency_penalty\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 711\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mfunction_call\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mfunction_call\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 712\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mfunctions\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mfunctions\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 713\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mlogit_bias\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mlogit_bias\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 714\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mlogprobs\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mlogprobs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 715\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmax_completion_tokens\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mmax_completion_tokens\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 716\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mmax_tokens\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mmax_tokens\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 717\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mn\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mn\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 718\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mparallel_tool_calls\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mparallel_tool_calls\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 719\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mpresence_penalty\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mpresence_penalty\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 720\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mresponse_format\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mresponse_format\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 721\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mseed\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mseed\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 722\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mservice_tier\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mservice_tier\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 723\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mstop\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 724\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mstream\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mstream\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 725\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mstream_options\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mstream_options\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 726\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtemperature\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtemperature\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 727\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtool_choice\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtool_choice\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 728\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtools\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtools\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 729\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtop_logprobs\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtop_logprobs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 730\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtop_p\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mtop_p\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 731\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43muser\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43muser\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 732\u001b[0m \u001b[43m \u001b[49m\u001b[43m}\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 733\u001b[0m \u001b[43m \u001b[49m\u001b[43mcompletion_create_params\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mCompletionCreateParams\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 734\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 735\u001b[0m \u001b[43m \u001b[49m\u001b[43moptions\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mmake_request_options\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 736\u001b[0m \u001b[43m \u001b[49m\u001b[43mextra_headers\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mextra_headers\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mextra_query\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mextra_query\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mextra_body\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mextra_body\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout\u001b[49m\n\u001b[1;32m 737\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 738\u001b[0m \u001b[43m \u001b[49m\u001b[43mcast_to\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mChatCompletion\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 739\u001b[0m \u001b[43m \u001b[49m\u001b[43mstream\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstream\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;129;43;01mor\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 740\u001b[0m \u001b[43m \u001b[49m\u001b[43mstream_cls\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mStream\u001b[49m\u001b[43m[\u001b[49m\u001b[43mChatCompletionChunk\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 741\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/openai/_base_client.py:1270\u001b[0m, in \u001b[0;36mSyncAPIClient.post\u001b[0;34m(self, path, cast_to, body, options, files, stream, stream_cls)\u001b[0m\n\u001b[1;32m 1256\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mpost\u001b[39m(\n\u001b[1;32m 1257\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[1;32m 1258\u001b[0m path: \u001b[38;5;28mstr\u001b[39m,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 1265\u001b[0m stream_cls: \u001b[38;5;28mtype\u001b[39m[_StreamT] \u001b[38;5;241m|\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m,\n\u001b[1;32m 1266\u001b[0m ) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m ResponseT \u001b[38;5;241m|\u001b[39m _StreamT:\n\u001b[1;32m 1267\u001b[0m opts \u001b[38;5;241m=\u001b[39m FinalRequestOptions\u001b[38;5;241m.\u001b[39mconstruct(\n\u001b[1;32m 1268\u001b[0m method\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpost\u001b[39m\u001b[38;5;124m\"\u001b[39m, url\u001b[38;5;241m=\u001b[39mpath, json_data\u001b[38;5;241m=\u001b[39mbody, files\u001b[38;5;241m=\u001b[39mto_httpx_files(files), \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39moptions\n\u001b[1;32m 1269\u001b[0m )\n\u001b[0;32m-> 1270\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m cast(ResponseT, \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[43m(\u001b[49m\u001b[43mcast_to\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mopts\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstream\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstream\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstream_cls\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstream_cls\u001b[49m\u001b[43m)\u001b[49m)\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/openai/_base_client.py:947\u001b[0m, in \u001b[0;36mSyncAPIClient.request\u001b[0;34m(self, cast_to, options, remaining_retries, stream, stream_cls)\u001b[0m\n\u001b[1;32m 944\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 945\u001b[0m retries_taken \u001b[38;5;241m=\u001b[39m \u001b[38;5;241m0\u001b[39m\n\u001b[0;32m--> 947\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_request\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 948\u001b[0m \u001b[43m \u001b[49m\u001b[43mcast_to\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mcast_to\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 949\u001b[0m \u001b[43m \u001b[49m\u001b[43moptions\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43moptions\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 950\u001b[0m \u001b[43m \u001b[49m\u001b[43mstream\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstream\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 951\u001b[0m \u001b[43m \u001b[49m\u001b[43mstream_cls\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mstream_cls\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 952\u001b[0m \u001b[43m \u001b[49m\u001b[43mretries_taken\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mretries_taken\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 953\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n",
- "File \u001b[0;32m~/workspace/.venv/lib/python3.9/site-packages/openai/_base_client.py:1051\u001b[0m, in \u001b[0;36mSyncAPIClient._request\u001b[0;34m(self, cast_to, options, retries_taken, stream, stream_cls)\u001b[0m\n\u001b[1;32m 1048\u001b[0m err\u001b[38;5;241m.\u001b[39mresponse\u001b[38;5;241m.\u001b[39mread()\n\u001b[1;32m 1050\u001b[0m log\u001b[38;5;241m.\u001b[39mdebug(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mRe-raising status error\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[0;32m-> 1051\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_make_status_error_from_response(err\u001b[38;5;241m.\u001b[39mresponse) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m 1053\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_process_response(\n\u001b[1;32m 1054\u001b[0m cast_to\u001b[38;5;241m=\u001b[39mcast_to,\n\u001b[1;32m 1055\u001b[0m options\u001b[38;5;241m=\u001b[39moptions,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 1059\u001b[0m retries_taken\u001b[38;5;241m=\u001b[39mretries_taken,\n\u001b[1;32m 1060\u001b[0m )\n",
- "\u001b[0;31mBadRequestError\u001b[0m: Error code: 400 - {'error': {'message': \"This model's maximum context length is 32768 tokens. However, your messages resulted in 35940 tokens. Please reduce the length of the messages.\", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}"
- ]
- }
- ],
- "source": [
- "chain.invoke({\"question\": \"What is bug classficiation?\", \"Context\": docs})"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 7,
- "metadata": {},
- "outputs": [
+ "cell_type": "code",
+ "execution_count": 8,
+ "metadata": {
+ "id": "UsYVFIV_CDz8"
+ },
+ "outputs": [],
+ "source": [
+ "from langchain_core.prompts import PromptTemplate\n",
+ "from langchain_core.output_parsers import StrOutputParser\n",
+ "from langchain_upstage import ChatUpstage\n",
+ "\n",
+ "\n",
+ "llm = ChatUpstage(model=\"solar-pro\")\n",
+ "\n",
+ "prompt_template = PromptTemplate.from_template(\n",
+ " \"\"\"\n",
+ " Please provide most correct answer from the following context.\n",
+ " If the answer is not present in the context, please write \"The information is not present in the context.\"\n",
+ " ---\n",
+ " Question: {question}\n",
+ " ---\n",
+ " Context: {Context}\n",
+ " \"\"\"\n",
+ ")\n",
+ "chain = prompt_template | llm | StrOutputParser()"
+ ]
+ },
{
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "162\n"
- ]
- }
- ],
- "source": [
- "from langchain_community.retrievers import BM25Retriever\n",
- "from langchain_text_splitters import (\n",
- " Language,\n",
- " RecursiveCharacterTextSplitter,\n",
- ")\n",
- "\n",
- "text_splitter = RecursiveCharacterTextSplitter.from_language(\n",
- " chunk_size=1000, chunk_overlap=100, language=Language.HTML\n",
- ")\n",
- "splits = text_splitter.split_documents(docs)\n",
- "print(len(splits))\n",
- "\n"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 14,
- "metadata": {},
- "outputs": [
+ "cell_type": "code",
+ "source": [
+ "# Example of \"Large language models (LLMs) have a limited context size.\"\"\n",
+ "chain.invoke({\"question\": \"What is bug classficiation?\", \"Context\": docs})"
+ ],
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 339
+ },
+ "id": "0P0LsEXVFOl_",
+ "outputId": "8eb3ad3f-b107-4b14-c1c6-9a11b5350c3b"
+ },
+ "execution_count": 66,
+ "outputs": [
+ {
+ "output_type": "error",
+ "ename": "BadRequestError",
+ "evalue": "Error code: 400 - {'error': {'message': \"This model's maximum context length is 4096 tokens. However, your messages resulted in 52377 tokens. Please reduce the length of the messages.\", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}",
+ "traceback": [
+ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
+ "\u001b[0;31mBadRequestError\u001b[0m Traceback (most recent call last)",
+ "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0mchain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minvoke\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m{\u001b[0m\u001b[0;34m\"question\"\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;34m\"What is bug classficiation?\"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m\"Context\"\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mdocs\u001b[0m\u001b[0;34m}\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_core/runnables/base.py\u001b[0m in \u001b[0;36minvoke\u001b[0;34m(self, input, config, **kwargs)\u001b[0m\n\u001b[1;32m 3020\u001b[0m \u001b[0minput\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcontext\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrun\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstep\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minvoke\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconfig\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3021\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 3022\u001b[0;31m \u001b[0minput\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcontext\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrun\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstep\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minvoke\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mconfig\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 3023\u001b[0m \u001b[0;31m# finish the root run\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3024\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mBaseException\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_core/language_models/chat_models.py\u001b[0m in \u001b[0;36minvoke\u001b[0;34m(self, input, config, stop, **kwargs)\u001b[0m\n\u001b[1;32m 282\u001b[0m return cast(\n\u001b[1;32m 283\u001b[0m \u001b[0mChatGeneration\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 284\u001b[0;31m self.generate_prompt(\n\u001b[0m\u001b[1;32m 285\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_convert_input\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 286\u001b[0m \u001b[0mstop\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstop\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_core/language_models/chat_models.py\u001b[0m in \u001b[0;36mgenerate_prompt\u001b[0;34m(self, prompts, stop, callbacks, **kwargs)\u001b[0m\n\u001b[1;32m 782\u001b[0m ) -> LLMResult:\n\u001b[1;32m 783\u001b[0m \u001b[0mprompt_messages\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mto_messages\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mp\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprompts\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 784\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgenerate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprompt_messages\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstop\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcallbacks\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcallbacks\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 785\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 786\u001b[0m async def agenerate_prompt(\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_core/language_models/chat_models.py\u001b[0m in \u001b[0;36mgenerate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 639\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrun_managers\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 640\u001b[0m \u001b[0mrun_managers\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mon_llm_error\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0me\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mresponse\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mLLMResult\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgenerations\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 641\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 642\u001b[0m flattened_outputs = [\n\u001b[1;32m 643\u001b[0m \u001b[0mLLMResult\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mgenerations\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mres\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgenerations\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mllm_output\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mres\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mllm_output\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;31m# type: ignore[list-item]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_core/language_models/chat_models.py\u001b[0m in \u001b[0;36mgenerate\u001b[0;34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[0m\n\u001b[1;32m 629\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 630\u001b[0m results.append(\n\u001b[0;32m--> 631\u001b[0;31m self._generate_with_cache(\n\u001b[0m\u001b[1;32m 632\u001b[0m \u001b[0mm\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 633\u001b[0m \u001b[0mstop\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstop\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_core/language_models/chat_models.py\u001b[0m in \u001b[0;36m_generate_with_cache\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 848\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 849\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0minspect\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msignature\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_generate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mparameters\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mget\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"run_manager\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 850\u001b[0;31m result = self._generate(\n\u001b[0m\u001b[1;32m 851\u001b[0m \u001b[0mmessages\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstop\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrun_manager\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mrun_manager\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 852\u001b[0m )\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/langchain_upstage/chat_models.py\u001b[0m in \u001b[0;36m_generate\u001b[0;34m(self, messages, stop, run_manager, **kwargs)\u001b[0m\n\u001b[1;32m 234\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mgenerate_from_stream\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstream_iter\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 235\u001b[0m \u001b[0mpayload\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_get_request_payload\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmessages\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstop\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 236\u001b[0;31m \u001b[0mresponse\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclient\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcreate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m**\u001b[0m\u001b[0mpayload\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 237\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_create_chat_result\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mresponse\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 238\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py\u001b[0m in \u001b[0;36mwrapper\u001b[0;34m(*args, **kwargs)\u001b[0m\n\u001b[1;32m 272\u001b[0m \u001b[0mmsg\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34mf\"Missing required argument: {quote(missing[0])}\"\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 273\u001b[0m \u001b[0;32mraise\u001b[0m \u001b[0mTypeError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmsg\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 274\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mfunc\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 275\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 276\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mwrapper\u001b[0m \u001b[0;31m# type: ignore\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py\u001b[0m in \u001b[0;36mcreate\u001b[0;34m(self, messages, model, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, n, parallel_tool_calls, presence_penalty, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, extra_headers, extra_query, extra_body, timeout)\u001b[0m\n\u001b[1;32m 740\u001b[0m ) -> ChatCompletion | Stream[ChatCompletionChunk]:\n\u001b[1;32m 741\u001b[0m \u001b[0mvalidate_response_format\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mresponse_format\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 742\u001b[0;31m return self._post(\n\u001b[0m\u001b[1;32m 743\u001b[0m \u001b[0;34m\"/chat/completions\"\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 744\u001b[0m body=maybe_transform(\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/openai/_base_client.py\u001b[0m in \u001b[0;36mpost\u001b[0;34m(self, path, cast_to, body, options, files, stream, stream_cls)\u001b[0m\n\u001b[1;32m 1275\u001b[0m \u001b[0mmethod\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m\"post\"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0murl\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mpath\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mjson_data\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mbody\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfiles\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mto_httpx_files\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfiles\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0moptions\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1276\u001b[0m )\n\u001b[0;32m-> 1277\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mcast\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mResponseT\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrequest\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcast_to\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mopts\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstream\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstream\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstream_cls\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mstream_cls\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1278\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1279\u001b[0m def patch(\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/openai/_base_client.py\u001b[0m in \u001b[0;36mrequest\u001b[0;34m(self, cast_to, options, remaining_retries, stream, stream_cls)\u001b[0m\n\u001b[1;32m 952\u001b[0m \u001b[0mretries_taken\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 953\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 954\u001b[0;31m return self._request(\n\u001b[0m\u001b[1;32m 955\u001b[0m \u001b[0mcast_to\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcast_to\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 956\u001b[0m \u001b[0moptions\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0moptions\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
+ "\u001b[0;32m/usr/local/lib/python3.10/dist-packages/openai/_base_client.py\u001b[0m in \u001b[0;36m_request\u001b[0;34m(self, cast_to, options, retries_taken, stream, stream_cls)\u001b[0m\n\u001b[1;32m 1056\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1057\u001b[0m \u001b[0mlog\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdebug\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Re-raising status error\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1058\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_make_status_error_from_response\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0merr\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mresponse\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1059\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1060\u001b[0m return self._process_response(\n",
+ "\u001b[0;31mBadRequestError\u001b[0m: Error code: 400 - {'error': {'message': \"This model's maximum context length is 4096 tokens. However, your messages resulted in 52377 tokens. Please reduce the length of the messages.\", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}"
+ ]
+ }
+ ]
+ },
{
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.\n"
- ]
+ "cell_type": "code",
+ "execution_count": 67,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "bNjKIRfICDz9",
+ "outputId": "70470d4e-5696-4d4e-eb97-7924a2016b4f"
+ },
+ "outputs": [
+ {
+ "output_type": "stream",
+ "name": "stdout",
+ "text": [
+ "177\n"
+ ]
+ }
+ ],
+ "source": [
+ "from transformers import AutoTokenizer\n",
+ "from langchain.text_splitter import TokenTextSplitter\n",
+ "\n",
+ "solar_tokenizer = AutoTokenizer.from_pretrained(\"upstage/solar-pro-preview-instruct\")\n",
+ "\n",
+ "token_splitter = TokenTextSplitter.from_huggingface_tokenizer(\n",
+ " solar_tokenizer, chunk_size=250, chunk_overlap=100\n",
+ ")\n",
+ "\n",
+ "splits = token_splitter.split_documents(docs)\n",
+ "print(len(splits))"
+ ]
},
{
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "210\n"
- ]
- }
- ],
- "source": [
- "from transformers import AutoTokenizer\n",
- "from langchain.text_splitter import TokenTextSplitter\n",
- "\n",
- "solar_tokenizer = AutoTokenizer.from_pretrained(\"upstage/solar-pro-preview-instruct\")\n",
- "\n",
- "token_splitter = TokenTextSplitter.from_huggingface_tokenizer(\n",
- " solar_tokenizer, chunk_size=250, chunk_overlap=100\n",
- ")\n",
- "\n",
- "splits = token_splitter.split_documents(docs)\n",
- "print(len(splits))"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [
+ "cell_type": "code",
+ "execution_count": 93,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "TD3--Am6CDz9",
+ "outputId": "55254027-5489-4b87-9c72-2b442594b2f9"
+ },
+ "outputs": [
+ {
+ "output_type": "execute_result",
+ "data": {
+ "text/plain": [
+ "Document(metadata={'total_pages': 16, 'coordinates': [[{'x': 0.0396, 'y': 0.0561}, {'x': 0.5623, 'y': 0.0561}, {'x': 0.5623, 'y': 0.0688}, {'x': 0.0396, 'y': 0.0688}], [{'x': 0.8614, 'y': 0.0571}, {'x': 0.8826, 'y': 0.0571}, {'x': 0.8826, 'y': 0.0675}, {'x': 0.8614, 'y': 0.0675}], [{'x': 0.1885, 'y': 0.0897}, {'x': 0.7413, 'y': 0.0897}, {'x': 0.7413, 'y': 0.1589}, {'x': 0.1885, 'y': 0.1589}], [{'x': 0.1016, 'y': 0.1696}, {'x': 0.824, 'y': 0.1696}, {'x': 0.824, 'y': 0.1877}, {'x': 0.1016, 'y': 0.1877}], [{'x': 0.0787, 'y': 0.2091}, {'x': 0.8515, 'y': 0.2091}, {'x': 0.8515, 'y': 0.356}, {'x': 0.0787, 'y': 0.356}], [{'x': 0.0785, 'y': 0.3693}, {'x': 0.8468, 'y': 0.3693}, {'x': 0.8468, 'y': 0.3972}, {'x': 0.0785, 'y': 0.3972}], [{'x': 0.4545, 'y': 0.3997}, {'x': 0.4711, 'y': 0.3997}, {'x': 0.4711, 'y': 0.417}, {'x': 0.4545, 'y': 0.417}], [{'x': 0.0391, 'y': 0.4394}, {'x': 0.1883, 'y': 0.4394}, {'x': 0.1883, 'y': 0.4553}, {'x': 0.0391, 'y': 0.4553}], [{'x': 0.04, 'y': 0.4603}, {'x': 0.4557, 'y': 0.4603}, {'x': 0.4557, 'y': 0.6351}, {'x': 0.04, 'y': 0.6351}], [{'x': 0.0391, 'y': 0.6345}, {'x': 0.4558, 'y': 0.6345}, {'x': 0.4558, 'y': 0.7753}, {'x': 0.0391, 'y': 0.7753}], [{'x': 0.0389, 'y': 0.7759}, {'x': 0.4554, 'y': 0.7759}, {'x': 0.4554, 'y': 0.8224}, {'x': 0.0389, 'y': 0.8224}], [{'x': 0.0396, 'y': 0.846}, {'x': 0.4531, 'y': 0.846}, {'x': 0.4531, 'y': 0.8922}, {'x': 0.0396, 'y': 0.8922}], [{'x': 0.0402, 'y': 0.8971}, {'x': 0.4519, 'y': 0.8971}, {'x': 0.4519, 'y': 0.966}, {'x': 0.0402, 'y': 0.966}], [{'x': 0.4688, 'y': 0.4602}, {'x': 0.8862, 'y': 0.4602}, {'x': 0.8862, 'y': 0.5788}, {'x': 0.4688, 'y': 0.5788}], [{'x': 0.4696, 'y': 0.5797}, {'x': 0.887, 'y': 0.5797}, {'x': 0.887, 'y': 0.6976}, {'x': 0.4696, 'y': 0.6976}], [{'x': 0.4694, 'y': 0.6979}, {'x': 0.887, 'y': 0.6979}, {'x': 0.887, 'y': 0.8758}, {'x': 0.4694, 'y': 0.8758}], [{'x': 0.4698, 'y': 0.8775}, {'x': 0.8873, 'y': 0.8775}, {'x': 0.8873, 'y': 0.967}, {'x': 0.4698, 'y': 0.967}], [{'x': 0.2956, 'y': 0.9728}, {'x': 0.4512, 'y': 0.9728}, {'x': 0.4512, 'y': 0.9832}, {'x': 0.2956, 'y': 0.9832}], [{'x': 0.4711, 'y': 0.9732}, {'x': 0.6547, 'y': 0.9732}, {'x': 0.6547, 'y': 0.9833}, {'x': 0.4711, 'y': 0.9833}], [{'x': 0.0401, 'y': 0.0568}, {'x': 0.063, 'y': 0.0568}, {'x': 0.063, 'y': 0.0677}, {'x': 0.0401, 'y': 0.0677}], [{'x': 0.3635, 'y': 0.0564}, {'x': 0.8858, 'y': 0.0564}, {'x': 0.8858, 'y': 0.0686}, {'x': 0.3635, 'y': 0.0686}], [{'x': 0.0396, 'y': 0.0839}, {'x': 0.4572, 'y': 0.0839}, {'x': 0.4572, 'y': 0.2735}, {'x': 0.0396, 'y': 0.2735}], [{'x': 0.0616, 'y': 0.2729}, {'x': 0.3881, 'y': 0.2729}, {'x': 0.3881, 'y': 0.287}, {'x': 0.0616, 'y': 0.287}], [{'x': 0.0393, 'y': 0.2854}, {'x': 0.4571, 'y': 0.2854}, {'x': 0.4571, 'y': 0.3436}, {'x': 0.0393, 'y': 0.3436}], [{'x': 0.0396, 'y': 0.344}, {'x': 0.4553, 'y': 0.344}, {'x': 0.4553, 'y': 0.5035}, {'x': 0.0396, 'y': 0.5035}], [{'x': 0.0392, 'y': 0.5042}, {'x': 0.4562, 'y': 0.5042}, {'x': 0.4562, 'y': 0.5768}, {'x': 0.0392, 'y': 0.5768}], [{'x': 0.0393, 'y': 0.5774}, {'x': 0.4553, 'y': 0.5774}, {'x': 0.4553, 'y': 0.6771}, {'x': 0.0393, 'y': 0.6771}], [{'x': 0.0401, 'y': 0.6779}, {'x': 0.4561, 'y': 0.6779}, {'x': 0.4561, 'y': 0.8668}, {'x': 0.0401, 'y': 0.8668}], [{'x': 0.0388, 'y': 0.8858}, {'x': 0.2034, 'y': 0.8858}, {'x': 0.2034, 'y': 0.9025}, {'x': 0.0388, 'y': 0.9025}], [{'x': 0.0399, 'y': 0.9083}, {'x': 0.4559, 'y': 0.9083}, {'x': 0.4559, 'y': 0.9675}, {'x': 0.0399, 'y': 0.9675}], [{'x': 0.4716, 'y': 0.0826}, {'x': 0.8383, 'y': 0.0826}, {'x': 0.8383, 'y': 0.0973}, {'x': 0.4716, 'y': 0.0973}], [{'x': 0.4684, 'y': 0.1}, {'x': 0.8874, 'y': 0.1}, {'x': 0.8874, 'y': 0.3919}, {'x': 0.4684, 'y': 0.3919}], [{'x': 0.4705, 'y': 0.393}, {'x': 0.8883, 'y': 0.393}, {'x': 0.8883, 'y': 0.775}, {'x': 0.4705, 'y': 0.775}], [{'x': 0.4693, 'y': 0.7761}, {'x': 0.8867, 'y': 0.7761}, {'x': 0.8867, 'y': 0.9662}, {'x': 0.4693, 'y': 0.9662}], [{'x': 0.0389, 'y': 0.0564}, {'x': 0.4333, 'y': 0.0564}, {'x': 0.4333, 'y': 0.0682}, {'x': 0.0389, 'y': 0.0682}], [{'x': 0.8619, 'y': 0.0571}, {'x': 0.8838, 'y': 0.0571}, {'x': 0.8838, 'y': 0.0677}, {'x': 0.8619, 'y': 0.0677}], [{'x': 0.0393, 'y': 0.083}, {'x': 0.4563, 'y': 0.083}, {'x': 0.4563, 'y': 0.4168}, {'x': 0.0393, 'y': 0.4168}], [{'x': 0.0395, 'y': 0.4175}, {'x': 0.4558, 'y': 0.4175}, {'x': 0.4558, 'y': 0.5763}, {'x': 0.0395, 'y': 0.5763}], [{'x': 0.0398, 'y': 0.5777}, {'x': 0.4563, 'y': 0.5777}, {'x': 0.4563, 'y': 0.7369}, {'x': 0.0398, 'y': 0.7369}], [{'x': 0.0396, 'y': 0.7374}, {'x': 0.4562, 'y': 0.7374}, {'x': 0.4562, 'y': 0.8097}, {'x': 0.0396, 'y': 0.8097}], [{'x': 0.0395, 'y': 0.8177}, {'x': 0.263, 'y': 0.8177}, {'x': 0.263, 'y': 0.8334}, {'x': 0.0395, 'y': 0.8334}], [{'x': 0.0398, 'y': 0.8356}, {'x': 0.456, 'y': 0.8356}, {'x': 0.456, 'y': 0.8645}, {'x': 0.0398, 'y': 0.8645}], [{'x': 0.0395, 'y': 0.8517}, {'x': 0.4566, 'y': 0.8517}, {'x': 0.4566, 'y': 0.966}, {'x': 0.0395, 'y': 0.966}], [{'x': 0.4688, 'y': 0.084}, {'x': 0.8857, 'y': 0.084}, {'x': 0.8857, 'y': 0.2285}, {'x': 0.4688, 'y': 0.2285}], [{'x': 0.4711, 'y': 0.2294}, {'x': 0.8875, 'y': 0.2294}, {'x': 0.8875, 'y': 0.3021}, {'x': 0.4711, 'y': 0.3021}], [{'x': 0.4713, 'y': 0.3101}, {'x': 0.8529, 'y': 0.3101}, {'x': 0.8529, 'y': 0.3392}, {'x': 0.4713, 'y': 0.3392}], [{'x': 0.4694, 'y': 0.3415}, {'x': 0.8857, 'y': 0.3415}, {'x': 0.8857, 'y': 0.4572}, {'x': 0.4694, 'y': 0.4572}], [{'x': 0.4697, 'y': 0.4586}, {'x': 0.8871, 'y': 0.4586}, {'x': 0.8871, 'y': 0.5447}, {'x': 0.4697, 'y': 0.5447}], [{'x': 0.4689, 'y': 0.5459}, {'x': 0.8863, 'y': 0.5459}, {'x': 0.8863, 'y': 0.8348}, {'x': 0.4689, 'y': 0.8348}], [{'x': 0.469, 'y': 0.8361}, {'x': 0.8867, 'y': 0.8361}, {'x': 0.8867, 'y': 0.9668}, {'x': 0.469, 'y': 0.9668}], [{'x': 0.3635, 'y': 0.0561}, {'x': 0.8856, 'y': 0.0561}, {'x': 0.8856, 'y': 0.0688}, {'x': 0.3635, 'y': 0.0688}], [{'x': 0.0407, 'y': 0.0848}, {'x': 0.4581, 'y': 0.0848}, {'x': 0.4581, 'y': 0.1418}, {'x': 0.0407, 'y': 0.1418}], [{'x': 0.0396, 'y': 0.1509}, {'x': 0.2284, 'y': 0.1509}, {'x': 0.2284, 'y': 0.1657}, {'x': 0.0396, 'y': 0.1657}], [{'x': 0.0402, 'y': 0.1684}, {'x': 0.4568, 'y': 0.1684}, {'x': 0.4568, 'y': 0.3297}, {'x': 0.0402, 'y': 0.3297}], [{'x': 0.0396, 'y': 0.3376}, {'x': 0.1581, 'y': 0.3376}, {'x': 0.1581, 'y': 0.353}, {'x': 0.0396, 'y': 0.353}], [{'x': 0.0407, 'y': 0.3561}, {'x': 0.4565, 'y': 0.3561}, {'x': 0.4565, 'y': 0.3839}, {'x': 0.0407, 'y': 0.3839}], [{'x': 0.0395, 'y': 0.3831}, {'x': 0.4571, 'y': 0.3831}, {'x': 0.4571, 'y': 0.5142}, {'x': 0.0395, 'y': 0.5142}], [{'x': 0.0395, 'y': 0.5144}, {'x': 0.4564, 'y': 0.5144}, {'x': 0.4564, 'y': 0.6599}, {'x': 0.0395, 'y': 0.6599}], [{'x': 0.0396, 'y': 0.6606}, {'x': 0.4557, 'y': 0.6606}, {'x': 0.4557, 'y': 0.7758}, {'x': 0.0396, 'y': 0.7758}], [{'x': 0.0395, 'y': 0.7776}, {'x': 0.4562, 'y': 0.7776}, {'x': 0.4562, 'y': 0.878}, {'x': 0.0395, 'y': 0.878}], [{'x': 0.0398, 'y': 0.8979}, {'x': 0.4261, 'y': 0.8979}, {'x': 0.4261, 'y': 0.9313}, {'x': 0.0398, 'y': 0.9313}], [{'x': 0.0393, 'y': 0.9372}, {'x': 0.4562, 'y': 0.9372}, {'x': 0.4562, 'y': 0.9675}, {'x': 0.0393, 'y': 0.9675}], [{'x': 0.4714, 'y': 0.0847}, {'x': 0.605, 'y': 0.0847}, {'x': 0.605, 'y': 0.0988}, {'x': 0.4714, 'y': 0.0988}], [{'x': 0.4898, 'y': 0.1088}, {'x': 0.8893, 'y': 0.1088}, {'x': 0.8893, 'y': 0.371}, {'x': 0.4898, 'y': 0.371}], [{'x': 0.5176, 'y': 0.3698}, {'x': 0.8876, 'y': 0.3698}, {'x': 0.8876, 'y': 0.4143}, {'x': 0.5176, 'y': 0.4143}], [{'x': 0.4704, 'y': 0.4233}, {'x': 0.5765, 'y': 0.4233}, {'x': 0.5765, 'y': 0.438}, {'x': 0.4704, 'y': 0.438}], [{'x': 0.4891, 'y': 0.447}, {'x': 0.8882, 'y': 0.447}, {'x': 0.8882, 'y': 0.5652}, {'x': 0.4891, 'y': 0.5652}], [{'x': 0.4698, 'y': 0.5777}, {'x': 0.8867, 'y': 0.5777}, {'x': 0.8867, 'y': 0.746}, {'x': 0.4698, 'y': 0.746}], [{'x': 0.4706, 'y': 0.755}, {'x': 0.7289, 'y': 0.755}, {'x': 0.7289, 'y': 0.7704}, {'x': 0.4706, 'y': 0.7704}], [{'x': 0.4891, 'y': 0.7799}, {'x': 0.8882, 'y': 0.7799}, {'x': 0.8882, 'y': 0.9548}, {'x': 0.4891, 'y': 0.9548}], [{'x': 0.0388, 'y': 0.056}, {'x': 0.4337, 'y': 0.056}, {'x': 0.4337, 'y': 0.0688}, {'x': 0.0388, 'y': 0.0688}], [{'x': 0.8618, 'y': 0.0571}, {'x': 0.8842, 'y': 0.0571}, {'x': 0.8842, 'y': 0.0677}, {'x': 0.8618, 'y': 0.0677}], [{'x': 0.2799, 'y': 0.0881}, {'x': 0.6477, 'y': 0.0881}, {'x': 0.6477, 'y': 0.117}, {'x': 0.2799, 'y': 0.117}], [{'x': 0.0431, 'y': 0.3303}, {'x': 0.16, 'y': 0.3303}, {'x': 0.16, 'y': 0.344}, {'x': 0.0431, 'y': 0.344}], [{'x': 0.4137, 'y': 0.3513}, {'x': 0.5111, 'y': 0.3513}, {'x': 0.5111, 'y': 0.3799}, {'x': 0.4137, 'y': 0.3799}], [{'x': 0.04, 'y': 0.6206}, {'x': 0.3706, 'y': 0.6206}, {'x': 0.3706, 'y': 0.6362}, {'x': 0.04, 'y': 0.6362}], [{'x': 0.0594, 'y': 0.6443}, {'x': 0.4568, 'y': 0.6443}, {'x': 0.4568, 'y': 0.79}, {'x': 0.0594, 'y': 0.79}], [{'x': 0.0405, 'y': 0.8055}, {'x': 0.2618, 'y': 0.8055}, {'x': 0.2618, 'y': 0.8224}, {'x': 0.0405, 'y': 0.8224}], [{'x': 0.0382, 'y': 0.8255}, {'x': 0.2912, 'y': 0.8255}, {'x': 0.2912, 'y': 0.8418}, {'x': 0.0382, 'y': 0.8418}], [{'x': 0.0397, 'y': 0.8425}, {'x': 0.4565, 'y': 0.8425}, {'x': 0.4565, 'y': 0.9674}, {'x': 0.0397, 'y': 0.9674}], [{'x': 0.4696, 'y': 0.6202}, {'x': 0.8881, 'y': 0.6202}, {'x': 0.8881, 'y': 0.677}, {'x': 0.4696, 'y': 0.677}], [{'x': 0.4687, 'y': 0.6784}, {'x': 0.8871, 'y': 0.6784}, {'x': 0.8871, 'y': 0.9068}, {'x': 0.4687, 'y': 0.9068}], [{'x': 0.4709, 'y': 0.9074}, {'x': 0.8883, 'y': 0.9074}, {'x': 0.8883, 'y': 0.9677}, {'x': 0.4709, 'y': 0.9677}], [{'x': 0.0402, 'y': 0.0569}, {'x': 0.0626, 'y': 0.0569}, {'x': 0.0626, 'y': 0.0679}, {'x': 0.0402, 'y': 0.0679}], [{'x': 0.3642, 'y': 0.0561}, {'x': 0.886, 'y': 0.0561}, {'x': 0.886, 'y': 0.0688}, {'x': 0.3642, 'y': 0.0688}], [{'x': 0.0575, 'y': 0.0896}, {'x': 0.4392, 'y': 0.0896}, {'x': 0.4392, 'y': 0.1167}, {'x': 0.0575, 'y': 0.1167}], [{'x': 0.0428, 'y': 0.2975}, {'x': 0.2962, 'y': 0.2975}, {'x': 0.2962, 'y': 0.3104}, {'x': 0.0428, 'y': 0.3104}], [{'x': 0.0399, 'y': 0.3274}, {'x': 0.4563, 'y': 0.3274}, {'x': 0.4563, 'y': 0.4536}, {'x': 0.0399, 'y': 0.4536}], [{'x': 0.0405, 'y': 0.4626}, {'x': 0.3744, 'y': 0.4626}, {'x': 0.3744, 'y': 0.4782}, {'x': 0.0405, 'y': 0.4782}], [{'x': 0.04, 'y': 0.4789}, {'x': 0.4554, 'y': 0.4789}, {'x': 0.4554, 'y': 0.7012}, {'x': 0.04, 'y': 0.7012}], [{'x': 0.0401, 'y': 0.7022}, {'x': 0.457, 'y': 0.7022}, {'x': 0.457, 'y': 0.8975}, {'x': 0.0401, 'y': 0.8975}], [{'x': 0.0392, 'y': 0.8962}, {'x': 0.4559, 'y': 0.8962}, {'x': 0.4559, 'y': 0.9668}, {'x': 0.0392, 'y': 0.9668}], [{'x': 0.47, 'y': 0.4427}, {'x': 0.887, 'y': 0.4427}, {'x': 0.887, 'y': 0.5277}, {'x': 0.47, 'y': 0.5277}], [{'x': 0.4698, 'y': 0.5284}, {'x': 0.8863, 'y': 0.5284}, {'x': 0.8863, 'y': 0.6269}, {'x': 0.4698, 'y': 0.6269}], [{'x': 0.4721, 'y': 0.6281}, {'x': 0.8871, 'y': 0.6281}, {'x': 0.8871, 'y': 0.6572}, {'x': 0.4721, 'y': 0.6572}], [{'x': 0.489, 'y': 0.6678}, {'x': 0.8883, 'y': 0.6678}, {'x': 0.8883, 'y': 0.9177}, {'x': 0.489, 'y': 0.9177}], [{'x': 0.4722, 'y': 0.9233}, {'x': 0.8883, 'y': 0.9233}, {'x': 0.8883, 'y': 0.9679}, {'x': 0.4722, 'y': 0.9679}], [{'x': 0.0391, 'y': 0.0566}, {'x': 0.4329, 'y': 0.0566}, {'x': 0.4329, 'y': 0.0682}, {'x': 0.0391, 'y': 0.0682}], [{'x': 0.862, 'y': 0.0572}, {'x': 0.8842, 'y': 0.0572}, {'x': 0.8842, 'y': 0.0677}, {'x': 0.862, 'y': 0.0677}], [{'x': 0.0402, 'y': 0.0834}, {'x': 0.4553, 'y': 0.0834}, {'x': 0.4553, 'y': 0.1256}, {'x': 0.0402, 'y': 0.1256}], [{'x': 0.0399, 'y': 0.126}, {'x': 0.4562, 'y': 0.126}, {'x': 0.4562, 'y': 0.4065}, {'x': 0.0399, 'y': 0.4065}], [{'x': 0.0402, 'y': 0.4152}, {'x': 0.228, 'y': 0.4152}, {'x': 0.228, 'y': 0.43}, {'x': 0.0402, 'y': 0.43}], [{'x': 0.0396, 'y': 0.4332}, {'x': 0.4562, 'y': 0.4332}, {'x': 0.4562, 'y': 0.5027}, {'x': 0.0396, 'y': 0.5027}], [{'x': 0.0393, 'y': 0.5029}, {'x': 0.4557, 'y': 0.5029}, {'x': 0.4557, 'y': 0.6566}, {'x': 0.0393, 'y': 0.6566}], [{'x': 0.042, 'y': 0.6683}, {'x': 0.4022, 'y': 0.6683}, {'x': 0.4022, 'y': 0.683}, {'x': 0.042, 'y': 0.683}], [{'x': 0.0402, 'y': 0.684}, {'x': 0.4559, 'y': 0.684}, {'x': 0.4559, 'y': 0.7984}, {'x': 0.0402, 'y': 0.7984}], [{'x': 0.0405, 'y': 0.8089}, {'x': 0.325, 'y': 0.8089}, {'x': 0.325, 'y': 0.8239}, {'x': 0.0405, 'y': 0.8239}], [{'x': 0.04, 'y': 0.8259}, {'x': 0.4566, 'y': 0.8259}, {'x': 0.4566, 'y': 0.9668}, {'x': 0.04, 'y': 0.9668}], [{'x': 0.4704, 'y': 0.0836}, {'x': 0.8872, 'y': 0.0836}, {'x': 0.8872, 'y': 0.1417}, {'x': 0.4704, 'y': 0.1417}], [{'x': 0.4705, 'y': 0.1535}, {'x': 0.8739, 'y': 0.1535}, {'x': 0.8739, 'y': 0.1852}, {'x': 0.4705, 'y': 0.1852}], [{'x': 0.4689, 'y': 0.1865}, {'x': 0.8866, 'y': 0.1865}, {'x': 0.8866, 'y': 0.3333}, {'x': 0.4689, 'y': 0.3333}], [{'x': 0.4697, 'y': 0.3344}, {'x': 0.8861, 'y': 0.3344}, {'x': 0.8861, 'y': 0.4812}, {'x': 0.4697, 'y': 0.4812}], [{'x': 0.4691, 'y': 0.4816}, {'x': 0.8855, 'y': 0.4816}, {'x': 0.8855, 'y': 0.6567}, {'x': 0.4691, 'y': 0.6567}], [{'x': 0.469, 'y': 0.6581}, {'x': 0.8862, 'y': 0.6581}, {'x': 0.8862, 'y': 0.8164}, {'x': 0.469, 'y': 0.8164}], [{'x': 0.469, 'y': 0.8168}, {'x': 0.8868, 'y': 0.8168}, {'x': 0.8868, 'y': 0.9662}, {'x': 0.469, 'y': 0.9662}], [{'x': 0.0403, 'y': 0.0569}, {'x': 0.0627, 'y': 0.0569}, {'x': 0.0627, 'y': 0.0678}, {'x': 0.0403, 'y': 0.0678}], [{'x': 0.3648, 'y': 0.0563}, {'x': 0.8856, 'y': 0.0563}, {'x': 0.8856, 'y': 0.0684}, {'x': 0.3648, 'y': 0.0684}], [{'x': 0.0411, 'y': 0.267}, {'x': 0.4426, 'y': 0.267}, {'x': 0.4426, 'y': 0.2801}, {'x': 0.0411, 'y': 0.2801}], [{'x': 0.0413, 'y': 0.3004}, {'x': 0.3042, 'y': 0.3004}, {'x': 0.3042, 'y': 0.3154}, {'x': 0.0413, 'y': 0.3154}], [{'x': 0.0401, 'y': 0.3176}, {'x': 0.4558, 'y': 0.3176}, {'x': 0.4558, 'y': 0.4631}, {'x': 0.0401, 'y': 0.4631}], [{'x': 0.0396, 'y': 0.4817}, {'x': 0.4478, 'y': 0.4817}, {'x': 0.4478, 'y': 0.5152}, {'x': 0.0396, 'y': 0.5152}], [{'x': 0.0396, 'y': 0.5199}, {'x': 0.4562, 'y': 0.5199}, {'x': 0.4562, 'y': 0.7092}, {'x': 0.0396, 'y': 0.7092}], [{'x': 0.0414, 'y': 0.718}, {'x': 0.3824, 'y': 0.718}, {'x': 0.3824, 'y': 0.7335}, {'x': 0.0414, 'y': 0.7335}], [{'x': 0.0401, 'y': 0.736}, {'x': 0.4572, 'y': 0.736}, {'x': 0.4572, 'y': 0.8654}, {'x': 0.0401, 'y': 0.8654}], [{'x': 0.0392, 'y': 0.8756}, {'x': 0.2882, 'y': 0.8756}, {'x': 0.2882, 'y': 0.8909}, {'x': 0.0392, 'y': 0.8909}], [{'x': 0.0398, 'y': 0.893}, {'x': 0.4558, 'y': 0.893}, {'x': 0.4558, 'y': 0.9655}, {'x': 0.0398, 'y': 0.9655}], [{'x': 0.4701, 'y': 0.2971}, {'x': 0.8544, 'y': 0.2971}, {'x': 0.8544, 'y': 0.3256}, {'x': 0.4701, 'y': 0.3256}], [{'x': 0.4689, 'y': 0.329}, {'x': 0.8866, 'y': 0.329}, {'x': 0.8866, 'y': 0.4819}, {'x': 0.4689, 'y': 0.4819}], [{'x': 0.4693, 'y': 0.4825}, {'x': 0.8871, 'y': 0.4825}, {'x': 0.8871, 'y': 0.649}, {'x': 0.4693, 'y': 0.649}], [{'x': 0.4705, 'y': 0.6499}, {'x': 0.8886, 'y': 0.6499}, {'x': 0.8886, 'y': 0.707}, {'x': 0.4705, 'y': 0.707}], [{'x': 0.5376, 'y': 0.7146}, {'x': 0.8149, 'y': 0.7146}, {'x': 0.8149, 'y': 0.7468}, {'x': 0.5376, 'y': 0.7468}], [{'x': 0.4688, 'y': 0.7518}, {'x': 0.8878, 'y': 0.7518}, {'x': 0.8878, 'y': 0.8506}, {'x': 0.4688, 'y': 0.8506}], [{'x': 0.528, 'y': 0.8568}, {'x': 0.8289, 'y': 0.8568}, {'x': 0.8289, 'y': 0.8885}, {'x': 0.528, 'y': 0.8885}], [{'x': 0.471, 'y': 0.8931}, {'x': 0.8887, 'y': 0.8931}, {'x': 0.8887, 'y': 0.9668}, {'x': 0.471, 'y': 0.9668}], [{'x': 0.0382, 'y': 0.0561}, {'x': 0.4343, 'y': 0.0561}, {'x': 0.4343, 'y': 0.0687}, {'x': 0.0382, 'y': 0.0687}], [{'x': 0.8615, 'y': 0.057}, {'x': 0.8841, 'y': 0.057}, {'x': 0.8841, 'y': 0.0678}, {'x': 0.8615, 'y': 0.0678}], [{'x': 0.084, 'y': 0.0876}, {'x': 0.8422, 'y': 0.0876}, {'x': 0.8422, 'y': 0.2931}, {'x': 0.084, 'y': 0.2931}], [{'x': 0.4323, 'y': 0.3313}, {'x': 0.4953, 'y': 0.3313}, {'x': 0.4953, 'y': 0.3431}, {'x': 0.4323, 'y': 0.3431}], [{'x': 0.0412, 'y': 0.539}, {'x': 0.3539, 'y': 0.539}, {'x': 0.3539, 'y': 0.5526}, {'x': 0.0412, 'y': 0.5526}], [{'x': 0.1069, 'y': 0.5672}, {'x': 0.3875, 'y': 0.5672}, {'x': 0.3875, 'y': 0.5975}, {'x': 0.1069, 'y': 0.5975}], [{'x': 0.0396, 'y': 0.603}, {'x': 0.4568, 'y': 0.603}, {'x': 0.4568, 'y': 0.6647}, {'x': 0.0396, 'y': 0.6647}], [{'x': 0.1034, 'y': 0.6708}, {'x': 0.3881, 'y': 0.6708}, {'x': 0.3881, 'y': 0.7064}, {'x': 0.1034, 'y': 0.7064}], [{'x': 0.0982, 'y': 0.7775}, {'x': 0.3965, 'y': 0.7775}, {'x': 0.3965, 'y': 0.8095}, {'x': 0.0982, 'y': 0.8095}], [{'x': 0.0395, 'y': 0.8143}, {'x': 0.4543, 'y': 0.8143}, {'x': 0.4543, 'y': 0.8455}, {'x': 0.0395, 'y': 0.8455}], [{'x': 0.1097, 'y': 0.8515}, {'x': 0.3835, 'y': 0.8515}, {'x': 0.3835, 'y': 0.8824}, {'x': 0.1097, 'y': 0.8824}], [{'x': 0.0402, 'y': 0.8874}, {'x': 0.4551, 'y': 0.8874}, {'x': 0.4551, 'y': 0.9186}, {'x': 0.0402, 'y': 0.9186}], [{'x': 0.1066, 'y': 0.9247}, {'x': 0.388, 'y': 0.9247}, {'x': 0.388, 'y': 0.9615}, {'x': 0.1066, 'y': 0.9615}], [{'x': 0.4731, 'y': 0.5746}, {'x': 0.8868, 'y': 0.5746}, {'x': 0.8868, 'y': 0.6062}, {'x': 0.4731, 'y': 0.6062}], [{'x': 0.4719, 'y': 0.6269}, {'x': 0.8347, 'y': 0.6269}, {'x': 0.8347, 'y': 0.6439}, {'x': 0.4719, 'y': 0.6439}], [{'x': 0.4693, 'y': 0.6475}, {'x': 0.8866, 'y': 0.6475}, {'x': 0.8866, 'y': 0.7635}, {'x': 0.4693, 'y': 0.7635}], [{'x': 0.4705, 'y': 0.7711}, {'x': 0.7643, 'y': 0.7711}, {'x': 0.7643, 'y': 0.7865}, {'x': 0.4705, 'y': 0.7865}], [{'x': 0.4707, 'y': 0.788}, {'x': 0.8877, 'y': 0.788}, {'x': 0.8877, 'y': 0.8346}, {'x': 0.4707, 'y': 0.8346}], [{'x': 0.4693, 'y': 0.8358}, {'x': 0.8875, 'y': 0.8358}, {'x': 0.8875, 'y': 0.9669}, {'x': 0.4693, 'y': 0.9669}], [{'x': 0.04, 'y': 0.0568}, {'x': 0.0631, 'y': 0.0568}, {'x': 0.0631, 'y': 0.0683}, {'x': 0.04, 'y': 0.0683}], [{'x': 0.3648, 'y': 0.0562}, {'x': 0.8866, 'y': 0.0562}, {'x': 0.8866, 'y': 0.0686}, {'x': 0.3648, 'y': 0.0686}], [{'x': 0.0838, 'y': 0.0903}, {'x': 0.8474, 'y': 0.0903}, {'x': 0.8474, 'y': 0.3804}, {'x': 0.0838, 'y': 0.3804}], [{'x': 0.0381, 'y': 0.3909}, {'x': 0.889, 'y': 0.3909}, {'x': 0.889, 'y': 0.4197}, {'x': 0.0381, 'y': 0.4197}], [{'x': 0.04, 'y': 0.4328}, {'x': 0.4337, 'y': 0.4328}, {'x': 0.4337, 'y': 0.4611}, {'x': 0.04, 'y': 0.4611}], [{'x': 0.0397, 'y': 0.4638}, {'x': 0.4562, 'y': 0.4638}, {'x': 0.4562, 'y': 0.6431}, {'x': 0.0397, 'y': 0.6431}], [{'x': 0.0389, 'y': 0.6428}, {'x': 0.4553, 'y': 0.6428}, {'x': 0.4553, 'y': 0.7556}, {'x': 0.0389, 'y': 0.7556}], [{'x': 0.0397, 'y': 0.7549}, {'x': 0.4558, 'y': 0.7549}, {'x': 0.4558, 'y': 0.9662}, {'x': 0.0397, 'y': 0.9662}], [{'x': 0.4767, 'y': 0.4324}, {'x': 0.8277, 'y': 0.4324}, {'x': 0.8277, 'y': 0.4754}, {'x': 0.4767, 'y': 0.4754}], [{'x': 0.4692, 'y': 0.4778}, {'x': 0.8866, 'y': 0.4778}, {'x': 0.8866, 'y': 0.6434}, {'x': 0.4692, 'y': 0.6434}], [{'x': 0.4698, 'y': 0.644}, {'x': 0.8868, 'y': 0.644}, {'x': 0.8868, 'y': 0.7841}, {'x': 0.4698, 'y': 0.7841}], [{'x': 0.4713, 'y': 0.8013}, {'x': 0.8332, 'y': 0.8013}, {'x': 0.8332, 'y': 0.8338}, {'x': 0.4713, 'y': 0.8338}], [{'x': 0.4706, 'y': 0.8379}, {'x': 0.8735, 'y': 0.8379}, {'x': 0.8735, 'y': 0.8668}, {'x': 0.4706, 'y': 0.8668}], [{'x': 0.4696, 'y': 0.8685}, {'x': 0.8879, 'y': 0.8685}, {'x': 0.8879, 'y': 0.9666}, {'x': 0.4696, 'y': 0.9666}], [{'x': 0.0382, 'y': 0.0565}, {'x': 0.4341, 'y': 0.0565}, {'x': 0.4341, 'y': 0.0684}, {'x': 0.0382, 'y': 0.0684}], [{'x': 0.8615, 'y': 0.0571}, {'x': 0.8825, 'y': 0.0571}, {'x': 0.8825, 'y': 0.0676}, {'x': 0.8615, 'y': 0.0676}], [{'x': 0.4311, 'y': 0.087}, {'x': 0.4958, 'y': 0.087}, {'x': 0.4958, 'y': 0.0997}, {'x': 0.4311, 'y': 0.0997}], [{'x': 0.0972, 'y': 0.0998}, {'x': 0.8283, 'y': 0.0998}, {'x': 0.8283, 'y': 0.1141}, {'x': 0.0972, 'y': 0.1141}], [{'x': 0.4084, 'y': 0.184}, {'x': 0.5175, 'y': 0.184}, {'x': 0.5175, 'y': 0.2116}, {'x': 0.4084, 'y': 0.2116}], [{'x': 0.0416, 'y': 0.3636}, {'x': 0.512, 'y': 0.3636}, {'x': 0.512, 'y': 0.3768}, {'x': 0.0416, 'y': 0.3768}], [{'x': 0.0746, 'y': 0.3803}, {'x': 0.8535, 'y': 0.3803}, {'x': 0.8535, 'y': 0.636}, {'x': 0.0746, 'y': 0.636}], [{'x': 0.0395, 'y': 0.6503}, {'x': 0.8874, 'y': 0.6503}, {'x': 0.8874, 'y': 0.6784}, {'x': 0.0395, 'y': 0.6784}], [{'x': 0.0407, 'y': 0.6932}, {'x': 0.4542, 'y': 0.6932}, {'x': 0.4542, 'y': 0.7356}, {'x': 0.0407, 'y': 0.7356}], [{'x': 0.0394, 'y': 0.7337}, {'x': 0.4555, 'y': 0.7337}, {'x': 0.4555, 'y': 0.8507}, {'x': 0.0394, 'y': 0.8507}], [{'x': 0.0393, 'y': 0.8519}, {'x': 0.4562, 'y': 0.8519}, {'x': 0.4562, 'y': 0.9674}, {'x': 0.0393, 'y': 0.9674}], [{'x': 0.4689, 'y': 0.6932}, {'x': 0.8861, 'y': 0.6932}, {'x': 0.8861, 'y': 0.7805}, {'x': 0.4689, 'y': 0.7805}], [{'x': 0.4689, 'y': 0.7806}, {'x': 0.8863, 'y': 0.7806}, {'x': 0.8863, 'y': 0.8808}, {'x': 0.4689, 'y': 0.8808}], [{'x': 0.4714, 'y': 0.8811}, {'x': 0.8876, 'y': 0.8811}, {'x': 0.8876, 'y': 0.9679}, {'x': 0.4714, 'y': 0.9679}], [{'x': 0.04, 'y': 0.0567}, {'x': 0.0631, 'y': 0.0567}, {'x': 0.0631, 'y': 0.0678}, {'x': 0.04, 'y': 0.0678}], [{'x': 0.3648, 'y': 0.056}, {'x': 0.8857, 'y': 0.056}, {'x': 0.8857, 'y': 0.0691}, {'x': 0.3648, 'y': 0.0691}], [{'x': 0.0777, 'y': 0.0872}, {'x': 0.8532, 'y': 0.0872}, {'x': 0.8532, 'y': 0.3559}, {'x': 0.0777, 'y': 0.3559}], [{'x': 0.0409, 'y': 0.3699}, {'x': 0.8314, 'y': 0.3699}, {'x': 0.8314, 'y': 0.4161}, {'x': 0.0409, 'y': 0.4161}], [{'x': 0.0419, 'y': 0.5147}, {'x': 0.5423, 'y': 0.5147}, {'x': 0.5423, 'y': 0.5285}, {'x': 0.0419, 'y': 0.5285}], [{'x': 0.0402, 'y': 0.5459}, {'x': 0.4553, 'y': 0.5459}, {'x': 0.4553, 'y': 0.6147}, {'x': 0.0402, 'y': 0.6147}], [{'x': 0.0405, 'y': 0.6231}, {'x': 0.3155, 'y': 0.6231}, {'x': 0.3155, 'y': 0.6381}, {'x': 0.0405, 'y': 0.6381}], [{'x': 0.0405, 'y': 0.6411}, {'x': 0.4557, 'y': 0.6411}, {'x': 0.4557, 'y': 0.6943}, {'x': 0.0405, 'y': 0.6943}], [{'x': 0.0397, 'y': 0.6946}, {'x': 0.4555, 'y': 0.6946}, {'x': 0.4555, 'y': 0.8585}, {'x': 0.0397, 'y': 0.8585}], [{'x': 0.0397, 'y': 0.859}, {'x': 0.4565, 'y': 0.859}, {'x': 0.4565, 'y': 0.9664}, {'x': 0.0397, 'y': 0.9664}], [{'x': 0.4685, 'y': 0.5459}, {'x': 0.8858, 'y': 0.5459}, {'x': 0.8858, 'y': 0.6995}, {'x': 0.4685, 'y': 0.6995}], [{'x': 0.4691, 'y': 0.6995}, {'x': 0.8868, 'y': 0.6995}, {'x': 0.8868, 'y': 0.8672}, {'x': 0.4691, 'y': 0.8672}], [{'x': 0.4699, 'y': 0.8872}, {'x': 0.5992, 'y': 0.8872}, {'x': 0.5992, 'y': 0.9035}, {'x': 0.4699, 'y': 0.9035}], [{'x': 0.4713, 'y': 0.909}, {'x': 0.8878, 'y': 0.909}, {'x': 0.8878, 'y': 0.9665}, {'x': 0.4713, 'y': 0.9665}], [{'x': 0.0385, 'y': 0.0559}, {'x': 0.4338, 'y': 0.0559}, {'x': 0.4338, 'y': 0.0689}, {'x': 0.0385, 'y': 0.0689}], [{'x': 0.8616, 'y': 0.0572}, {'x': 0.884, 'y': 0.0572}, {'x': 0.884, 'y': 0.0676}, {'x': 0.8616, 'y': 0.0676}], [{'x': 0.2632, 'y': 0.0886}, {'x': 0.6659, 'y': 0.0886}, {'x': 0.6659, 'y': 0.1166}, {'x': 0.2632, 'y': 0.1166}], [{'x': 0.0404, 'y': 0.468}, {'x': 0.888, 'y': 0.468}, {'x': 0.888, 'y': 0.5046}, {'x': 0.0404, 'y': 0.5046}], [{'x': 0.0394, 'y': 0.5221}, {'x': 0.4563, 'y': 0.5221}, {'x': 0.4563, 'y': 0.5904}, {'x': 0.0394, 'y': 0.5904}], [{'x': 0.0609, 'y': 0.5995}, {'x': 0.4554, 'y': 0.5995}, {'x': 0.4554, 'y': 0.97}, {'x': 0.0609, 'y': 0.97}], [{'x': 0.5196, 'y': 0.5224}, {'x': 0.8878, 'y': 0.5224}, {'x': 0.8878, 'y': 0.5948}, {'x': 0.5196, 'y': 0.5948}], [{'x': 0.4703, 'y': 0.6037}, {'x': 0.7623, 'y': 0.6037}, {'x': 0.7623, 'y': 0.6188}, {'x': 0.4703, 'y': 0.6188}], [{'x': 0.4699, 'y': 0.6211}, {'x': 0.8877, 'y': 0.6211}, {'x': 0.8877, 'y': 0.7321}, {'x': 0.4699, 'y': 0.7321}], [{'x': 0.4707, 'y': 0.7317}, {'x': 0.8853, 'y': 0.7317}, {'x': 0.8853, 'y': 0.8145}, {'x': 0.4707, 'y': 0.8145}], [{'x': 0.4709, 'y': 0.8145}, {'x': 0.8874, 'y': 0.8145}, {'x': 0.8874, 'y': 0.8844}, {'x': 0.4709, 'y': 0.8844}], [{'x': 0.4729, 'y': 0.8929}, {'x': 0.8336, 'y': 0.8929}, {'x': 0.8336, 'y': 0.9218}, {'x': 0.4729, 'y': 0.9218}], [{'x': 0.4706, 'y': 0.9247}, {'x': 0.8879, 'y': 0.9247}, {'x': 0.8879, 'y': 0.9675}, {'x': 0.4706, 'y': 0.9675}], [{'x': 0.0399, 'y': 0.0567}, {'x': 0.0629, 'y': 0.0567}, {'x': 0.0629, 'y': 0.0678}, {'x': 0.0399, 'y': 0.0678}], [{'x': 0.3646, 'y': 0.0562}, {'x': 0.8862, 'y': 0.0562}, {'x': 0.8862, 'y': 0.0686}, {'x': 0.3646, 'y': 0.0686}], [{'x': 0.0758, 'y': 0.0909}, {'x': 0.8512, 'y': 0.0909}, {'x': 0.8512, 'y': 0.3792}, {'x': 0.0758, 'y': 0.3792}], [{'x': 0.0384, 'y': 0.3884}, {'x': 0.8871, 'y': 0.3884}, {'x': 0.8871, 'y': 0.4177}, {'x': 0.0384, 'y': 0.4177}], [{'x': 0.0399, 'y': 0.4324}, {'x': 0.4568, 'y': 0.4324}, {'x': 0.4568, 'y': 0.546}, {'x': 0.0399, 'y': 0.546}], [{'x': 0.0397, 'y': 0.5464}, {'x': 0.457, 'y': 0.5464}, {'x': 0.457, 'y': 0.8321}, {'x': 0.0397, 'y': 0.8321}], [{'x': 0.0399, 'y': 0.84}, {'x': 0.2277, 'y': 0.84}, {'x': 0.2277, 'y': 0.8552}, {'x': 0.0399, 'y': 0.8552}], [{'x': 0.0411, 'y': 0.8581}, {'x': 0.4323, 'y': 0.8581}, {'x': 0.4323, 'y': 0.8725}, {'x': 0.0411, 'y': 0.8725}], [{'x': 0.0609, 'y': 0.8821}, {'x': 0.4575, 'y': 0.8821}, {'x': 0.4575, 'y': 0.9675}, {'x': 0.0609, 'y': 0.9675}], [{'x': 0.5199, 'y': 0.433}, {'x': 0.8844, 'y': 0.433}, {'x': 0.8844, 'y': 0.4474}, {'x': 0.5199, 'y': 0.4474}], [{'x': 0.4909, 'y': 0.5311}, {'x': 0.5052, 'y': 0.5311}, {'x': 0.5052, 'y': 0.5449}, {'x': 0.4909, 'y': 0.5449}], [{'x': 0.4909, 'y': 0.6437}, {'x': 0.5052, 'y': 0.6437}, {'x': 0.5052, 'y': 0.6575}, {'x': 0.4909, 'y': 0.6575}], [{'x': 0.4909, 'y': 0.7978}, {'x': 0.5052, 'y': 0.7978}, {'x': 0.5052, 'y': 0.8116}, {'x': 0.4909, 'y': 0.8116}], [{'x': 0.4909, 'y': 0.9382}, {'x': 0.5052, 'y': 0.9382}, {'x': 0.5052, 'y': 0.9519}, {'x': 0.4909, 'y': 0.9519}], [{'x': 0.4984, 'y': 0.449}, {'x': 0.8901, 'y': 0.449}, {'x': 0.8901, 'y': 0.9717}, {'x': 0.4984, 'y': 0.9717}], [{'x': 0.0388, 'y': 0.0565}, {'x': 0.4333, 'y': 0.0565}, {'x': 0.4333, 'y': 0.0682}, {'x': 0.0388, 'y': 0.0682}], [{'x': 0.8616, 'y': 0.0571}, {'x': 0.8838, 'y': 0.0571}, {'x': 0.8838, 'y': 0.0678}, {'x': 0.8616, 'y': 0.0678}], [{'x': 0.0612, 'y': 0.1396}, {'x': 0.0754, 'y': 0.1396}, {'x': 0.0754, 'y': 0.1534}, {'x': 0.0612, 'y': 0.1534}], [{'x': 0.0794, 'y': 0.0849}, {'x': 0.4571, 'y': 0.0849}, {'x': 0.4571, 'y': 0.3725}, {'x': 0.0794, 'y': 0.3725}], [{'x': 0.0404, 'y': 0.3874}, {'x': 0.3265, 'y': 0.3874}, {'x': 0.3265, 'y': 0.4038}, {'x': 0.0404, 'y': 0.4038}], [{'x': 0.0402, 'y': 0.4081}, {'x': 0.4568, 'y': 0.4081}, {'x': 0.4568, 'y': 0.6268}, {'x': 0.0402, 'y': 0.6268}], [{'x': 0.039, 'y': 0.6294}, {'x': 0.4565, 'y': 0.6294}, {'x': 0.4565, 'y': 0.6999}, {'x': 0.039, 'y': 0.6999}], [{'x': 0.0402, 'y': 0.7006}, {'x': 0.4555, 'y': 0.7006}, {'x': 0.4555, 'y': 0.7424}, {'x': 0.0402, 'y': 0.7424}], [{'x': 0.0589, 'y': 0.7535}, {'x': 0.4576, 'y': 0.7535}, {'x': 0.4576, 'y': 0.9225}, {'x': 0.0589, 'y': 0.9225}], [{'x': 0.0402, 'y': 0.9238}, {'x': 0.4567, 'y': 0.9238}, {'x': 0.4567, 'y': 0.9672}, {'x': 0.0402, 'y': 0.9672}], [{'x': 0.4698, 'y': 0.0842}, {'x': 0.8854, 'y': 0.0842}, {'x': 0.8854, 'y': 0.113}, {'x': 0.4698, 'y': 0.113}], [{'x': 0.4702, 'y': 0.1333}, {'x': 0.6374, 'y': 0.1333}, {'x': 0.6374, 'y': 0.1483}, {'x': 0.4702, 'y': 0.1483}], [{'x': 0.4687, 'y': 0.1525}, {'x': 0.8878, 'y': 0.1525}, {'x': 0.8878, 'y': 0.3136}, {'x': 0.4687, 'y': 0.3136}], [{'x': 0.471, 'y': 0.3343}, {'x': 0.5778, 'y': 0.3343}, {'x': 0.5778, 'y': 0.3493}, {'x': 0.471, 'y': 0.3493}], [{'x': 0.4666, 'y': 0.353}, {'x': 0.8917, 'y': 0.353}, {'x': 0.8917, 'y': 0.9645}, {'x': 0.4666, 'y': 0.9645}], [{'x': 0.0399, 'y': 0.0568}, {'x': 0.0627, 'y': 0.0568}, {'x': 0.0627, 'y': 0.0678}, {'x': 0.0399, 'y': 0.0678}], [{'x': 0.3637, 'y': 0.0565}, {'x': 0.8848, 'y': 0.0565}, {'x': 0.8848, 'y': 0.0685}, {'x': 0.3637, 'y': 0.0685}], [{'x': 0.0341, 'y': 0.083}, {'x': 0.4659, 'y': 0.083}, {'x': 0.4659, 'y': 0.9675}, {'x': 0.0341, 'y': 0.9675}], [{'x': 0.4686, 'y': 0.0804}, {'x': 0.8906, 'y': 0.0804}, {'x': 0.8906, 'y': 0.4112}, {'x': 0.4686, 'y': 0.4112}], [{'x': 0.474, 'y': 0.4209}, {'x': 0.5894, 'y': 0.4209}, {'x': 0.5894, 'y': 0.5338}, {'x': 0.474, 'y': 0.5338}], [{'x': 0.6054, 'y': 0.4222}, {'x': 0.8878, 'y': 0.4222}, {'x': 0.8878, 'y': 0.5257}, {'x': 0.6054, 'y': 0.5257}], [{'x': 0.4723, 'y': 0.5661}, {'x': 0.5967, 'y': 0.5661}, {'x': 0.5967, 'y': 0.6795}, {'x': 0.4723, 'y': 0.6795}], [{'x': 0.4713, 'y': 0.6795}, {'x': 0.5978, 'y': 0.6795}, {'x': 0.5978, 'y': 0.6915}, {'x': 0.4713, 'y': 0.6915}], [{'x': 0.5998, 'y': 0.5645}, {'x': 0.8872, 'y': 0.5645}, {'x': 0.8872, 'y': 0.6906}, {'x': 0.5998, 'y': 0.6906}], [{'x': 0.6042, 'y': 0.7028}, {'x': 0.8871, 'y': 0.7028}, {'x': 0.8871, 'y': 0.75}, {'x': 0.6042, 'y': 0.75}], [{'x': 0.473, 'y': 0.7031}, {'x': 0.5912, 'y': 0.7031}, {'x': 0.5912, 'y': 0.8164}, {'x': 0.473, 'y': 0.8164}], [{'x': 0.4705, 'y': 0.8425}, {'x': 0.8876, 'y': 0.8425}, {'x': 0.8876, 'y': 0.8673}, {'x': 0.4705, 'y': 0.8673}]]}, page_content=\"br> The SZZ algorithm then identifies the bug-introducing change associated with the bug fix in revision 3. It starts by computing the delta between revisions 3 and 2, yielding KIM ET AL.: CLASSIFYING SOFTWARE CHANGES: CLEAN OR BUGGY?
line 3. SZZ then uses the SCM annotate data to determine the initial origin of line 3 at revision 2. This is revision 1, the bug-introducing change.
One assumption of the presentation so far is that a bug is repaired in a single bug-fix change. What happens when a bug is repaired across multiple commits? There\")"
+ ]
+ },
+ "metadata": {},
+ "execution_count": 93
+ }
+ ],
+ "source": [
+ "from langchain_community.retrievers import BM25Retriever\n",
+ "\n",
+ "retriever = BM25Retriever.from_documents(splits)\n",
+ "retriever.invoke(\"What is bug classficiation?\")[0]"
+ ]
+ },
{
- "data": {
- "text/plain": [
- "[Document(page_content=\" One assumption of the presentation SO far is that a bug is repaired in a single bug-fix change. What happens when a bug is repaired across multiple commits? There are two cases. In the first case, a bug repair is split across multiple commits, with each commit modifying a separate section of the code (code sections are disjoint). Each separate change is tracked back to its initial bug-introducing change, which is then used to train the SVM classifier. In the second case, a bug fix occurs incrementally over multiple commits, with some later fixes modifying earlier ones (the fix code partially overlaps). The first patch in an overlapping code section would be traced back to the original bug-introducing change. Later modifications would not be traced back to the original bug-introducing change. Instead, they would be traced back to an intermediate modification, which is identified as bug\", metadata={'total_pages': 16, 'type': 'html', 'split': 'none'}),\n",
- " Document(page_content=' to an intermediate modification, which is identified as bug introducing. This is appropriate since the intermediate modification did not correctly fix the bug and, hence, is simultaneously a bug fix and buggy. In this case, the classifier is being trained with the attributes of the buggy intermediate commit, a valid bug-introducing change. ', metadata={'total_pages': 16, 'type': 'html', 'split': 'none'}),\n",
- " Document(page_content=\"6. Bug tracking systems for tracking new functional- ities were used. In two of the systems examined, that is, Bugzilla and Scarab, the projects used bug tracking systems to also track new functionality additions to the project. For these projects, the meaning of a bug tracking identifier in the change log message is that either a bug was fixed or a new functionality is added. This substantially increases the number of changes flagged as bug fixes. For these systems, the interpretation of a positive classification output is a change that is either buggy or a new functionality. When using this algorithm, care needs to be taken to understand the meaning of changes identified as bugs and, wherever possible, to ensure that only truly buggy changes are flagged as being buggy. 9 CONCLUSION AND OPEN ISSUES \", metadata={'total_pages': 16, 'type': 'html', 'split': 'none'}),\n",
- " Document(page_content=\"A bug in the source code leads to an unintended state within the executing software and this corrupted state eventually results in an undesired external behavior. This is logged in a bug report message in a change tracking system, often many months after the initial injection of the bug into the software. By the time that a developer receives the bug report, she must spend time to reacquaint herself with the source code and the recent changes made to it. This is time consuming.
If there were a tool that could accurately predict whether a change is buggy or clean immediately after a change was made, it would enable developers to take steps to fix the \", metadata={'total_pages': 16, 'type': 'html', 'split': 'none'})]"
+ "cell_type": "code",
+ "execution_count": 99,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 70
+ },
+ "id": "POx3WUkOCDz9",
+ "outputId": "f137209c-f177-4fc5-f99b-da8f01c21458"
+ },
+ "outputs": [
+ {
+ "output_type": "execute_result",
+ "data": {
+ "text/plain": [
+ "'Bug classification is the process of classifying software changes as either clean or buggy, and it requires about 100 changes to train a project-specific classification model before the predictive accuracy achieves a \"usable\" level of accuracy. The information is present in the context.'"
+ ],
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ }
+ },
+ "metadata": {},
+ "execution_count": 99
+ }
+ ],
+ "source": [
+ "# Query 1 : using the whold page_content meta data\n",
+ "query = \"What is bug classficiation?\"\n",
+ "context_docs = retriever.invoke(query)\n",
+ "\n",
+ "context_pagecontent=\"\"\n",
+ "for context_doc in context_docs:\n",
+ " context_pagecontent += context_doc.page_content\n",
+ "chain.invoke({\"question\": query, \"Context\": context_pagecontent})"
]
- },
- "execution_count": 9,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
- "source": [
- "retriever = BM25Retriever.from_documents(splits)\n",
- "retriever.invoke(\"What is bug classficiation?\")"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [
+ },
{
- "data": {
- "text/plain": [
- "'The information is not present in the context.'"
+ "cell_type": "code",
+ "execution_count": 94,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 35
+ },
+ "id": "1RR5Y4iYCDz-",
+ "outputId": "6b5d1c70-a7c1-4db0-d4f2-c2d8c78b6984"
+ },
+ "outputs": [
+ {
+ "output_type": "execute_result",
+ "data": {
+ "text/plain": [
+ "'The information is not present in the context.'"
+ ],
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ }
+ },
+ "metadata": {},
+ "execution_count": 94
+ }
+ ],
+ "source": [
+ "# Query 2 : using the whold page_content meta data\n",
+ "query = \"How do find the bug?\"\n",
+ "context_docs = retriever.invoke(query)\n",
+ "\n",
+ "context_page_content=\"\"\n",
+ "for context_doc in context_docs:\n",
+ " context_pagecontent += context_doc.page_content\n",
+ "\n",
+ "chain.invoke({\"question\": query, \"Context\": context_page_content})"
]
- },
- "execution_count": 10,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
- "source": [
- "query = \"What is bug classficiation?\"\n",
- "context_docs = retriever.invoke(query)\n",
- "chain.invoke({\"question\": query, \"Context\": context_docs})"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "_Dp6QbTMCDz-"
+ },
+ "source": [
+ "# Excercise\n",
+ "It seems keyword search is not the best for LLM queries. What are some alternatives?"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "# Query 1 : extracting the relevant text by parsing html\n",
+ "from bs4 import BeautifulSoup\n",
+ "\n",
+ "query = \"What is bug classficiation?\"\n",
+ "context_docs = retriever.invoke(query)\n",
+ "\n",
+ "context=\"\"\n",
+ "for context_doc in context_docs:\n",
+ " soup = BeautifulSoup(context_doc.page_content, 'html.parser')\n",
+ " text_content = ''.join([element.get_text() for element in soup.find_all(['p'])])\n",
+ "\n",
+ " context += text_content\n",
+ "\n",
+ "chain.invoke({\"question\": \"What is bug classficiation?\", \"Context\": context})"
+ ],
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 122
+ },
+ "id": "SJ7fLvvtJYqp",
+ "outputId": "73e345d3-1764-4ffe-ff79-979538b67f2d"
+ },
+ "execution_count": 76,
+ "outputs": [
+ {
+ "output_type": "execute_result",
+ "data": {
+ "text/plain": [
+ "'Bug classification is the process of categorizing changes in software as bug-fixing changes or not. This is typically done using a classification technique, such as an SVM classifier, which requires about 100 changes to train a project-specific model before achieving a \"usable\" level of accuracy. In the context of bug tracking systems, one potential issue is that these systems are often used to record both bug reports and new feature additions, causing new feature changes to be identified as bug-fix changes. This can increase the number of changes flagged as bug fixes, and when using this algorithm, care needs to be taken to understand the meaning of changes identified as bugs.'"
+ ],
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ }
+ },
+ "metadata": {},
+ "execution_count": 76
+ }
+ ]
+ },
{
- "data": {
- "text/plain": [
- "'Bug classification is a process that involves classifying changes and predicting whether there is a bug in any of the lines that were changed in one file in one SCM commit transaction. It is different from bug prediction work that focuses on finding prediction or regression models to identify fault-prone or buggy modules, files, and functions. Bug classification allows for immediate bug predictions since it can predict buggy changes as soon as a change is made.'"
+ "cell_type": "code",
+ "execution_count": 80,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 104
+ },
+ "id": "4SlQZPLxCDz9",
+ "outputId": "b644dc14-34df-4e3f-f924-97a1cda17b08"
+ },
+ "outputs": [
+ {
+ "output_type": "execute_result",
+ "data": {
+ "text/plain": [
+ "'To find the bug, Pan et al. use metrics computed over software slice data in conjunction with machine learning algorithms to find bug-prone software files or functions. They try to find faults in the whole code. Another approach focuses on file changes. One thread of research attempts to find buggy or clean code patterns in the history of development of a software project, like Williams and Hollingsworth, who use project histories to improve existing bug-finding tools and remove false positives from their approach.'"
+ ],
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ }
+ },
+ "metadata": {},
+ "execution_count": 80
+ }
+ ],
+ "source": [
+ "# Query 2 : extracting the relevant text by parsing html\n",
+ "from bs4 import BeautifulSoup\n",
+ "\n",
+ "query = \"How do find the bug?\"\n",
+ "context_docs = retriever.invoke(query)\n",
+ "\n",
+ "context=\"\"\n",
+ "for context_doc in context_docs:\n",
+ " soup = BeautifulSoup(context_doc.page_content, 'html.parser')\n",
+ " text_content = ''.join([element.get_text() for element in soup.find_all(['p'])])\n",
+ "\n",
+ " context += text_content\n",
+ "\n",
+ "chain.invoke({\"question\": \"How do find the bug?\", \"Context\": context})"
]
- },
- "execution_count": 11,
- "metadata": {},
- "output_type": "execute_result"
}
- ],
- "source": [
- "query = \"How do find the bug?\"\n",
- "context_docs = retriever.invoke(query)\n",
- "chain.invoke({\"question\": query, \"Context\": context_docs})"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "# Excercise \n",
- "It seems keyword search is not the best for LLM queries. What are some alternatives?"
- ]
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 3 (ipykernel)",
- "language": "python",
- "name": "python3"
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.9.6"
+ },
+ "colab": {
+ "provenance": []
+ }
},
- "language_info": {
- "codemirror_mode": {
- "name": "ipython",
- "version": 3
- },
- "file_extension": ".py",
- "mimetype": "text/x-python",
- "name": "python",
- "nbconvert_exporter": "python",
- "pygments_lexer": "ipython3",
- "version": "3.9.6"
- }
- },
- "nbformat": 4,
- "nbformat_minor": 4
-}
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
\ No newline at end of file
|