-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
error after fresh installation #14
Comments
this is related to #13 the one click installer initiates the downloaded snapshot recovery to Qdrant, but it fails in error. for what it's worth, I was eventually able to manually populate the data from the snapshot with the dockerized version of Qdrant, so maybe the issue lies somewhere in the implementation of the windows-native version of Qdrant? |
I'm tring to hunt down that issue but I fail to be able to reproduce it on any of my systems. I would be very happy to get a chance to see it happen first hand on some system to maybe get an idea where it is coming from. Maybe I should create a path in the once click installer where I assume there is a qdrant running in docker and push the data there. But I still hope if I can touch the issue first hand I might be able to find a fix. |
Command '"C:\Aplicativozinhos\prompt_quill\llama_index_pq\installer_files\conda\condabin\conda.bat" activate "C:\Aplicativozinhos\prompt_quill\llama_index_pq\installer_files\env" >nul && python -m pip install -r temp_requirements.txt --upgrade' failed with exit status code '1'. |
fresh installation (NVIDIA + CUDA 12.1) with one_click_install.bat in llama_index_pq.
If a try chat, i get:
F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\transformers\models\bert\modeling_bert.py:439: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.)
attn_output = torch.nn.functional.scaled_dot_product_attention(
Traceback (most recent call last):
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\queueing.py", line 527, in process_events
response = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\route_utils.py", line 270, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1847, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1431, in call_function
prediction = await fn(*processed_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\utils.py", line 772, in async_wrapper
response = await f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\gradio\chat_interface.py", line 513, in _submit_fn
response = await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\anyio_backends_asyncio.py", line 2177, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\anyio_backends_asyncio.py", line 859, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\ui.py", line 78, in run_llm_response
prompt = self.interface.run_llm_response(query, history)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llm_interface_qdrant.py", line 202, in run_llm_response
response = self.adapter.retrieve_llm_completion(query)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 372, in retrieve_llm_completion
context = self.get_context_text(prompt)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 257, in get_context_text
nodes = self.retrieve_context(query)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 253, in retrieve_context
return self.direct_search(prompt,self.g.settings_data['top_k'],0,True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\pq\llm_fw\llama_index_interface.py", line 219, in direct_search
result = self.document_store.search(collection_name=self.g.settings_data['collection'],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\qdrant_client.py", line 353, in search
return self._client.search(
^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\qdrant_remote.py", line 521, in search
search_result = self.http.points_api.search_points(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api\points_api.py", line 1524, in search_points
return self._build_for_search_points(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api\points_api.py", line 704, in build_for_search_points
return self.api_client.request(
^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api_client.py", line 79, in request
return self.send(request, type)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\AI\prompt_quill\prompt_quill\llama_index_pq\installer_files\env\Lib\site-packages\qdrant_client\http\api_client.py", line 102, in send
raise UnexpectedResponse.for_response(response)
qdrant_client.http.exceptions.UnexpectedResponse: Unexpected Response: 404 (Not Found)
Raw response content:
b'{"status":{"error":"Not found: Collection
prompts_large_meta
doesn't exist!"},"time":0.0000123}'The text was updated successfully, but these errors were encountered: