You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.
File "C:\Python311\Lib\site-packages\pydantic\deprecated\class_validators.py", line 240, in root_validator
raise PydanticUserError(
pydantic.errors.PydanticUserError: If you use `@root_validator` with pre=False (the default) you MUST specify `skip_on_failure=True`. Note that `@root_validator` is deprecated and should be replaced with `@model_validator`.
After reading some other issues, we ran:
pip install pydantic==1.10.18
But now, with JSON documents from read_files_as_documents, embed_model="default" or (local:intfloat/multilingual-e5-small) returns the error below (Windows, latest autollm).
We tried installing a few different litellm versions with no success (same error, or another one because it's incompatible with autollm):
Traceback (most recent call last):
File "llm.py", line 9, in <module>
query_engine = AutoQueryEngine.from_defaults(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\autollm\auto\query_engine.py", line 258, in from_defaults
return create_query_engine(
^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\autollm\auto\query_engine.py", line 105, in create_query_engine
vector_store_index = AutoVectorStoreIndex.from_defaults(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\autollm\auto\vector_store_index.py", line 107, in from_defaults
index = AutoVectorStoreIndex._create_index(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\autollm\auto\vector_store_index.py", line 215, in _create_index
index = VectorStoreIndex.from_documents(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\indices\base.py", line 107, in from_documents
return cls(
^^^^
File "C:\Python311\Lib\site-packages\llama_index\indices\vector_store\base.py", line 52, in __init__
super().__init__(
File "C:\Python311\Lib\site-packages\llama_index\indices\base.py", line 72, in __init__
index_struct = self.build_index_from_nodes(nodes)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\indices\vector_store\base.py", line 262, in build_index_from_nodes
return self._build_index_from_nodes(nodes, **insert_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\indices\vector_store\base.py", line 243, in _build_index_from_nodes
self._add_nodes_to_index(
File "C:\Python311\Lib\site-packages\llama_index\indices\vector_store\base.py", line 196, in _add_nodes_to_index
nodes_batch = self._get_node_with_embedding(nodes_batch, show_progress)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\indices\vector_store\base.py", line 104, in _get_node_with_embedding
id_to_embed_map = embed_nodes(
^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\indices\utils.py", line 137, in embed_nodes
new_embeddings = embed_model.get_text_embedding_batch(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\core\embeddings\base.py", line 256, in get_text_embedding_batch
embeddings = self._get_text_embeddings(cur_batch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\core\embeddings\base.py", line 183, in _get_text_embeddings
return [self._get_text_embedding(text) for text in texts]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\llama_index\core\embeddings\base.py", line 183, in <listcomp>
return [self._get_text_embedding(text) for text in texts]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\autollm\auto\embedding.py", line 67, in _get_text_embedding
return self._get_query_embedding(text)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\autollm\auto\embedding.py", line 41, in _get_query_embedding
response = lite_embedding(model=self.model, input=[query])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\litellm\utils.py", line 3421, in wrapper
raise e
File "C:\Python311\Lib\site-packages\litellm\utils.py", line 3314, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\litellm\main.py", line 2947, in embedding
model, custom_llm_provider, dynamic_api_key, api_base = get_llm_provider(
^^^^^^^^^^^^^^^^^
File "C:\Python311\Lib\site-packages\litellm\utils.py", line 6938, in get_llm_provider
raise e
File "C:\Python311\Lib\site-packages\litellm\utils.py", line 6915, in get_llm_provider
raise litellm.exceptions.BadRequestError( # type: ignore
litellm.exceptions.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=default
Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers
Trying to use huggingface directly gives the following, which is fixed in a later version of litellm (August 2024) and is incompatible:
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello,
Initially we had the following error:
After reading some other issues, we ran:
pip install pydantic==1.10.18
But now, with JSON documents from
read_files_as_documents
,embed_model="default"
or (local:intfloat/multilingual-e5-small
) returns the error below (Windows, latest autollm).We tried installing a few different litellm versions with no success (same error, or another one because it's incompatible with autollm):
Trying to use huggingface directly gives the following, which is fixed in a later version of litellm (August 2024) and is incompatible:
Input should be a valid dictionary or instance of SentenceSimilarityInputsCheck
Beta Was this translation helpful? Give feedback.
All reactions