You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A temporary solution to this issue is to comment out the llm_service section of the config.yaml file, like so:
...
# LLM Usage (check README for help)
# llm_service:
# type: sap # use "sap" or "third_party"
# model_name: gpt-4-turbo
# temperature: 0.0 # optional, default is 0.0
# ai_core_sk: <file_path> # needed for type: sap
use_llm_repository_url: False # whether to use LLM's to obtain the repository URL
...
I supposed this is due to ConfigSchema, which means use_llm_repository_url must be inside llm_service. Therefore I tried the config:
...
# LLM Usage (check README for help)
llm_service:
# type: sap # use "sap" or "third_party"
# model_name: gpt-4-turbo
# temperature: 0.0 # optional, default is 0.0
# ai_core_sk: <file_path> # needed for type: sap
use_llm_repository_url: False # whether to use LLM's to obtain the repository URL
...
And it still does not work:
I just want to run Prospector with rule-based only, no LLM needed. Can you help me solve this config issue ?
Thanks !
When trying to run the example command "./run_prospector.sh CVE-2020-1925 --repository https://github.com/apache/olingo-odata4", a missing LLM error appears, even when
use_llm_repository_url
is set toFalse
.A temporary solution to this issue is to comment out the
llm_service
section of the config.yaml file, like so:See #387
The text was updated successfully, but these errors were encountered: