Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stucks on: Thinking about research questions for the task... #993

Open
harisla7 opened this issue Nov 23, 2024 · 6 comments
Open

Stucks on: Thinking about research questions for the task... #993

harisla7 opened this issue Nov 23, 2024 · 6 comments

Comments

@harisla7
Copy link

In PowerShell I get the below error.

Warning: Configuration not found at 'default'. Using default configuration.
Do you mean 'default.json'?
⚠️ Error in reading JSON, attempting to repair JSON
Error using json_repair: the JSON object must be str, bytes or bytearray, not NoneType
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 27, in choose_agent
response = await create_chat_completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\utils\llm.py", line 60, in create_chat_completion response = await provider.get_chat_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\llm_provider\generic\base.py", line 116, in get_chat_response
output = await self.llm.ainvoke(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 307, in ainvoke llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 796, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 756, in agenerate
raise exceptions[0]
File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 924, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\langchain_openai\chat_models\base.py", line 824, in _agenerate
response = await self.async_client.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\openai_base_client.py", line 1839, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\openai_base_client.py", line 1533, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\openai_base_client.py", line 1634, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model gpt-4o-2024-08-06 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Python312\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 242, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\middleware\errors.py", line 152, in call
await self.app(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\middleware\cors.py", line 77, in call
await self.app(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
raise exc
File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "C:\Python312\Lib\site-packages\starlette\routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\routing.py", line 735, in app
await route.handle(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\routing.py", line 362, in handle
await self.app(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette\routing.py", line 95, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
raise exc
File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "C:\Python312\Lib\site-packages\starlette\routing.py", line 93, in app
await func(session)
File "C:\Python312\Lib\site-packages\fastapi\routing.py", line 383, in app
await dependant.call(**solved_result.values)
File "C:\Users\azureuser1\gpt-researcher\backend\server\server.py", line 110, in websocket_endpoint
await handle_websocket_communication(websocket, manager)
File "C:\Users\azureuser1\gpt-researcher\backend\server\server_utils.py", line 121, in handle_websocket_communication
await handle_start_command(websocket, data, manager)
File "C:\Users\azureuser1\gpt-researcher\backend\server\server_utils.py", line 28, in handle_start_command
report = await manager.start_streaming(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\backend\server\websocket_manager.py", line 66, in start_streaming
report = await run_agent(task, report_type, report_source, source_urls, tone, websocket, headers = headers, config_path = config_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\backend\server\websocket_manager.py", line 108, in run_agent
report = await researcher.run()
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\backend\report_type\basic_report\basic_report.py", line 41, in run
await researcher.conduct_research()
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\agent.py", line 90, in conduct_research
self.agent, self.role = await choose_agent(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 44, in choose_agent
return await handle_json_error(response)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 55, in handle_json_error
json_string = extract_json_with_regex(response)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 71, in extract_json_with_regex
json_match = re.search(r"{.*?}", response, re.DOTALL)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\re_init_.py", line 177, in search
return _compile(pattern, flags).search(string)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: expected string or bytes-like object, got 'NoneType'
INFO: connection closed

Desktop (please complete the following information):

  • OS: Windows
  • Browser: Chrome and Edge

GPT Researcher Error
GTP Researcher Error (2)
GTP Researcher Error (3)
GPT Researcher Error (4)

@ouarkainfo
Copy link

same issue. it does not work.

@MC-shark
Copy link

same issue.

@rossgalloway
Copy link

Looks like a problem with your API key or you are choosing a non-existant model.

openai.NotFoundError: Error code: 404 - 
{'error': {'message': 'The model gpt-4o-2024-08-06 does not exist or you do not have access to it.', 
'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

I had a similar issue where it was pulling some random API key. Had to comment out the text in the .env file and reload a few times and eventually it sorted itself. You can add a log to check your keys like this:

import os

load_dotenv()

openai_api_key = os.getenv("OPENAI_API_KEY")
tavily_api_key = os.getenv("TAVILY_API_KEY")

# Masking the API keys for security
def mask_key(key):
    if key and len(key) > 8:
        return key[:4] + '*' * (len(key) - 8) + key[-4:]
    return key

print("OPENAI_API_KEY:", mask_key(openai_api_key))
print("TAVILY_API_KEY:", mask_key(tavily_api_key))

When it doubt ask an LLM for help.

@ouarkainfo
Copy link

@rossgalloway thank you
I tested with correct api keys and it stucks; it does not work also on local files.

@ouarkainfo
Copy link

@harisla7 Have you found a solution?

@harisla7
Copy link
Author

@ouarkainfo Not yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants