Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncaught ImportError in utils.py line 9 #2475

Closed
dneprovets opened this issue Nov 27, 2024 · 1 comment
Closed

Uncaught ImportError in utils.py line 9 #2475

dneprovets opened this issue Nov 27, 2024 · 1 comment

Comments

@dneprovets
Copy link

Aider version: <module 'aider.version' from '/Users/mda/miniconda3/lib/python3.12/site-packages/aider/version.py'>
Python version: 3.12.2
Platform: macOS-15.0.1-arm64-arm-64bit
Python implementation: CPython
Virtual environment: No
OS: Darwin 24.0.0 (64bit)
Git version: git version 2.46.0

An uncaught exception occurred:

Traceback (most recent call last):
  File "base_coder.py", line 1115, in send_message
    yield from self.send(messages, functions=self.functions)
  File "base_coder.py", line 1392, in send
    hash_object, completion = send_completion(
                              ^^^^^^^^^^^^^^^^
  File "sendchat.py", line 87, in send_completion
    res = litellm.completion(**kwargs)
          ^^^^^^^^^^^^^^^^^^
  File "llm.py", line 23, in __getattr__
    self._load_litellm()
  File "llm.py", line 30, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 28, in <module>
    from litellm.types.utils import all_litellm_params
  File "utils.py", line 9, in <module>
    from openai.types.completion_usage import CompletionTokensDetails, CompletionUsage
ImportError: cannot import name 'CompletionTokensDetails' from 'openai.types.completion_usage' (/Users/mda/miniconda3/lib/python3.12/site-packages/openai/types/completion_usage.py)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "main.py", line 709, in main
    coder.run()
  File "base_coder.py", line 723, in run
    self.run_one(user_message, preproc)
  File "base_coder.py", line 766, in run_one
    list(self.send_message(message))
  File "base_coder.py", line 1117, in send_message
    except retry_exceptions() as err:
           ^^^^^^^^^^^^^^^^^^
  File "sendchat.py", line 24, in retry_exceptions
    litellm.exceptions.APIConnectionError,
    ^^^^^^^^^^^^^^^^^^
  File "llm.py", line 23, in __getattr__
    self._load_litellm()
  File "llm.py", line 30, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 28, in <module>
    from litellm.types.utils import all_litellm_params
  File "utils.py", line 9, in <module>
    from openai.types.completion_usage import CompletionTokensDetails, CompletionUsage
ImportError: cannot import name 'CompletionTokensDetails' from 'openai.types.completion_usage' (/Users/mda/miniconda3/lib/python3.12/site-packages/openai/types/completion_usage.py)

@paul-gauthier
Copy link
Collaborator

Thanks for trying aider and filing this issue.

This looks like a duplicate of #1690. Please see the comments there for more information, and feel free to continue the discussion within that issue.

I'm going to close this issue for now. But please let me know if you think this is actually a distinct issue and I will reopen this issue.Note: A bot script made these updates to the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants