Skip to content

fix: Include original error message in ImportError for LLM client dependencies#7497

Open
karthik-idikuda wants to merge 1 commit intomicrosoft:mainfrom
karthik-idikuda:fix/improve-import-error-messages
Open

fix: Include original error message in ImportError for LLM client dependencies#7497
karthik-idikuda wants to merge 1 commit intomicrosoft:mainfrom
karthik-idikuda:fix/improve-import-error-messages

Conversation

@karthik-idikuda
Copy link
Copy Markdown

Summary

Improved import error handling across autogen-ext to include the original error message in re-raised ImportError exceptions. This makes it significantly easier for users to diagnose dependency issues.

Problem

When an import error occurs while initializing LLM clients or other extension components, the code raises a generic ImportError with a hardcoded message suggesting to install specific packages. However, the actual import error might be caused by other issues (like missing transitive dependencies or version conflicts).

Before:

ImportError: Dependencies for Llama Cpp not found. Please install llama-cpp-python: pip install autogen-ext[llama-cpp]

After:

ImportError: Dependencies for Llama Cpp not found. Original error: No module named 'llama_cpp'
Please install llama-cpp-python: pip install autogen-ext[llama-cpp]

Changes

Updated 10 files across autogen-ext to include the original exception message:

  • models/llama_cpp/__init__.py
  • agents/azure/__init__.py
  • agents/magentic_one/__init__.py
  • runtimes/grpc/__init__.py and 3 worker runtime files
  • memory/chromadb/_chromadb.py
  • memory/redis/_redis_memory.py
  • tools/azure/_ai_search.py

Testing

The change is minimal and only affects error message formatting. The from e chain is preserved in all cases, and the original exception object is still accessible via __cause__.

Closes #4605

@karthik-idikuda karthik-idikuda force-pushed the fix/improve-import-error-messages branch from 2df8197 to af55c33 Compare March 31, 2026 10:34
…endencies

Improved import error handling across autogen-ext to include the original
error message, making it easier for users to diagnose dependency issues.

Previously, import errors would show a generic message like:
  'Dependencies for X not found. Please install ...'

Now they show:
  'Dependencies for X not found. Original error: No module named vertexai
   Please install ...'

This helps users distinguish between missing packages and other issues like
version conflicts or missing transitive dependencies.

Files updated:
- models/llama_cpp/__init__.py
- agents/azure/__init__.py
- agents/magentic_one/__init__.py
- runtimes/grpc/__init__.py and worker runtime files
- memory/chromadb/_chromadb.py
- memory/redis/_redis_memory.py
- tools/azure/_ai_search.py

Closes microsoft#4605
@karthik-idikuda karthik-idikuda force-pushed the fix/improve-import-error-messages branch from af55c33 to 18485b4 Compare March 31, 2026 10:56
@karthik-idikuda
Copy link
Copy Markdown
Author

@microsoft-github-policy-service agree

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Improve Import Error Messages for LLM Client Dependencies

1 participant