Bug Description
The MCP server Docker image (zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2) installs graphiti-core without optional extras. This means non-OpenAI LLM providers (Anthropic, Gemini, Groq) fail at runtime even though they are listed as supported in the config schema and README.
The MCP server config accepts provider: "anthropic" (and gemini, groq), and the factory code in services/factories.py has code paths for these providers guarded by HAS_ANTHROPIC, HAS_GEMINI, HAS_GROQ flags. But since the extras aren't installed, these flags are False and the providers fail immediately.
Steps to Reproduce
-
Pull the latest MCP server image:
docker pull zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2
-
Configure config.yaml with an Anthropic provider:
llm:
provider: "anthropic"
model: "claude-haiku-4-5-20251001"
temperature: 0.0
providers:
anthropic:
api_key: "${ANTHROPIC_API_KEY}"
-
Start the server. It fails during initialization.
Expected Behavior
The Docker image should include optional extras for all providers that the MCP server advertises as supported, so that switching providers is a config-only change.
Actual Behavior
WARNING - Failed to create LLM client: Anthropic client not available in current graphiti-core version
The HAS_ANTHROPIC flag in factories.py is False because graphiti-core[anthropic] was not installed. Same applies to HAS_GEMINI and HAS_GROQ.
Environment
- Graphiti Version: graphiti-core 0.28.2, MCP server 1.0.2 (Docker image
zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2)
- Python Version: 3.11
- Operating System: Docker (Debian-based)
- Database Backend: FalkorDB
- LLM Provider & Model: Anthropic claude-haiku-4-5-20251001
Installation Method
Workaround
Build a custom image that installs the extras. The image uses uv:
FROM zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2
RUN cd /app/mcp && uv add "graphiti-core[anthropic,google-genai]==0.28.2"
Possible Solution
Change the Dockerfile for zepai/knowledge-graph-mcp to install graphiti-core with all optional provider extras:
graphiti-core[anthropic,google-genai,groq]==0.28.2
The additional dependencies are lightweight (the anthropic, google-genai, and groq Python packages) and would make the image work out of the box for all supported providers.
Related
Bug Description
The MCP server Docker image (
zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2) installsgraphiti-corewithout optional extras. This means non-OpenAI LLM providers (Anthropic, Gemini, Groq) fail at runtime even though they are listed as supported in the config schema and README.The MCP server config accepts
provider: "anthropic"(and gemini, groq), and the factory code inservices/factories.pyhas code paths for these providers guarded byHAS_ANTHROPIC,HAS_GEMINI,HAS_GROQflags. But since the extras aren't installed, these flags areFalseand the providers fail immediately.Steps to Reproduce
Pull the latest MCP server image:
Configure
config.yamlwith an Anthropic provider:Start the server. It fails during initialization.
Expected Behavior
The Docker image should include optional extras for all providers that the MCP server advertises as supported, so that switching providers is a config-only change.
Actual Behavior
The
HAS_ANTHROPICflag infactories.pyisFalsebecausegraphiti-core[anthropic]was not installed. Same applies toHAS_GEMINIandHAS_GROQ.Environment
zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2)Installation Method
zepai/knowledge-graph-mcp:1.0.2-graphiti-0.28.2)Workaround
Build a custom image that installs the extras. The image uses
uv:Possible Solution
Change the Dockerfile for
zepai/knowledge-graph-mcpto install graphiti-core with all optional provider extras:The additional dependencies are lightweight (the
anthropic,google-genai, andgroqPython packages) and would make the image work out of the box for all supported providers.Related