Open
Description
I'm encountering an issue while using the langgraph-supervisor
library in conjunction with Google's Gemini model (via LangChain). The error I'm seeing is:
langchain_google_genai.chat_models.ChatGoogleGenerativeAIError: Invalid argument provided to Gemini: 400 * GenerateContentRequest.contents: contents is not specified
After investigating, I found:
- When the LLM is invoked directly (outside of
langgraph-supervisor
), the Gemini model works fine. - However, when used within the
langgraph-supervisor
, Gemini throws the above error, which indicates that an emptycontent
field is being passed at some point. - This suggests that the supervisor may be invoking the agent even when no new message or content is passed, leading Gemini to reject the call.
Reproducible Code:
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph_supervisor import create_supervisor
from langgraph.prebuilt import create_react_agent
from langchain_core.messages import HumanMessage
from config import GEMINI_API_KEY
model = ChatGoogleGenerativeAI(
model="gemini-1.5-flash",
temperature=0.7,
google_api_key=GEMINI_API_KEY,
)
# Create specialized agents
def add(a: float, b: float) -> float:
"""Add two numbers."""
return a + b
def multiply(a: float, b: float) -> float:
"""Multiply two numbers."""
return a * b
def web_search(query: str) -> str:
"""Search the web for information."""
return (
"Here are the headcounts for each of the FAANG companies in 2024:\n"
"1. **Facebook (Meta)**: 67,317 employees.\n"
"2. **Apple**: 164,000 employees.\n"
"3. **Amazon**: 1,551,000 employees.\n"
"4. **Netflix**: 14,000 employees.\n"
"5. **Google (Alphabet)**: 181,269 employees."
)
math_agent = create_react_agent(
model=model,
tools=[add, multiply],
name="math_expert",
prompt="You are a math expert. Always use one tool at a time."
)
research_agent = create_react_agent(
model=model,
tools=[web_search],
name="research_expert",
prompt="You are a world class researcher with access to web search. Do not do any math."
)
# Create supervisor workflow
workflow = create_supervisor(
[research_agent, math_agent],
model=model,
prompt=(
"You are a team supervisor managing a research expert and a math expert. "
"For current events, use research_agent. "
"For math problems, use math_agent."
)
)
# Compile and run
app = workflow.compile()
result = app.invoke({
"messages": [
{
"role": "user",
"content": "what's the combined headcount of the FAANG companies in 2024?"
}
]
})
print(result)
Metadata
Metadata
Assignees
Labels
No labels