ControlFlow 0.10 Errors and Exceptions Thread
# marvin-ai
d
ControlFlow 0.10 Errors and Exceptions Thread
Here is one trying to use Claude still:
Also we need to handle these: openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.", 'type': 'invalid_request_error', 'param': 'messages.[1].role', 'code': None}}
ok yeah, this is somehow not getting handled right for Claude still . This makes me sad. 😞
j
Hm those are frustrating, we have a rule that's supposed to check for them.
Oh wait, can you show me how you create the agent's model for these?
I'm wondering if its not a
ChatAnthropic
subclass
which is what its currently looking for to apply the anthropic rules
But I'm guessing its a GoogleVertex object or something?
We can add a rule for it
d
Copy code
from langchain_google_vertexai.model_garden import ChatAnthropicVertex
Copy code
def get_llm():

    project_id = "majestic-disk-431719-e8"  # Replace this with your actual project ID
    model_name = "claude-3-5-sonnet@20240620"  # Use a model you have confirmed access to

    llm = ChatAnthropicVertex(model_name=model_name, project=project_id, location="us-east5")
    #response = llm.invoke("hi")
    #print(response)
    print("Using VERTEX AI CLAUDE")
    return llm
I think you are right
should be a simple fix!
j
Great, thanks for the example! I just opened a PR to allow setting custom compilation rules (and added ChatAnthropicVertex to the list of auto-detected rules) https://github.com/PrefectHQ/ControlFlow/pull/350
I will get this in
main
this morning and am targeting a larger release early next week
if not this weekend
d
Awsesome! I really need to try some of the more advanced ControlFlow issues so I can annoy you with better bugs. I do want to try Google's Gemini Pro at some point, if you haven't already.
ok, I really wish I knew what the right thing to do with this exception was:
Copy code
File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/controlflow/orchestration/orchestrator.py", line 331, in run_agent_turn
    for event in self.agent._run_model(messages=messages, tools=tools):
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/prefect/task_engine.py", line 1407, in run_generator_task_sync
    return engine.result()
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/prefect/task_engine.py", line 457, in result
    raise self._raised
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/prefect/task_engine.py", line 763, in run_context
    yield self
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/prefect/task_engine.py", line 1392, in run_generator_task_sync
    gen_result = next(gen)
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/controlflow/agents/agent.py", line 283, in _run_model
    for delta in model.stream(messages):
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 5508, in stream
    yield from self.bound.stream(
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 418, in stream
    raise e
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 398, in stream
    for chunk in self._stream(messages, stop=stop, **kwargs):
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 639, in _stream
    response = self.client.create(**payload)
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
    return func(*args, **kwargs)
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
    return self._post(
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/openai/_base_client.py", line 1265, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/openai/_base_client.py", line 942, in request
    return self._request(
  File "/home/dave/.cache/pypoetry/virtualenvs/rustmonster-GKhQY2Y4-py3.10/lib/python3.10/site-packages/openai/_base_client.py", line 1046, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.", 'type': 'invalid_request_error', 'param': 'messages.[1].role', 'code': None}}
j
That is another LLM rules error.
d
Is this something magically addressed when I type poetry update tomorrow? 🙂
j
cant promise tomorrow but very soon yes
d
@Jeremiah This is going to be a bit harder to fix, but the o1-mini (and o1-preview etc) models don't support SYSTEM messages. openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
But I really want to use them 🙂
j
I started implement this (and did, for system messages) but it turns out they dont support tools! So unfortunately nonstarter for now 😞
d
Argh. I assumed they'd finally fixed that. 😞
well I'm excited for the release that lets me use Claude then 🙂
and I'll test Gemini as soon as I can