Silly question - but we have our own internal ENV ...
# marvin-ai
j
Silly question - but we have our own internal ENV config naming convention where we have something like PREFIX_OPENAI_API_KEY. I know I can instantiate a ChatOpenAI(openai_api_key=...) and pass that into an Agent, but I see this in our logs. > Did not find openai_api_key, please add an environment variable
OPENAI_API_KEY
which contains it, or pass
openai_api_key
as a named parameter Is there a way to 'set the default model" using a custom API key? I believe this is due to our runtime settings init getting called after this attempt
n
hi @Jason whose log is that? do you have the trace?
j
@Nate ah, this might be langtrace.
👍 1
Actually, I don't know anymore. It's unclear.
It looks like it's CF.
Copy code
17:56:34.244 | WARNING | controlflow.llm.models - The default LLM model could not be created. ControlFlow will continue to work, but you must manually provide an LLM model for each agent. For more information, please see <https://controlflow.ai/guides/llms>. The error was:
1 validation error for ChatOpenAI
__root__
  Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)
looks to be just warning. eventually the model is correctly configured and passed to the agent — but would be great to know if there is some step I'm missing.