Brock
11/20/2024, 3:02 PMJeremiah
Jeremiah
Brock
11/20/2024, 3:21 PMprefect profile ls
shows
┏━━━━━━━━━━━━━━━━━━━━━┓
┃ Available Profiles: ┃
┡━━━━━━━━━━━━━━━━━━━━━┩
│ * local │
└─────────────────────┘
and in the same shell I execute python myflow.py
and I see the prefect flow firing with data being logged to the web console. I am still learning, but not sure what the difference is.Bianca Hoch
11/20/2024, 6:39 PMBianca Hoch
11/20/2024, 6:59 PMprefect profile inspect
, what is your PREFECT_API_URL
is set to?Brock
11/20/2024, 7:30 PMNo name provided, defaulting to 'local'
PREFECT_API_URL='<https://api.prefect.cloud/api/accounts/{acccount_id}/workspaces/{workspace}>
71868aff'
PREFECT_API_KEY='key_here'
I am certain there is a very fundamental concept I am still not grasping, but it runs locally, I just don't see the output in the cloud, though I do if it's a local prefect flow.Bianca Hoch
11/20/2024, 9:12 PMBrock
11/20/2024, 9:21 PMimport vertexai
from vertexai.generative_models import GenerativeModel
from google.oauth2.service_account import Credentials
from prefect_gcp import GcpCredentials
gcp_credentials_block = GcpCredentials.load("GCP_PROJECT_ID")
service_account_json_str = gcp_credentials_block.service_account_info.get_secret_value()
credentials = Credentials.from_service_account_info(service_account_json_str)
from langchain_google_vertexai import ChatVertexAI
model = ChatVertexAI(model="gemini-1.5-flash", credentials=credentials)
import controlflow as cf
cf.defaults.model = model
emails = [
"Hello, I need an update on the project status.",
"Subject: Exclusive offer just for you!",
"Urgent: Project deadline moved up by one week.",
]
reply = cf.run(
"Write a polite reply to an email",
context=dict(email=emails[0])
)
print(reply)
It see the output in the shell, its just not flowing through. In part, I suppose thats ok, but I am trying to learn as I go. 🙂
``````Nate
11/20/2024, 10:06 PMprefect config view
which is the best way I know to ask "what prefect backend am I hooked up to right now?"
python -c "from prefect.settings import Settings; print(Settings().api.url)"
Brock
11/20/2024, 11:12 PMprefect config view
->
🚀 you are connected to:
<https://app.prefect.cloud/account/{account_id}>
PREFECT_PROFILE='local'
PREFECT_API_KEY='********' (from profile)
PREFECT_API_URL='<https://api.prefect.cloud/api/accounts/{account_id}/workspaces/{workspace_id}>' (from profile)
The python command above also yields the same PREFECT_API_URLNate
11/20/2024, 11:16 PMBrock
11/20/2024, 11:44 PM$ python gcp-flow2.py
23:42:32.521 | WARNING | controlflow.llm.models - The default LLM model could not be created. ControlFlow will continue to work, but you must manually provide an LLM model for each agent. For more information, please see <https://controlflow.ai/guides/configure-llms>. The error was:
The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
23:42:34.357 | WARNING | langchain_google_vertexai.functions_utils - Key 'additionalProperties' is not supported in schema, ignoring
23:42:34.358 | WARNING | langchain_google_vertexai.functions_utils - Key 'additionalProperties' is not supported in schema, ignoring
╭─ Agent: Marvin ──────────────────────────────────────────────────────────────────────────────────╮
│ │
│ ✅ Tool call: "mark_task_036cc2c9_successful" │
│ │
│ Tool args: {'task_result': 'Hello, thank you for reaching out. I will provide an update on │
│ the project status as soon as possible.'} │
│ │
│ Tool result: Task #036cc2c9 ("Write a polite reply to an email") marked successful. │
│ │
╰──────────────────────────────────────────────────────────────────────────────────── 11:42:35 PM ─╯
Hello, thank you for reaching out. I will provide an update on the project status as soon as possible.
23:42:35.367 | WARNING | EventsWorker - Still processing items: 2 items remaining...
Nate
11/21/2024, 12:11 AMPREFECT_LOGGING_LEVEL=DEBUG python gcp-flow2.py
Jeremiah
Jeremiah
Jeremiah
Brock
11/21/2024, 12:25 AM$ PREFECT_LOGGING_LEVEL=DEBUG python gcp-flow2.py
00:22:29.395 | DEBUG | prefect.profiles - Using profile 'local'
00:22:30.923 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:31.401 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:32.828 | WARNING | controlflow.llm.models - The default LLM model could not be created. ControlFlow will continue to work, but you must manually provide an LLM model for each agent. For more information, please see <https://controlflow.ai/guides/configure-llms>. The error was:
The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
00:22:33.046 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:33.071 | DEBUG | Task run 'run_tasks' - Created task run 'run_tasks' for task 'run_tasks'
00:22:33.075 | DEBUG | Task run 'Run task: Task #73e6ba09 ("Write a polite reply to an email")' - Renamed task run 'run_tasks' to 'Run task: Task #73e6ba09 ("Write a polite reply to an email")'
00:22:33.078 | DEBUG | Task run 'Run task: Task #73e6ba09 ("Write a polite reply to an email")' - Executing task 'run_tasks' for task run 'Run task: Task #73e6ba09 ("Write a polite reply to an email")'...
00:22:33.137 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:33.192 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:33.199 | DEBUG | Task run 'run' - Created task run 'run' for task 'run'
00:22:33.201 | DEBUG | Task run 'Orchestrator.run()' - Renamed task run 'run' to 'Orchestrator.run()'
00:22:33.204 | DEBUG | Task run 'Orchestrator.run()' - Executing task 'run' for task run 'Orchestrator.run()'...
00:22:33.249 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:33.255 | DEBUG | Task run 'run_agent_turn' - Created task run 'run_agent_turn' for task 'run_agent_turn'
00:22:33.257 | DEBUG | Task run 'Agent turn: Marvin' - Renamed task run 'run_agent_turn' to 'Agent turn: Marvin'
00:22:33.261 | DEBUG | Task run 'Agent turn: Marvin' - Executing task 'run_agent_turn' for task run 'Agent turn: Marvin'...
00:22:33.611 | DEBUG | prefect.client - Connecting to API at <https://api.prefect.cloud/api/accounts/996e7e98-1ad6-4529-a9cc-13a5d71928e7/workspaces/f6e0b7c4-93ed-4ac9-b92f-7e3d71868aff/>
00:22:33.616 | DEBUG | Task run '_run_model' - Created task run '_run_model' for task '_run_model'
00:22:33.618 | DEBUG | Task run 'Call LLM' - Renamed task run '_run_model' to 'Call LLM'
00:22:33.623 | DEBUG | Task run 'Call LLM' - Executing task '_run_model' for task run 'Call LLM'...
00:22:33.626 | WARNING | langchain_google_vertexai.functions_utils - Key 'additionalProperties' is not supported in schema, ignoring
00:22:33.628 | WARNING | langchain_google_vertexai.functions_utils - Key 'additionalProperties' is not supported in schema, ignoring
╭─ Agent: Marvin ──────────────────────────────────────────────────────────────────────────────────╮
│ │
│ ⠼ Tool call: "mark_task_73e6ba09_successful" │
│ │
│ Tool args: {'task_result': 'Dear [Name], \\n\\nThank you for your email. I am working on │
│ getting you an update as soon as possible. Please let me know if you have any other │
│ questions. \\n\\nSincerely, \\nMarvin'} │
│ │
╰──────────────────────────────────────────────────────────────────────────────────── 12:22:34 AM ─╯00:22:34.634 | DEBUG | prefect.clien
╭─ Agent: Marvin ──────────────────────────────────────────────────────────────────────────────────╮
│ │
│ ✅ Tool call: "mark_task_73e6ba09_successful" │
╭─ Agent: Marvin ──────────────────────────────────────────────────────────────────────────────────╮
│ │
│ ✅ Tool call: "mark_task_73e6ba09_successful" │
│ │
│ Tool args: {'task_result': 'Dear [Name], \\n\\nThank you for your email. I am working on │
│ getting you an update as soon as possible. Please let me know if you have any other │
│ questions. \\n\\nSincerely, \\nMarvin'} │
│ │
│ Tool result: Task #73e6ba09 ("Write a polite reply to an email") marked successful. │
│ │
╰──────────────────────────────────────────────────────────────────────────────────── 12:22:34 AM ─╯
00:22:34.738 | INFO | Task run 'Orchestrator.run()' - Finished in state Completed()
00:22:34.745 | INFO | Task run 'Run task: Task #73e6ba09 ("Write a polite reply to an email")' - Finished in state Completed()
Dear [Name], \n\nThank you for your email. I am working on getting you an update as soon as possible. Please let me know if you have any other questions. \n\nSincerely, \nMarvin
Nate
11/21/2024, 12:27 AMBrock
11/21/2024, 12:29 AMBrock
11/21/2024, 12:30 AMNate
11/21/2024, 12:31 AMFlow Runs
and Task Runs
tabs on that Runs
page?Brock
11/21/2024, 12:32 AMNate
11/21/2024, 12:32 AMBrock
11/21/2024, 12:32 AMNate
11/21/2024, 12:33 AMBrock
11/21/2024, 12:33 AMJeremiah
Nate
11/21/2024, 12:34 AMBrock
11/21/2024, 12:39 AMNate
11/21/2024, 12:53 AM.deploy()
are you able to change that to simply .serve()
(like this)?
if it runs like that fine then we can rule out a class of problems
RE
infra level issues that prevent the flow run from starting upthat "exited with a status code 1" is sort of an information black hole when you're not running the worker yourself (and able to look at logs), which is something we're actively working on most commonly this happens because: • missing 3rd party deps remotely that arent present on the prefect base image • failure to retrieve source code
Brock
11/21/2024, 12:55 AMfrom prefect import flow
if __name__ == "__main__":
flow.from_source(
source="<https://github.com/>",
entrypoint="prefect/flows/etl.py:etl_flow",
).deploy(
name="aws-blogs-etl",
work_pool_name="brock-pool1",
job_variables={"env": {"BROCK": "loves-to-code"},
"pip_packages": ["pandas", "requests"]},
cron="15 0 * * *",
tags=["prod"],
description="The pipeline to populate the stage schema with the newest posts. Version is just for illustration",
version="1.0.0",
)
would become
from prefect import flow
if __name__ == "__main__":
flow.from_source(
source="<https://github.com/git>",
entrypoint="prefect/flows/etl.py:etl_flow",
).serve(
name="aws-blogs-etl",
work_pool_name="brock-pool1",
job_variables={"env": {"BROCK": "loves-to-code"},
"pip_packages": ["pandas", "requests"]},
cron="15 0 * * *",
tags=["prod"],
description="The pipeline to populate the stage schema with the newest posts. Version is just for illustration",
version="1.0.0",
)
Nate
11/21/2024, 12:56 AMNate
11/21/2024, 12:56 AMwork_pool_name
and job_variables
kwargsBrock
11/21/2024, 12:58 AM$ python deploy-etl-serve.py
Your flow 'aws-blogs-etl-flow' is being served and polling for scheduled runs!
To trigger a run for this flow, use the following command:
$ prefect deployment run 'aws-blogs-etl-flow/aws-blogs-etl'
You can also run your flow via the Prefect UI: <https://app.prefect.cloud/account/{account}/workspace/{worksplace}/deployments/deployment/cef3eeec-ab3f-4a1a-b0f9-ab463975e429>
Brock
11/21/2024, 12:59 AMNate
11/21/2024, 1:01 AMBrock
11/21/2024, 1:04 AMBrock
11/21/2024, 1:05 AMNate
11/21/2024, 1:06 AMjust when it gets triggered for the cron job,
Brock
11/21/2024, 1:07 AMBrock
11/21/2024, 1:08 AMBrock
11/21/2024, 1:09 AMNate
11/21/2024, 1:10 AM.serve
or running my own worker with prefect worker start
so I could see the logs it spits out, bc they likely contain the cause of error (again, in lieu of better observability for managed workers that were working on)Brock
11/21/2024, 1:12 AMNate
11/21/2024, 1:19 AMpip_packages
on that pool, you're opting into installing deps at runtime, which can be more risky in that regard
for example, just today 😅Brock
11/21/2024, 1:21 AM