Samuel Hinton
02/24/2023, 3:23 AM/Users/sh/Projects/flows/.venv/lib/python3.9/site-packages/prefect/tasks.py:270: UserWarning: A task named 'some_task' and defined at 'flows/data/endpoints/example.py:3' conflicts with another task. Consider specifying a unique `name` parameter in the task definition:
`@task(name='my_unique_name', ...)`
warnings.warn(
/Users/sh/Projects/flows/.venv/lib/python3.9/site-packages/prefect/flows.py:214: UserWarning: A flow named 'get-some-data' and defined at 'flows/data/endpoints/example.py:9' conflicts with another flow. Consider specifying a unique `name` parameter in the flow definition:
`@flow(name='my_unique_name', ...)`
warnings.warn(
13:20:46.203 | INFO | Flow run 'important-jaguarundi' - Created task run 'some_task-0' for task 'some_task'
Apparently both my task and my flow conflict with another…
This is… not possible? I’m playing around with prefect and have a single example.py
that contains a single flow and task. Is there a way I can figure out why prefect thinks theres a conflict, or just silence the spurious warnings? (Ideally Id like to figure out whats going on rather than ignoring the warnings)Ryan Peden
02/24/2023, 3:41 AMexample.py
more than once.
How did you create your deployment? I don't encounter this when creating and running your code in my own local deployment, so perhaps we can find a way to adjust your deployment to solve this. 🙂Samuel Hinton
02/24/2023, 3:42 AMDeployment.build_from_flow(...).apply()
where I use an S3 storage bucketRyan Peden
02/24/2023, 4:08 AMbuild_from_flow
? Feel free to redact any sensitive info; I'm just trying to gather enough info to try to reproduce the error.Samuel Hinton
02/24/2023, 4:10 AMd = Deployment.build_from_flow(
flow=get_some_data,
storage=s3_block,
work_pool_name="process",
schedule="long rrule here, will try without it now",
)
d.apply()
from prefect import flow, task
from prefect.deployments import Deployment
from prefect.filesystems import S3
from prefect.settings import PREFECT_API_URL, temporary_settings
@task
def some_task():
print("Some task")
return "Finished"
@flow
def get_some_data():
return some_task.submit()
if __name__ == "__main__":
with temporary_settings({PREFECT_API_URL: "some_address:4200/api"}):
s3_block = S3.load("flow-storage")
Deployment.build_from_flow(
flow=get_some_data, name="default", storage=s3_block, work_pool_name="process"
).apply()
Thats whole file, if it helps. It doesnt complain with that, so Ill start doing things like adding the schedule back in and see what ends up tricking it upRyan Peden
02/24/2023, 2:41 PMSamuel Hinton
02/26/2023, 10:48 PM