https://prefect.io logo
Title
s

Samuel Hinton

02/24/2023, 3:23 AM
Hi team. Another confusing thing on my end - I’m getting warnings about naming conflicts:
/Users/sh/Projects/flows/.venv/lib/python3.9/site-packages/prefect/tasks.py:270: UserWarning: A task named 'some_task' and defined at 'flows/data/endpoints/example.py:3' conflicts with another task. Consider specifying a unique `name` parameter in the task definition:

 `@task(name='my_unique_name', ...)`
  warnings.warn(
/Users/sh/Projects/flows/.venv/lib/python3.9/site-packages/prefect/flows.py:214: UserWarning: A flow named 'get-some-data' and defined at 'flows/data/endpoints/example.py:9' conflicts with another flow. Consider specifying a unique `name` parameter in the flow definition:

 `@flow(name='my_unique_name', ...)`
  warnings.warn(
13:20:46.203 | INFO    | Flow run 'important-jaguarundi' - Created task run 'some_task-0' for task 'some_task'
Apparently both my task and my flow conflict with another… This is… not possible? I’m playing around with prefect and have a single
example.py
that contains a single flow and task. Is there a way I can figure out why prefect thinks theres a conflict, or just silence the spurious warnings? (Ideally Id like to figure out whats going on rather than ignoring the warnings)
r

Ryan Peden

02/24/2023, 3:41 AM
It almost looks like it is loading
example.py
more than once. How did you create your deployment? I don't encounter this when creating and running your code in my own local deployment, so perhaps we can find a way to adjust your deployment to solve this. 🙂
s

Samuel Hinton

02/24/2023, 3:42 AM
I use
Deployment.build_from_flow(...).apply()
where I use an S3 storage bucket
So the flow is defined in the s3 bucket. Does it also get cloudpickled and uploaded into the sqlite db or something?
r

Ryan Peden

02/24/2023, 4:08 AM
No, it should only be going to S3. Can you show the parameters you passed to
build_from_flow
? Feel free to redact any sensitive info; I'm just trying to gather enough info to try to reproduce the error.
s

Samuel Hinton

02/24/2023, 4:10 AM
Yeah sure thing:
d = Deployment.build_from_flow(
    flow=get_some_data,
    storage=s3_block, 
    work_pool_name="process",
    schedule="long rrule here, will try without it now",
)
d.apply()
from prefect import flow, task
from prefect.deployments import Deployment
from prefect.filesystems import S3
from prefect.settings import PREFECT_API_URL, temporary_settings


@task
def some_task():
    print("Some task")
    return "Finished"


@flow
def get_some_data():
    return some_task.submit()


if __name__ == "__main__":
    with temporary_settings({PREFECT_API_URL: "some_address:4200/api"}):
        s3_block = S3.load("flow-storage")

        Deployment.build_from_flow(
            flow=get_some_data, name="default", storage=s3_block, work_pool_name="process"
        ).apply()
Thats whole file, if it helps. It doesnt complain with that, so Ill start doing things like adding the schedule back in and see what ends up tricking it up
So I think the issue is that I make my deployments in another python file and import in the flow from example.py (I effectively have a registry list of flows to iterate through and deploy, which is just the one at the moment). It seems to be treating that list of references as the thing getting in the way, even though it references the same function
@Ryan Peden When I stuff things in one file instead of multiple files I can confirm the warnings vanish, but having the deployment in another file generally screws things up. Im going to just ignore warnings to silence these issues
r

Ryan Peden

02/24/2023, 2:41 PM
Doing it in a separate file should work without giving this message, so it seems like it has something to do with what's happening in your deployment file. If you're willing to share that code via DM, I can probably provide insight into what's happening and give suggestions about how to avoid the warning. Having said that, I don't think the warning will cause you problems right now; it's meant to let you know when you accidentally overwrite an existing flow with another one. The downside to ignoring it is that you won't see the warning in the event you do accidentally use a duplicate flow or task name.
s

Samuel Hinton

02/26/2023, 10:48 PM
I’ll see if I can rustle up a complete example for you 🙂
Complete example here for you. Theres the file with flows, a file to do the deployment, and the file containing the flow registry. This is used because we will have many flows with specific deployment configurations (like specific schedules, work pool/queue, etc) and this registry just gathers up all the flows using a very simple decorator EDIT: @Ryan Peden please disregard the code. I believe the issue was running the agent locally to test things, while using s3. S3 and my venv was (I think) causing the issue