https://prefect.io logo
Docs
Join the conversationJoin Slack
Channels
announcements
ask-marvin
best-practices-coordination-plane
data-ecosystem
data-tricks-and-tips
events
find-a-prefect-job
geo-australia
geo-bay-area
geo-berlin
geo-boston
geo-chicago
geo-colorado
geo-dc
geo-israel
geo-japan
geo-london
geo-nyc
geo-seattle
geo-texas
gratitude
introductions
marvin-in-the-wild
prefect-ai
prefect-aws
prefect-azure
prefect-cloud
prefect-community
prefect-contributors
prefect-dbt
prefect-docker
prefect-gcp
prefect-getting-started
prefect-integrations
prefect-kubernetes
prefect-recipes
prefect-server
prefect-ui
random
show-us-what-you-got
Powered by Linen
prefect-community
  • f

    Freddie

    08/19/2022, 10:10 AM
    Hi folks! Just a quick one - I'm trying to delete some old projects. This is on 0.x. I'm getting GraphQL timeouts when trying to delete them and wondered if there's another way to get them deleted? I've turned off all the schedules, I'd just really like to get rid of them to avoid confusion / UI clutter.
    ✅ 1
    b
    • 2
    • 5
  • b

    Bigya Man Pradhan

    08/19/2022, 10:18 AM
    Hi all, I am using prefect 2.1.1 trying to build a deployment through the provided python object [
    from prefect.deployments import Deployment
    ]. I am not sure about how to get
    infrastructure block
    and
    storage block
    (created through the UI) added to the Deployment. Is this feature not included in the object?
    ✅ 1
    a
    • 2
    • 13
  • i

    Iuliia Volkova

    08/19/2022, 10:19 AM
    Prefect 2 question. Remote storage. I'm trying to build & deploy pipeline according to documentation https://docs.prefect.io/concepts/deployments/#build-the-deployment. I run command as explain in doc. And I see that Prefect put in bucket WHOLE folder that I'm in (whole workdir) - take a look at the screen. If I will run command from my $HOME Path it will upload whole HOME path in remote bucket? If think.. by logic, it should upload only flow, or flow dir - path that defined in command, but not all environment around. Should I open issue for that? or it there is any idea under that behavior?
    ✅ 1
    o
    a
    j
    • 4
    • 9
  • b

    Ben Strange

    08/19/2022, 11:08 AM
    Hello everyone, Has anyone attempted accessing a public s3 bucket using the S3 block or know of a work around? Had a good search around but can't seem to find how you could change the S3 validation.
    ✅ 1
    a
    • 2
    • 1
  • f

    Florian Kühnlenz

    08/19/2022, 1:36 PM
    in prefect cloud 1.0 how can we find the tasks that block a task concurrency limit?
    ✅ 1
    a
    • 2
    • 4
  • j

    John Mizerany

    08/19/2022, 1:56 PM
    We are hoping to use Parameters to pass in a different value for one of our flows (through the UI and also through the API). Is it best to just have a default parameter value in the flow definition at first so it will change if we pass another value through? I can’t seem to find in the docs how to grab the parameter in the flow that is passed through especially if we create a flow run with the API. We are using prefect 1.0 atm
    ✅ 1
    a
    • 2
    • 2
  • b

    Bigya Man Pradhan

    08/19/2022, 2:18 PM
    Hi all, Unsure if this is a common issue or a configuration issue on my side. Getting the following issue:
    ValueError: 'api.prefect.cloud' does not appear to be an IPv4 or IPv6 address
    Complete error log in comment.
    k
    n
    • 3
    • 12
  • h

    Hawkar Mahmod

    08/19/2022, 2:28 PM
    What is the idiomatically Prefect (2.0) way of handling failed tasks? My task raises an exception that bubbles all the way when I run my flow locally. I’d like to just be able to have the flow finish with a failed state and not see the whole stack trace?
    a
    • 2
    • 1
  • j

    Jared Robbins

    08/19/2022, 3:06 PM
    Anyone have favorite built in prefect tasks? Just noticed there was an sftp one after i implemented it myself
    a
    j
    • 3
    • 2
  • t

    Tony Yun

    08/19/2022, 3:37 PM
    how to stop the async task runs in flow context? I need to execute a loop in the flow context but I don’t want it to execute the next loop before the first loop is finished. I see the tasks are executed asynchrously. For example:
    with Flow(
        name="ETL",
    ) as flow:
        page = 0
        max_page = 999999
        while page <= max_page:
            page += 1
            dummy_task()
    m
    • 2
    • 2
  • i

    Ilya Galperin

    08/19/2022, 3:48 PM
    When using the python
    Deployment
    object and specifying a remote S3 flow storage block and Kubernetes infrastructure, we are seeing strange behavior on flow execution. The deployment pushes the flow to S3 storage as expected (which is confirmed by the storage block in the Prefect Cloud UI being referenced in the deployment UI) but errs out with the following:
    Flow could not be retrieved from deployment...FileNotFoundError: [Errno 2] No such file or directory: '/My/Local/Path/my_project/flow.py'
    where the path is the absolute path of the machine that applied the deployment whereas the absolute path in the s3 bucket is just
    <bucketname://flow.py>
    . Here is the code we are using if anyone has any ideas?
    from prefect.deployments import Deployment
    from prefect.infrastructure import KubernetesJob
    from prefect.filesystems import S3
    from my_project.flow import entrypoint
    
    infrastructure = KubernetesJob(namespace="prefect2")
    
    deployment = Deployment.build_from_flow(
        flow=entrypoint,
        name="my_deployment",
        work_queue_name="default",
        storage=S3.load("default-block"),
        infrastructure=infrastructure,
    )
    deployment.apply()
    k
    d
    r
    • 4
    • 24
  • o

    Owen Cook

    08/19/2022, 4:20 PM
    Hi, has else had a problem with flow retries with deployments? I ran my flow directly in a python script and the retries were working fine. I then pushed the same flow to a deployment and then ran the deployment on an agent and the retries just didn't happen. (using prefect 2.1.1)
    🙌 1
    ✅ 1
    a
    • 2
    • 1
  • n

    Neil Natarajan

    08/19/2022, 4:43 PM
    What replaces the following logic in prefect 2.0
    schedule = IntervalSchedule(interval=timedelta(seconds=30))
    
    with Flow("workflow", schedule=schedule) as workflow:
    Is there a way to run a flow on an interval schedule using the
    DaskTaskRunner
    in prefect 2.0
    c
    • 2
    • 3
  • i

    Ilya Galperin

    08/19/2022, 4:49 PM
    Are there any plans (or existing methods) to allow users to save flow code to its storage block in an ad-hoc manner, independently of the deployment process? This seems like it could be nice for CI/CD workflows where only the flow code used by existing deployments is updated, but no new deployment actually needs to be created. It seems like the workaround for this right now is just to build a “dummy” deployment but not apply it, but I’m afraid that might introduce unintended side effects down the line.
    :plus-one: 1
    a
    • 2
    • 8
  • p

    Payam K

    08/19/2022, 6:15 PM
    Hi All. I have a dumb design question on how to use Prefect cloud + ECS Fargate. I have a flow registered to an ECS cluster. Now I want to iterate over a list of value for a parameter(e.g.[1,2,3,4,5]) that I used inside a task or the registered flow: example:
    @task
    def make_df(i):
        logger = prefect.context.get("logger")
        <http://logger.info|logger.info>("Hi from Prefect %s", prefect.__version__)
        <http://logger.info|logger.info>("this is the first step")
        data = {'Name':['Tom', 'Brad', 'Kyle', 'Jerry'],
            'Age':[20, i**2, 2*i, 18*i],
            'Height' : [6.1, 5.9, 6.0, 6.1]
            }
        df = pd.DataFrame(data)
        return df
    How should I design my work to run 5 parallel task in ECS cluster?
    a
    • 2
    • 8
  • k

    Kal

    08/19/2022, 6:39 PM
    See screenshots in thread! Hello! I am running into an issue when trying to build a flow on the Prefect CLI. Basically I am trying to use an S3 storage block, but when I run the command it just hangs and never finishes. The files do get uploaded to the specified S3 bucket, but no YAML file is ever created so I can't apply it to the deployment. Please let me know if you have a solution. Thanks!
    a
    • 2
    • 5
  • t

    Tim Enders

    08/19/2022, 6:55 PM
    Back again with DB QueuePool errors. But this time I have the whole stack trace! Anybody know how to solve this? Stack Trace in thread. I am running Postgres backend on Prefect 2.0
    m
    • 2
    • 9
  • k

    kwmiebach

    08/19/2022, 7:12 PM
    Hi. I wonder what the field name for the flow-run-name / unique-animal-names is. If I knew its field name this could help searching, also inside the code. Next question would be: How can I set it, I'd like to set my own silly-unique-names instead of these. Is there just a keyword parameter for it? And if there is no easy way, what is the hard way? Can I swap / hack / override some function or module? What is the module / function that creates the names. In my project it could be helpful to set this field to something more useful. Thank you!
    o
    • 2
    • 4
  • c

    Chandrashekar Althati

    08/19/2022, 7:54 PM
    Hi, I just signed up using my work email to try prefect https://app.prefect.cloud/ .But I didn't receive the verification email. I have been waiting since 2 hours. 😞
    k
    j
    j
    • 4
    • 12
  • p

    Paco Ibañez

    08/19/2022, 9:28 PM
    Hello, what is the correct way to create a deployment for a flow so that the flow code gets uploaded to remote storage? I have seen examples using
    packager
    or
    storage
    arguments but none of them are working for me.
    deployment = Deployment.build_from_flow(
        name="docker-example",
        flow=my_docker_flow,
        packager=FilePackager(filesystem=RemoteFileSystem.load('minio-docker')),
        # storage_block=RemoteFileSystem.load('minio-docker'),
        infrastructure=DockerContainer(
            image = 'prefect-orion:2.1.1',
            image_pull_policy = 'IF_NOT_PRESENT',
            networks = ['prefect'],
            env = {
                "USE_SSL": False,
                "AWS_ACCESS_KEY_ID": "blablabla",
                "AWS_SECRET_ACCESS_KEY": "blablabla",
                "ENDPOINT_URL": '<http://minio:9000>',
            }
        ),
    )
    deployment.apply()
    With the above code the deployment is created but the flow is not uploaded to minio
    i
    • 2
    • 2
  • a

    Alexander Kloumann

    08/19/2022, 9:31 PM
    Hello everyone, I have a question about how to handle failures in flow runs: I have a basic ETL pipeline set up as a flow that is scheduled to execute daily, and it ingests data from about a 100 different endpoints. It is expected that the ingest for most of those endpoints will be successful, but also that there will always be several that are unsuccessful. But since some are unsuccessful, Prefect shows the entire flow run as a failure. Is there a way to change that? I'd prefer that it only show as a failure if none of the ingests are successful, because it's set up to alert people who use this about failures. The Flow object in Python is set up sort of like this
    source_ids = ["source_1", "source_2", "source_3"]
    with Flow("my_flow") as flow:
        for source_id in source_ids:
            data = extract(source_id)
            data = transform(data)
            load(data)
        return flow
    Thanks in advance!
    n
    j
    • 3
    • 4
  • o

    Oscar Björhn

    08/20/2022, 1:09 PM
    For Prefect 2: Is there a recommended way/best practice to generate logs from within one of my own modules? The modules are not contained in their own packages, they're just .py files in a folder (everything that's packaged seems to log properly). The modules are made up of some pretty big classes and functions and ideally these modules should not have any prefect dependencies. Unless there's no other way, then we'll have to live with it. 🙂 I'm probably missing something obvious, I don't have that much experience with the various ways you can set up Python loggers. I couldn't get this working using PREFECT_LOGGING_EXTRA_LOGGERS, but perhaps I've just set up my modules incorrectly.
    ✅ 1
    a
    • 2
    • 6
  • m

    Mohamed Ayoub Chettouh

    08/20/2022, 3:37 PM
    (Prefect 2) I want my flow to write a file (more technically, I want one of the tasks to do it). My file system block is GCS but its preferable if it outputs locally. Currently, the file seems like it isn't created at all. What to do?
    ✅ 1
    a
    • 2
    • 5
  • y

    Yaron Levi

    08/20/2022, 5:00 PM
    Hi 👋👋👋 Is there a command in a Prefect flow that will stop (kill) the current agent? To give some context, our agent is a cron job in Render (Heroku alternative) that wakes up for time to time and starts the agent to process flows. But we don't need it to run all the time, so after it got a flow from a queue and flow is done, we want the agent to stop (and then the cron job in Render will also stop). I think process.kill in the end of the flow might work but seems a bit wrong.
    👋 1
    a
    • 2
    • 1
  • a

    Amjad Salhab

    08/20/2022, 5:09 PM
    Hi there, is there a guide or steps to setup prefect 2 environment to run orion server and agent in different containers in different servers?I was able to do that but the agent refuses to connect to server when using https (it only works with http) is there additional configurations needed to have agent connected to orion server over https?
    a
    r
    • 3
    • 7
  • v

    vk

    08/20/2022, 8:18 PM
    Hi all, I have a question about Orion, is there a way to call tasks from other tasks? In prefect 1.0 it's possible to call
    some_task.run(args)
    from other tasks and it's very handy, cause very often it's necessary to call existing tasks (especially from prefect task library) deep inside other tasks. in Orion I didn't find how to do that, but pretty sure there should be some way?
    j
    b
    • 3
    • 4
  • m

    Michael Z

    08/21/2022, 3:47 AM
    Hi All, new to Prefect ... would anyone be willing to link some open source projects that use Prefect? Basically I just want to see some more complex scenarios than what is shown in the docs. Like a more complicated graph dealing with many different tasks and states. Thanks for your help!
    ✅ 2
    a
    • 2
    • 3
  • a

    Amjad Salhab

    08/21/2022, 11:52 AM
    Hi All, i have orion up and running connected to postgresSQL database,it was working fine until i shut down the orion and tried to run it again,orion in trying to create the database tables again! as shown in the image below is there a way to avoid initializing database tables on every orion start up?
    ✅ 2
    a
    • 2
    • 3
  • l

    Low Kim Hoe

    08/21/2022, 11:54 AM
    Hi All, new to Prefect. I found that we can put the schedule parameter under prefect.deployments in Python API May I ask what kind of schedule we can put? Thanks.
    ✅ 1
    a
    • 2
    • 2
  • f

    Fady Khallaf

    08/21/2022, 1:44 PM
    Hello All, Is there a way to secure Orion UI? I am not using prefect cloud so I am looking for a way to have authentication on Orion UI instead of being publicly accessible
    ✅ 1
    ⬆️ 1
    m
    a
    • 3
    • 2
Powered by Linen
Title
f

Fady Khallaf

08/21/2022, 1:44 PM
Hello All, Is there a way to secure Orion UI? I am not using prefect cloud so I am looking for a way to have authentication on Orion UI instead of being publicly accessible
✅ 1
⬆️ 1
m

Mohamed Alaa

08/21/2022, 1:49 PM
I am having a similar issue too and would love to see a tutorial or and example for it
a

Anna Geller

08/21/2022, 2:26 PM
Check this topic for more information about it
👍 2
View count: 4