• Ajith Kumara Beragala Acharige Lal

    Ajith Kumara Beragala Acharige Lal

    1 year ago
    Hi Prefect-Experts, Can someone point me to a simple example where you show steps to deploy and schedule a flow on kubernetes cluster via
    Helm
    , the Prefect server also deployed to the kubernetes via Helm, now need to try-out a flow on that Prefect-server, does anyone has example/page?
    Ajith Kumara Beragala Acharige Lal
    j
    +1
    7 replies
    Copy to Clipboard
  • Tobias Heintz

    Tobias Heintz

    1 year ago
    Hi, we are evaluating using prefect to run our ML pipelines. Currently these are run as a series of ECS tasks on Fargate, triggered periodically through Cloudwatch. The big contender for the job is Airflow, here our setup would look something like this: have Airflow UI and scheduler running in the cloud (probably also as ECS tasks), and have a DAG which triggers the existing pieces of the pipeline in ECS. So far I haven't been able to find a similar functionality for prefect. Here's what I understood: we would run the prefect UI and server in the cloud, and have an ECS agent deployed that talks to the server. This agent would be able to run entire workflows within a dedicated ECS task, but I would rather trigger existing tasks. Would love if you could point me to documentation that explains the architecture, and maybe covers our use case as well. Thanks!
    Tobias Heintz
    Amanda Wee
    +1
    6 replies
    Copy to Clipboard
  • Fina Silva-Santisteban

    Fina Silva-Santisteban

    1 year ago
    Hi Prefect community! I’m using prefect cloud and deploy flow runs using the ECS Fargate agent successfully. To trigger a flow run I need to first be running a
    prefect ecs agent
    in a terminal, and then use the prefect dashboard to trigger that run. I’d like to build a standalone app, so have a custom GUI that lets a user input flow parameters and trigger a flow run. I’m imagining that I’ll have to create a prefect ecs agent using the prefect api instead of running it in a separate terminal but I’m not sure how to go about that. Pls let me know of any deployment recipes that fit my use case! 🙏
    Fina Silva-Santisteban
    s
    +2
    15 replies
    Copy to Clipboard
  • Puja Thacker

    Puja Thacker

    1 year ago
    #prefect-community --- Hello everyone.. I have a quick question on Prefect Cloud... How does Prefect Cloud isolate one client's content from other? Using Prefect Cloud, we will be accessing open Prefect API.. I am trying to understand at what level does it isolate one client from other? For example, For Informatica Cloud API, it maintains "Organizations" for each client and each environment, Tableau maintains "Sites" for each client for content isolation. So, in similar way, how does Prefect Cloud isolate clients from each other?
    Puja Thacker
    1 replies
    Copy to Clipboard
  • Puja Thacker

    Puja Thacker

    1 year ago
    Also, if we need to have different environments, like Dev, Preprod and Prod, how is that implmeneted at Prefect cloud level?
    Puja Thacker
    1 replies
    Copy to Clipboard
  • j

    Julian

    1 year ago
    Hey, I wonder if it is possible to name future scheduled flow_runs derived from some provided flowrun name-template and the scheduled time. At the moment, we schedule most of our flows using a CronSchedule. Then, the prefect-scheduler schedules future FlowRuns with funny flowrun names such as
    crazy_monkey
    , but they don't provide any actual value. Since it is quite nice (from ui perspective) to have meaningful names, we rename these flowruns during runtime like
    {flow_name}_{system_time}
    or
    {flow_name}_{parameters}
    . However, if a flowrun fails before executing the rename task, it won't have this "nice" name but the orignal (meaning-less) name. Also, renaming inside the flow will also overwrite flowrun names which were provided when manually starting a flowrun with name, e.g. from ui.
  • m

    Milly gupta

    1 year ago
    Hi all, Does Prefect agent persist all task results some where?
    m
    Amanda Wee
    +1
    11 replies
    Copy to Clipboard
  • s

    Siva Kumar

    1 year ago
    #prefect-community how to return dataframe objects or serialized objects from task function, when i tried to return object i am getting an exception,pickle cloud object can not return serialized objeect
    s
    1 replies
    Copy to Clipboard
  • Joël Luijmes

    Joël Luijmes

    1 year ago
    Hey, I’m looking for something similar as asked here (batch mapped tasks or nested mapping). Has anyone got something like this working? The example provided by Michael does something differently than I want. What I’m after is: 1. Retrieve some dynamic list (i.e. query from database) 2. Batch the result set in 20ish items to process in parallel 3. For each item in the batch, do a branch of chained tasks 4. Wait for batch to complete, repeat for next batch until exhausted Basically I want to write something as
    tasks = dynamic_list_of_tasks()
    windowed_tasks = fixed_window(tasks, window_size=5)
    
    def process_item(item):
        x = task_1()
        task_2(x)
    
    def process_window(tasks):
        apply_map(process_item, tasks)
    
    apply_map(process_window, windowed_tasks)
    Joël Luijmes
    Josh Greenhalgh
    +1
    6 replies
    Copy to Clipboard