• m

    Maximilian Schnieder

    1 month ago
    Hi, I was wondering if I could use prefect for object methods. My use case would be that instead of creating a thread that runs inside the object every 60 seconds, I could create a scheduled flow for the object. But is this even possible? And how would the deployment file look?
    m
    Bianca Hoch
    5 replies
    Copy to Clipboard
  • John Kang

    John Kang

    1 month ago
    Encountered an issue with the Agent in a Windows terminal. Just an FYI (on 2.0.2).
    John Kang
    Kevin Grismore
    5 replies
    Copy to Clipboard
  • Jonas Dahlbæk

    Jonas Dahlbæk

    1 month ago
    Question: We are looking for a pipeline orchestrator and I have been testing prefect. I had a very good experience following the deployment docs, but one thing has me a bit confused; I was testing out the flow versioning feature by sending a TERM signal to the agent while it was running a flow, and it seems this leaves the flow in running state indefinitely (or pending, depending on when I interrupt the agent). I am following https://docs.prefect.io/concepts/work-queues/ for the agent setup, and what I see seems similar to the issue described here https://github.com/PrefectHQ/prefect/issues/2834, so I went looking for docker or kubernetes agents. However, I'm not finding anything for prefect v2; Am I doing something horribly wrong? What is the best practice with regards to handling hanging flows due to issues on the agent side?
    Jonas Dahlbæk
    Bianca Hoch
    6 replies
    Copy to Clipboard
  • d

    Darren

    1 month ago
    I'm still new to tools such as Prefect. I am trying to automate our onboarding process which is pulling(api/json) employee information from one source and create accounts into 3 other applications via api/json. My thought process would be creating a flow to pull the data and check to see if the accounts exists, if they don't create them. My question is: Would it be better to pass the data between the tasks or place the data into a source such as a file(csv maybe?) or database?
    d
    John Kang
    7 replies
    Copy to Clipboard
  • k

    Karl Bühler

    1 month ago
    What is best practice for api calls (requests session). My issue is: I would like to use the same session object for my scheduled flows. I tried just caching the session object, but this doesn't lead to the desired results. Any best practices here in how to handle sessions?
    k
    1 replies
    Copy to Clipboard
  • Андрей Насонов

    Андрей Насонов

    1 month ago
    Hi! We're moving from Prefect to Prefect2.0. We would like to use our Gitlab repository as a storage for flows, as we used to with GitlabStorage in Prefect1.0. I'm trying to use RemoteFileSystem
    from prefect.filesystems import RemoteFileSystem
    gitlab_block = RemoteFileSystem(
            basepath='<git://path/to/repo>',
            settings={
            'key': "GITLAB_USER",
            'secret': "GITLAB_TOKEN"})
    gitlab_block.save('flows_repo')
    This leaves me with
    prefect.exceptions.PrefectHTTPStatusError: Client error '422 Unprocessable Entity' for url '<http://ephemeral-orion/api/block_documents/>'
    I might be misusing RemoteFileSystem something fierce, could you please guide me in the right direction? Big thanks!
    Андрей Насонов
    Anna Geller
    +1
    3 replies
    Copy to Clipboard
  • Chris L.

    Chris L.

    1 month ago
    Hello Prefecters. A question about Prefect 2.0 deployment. I have one
    scheduler
    flow that takes a
    subflow_key
    . This
    subflow_key
    is passed into a curried function that dynamically creates a new Prefect flow. There are about 30 (and growing) different subflows that can be created dynamically. My problem: I would like to have a single-flow many-deployments (each deployment is associated with different types of subflows) setupConstraints: Because subflows are generated dynamically using the curried function, I can't separate subflows into parent flows and run
    prefect deployment build
    for each flow.What I've tried: 30 different deployment yaml files for 1
    scheduler
    flow with 30 different combinations of
    subflow_keys
    and schedules.My question: Is there a DRYer way to achieve the same setup. Each deployment file is identical except for 2 lines (parameters and schedule)? What is Prefect engineering's current take on this single-flow many-deployments paradigm? Will this be achievable via a single Prefect CLI command in the future (maybe with arrays of parameters / schedule flags passed into
    prefect deployment build
    )?
    Chris L.
    Jeff Hale
    4 replies
    Copy to Clipboard
  • Андрей Насонов

    Андрей Насонов

    1 month ago
    Hi! As I understand it, KubernetesJob exists on the level of infrastructure and allows for flow execution as a separate job. Is there a similar task runner for this (separate task - separate job), or should I stick to using subflows?
    Андрей Насонов
    1 replies
    Copy to Clipboard
  • Neil Natarajan

    Neil Natarajan

    1 month ago
    Hi! Does prefect support running Flows from within Prefect Tasks? What is the best syntax for running a flow of flows with the free offering
    Neil Natarajan
    4 replies
    Copy to Clipboard