• James Phoenix

    James Phoenix

    1 year ago
    Is it this?
  • James Phoenix

    James Phoenix

    1 year ago
    client.create_flow_run(flow_id="<flow id>")
    James Phoenix
    emre
    4 replies
    Copy to Clipboard
  • rafaqat ali

    rafaqat ali

    1 year ago
    Hi, I'm new here so don't have much idea about perfect. Can anyone let me know how can I create approval workflow using perfect?
    rafaqat ali
    Michael Adkins
    2 replies
    Copy to Clipboard
  • Avinash Varma Kanumuri

    Avinash Varma Kanumuri

    1 year ago
    Hi, I’m new to prefect. Does prefect have subdag like @composite_solid in dagster, if yes can you please let me know the prefect term for it. (task calling other tasks)
    Avinash Varma Kanumuri
    1 replies
    Copy to Clipboard
  • Samuel Hinton

    Samuel Hinton

    1 year ago
    Hi team! I have a few files which contain some different flows. There are some common functions that I want them all to share, all in a common.py file in the same directory. I could turn this into a github package and pip installable by itself, but is there a simple way of simply flagging to the storage that I also want common.py uploaded into my S3 bucket alongside the flow file?
    Samuel Hinton
    Josh Greenhalgh
    +3
    36 replies
    Copy to Clipboard
  • Pedro Martins

    Pedro Martins

    1 year ago
    Hey team! I have a flow in which the parameters default value comes from a
    config.toml
    . See below:
    with Flow("model-deployment-pipeline", **custom_confs,) as flow:
            model_uri = Parameter("model_uri", default=config["model"]["modelUri"])
            environment = Parameter("environment", default=config["model"]["environment"])
    
            deploy_model(model_uri=model_uri, namespace=environment)
    This flow is triggered when there are modification on the config.toml. How can I make sure that this flow will access the most up to date config file?
    Pedro Martins
    Josh Greenhalgh
    3 replies
    Copy to Clipboard
  • Joachim Zaspel

    Joachim Zaspel

    1 year ago
    Hello, I am new to prefect and have executed parts of the tutorial of basic features applied to our environment (Exasol Data Warehouse). I managed to execute arbitrary SQL statements on the Exasol Data Warehouse via a Prefect flow composed of tasks executing sql functions on Exasol in a specific sequence. So far I am running everything locally on my laptop. I have a few questions mainly on deployment for production environments, perhaps you can help:1. Regarding deployment on a production environment, I would like an installation on on a docker host. However I do not want to install anything on the docker host itself (things like conda, prefect package, etc.) - rather I want everything encapsulated within separate docker containers. I ran across https://github.com/flavienbwk/prefect-docker-compose - which explains the deployment via custom docker compose in much more detail but it seems that on the docker host, prefect has to be installed in the OS as well. Are there other similar resources or alternatives? 2. Is there a open source solution for the Prefect UI to add user administration and user rights? 3. Is it possible to execute a single Prefect Task within a docker container? 4. Is it possible to execute a single Prefect Task with Kubernetes? 5. does Prefect or some other company provide support for on premise deployments?
    Joachim Zaspel
    Michael Adkins
    2 replies
    Copy to Clipboard
  • m

    Matthew Blau

    1 year ago
    Hello everyone, I had a general question about Prefect's FlowRun as shown here: https://docs.prefect.io/core/idioms/flow-to-flow.html and I wanted to make sure that I was understanding and that this made sense for our use case. We are considering the idea of having a "monolithic program" that has commonly used functions so we can just set parameter values on other programs where things differ. For example, we have a commonly used connect_to_db method that all our software shares. With a FlowRun would it be possible for us to register a flow that does the database connection and returns the information back to the dependent flow? As in Flow1 calls connect_to_db flow, returns connection information back to Flow1 and proceeds on with the rest of the tasks.
    m
    Kyle Moon-Wright
    +1
    3 replies
    Copy to Clipboard
  • jeff n

    jeff n

    1 year ago
    Howdy. We have a flow we are running that uses the core server to manage it. The flow hits a certain task and then just stalls out. It does not write any of the logs, it ignores the timeout set in the
    @task
    settings, and just sits waiting for an hour until the heartbeat dies. We can run it on the same worker server manually with no problems at all. There doesn’t seem to be any logs indicating it is out of memory. This seems pretty squarely on the prefect side but I am lost how to debug it since only the server has the issue.
    Process 27956: /opt/prefect/.venv/bin/python3.8 -m prefect heartbeat flow-run -i 14303ee3-e96a-49c9-87e6-6577fcace7d3
    Python v3.8.0 (/usr/bin/python3.8)
    Thread 27956 (idle): "MainThread"
        flow_run (prefect/cli/heartbeat.py:98)
        invoke (click/core.py:610)
        invoke (click/core.py:1066)
        invoke (click/core.py:1259)
        invoke (click/core.py:1259)
        main (click/core.py:782)
        __call__ (click/core.py:829)
        <module> (prefect/__main__.py:4)
        _run_code (runpy.py:85)
        _run_module_as_main (runpy.py:192)
    jeff n
    Chris White
    8 replies
    Copy to Clipboard