• Vadym Dytyniak

    Vadym Dytyniak

    9 months ago
    Hello Everyone. We want to run lightweight Prefect flows using LocalExecutor on ECS as a separate task. It successfully creates new task and run the flow, but the problem that I have to install additional dependencies for the flow. Can't find the way how to run pip install before running the flow. Can someone help if I have any chance to implement something like this?
    Vadym Dytyniak
    Anna Geller
    +1
    58 replies
    Copy to Clipboard
  • Tilak Maddy

    Tilak Maddy

    9 months ago
    Hey y'all could you explain what this means ?
    UserWarning: No result handler was specified on your Flow. Cloud features such as input caching and resuming task runs from failure may not work properly. registered_flow = client.register(
    That's the whole warning message. and here is the code. It worked properly though when i triggered the run from prefect cloud, there was no problem in execution. But I want to be able to resume tasks is they fail. what do I do ?
    import os
    import time
    from prefect.storage import GitHub
    import prefect
    from prefect import task, Flow, Parameter
    from prefect.run_configs import LocalRun
    from prefect.executors import LocalDaskExecutor
    
    
    @task
    def say_hello(name):
        # Add a sleep to simulate some long-running task
        time.sleep(3)
        # Load the greeting to use from an environment variable
        greeting = os.environ.get("GREETING")
        logger = prefect.context.get("logger")
        <http://logger.info|logger.info>(f"{greeting}, {name}!")
    
    
    with Flow("hello-flow") as flow:
        people = Parameter("people", default=["Arthur", "Ford", "Marvin"])
        say_hello.map(people)
    
    flow.storage = GitHub(
        repo="XXX/test-repo",
        path="learning_storage.py",
        access_token_secret="XXX"
    )
    
    flow.run_config = LocalRun(env={"GREETING": "Hello from User 2 "}, labels=["dev"])
    flow.executor = LocalDaskExecutor()
    flow.register(project_name="test_user_2")
    Ran this on my local machine. yes I have a copy of the flow in the mentioned github repo too.
    Tilak Maddy
    Anna Geller
    6 replies
    Copy to Clipboard
  • Tilak Maddy

    Tilak Maddy

    9 months ago
    Is it okay to create a lot of projects with just 1 or 2 flows registered to each of them (I use prefect cloud) ? Because I think I have setup quite a comfortable workflow and we could have anywhere like 300 prefect projects to be created. What should I be worrying about
    Tilak Maddy
    Anna Geller
    4 replies
    Copy to Clipboard
  • Bruno Murino

    Bruno Murino

    9 months ago
    Hi everyone — I’ve got a security concern I’m not sure how to solve. When running a flow as an ECS Task with a custom task definition, prefect injects a bunch of environment variables which then appear on the ECS Task execution screen on AWS console. This includes the prefect api keys. Does anyone know how to solve that?
    Bruno Murino
    Amanda Wee
    +1
    25 replies
    Copy to Clipboard
  • Jacob Blanco

    Jacob Blanco

    9 months ago
    Hey folks, having some persistent issues with the Secrets Vault. In a mapped task with 20 tasks, for some reason 1 out of the 20 mapped runs might error out complaining that Secret A is not available even though the other 19 found Secret A before and after the failing task.
    Jacob Blanco
    Anna Geller
    +2
    5 replies
    Copy to Clipboard
  • Casey Green

    Casey Green

    9 months ago
    Is there a slack channel for Orion questions or should they be posted here?
    Casey Green
    1 replies
    Copy to Clipboard
  • a

    An Hoang

    9 months ago
    question: Can I change the directory of the process of the flow execution midway through the flow? Maybe by doing a
    ShellTask("cd directory/to/switch/to")
    ? so that all of the results outputted are now ``directory/to/switch/to/result` ? My
    directory/to/switch/to/
    is templated at runtime and is a result of a task. Right now I have to pass this path as parameter to all subsequent tasks. Wondering if there's a more efficient/less error-prone way for this
    a
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Casey Green

    Casey Green

    9 months ago
    I'm having trouble creating Deployments in Orion. I've tried two methods:1. The method described in the docs. 2. Programmatically using
    OrionClient.create_deployment(...)
    In both cases, it supposedly succeeds, but it's not showing up in the UI. I've also tried copying the simple example in the docs verbatim, but no dice.
    $ prefect version
    2.0a5
    $ prefect deployment create ./my_flow_deployment.py
    Loading deployments from python script at 'my_flow_deployment.py'...
    Created deployment 'my-first-deployment' for flow 'Addition Machine'
    note: I'm able to run flows and see them show up in the UI.
    Casey Green
    Michael Adkins
    +1
    27 replies
    Copy to Clipboard
  • b

    brian

    9 months ago
    Hi all, I’m running a flow via prefect cloud using github storage and have come across a strange error
    b
    Kevin Kho
    9 replies
    Copy to Clipboard
  • Erik Amundson

    Erik Amundson

    9 months ago
    This might not be supported, but we're trying to run a docker agent on windows with a named pipe file mounted as a volume. The registration works fine with docker storage, and we can run the container manually from an anaconda prompt and hit the pipe file from within the container with:
    docker run -v "//.pipe/<named_pipe>://.pipe/<named_pipe>" —rm -it <image>
    It also doesn't throw any error when we start the prefect agent with:
    prefect agent docker start --label <agent_label> --volume "//.pipe/<named_pipe>://.pipe/<named_pipe>"
    but once we try to actually run the workflow it fails with this message:
    Internal Server Error ("invalid volume specification: '\\.\pipe\<named_pipe>:\pipe\<named_pipe>:/pipe/<named_pipe>:/pipe/<named_pipe>:rw'")
    I'm not really sure why it seems to be duplicating the volume map and removing some of the forward slashes. Is this supported at all? Worst case we can probably subclass the docker agent and hardcode the
    run_flow()
    command but we'd like to avoid having extra code on the agent side.
    Erik Amundson
    Kevin Kho
    +1
    7 replies
    Copy to Clipboard