• s

    Sean Perry

    1 year ago
    Resurrecting https://prefect-community.slack.com/archives/CL09KU1K7/p1616759310466600 When I have the following in my requirements.in file:
    boto3
    dask-cloudprovider[aws] == 2021.3.1
    prefect[aws] == 0.14.17
    Trying to run pip-compile on it fails with a complaint about botocore versioning:
    botocore<1.20.50,>=1.20.49 (from aiobotocore==1.3.0->dask-cloudprovider[aws]==2021.3.1->-r requirements/production.in (line 14))
      botocore<1.21.0,>=1.20.67 (from boto3==1.17.67->-r requirements/production.in (line 9))
      botocore<2.0a.0,>=1.12.36 (from s3transfer==0.4.2->boto3==1.17.67->-r requirements/production.in (line 9))
    This boils down to: botocore ==1.20.49,>=1.20.67 which cannot be satisfied. What is a safe version of boto3 that works with prefect and dask cloud provider? Removing the version numbers above from the requirements.in does not help, same complaints. Which is to be expected since pip is going to try the newest it can without guidance.
    s
    ciaran
    10 replies
    Copy to Clipboard
  • Zach Schumacher

    Zach Schumacher

    1 year ago
    is there a way to give docker tasks a name?
    Zach Schumacher
    Kevin Kho
    6 replies
    Copy to Clipboard
  • Todd Lindstrom

    Todd Lindstrom

    1 year ago
    Agent Issue. I'm just trying prefect out - using the backend server and running on an EC2 instance m5.large with 8080 and 4200 ports allowed external.  I have the server running, the UI attached, the graphql connected, and I started an agent on the same machine. For me I installed prefect using a conda activation - so to start the agent - i did the conda activation then prefect agent start command. HOWEVER, when I add a flow and run it - it fails on
    unexpected error: NameError("name 'prefect' is not defined
    This appears to me that it is not running in the condra activation and yes if thats true then the pip install will not be there. Please advise how I can set the python environment of my local agent.
    Todd Lindstrom
    Kevin Kho
    15 replies
    Copy to Clipboard
  • ciaran

    ciaran

    1 year ago
    What's the canonical way to provide ENV vars to DaskExecutor Schedulers/Workers? I'm assuming it's going to be in either
    cluster_kwargs
    or
    client_kwargs
    on https://docs.prefect.io/api/latest/executors.html#daskexecutor ?
    ciaran
    Kevin Kho
    6 replies
    Copy to Clipboard
  • s

    Sean Perry

    1 year ago
    Would there be interest in a
    FilterTask
    variant that had an extra parameter along the lines of:
    log_filtered (Callable, optional): a function to use for logging any result that did not pass `filter_func`.
    Use would be something like:
    valid_configurations = FilterTask(filter_func=some_validator, log_filtered=log_invalid_configuration)
    where
    def log_invalid_configuration(configuration):
        # log the reasons this configuration is invalid. Or some other special logging for invalid configurations.
        # Like, notifying a team to fix the configuration.
    Adding this to
    FilterTask
    would be straightforward. Turn the list comprehension info a for loop appending to
    filtered
    and call
    log_filtered
    whenever a result does not match the predicate. I am currently making 2 calls to FilterTask, once to collect valid and once to collect invalid. Which is not pretty.
    s
    Michael Adkins
    2 replies
    Copy to Clipboard
  • l

    Lukas N.

    1 year ago
    Hello 😛refect:, I have a hard time understanding the depth-first-execution of mapped tasks with Dask executor. From my experiments it seems like the execution is bfe, but maybe I'm missing something trivial like task trigger or something? More info with code example in 🧵
    l
    Kevin Kho
    5 replies
    Copy to Clipboard
  • Austen Bouza

    Austen Bouza

    1 year ago
    Hi all, I have some newbie Prefect deployment questions if anyone can help:1. What is considered best practice for automating flow registration, docker builds, etc. when deploying new code? Is it preferable to simply lean on CI/CD tools for this, or do people use things like the docker BuildImage task as part of a multi-flow setup? I want to use something like GitHub for storage but locally-built docker images for the run config, but am not sure how to close the gap on automating builds from local Dockerfiles. 2. Is the only way to define non-python docker image dependencies for a flow simply to write a Dockerfile?
    Austen Bouza
    Kevin Kho
    5 replies
    Copy to Clipboard
  • George Tam

    George Tam

    1 year ago
    Hi everyone, I'm interested in using prefect to continuously replicate a bigquery table into a postgres database. Is this possible to do within prefect?
    George Tam
    Kevin Kho
    9 replies
    Copy to Clipboard
  • t

    Trevor Kramer

    1 year ago
    Is it possible to retry mapped tasks? Whenever I try it I get nulls being passed as the mapped input parameter. We are using prefect cloud with prefect results.
    t
    Kevin Kho
    11 replies
    Copy to Clipboard
  • dipsy wong

    dipsy wong

    1 year ago
    Hi everyone, I am trying to push a markdown file as an artifact which contains some js in order to render a chart, but the chart disappeared in prefect UI. Actually is it possible to render scripted markdown in prefect? here are the markdown file I am trying to pushed and the flow definition. Thanks
    dipsy wong
    m
    4 replies
    Copy to Clipboard