• a

    alex

    2 years ago
    Hello! I have a flow as follows:
    results = []
    for op in operations:
      op = task(op)
      most_recent = op
      
      .. a few more conditional tasks defined at compile time
     
      if do_recovery: 
         recovery_task = recover(trigger=any_failed_else_raise_skip_signal)
         most_recent = recovery_task
      results.append(most_recent)
    
    backup_task = Backup('mydatabasename', upstream_tasks=[results], skip_on_upstream_skip=False)
    mapped = Aggregate.map(
                results,
                target=unmapped('mydatabasename')
    )
    mapped.set_upstream([backup_task])
    As you can see from the image, instead of getting one list, I have 2 lists and due to this, my Aggregate task is being skipped even though it is downstream of a
    skip_on_upstream_skip
    task. Is there a way to cleanly resolve this (ie. only get 1 list) ?
    a
    nicholas
    5 replies
    Copy to Clipboard
  • a

    Alex Cano

    2 years ago
    Hey yall, question for ya: What is the
    labeler
    in github actions? It looks like the action is failing on this PR: https://github.com/PrefectHQ/server/actions/runs/266743999
    a
    Chris White
    15 replies
    Copy to Clipboard
  • t

    Tenzin Choedak

    1 year ago
    Hi there 👋 In an attempt to get more familiar with Prefect, I thought I’d look at open issues and just add what little I can. I fixed an issue but I am wondering what the standard way to contribute is. Following the development guide, I’m just at the last step of opening a pull request from my forked repo back into Prefect. Do I make the base branch
    master
    or is there a different branch I should use? For context i’m fixing issue 3318 and my fixes are in my branch.
    t
    Chris White
    3 replies
    Copy to Clipboard
  • m

    max

    1 year ago
    hey everyone, my name is max (madelgi on github). i've been contributing to prefect off and on for the past month, mostly just adding tasks that have been useful for my purposes at my current company, but i've enjoyed contributing and i'm looking to work on something maybe slightly more substantial to get familiar with other parts of the code base. i've considered a couple of options -- adding a new storage option, maybe this issue, about persisting results in memory, etc
  • m

    max

    1 year ago
    i just wanted to drop a message in here before i start on anything to see if any of the folks @ prefect have an opinion on something they'd like done. e.g., if there's something that's been on the backlog for a while that isn't necessarily super difficult but no one wants to do it, well, feel free to share it with me haha. no worries if no one has any suggestions or requests, i'll just choose some open issue that looks interesting
    m
    Jenny
    8 replies
    Copy to Clipboard
  • ale

    ale

    1 year ago
    Hi folks, I’m currently working on improving S3List to support
    last_modified_start
    and
    last_modified_end
    to leverage the
    last_modified
    property of keys, to make it possible to filter keys based on time. I’m struggling to find a way to mock boto3 put_object to write a meaningful test. Any suggestions?
    ale
    Dylan
    6 replies
    Copy to Clipboard
  • Laura Lorenz

    Laura Lorenz

    1 year ago
    set the channel topic: Core Contributor Cantina 11/6 @ 4pm EST (meet.google.com/aqv-gxru-zsi)
  • Braun Reyes

    Braun Reyes

    1 year ago
    Hello yall! Long time no contribute....we were wanted to make a couple enhancements to the fargate agent. 1 - add an
    extra_containers
    argument to support adding sidecars to the flow run container 2 - add the ability to skip the task definition registration based a run context key of
    task_arn_override
    . This would allow folks using a map to submit flow run(subflow pattern) to skip the task registration issue that @Darragh and @Lukas were running into
    Braun Reyes
    Darragh
    +3
    36 replies
    Copy to Clipboard
  • Braun Reyes

    Braun Reyes

    1 year ago
    number 1 allows you to add a datadog agent to your flows