• s

    Scott Vermillion

    1 year ago
    I have a fairly basic question… Let’s assume I have some Python app that uploads something to S3. Now I want to kick off my flow. So I add this to my aforementioned Python app:
    from prefect import Client
    
    client = Client()
    client.create_flow_run(flow_id="<some flow_id>")
    But lo and behold, someone comes along an re-registers the flow with Cloud. The previous flow_id gets archived and a new one is generated. Now I have to go and update my Python app with the new flow_id? Can this be done by name or something? Or can I pull the flow_id as a variable? Or? Thank you.
    s
    Kevin Kho
    5 replies
    Copy to Clipboard
  • CA Lee

    CA Lee

    1 year ago
    Hello, trying to find a better way to use parameters for control flow using a dictionary (code in thread)
    CA Lee
    Kevin Kho
    10 replies
    Copy to Clipboard
  • Nishtha Varshney

    Nishtha Varshney

    1 year ago
    Hello i am working with a simple ml model and trying to use prefect ui to see the flow of it. I tried almost everything to debug it following the old threads related to pickling error, but nothing works. Can anyone please tell what can be done. Error - Unexpected error: TypeError("cannot pickle 'weakref' object") ``````
    Nishtha Varshney
    Kevin Kho
    8 replies
    Copy to Clipboard
  • o

    Oludayo

    1 year ago
    Hi everyone. I started using prefect a couple of months ago and it's been a good ride so far. By default, if the run() method succeeds for a task, Prefect sets the state to Success and records any returned data. I would however like to disable this recording of returned data because of how much memory it uses up. The task i run returns data of about 130Mb and I have thousands of raw data to map using this task. I couldn't find information about this online. How do you suggest i proceed? I should also mention that I'm running the flow without prefect server. Thank you for your time.
    o
    Kevin Kho
    7 replies
    Copy to Clipboard
  • Joseph Loss

    Joseph Loss

    1 year ago
    is there a way to report specific items on the slack hook automation? Like if I wanted to report "max date downloaded" for example. I have a special notification that will do this on fail signals, but part of that process is to actually raise the signal and call signals.FAIL(). Wondering if I could just do the opposite for success? Or would it end up being two separate slack messages, one being the automated slack flow success message, and the other being the custom signals.SUCCESS() ?
    Joseph Loss
    nicholas
    2 replies
    Copy to Clipboard
  • b

    Blake List

    1 year ago
    Hi there! I was wondering what is the best way to create a prefect task out of a function that is applied to each row of a dataframe. E.g. wrap something like
    df = df.apply(my_function, axis=1)
    with prefect. Thanks!
    b
    nicholas
    +1
    6 replies
    Copy to Clipboard
  • Pedro Machado

    Pedro Machado

    1 year ago
    Hi there. I have a flow that will run on Kubernetes. I'd like to see log messages in near real time. Currently, the messages show at the end of the run? I wonder if this has to do with the stream not being flushed frequently. How can I flush the stream after calling
    <http://logger.info|logger.info>()
    ? Thanks!
    Pedro Machado
    nicholas
    6 replies
    Copy to Clipboard
  • Ben Muller

    Ben Muller

    1 year ago
    Hey community, In the python api you have the ability to run a flow locally, that has a schedule by adding
    flow.run(run_on_schedule=False)
    . I can't find this option for the cli when running
    prefect run -p my_flow.py
    What am I missing?
    Ben Muller
    Michael Adkins
    +1
    4 replies
    Copy to Clipboard
  • Ben Muller

    Ben Muller

    1 year ago
    Hey Prefect, I am getting an error on my flow in prod of
    Error uploading to S3: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
    This is from my S3Result object trying to persist the results to the bucket. Not sure why this is happening as my
    task-role
    has full S3 access and I have an identical set up in staging, yet I dont get these errors. Am I right in assuming that the S3Result just assumes the
    task-role
    ? Why would boto3 not be picking this up?
  • Ben Muller

    Ben Muller

    1 year ago
    The only other thing I can think of is if the bucket value parameter is fixed at register time and not at run time of the flow?