• Tony Yun

    Tony Yun

    11 months ago
    Hi! I’m having some trouble in getting proper logs from 
    RunNamespacedJob
     task. I had set the 
    log_level='info'
     option, but when it’s failed, no any logs are being sent to Prefect. Instead, I only get this from UI.
    FAIL signal raised: FAIL('Job dbt-run-from-flow failed, check Kubernetes pod logs for more information.')
    After going to the k8s pod, I see exception logs:
    dbt ls --models tm_snowflake.* --profiles-dir=. --profile default --target dev
    Encountered an error while reading profiles:
      ERROR Runtime Error
      Compilation Error
        Could not render {{ env_var('DBT_PASSWORD') }}: Env var required but not provided: 'DBT_PASSWORD'
    Encountered an error:
    Runtime Error
      Could not run dbt
    make: *** [Makefile:6: ls] Error 2
    How could I pass any logs from k8s pods to Prefect UI?
    Tony Yun
    Kevin Kho
    22 replies
    Copy to Clipboard
  • k

    kiran

    11 months ago
    Hi y’all. Has anyone gotten
    flow.visualize()
    to run on a Linux server (running the code, which is on the server, from my Mac)? I’m having issues with
    xdg-open
    and also
    gio
    and have now gone down several google/stack overflow rabbit holes with only minor successes
    k
    1 replies
    Copy to Clipboard
  • ek

    ek

    11 months ago
    Hello everyone, I'm trying to figure out the code below:
    import my_lib
    from prefect import task, Flow
    from prefect.storage import S3
    
    @task
    def my_func():
        my_lib.func()
    
    with Flow("myflow") as flow:
       flow.storage = S3(
            bucket="BUCKET_NAME",
            stored_as_script="true",
            local_script_path="main.py"
        )
    flow.register('myflow')
    my dir looks like this:
    .
    ├── main.py
    └── my_lib
        ├── __init__.py
        └── db.py
    I'm trying to push my code up to s3 bucket but my
    my_lib
    dir doesn't get package up along with
    main.py
    what am I missing here?
    ek
    Kevin Kho
    6 replies
    Copy to Clipboard
  • k

    Kevin Weiler

    11 months ago
    is there any way for a task to know that its flow has been triggered by a Schedule vs. the API? Something in the context perhaps?
    k
    nicholas
    +1
    5 replies
    Copy to Clipboard
  • Ken Nguyen

    Ken Nguyen

    11 months ago
    Are we able to set a more specific time when setting the schedule to be once a day on the front end? For example, I'm trying to set my flow to run once a day at 7:32AM
    Ken Nguyen
    Zach Angell
    2 replies
    Copy to Clipboard
  • b

    Bob Colner

    11 months ago
    Just now skimming the ‘Orion’ blog post -am I reading this correctly that all existing prefect core workflow will not be supported ~1 year from now?
    b
    Jeremiah
    10 replies
    Copy to Clipboard
  • Adam Brusselback

    Adam Brusselback

    11 months ago
    Hey again... so lets say I have prefect server with two worker localagents that connect to it that will have the same flows / tasks deployed manually through ansible. I must have misconfigured something, because it looks like when both agents are up and I run a job, it gets submitted to both agents
    Adam Brusselback
    7 replies
    Copy to Clipboard
  • f

    Frederick Thomas

    11 months ago
    Hi All, I hope everyone is okay... Anyhow, my question is that we currently have Prefect set up in an Azure VM running Ubuntu 18.04 which is pushing the limit on disk usage. I've :
    sudo du -h /mnt/data/prefect/ > log.txt
    and after reading the file found this:
    217G	/mnt/data/prefect/.prefect/results
    My question being is it safe to delete the results in the folder, and if not can they be stored elsewhere safely? Thanks!
    f
    Kevin Kho
    5 replies
    Copy to Clipboard
  • Daniel Manson

    Daniel Manson

    11 months ago
    Excited about Orion. I have a question/feature request, which I think may make a lot more sense in the context of Orion than original Prefect... If I want some tasks within a flow to operate with a shared filesystem, there's not easy way to do that in Prefect if the flow as a whole is running in a distributed fashion. However, perhaps in Orion you could have some sub part of the flow which runs as though that bit of the flow was "local", i.e. you could do
    [download from s3] => [unzip] => [do something interesting] => [upload to somewhere]
    as a series of tasks that run sequentially on a single filesystem, but as part of a larger flow where tasks are run on entirely separate filesystems. The advantage of allowing these steps to be full-blown tasks rather than just functions is that you can get dashboard level visibility, retries, and better use of library tasks etc. A related thought - could you compile a dbt graph into a nested part of a prefect flow, so you get full visibility into what dbt is doing at the level of Prefect. That would be cool.
    Daniel Manson
    Chris White
    2 replies
    Copy to Clipboard
  • Jacob Blanco

    Jacob Blanco

    11 months ago
    Just came to say HOLY s&*t Orion looks 🍌s! So excited to be a Prefect customer right now.