• Stéphan Taljaard

    Stéphan Taljaard

    7 months ago
    Hi. I'm sure the answer will be here if I search 😅 Why am I not seeing logs logged using
    prefect.context.get("logger").warning()
    Surely if the default logging level is
    info
    , it will hide
    debug
    , but should still log and show
    warning
    ,
    error
    , and
    critical
    ?
    Stéphan Taljaard
    Kevin Kho
    5 replies
    Copy to Clipboard
  • Stéphan Taljaard

    Stéphan Taljaard

    7 months ago
    Hey. Anyone that can point out if there's a gotcha I missed here? I created a flow to get a flow run (by name)'s logs and email them to a user. I figured out the GQL query in the
    Interactive API
    tab on the Server page. I then moved it to my flow. When running the flow, I'm getting
    ReadTimeoutError: HTTPConnectionPool(host='localhost', port=4200): Read timed out. (read timeout=15)
    It's strange to me, because I'm using default port/other settings, and expect it to work since the
    Interactive API
    gives a result almost immediately
    Stéphan Taljaard
    Kevin Kho
    11 replies
    Copy to Clipboard
  • Alexis Lucido

    Alexis Lucido

    7 months ago
    Another question, but big issue for us here. Sorry folks. We have deployed Prefect Core on an on-premise virtual machine with 4 CPU and 16 Go of RAM. Some flows regularly fail and are retried. Right now we have 10 flows retrying, and they are overloading the CPUs... Those retrying flows slow down the subsequent ones, that get slower and slower, until no flow can run anymore and the whole VM is virtually stopped. The tasks executed by the flows are not memory-intensive and, when no error is raised, they run smoothly in several seconds. But here, we are facing some critical error problems... One solution would be to reboot the whole application regularly, but it was something advised to do with Airflow and its scheduler issues, and one of the reason we switched from Airflow to Prefect. I have also optimized for the number of retries and the retry_timedelta, but that is not an optimal solution. You can find attached a screenshot of my htop command. Any thoughts on how to solve the issue? Btw we are using Prefect for launching an entirely-automated renewable electricity trading and, except for these bugs that I'm sure we can solve, we are very happy with the solution and would happily collaborate for writing blog posts showing our use-case of your solution or anything! Best, and thanks in advance!
    Alexis Lucido
    Kevin Kho
    +1
    50 replies
    Copy to Clipboard
  • Henrietta Salonen

    Henrietta Salonen

    7 months ago
    Hello, This may be something simple that I’m just totally overlooking but trying to use the s3.s3Upload.run in my flow in the following way:
    with Flow("test") as flow:
        s3.S3Upload.run(data, credentials="AWS_CREDENTIALS", bucket='bucket_name', compression='gzip')
    
    flow.run()
    Data is json string. I keep on getting this error
    AttributeError: 'str' object has no attribute 'bucket'
    Henrietta Salonen
    Kevin Kho
    5 replies
    Copy to Clipboard
  • b

    Bo

    7 months ago
    Hello, I am trying to run an ECS task while defining the task definition as a dictionary (https://github.com/anna-geller/packaging-prefect-flows/blob/master/flows/s3_ecs_run_task_definition_as_dict.py), but I keep receiving this error:
    An error occurred (ClientException) when calling the RegisterTaskDefinition operation: Container.image should not be null or empty
    I'm have defined the task as a yaml file (e.g. https://github.com/anna-geller/packaging-prefect-flows/blob/master/flows/s3_ecs_run_custom_task_definition.py) and that works fine, so I am not sure what I am doing wrong. Thanks!
    b
    Kevin Kho
    15 replies
    Copy to Clipboard
  • b

    Ben Welsh

    7 months ago
    I have a private Python package bundled up and stored in Google Artifact Registry (as opposed to an open-source package on PyPI). I'd like to include it in my flow, which uses a Docker storage instance in production. I know that I can use the
    python_dependencies
    kwarg to the class to include open-source packages. But how do I get a private package from Google Artifact Registry included as well? Is there an established pattern for this?
    b
    Kevin Kho
    6 replies
    Copy to Clipboard
  • David Serédi

    David Serédi

    7 months ago
    Hey. I have started playing with prefect Orion, and I use a different host (I need to access the UI from another machine). What I noticed that after using the prefect Orion start command(prefect orion start --host MY_IP ) is that everything works fine(I can deploy flows and they run on schedule) except that the UI shows no flows nor any deployments. Am I doing something wrong? Thanks a lot
    David Serédi
    Kevin Kho
    2 replies
    Copy to Clipboard
  • jack

    jack

    7 months ago
    Anyone had success running an ECSTask on AWS Workspaces? There appears to be a bug where docker's default "bridge" networking, when run on AWS Workspaces, does not allow any network traffic to/from the outside world. Which means that when your Dockerfile gets to a line like
    RUN pip install -r requirements.txt
    that requires network it just hangs.
    jack
    Kevin Kho
    12 replies
    Copy to Clipboard
  • Владислав Богучаров

    Владислав Богучаров

    7 months ago
    Hello everyone! I use aws for my work. On my MacOS laptop .aws contains configuration files that are responsible for accessing my work cloud. I want to test prefect with another aws account but I'm afraid something might go wrong. As far as I understand, Prefect uses boto3, and boto3 will take configs from the default path. How to distinguish between a working aws profile and a home one, and what is the most important thing how to say about this to Prefect?
    Владислав Богучаров
    Anna Geller
    +3
    38 replies
    Copy to Clipboard
  • Alvaro Durán Tovar

    Alvaro Durán Tovar

    7 months ago
    When using k8s
    RunNamespacedJob
    is possible to get the logs from the job and have them on prefect ui? I'm setting
    log_level
    but that isn't working
    Alvaro Durán Tovar
    Kevin Kho
    4 replies
    Copy to Clipboard