• c

    Clovis

    6 months ago
    Hi everyone 👋 ! I'm using prefect since a few months now but I keep encountering the same issue with my Airbyte tasks: From some reasons, my Airbyte connection failed at some point and throw a
    Failed
    status but, I don't know why, prefect considers the task as successful (cf. my screenshot in attachment). It's a blocking point from my point of view as it prevents me from relying on prefect and therefore forces me to double-check each time with Airbyte. Maybe this issue comes from my code but I don't see why ?
    sync_airbyte_connection = AirbyteConnectionTask(
        max_retries=3, retry_delay=timedelta(seconds=10), timeout=timedelta(minutes=30),
    )
    
    with Flow("my flow", run_config=UniversalRun()) as flow:
        airbyte_sync = sync_airbyte_connection(
            <connection_infos>,
        )
        [...]
    
    flow.set_reference_tasks([
       airbyte_sync
    ])
    c
    Anna Geller
    10 replies
    Copy to Clipboard
  • Malthe Karbo

    Malthe Karbo

    6 months ago
    Hi, if anyone is experiencing issues with EKS and/or Kubernetes on dask using prefect, I found that the latest release (14h ago) of kubernetes-asyncio (22.6) has been breaking all my flows that worked yesterday (it fails to pickup serviceaccount for rbac, sends all requests as 'system/anonymous'). Pinning version 'kubernetes-asyncio<22.6' in my requirements fixed it for me. Issue:https://github.com/PrefectHQ/prefect/issues/5573
    Malthe Karbo
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Vadym Dytyniak

    Vadym Dytyniak

    6 months ago
    Hey. In test I would like to assert that create_flow_run was called with expected parameters. Is it possible somehow to get task result?
    Vadym Dytyniak
    Anna Geller
    2 replies
    Copy to Clipboard
  • a

    Adam Roderick

    6 months ago
    We've started seeing an error when accessing secrets from Prefect Cloud. Any ideas how I can troubleshoot this?
    prefect.exceptions.ClientError: [{'path': ['secret_value'], 'message': 'An unknown error occurred.', 'extensions': {'code': 'INTERNAL_SERVER_ERROR'}}]
    a
    Kevin Kho
    +1
    10 replies
    Copy to Clipboard
  • Chris Reuter

    Chris Reuter

    6 months ago
    We've laughed, we've cried, but most of all we've 🚀 launched! Come join @Jeremiah, @Chris White and myself as we recap Launch Week 2022 at 3p Eastern on

    Youtube

    . You might find yourself hearing a few more exciting announcements 😏 https://prefect-community.slack.com/archives/C036FRC4KMW/p1647621014192929
    Chris Reuter
    Jeremiah
    3 replies
    Copy to Clipboard
  • Aram Panasenco

    Aram Panasenco

    6 months ago
    I'd like to set up a testing framework for testing Prefect flows. The idea is that when someone modifies a Prefect flow in a pull request, an automated DevOps process can launch that flow with certain parameters and somehow test that the flow did what it was supposed to do. I couldn't find anything in the documentation or in Slack. Is there an official testing framework for Prefect flows? If not, what do you all use?
    Aram Panasenco
    Kevin Kho
    4 replies
    Copy to Clipboard
  • Wei Mei

    Wei Mei

    6 months ago
    Hi, I am making my first pass at using Github actions to CI. I have reached something I don’t fully understand. On pull_request of a feature/** branch, I run actions that:1. checkout the branch 2. create a project in prefect cloud 3. register the flow.py with a --label dev when I run the flow manually, i get an error:
    Unexpected error while running flow: KeyError('Task slug connect_source-1 is not found in the current Flow. This is usually caused by a mismatch between the flow version stored in the Prefect backend and the flow that was loaded from storage.\n- Did you change the flow without re-registering it?\n- Did you register the flow without updating it in your storage location (if applicable)?')
    Wei Mei
    Kevin Kho
    +1
    26 replies
    Copy to Clipboard
  • Chris Reuter

    Chris Reuter

    6 months ago
    See you all in 5 mins on a Fireside Chat:

    https://www.youtube.com/watch?v=uIv1m3-2tjA

  • Jean-Michel Provencher

    Jean-Michel Provencher

    6 months ago
    Hi, do you guys know if it’s possible to pass
    upstream_tasks
    with methods that actually requires parameter? The documentation is not really clear regarding how to chain multiple upstream_tasks to create dependencies between them and I was wondering if some of you had some more complex examples. For example, I don’t think I can do this
    with Flow(f"{environment_prefix}-test", storage=S3(bucket=storage_location_bucket_name)) as flow:
    
    
        dbt_run(organization_id_param, data_processing_start_date_param, data_processing_end_date_param, should_process_last_period, period, period_value,
                upstream_tasks=[pull_snowflake_secret(a,b), pull_repo(b,c)])
    Jean-Michel Provencher
    emre
    2 replies
    Copy to Clipboard