https://prefect.io logo
Docs
Join the conversationJoin Slack
Channels
announcements
ask-marvin
best-practices-coordination-plane
data-ecosystem
data-tricks-and-tips
events
find-a-prefect-job
geo-australia
geo-bay-area
geo-berlin
geo-boston
geo-chicago
geo-colorado
geo-dc
geo-israel
geo-japan
geo-london
geo-nyc
geo-seattle
geo-texas
gratitude
introductions
marvin-in-the-wild
prefect-ai
prefect-aws
prefect-azure
prefect-cloud
prefect-community
prefect-contributors
prefect-dbt
prefect-docker
prefect-gcp
prefect-getting-started
prefect-integrations
prefect-kubernetes
prefect-recipes
prefect-server
prefect-ui
random
show-us-what-you-got
Powered by Linen
prefect-getting-started
  • s

    Scott Chamberlain

    02/10/2023, 5:14 PM
    All of our Prefect tasks run shell commands using
    prefect-shell
    (we’re currently moving from using
    shell_run_command
    to
    ShellOperation
    ). These shell commands all spin up docker containers. Sometimes a container will hang without any logs for up to 24 hrs. The only thing we’ve found to work is “tickling” the process connected to the container (we’ve done e..g,
    tail /proc/<pid>/fd/2
    ), and then we get logs again. We haven’t seen this problem when we start docker containers outside of a prefect context, so we’re thinking it’s something to do with prefect. it is possible this will go away as we transition to
    ShellOperation
    , but perhaps not. Curious if anyone has seen this behavior too? (related thread https://prefect-community.slack.com/archives/C048ZHT5U3U/p1675282219967509 )
  • j

    Jaime Raldua Veuthey

    02/10/2023, 5:45 PM
    Hi, Is it somehow possible to start and stop a container in AWS or GCP from a flow from the Prefect Cloud UI? We want to host Metabase in a container but don't want to have it alive 24/7 and would be nice if the container could be started/stoped somehow from the Prefect UI. Also any alternatives are welcome thanks!
    ✅ 1
    👀 1
    p
    • 2
    • 4
  • m

    Mathuraju Sivasankar

    02/11/2023, 8:37 AM
    I am unable to install prefect in windows using "pip install prefect" command in CLI. It is throwing the following error message: Could not install packages due to an OSError: [Errno 2] No such file or directory: 'C:\\Users\\Bhnimda\\AppData\\Local\\Packages\\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\\LocalCache\\local-packages\\Python310\\site-packages\\prefect\\orion\\database\\migrations\\versions\\postgresql\\2022_04_23_132803_d38c5e6a9115_rename_block_to_blockbasis_and_.py'
    ✅ 1
    • 1
    • 1
  • m

    Mathuraju Sivasankar

    02/11/2023, 2:59 PM
    I have created sample python code with flow and task in VS code editor. Also I have started server using "prefect orion start" using CLI in windows. I am not seeing the running flow in the prefect UI (http://127.0.0.1:4200/). Could any one help me on this? Thanks in Advance.
    ✅ 1
    r
    • 2
    • 3
  • j

    Juan David Lozano

    02/11/2023, 8:21 PM
    Hi I am following this tutorial to install prefect in google cloud https://medium.com/the-prefect-blog/prefect-server-101-deploying-to-google-cloud-platform-47354b16afe2, but when I get to the step
    cd prefect && python3 -m pip install .
    I get an error that prefect needs python 3.7, I try to upgrade to python 3.7 but I think (the repo was made with a lower version of python? I could be wrong just starting to understand docker) is there a more updated version of this article? or somebody can point me on how to install prefect on google cloud?
    ✅ 1
    r
    • 2
    • 11
  • h

    Hicham Benbriqa

    02/11/2023, 9:20 PM
    Hi everyone, I am very new to Prefect, and I couldn't find the solution to the bug I am facing: This is the output of
    prefect version
    :
    Version:             2.3.1
    API version:         0.8.0
    Python version:      3.9.7
    Git commit:          1d485b1d
    Built:               Thu, Sep 1, 2022 3:53 PM
    OS/Arch:             linux/x86_64
    Profile:             default
    Server type:         <client error>
  • h

    Hicham Benbriqa

    02/11/2023, 9:22 PM
    I think the line that says Server type: <client error>, is problematic, but I am not sure how I can fix that. Now when I run a simple workflow such as:
    from prefect import task, flow
    
    @task
    def a():
        return 1+1
    
    @flow
    def flow1():
        print(a())
    
    if __name__ == '__main__':
        flow1()
  • h

    Hicham Benbriqa

    02/11/2023, 9:22 PM
    I get the following error:
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "test-prefect.py", line 12, in <module>
        flow1()
    .....
    sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: block_spec
    [SQL: ALTER TABLE block_spec RENAME TO block_schema]
    (Background on this error at: <https://sqlalche.me/e/14/e3q8>)
  • h

    Hicham Benbriqa

    02/11/2023, 9:23 PM
    Thank you very much in advance!
  • r

    Rohit Motiani

    02/12/2023, 10:52 PM
    Hello I get 'Failed to submit notification' message when trying to create a notification on the orion server. I am able to create notifications on Prefect Cloud without any issues though. Any help would be highly appreciated. Thanks.
  • s

    Shivan Trivedi

    02/13/2023, 8:06 AM
    I am trying to use Prefect REST API to create flow run for deployment but it's giving 403 and an error message saying that "Not authenticated" in response. What should I add for authentication ? Could any one help me on this? Thanks in Advance.
  • n

    Nick Williams

    02/13/2023, 1:50 PM
    Hi all, really enjoying Prefect so far. Much easier than airflow and much better suited to my requirements at work. I have a number of pipelines which load from Postgres into BigQuery, performing standard transformations along the way. What I've done is create a flow which loads a given JSON file from a bucket and then loads that table with the settings given in the JSON file. This flow can be run on its own with a parameter specifying the JSON filename. I have another flow which loops through the bucket and launches the single-table subflow for each file in the bucket, so I can schedule all the tables I need on a daily basis. It all works fine and I'm really happy with this. However, each subflow has a generic "active-pig", "tested-puma" etc name. Is there any way that I can set the names of these subflows in the python task decorator? It would give me improved visibility of which tables have been run.
    ✅ 1
    p
    • 2
    • 3
  • a

    Austin Weisgrau

    02/13/2023, 9:23 PM
    Getting an error from AWS when trying to apply deployment with S3 storage block.
    from flows.helloworld.helloworld_flow import helloworld
    from prefect.deployments import Deployment
    from prefect_aws.ecs import ECSTask
    from prefect_aws.s3 import S3Bucket
    
    s3_storage = S3Bucket.load("prod")
    
    ecs_task = ECSTask.load("prod")
    
    helloworld_deployment = Deployment.build_from_flow(
        flow=helloworld,
        name="Hello World",
        storage=s3_storage,
        path="helloworld",
        infrastructure=ecs_task,
    )
    
    helloworld_deployment.apply()
    is giving me
    $ python deployments.py 
    Traceback (most recent call last):
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/s3fs/core.py", line 112, in _error_wrapper
        return await func(*args, **kwargs)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/aiobotocore/client.py", line 358, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (SignatureDoesNotMatch) when calling the PutObject operation: The request signature we calculated does not match the signature you provided. Check your key and signing method.
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "~/code/wfp/wfp-prefect/deployments.py", line 29, in <module>
        helloworld_deployment = WFPDeployment(
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/prefect/utilities/asyncutils.py", line 230, in coroutine_wrapper
        return run_async_in_new_loop(async_fn, *args, **kwargs)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/prefect/utilities/asyncutils.py", line 181, in run_async_in_new_loop
        return anyio.run(partial(__fn, *args, **kwargs))
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/anyio/_core/_eventloop.py", line 70, in run
        return asynclib.run(func, *args, **backend_options)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 292, in run
        return native_run(wrapper(), debug=debug)
      File "~/.pyenv/versions/3.10.9/lib/python3.10/asyncio/runners.py", line 44, in run
        return loop.run_until_complete(main)
      File "~/.pyenv/versions/3.10.9/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
        return future.result()
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 287, in wrapper
        return await func(*args)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/prefect/deployments.py", line 755, in build_from_flow
        await deployment.upload_to_storage(ignore_file=ignore_file)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/prefect/deployments.py", line 600, in upload_to_storage
        file_count = await self.storage.put_directory(
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/prefect/filesystems.py", line 492, in put_directory
        return await self.filesystem.put_directory(
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/prefect/filesystems.py", line 368, in put_directory
        self.filesystem.put_file(f, fpath, overwrite=True)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/fsspec/asyn.py", line 114, in wrapper
        return sync(self.loop, func, *args, **kwargs)
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/fsspec/asyn.py", line 99, in sync
        raise return_result
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/fsspec/asyn.py", line 54, in _runner
        result[0] = await coro
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/s3fs/core.py", line 1101, in _put_file
        await self._call_s3(
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/s3fs/core.py", line 339, in _call_s3
        return await _error_wrapper(
      File "~/.pyenv/versions/prefect/lib/python3.10/site-packages/s3fs/core.py", line 139, in _error_wrapper
        raise err
    PermissionError: The request signature we calculated does not match the signature you provided. Check your key and signing method.
    r
    • 2
    • 1
  • a

    Austin Weisgrau

    02/13/2023, 10:51 PM
    Is there a way to check what the python API client expects for deployment name and flow name for deployments that have already been deployed to the cloud? I have found this behavior confusing. See examples in comment thread
    ✅ 1
    • 1
    • 4
  • t

    Theodor Sjöstedt

    02/14/2023, 9:31 AM
    Hi! I've followed this tutorial to set up a scheduled deployment in Prefect Cloud. I edited the yaml with
    schedule:
      cron: '*/5 * * * *'
      timezone: null
      day_or: true
    is_schedule_active: true
    To have it run every 5 minutes, and then applied it. When I run
    prefect agent start -q 'test'
    I see output from my deployment every 5 minutes, so its working, but as soon as I
    ctrc + C
    to exit the output of the agent, it stops, and nothing is scheduled. How do I get the agent to run without my terminal? 🙂
    r
    n
    • 3
    • 10
  • t

    Theodor Sjöstedt

    02/14/2023, 2:21 PM
    I've followed this tutorial trying to set up Prefect with Google Cloud Run. However after following all the steps, and running the deployment, the Flow run is still Late after 5 minutes. How can I debug what have gone wrong?
    r
    a
    • 3
    • 5
  • a

    Anil

    02/14/2023, 10:55 PM
    Hi everyone. I have just started looking into prefect and coiled and planning to use these 2 with ECS Cluster. Does anyone have any recommendations about good tutorials, examples, scripts on those? I have watched most from YouTube and looking into documentations on Coiled and Prefect pages now, but didn't find clear workaround on these.
    j
    a
    • 3
    • 3
  • s

    Shivan Trivedi

    02/16/2023, 5:10 PM
    Hey all, I need help so if I use this combination [ cache_key_fn=task_input_hash, cache_expiration=timedelta(seconds=20), retries=1, retry_delay_seconds=20 ] for a task then what will priortize i.e. if task crashes or fails and retries with same input will cache ignore that and run the task again or it will respond the same response from failed state without running the task again ? Also as I have kept cache_expiration _time and retry delay time equal so would it allow to run task again without any cached response ?
  • a

    Austin Weisgrau

    02/16/2023, 11:40 PM
    Think I have a bug but want to check that I'm using these methods correctly. The use of
    prefect.get_run_logger()
    is incompatible with testing task functions. A minimal example based on the documentation for testing task functions. I only added the two lines defining and using the logger.
    import pytest
    from prefect import flow, get_run_logger, task
    from prefect.testing.utilities import prefect_test_harness
    
    @pytest.fixture(autouse=True, scope="session")
    def prefect_test_fixture():
        with prefect_test_harness():
            yield
    
    @task
    def my_favorite_task():
        logger = get_run_logger()
        <http://logger.info|logger.info>("running task")
        return 42
    
    @flow
    def my_favorite_flow():
        val = my_favorite_task()
        return val
    
    def test_my_favorite_task():
        assert my_favorite_task.fn() == 42
    Raises
    MissingContextError("There is no active flow or task run context.")
    on the call to
    prefect.get_run_logger()
    • 1
    • 1
  • s

    Sampath Sukesh Ravolaparthi

    02/20/2023, 6:06 PM
    Hello Everyone!! I wanted to know what is a recommended configuration to run Prefect Server? Is it single threaded ( since it runs on python? ) will it make use of multiple cores? ( ex: 2, 4, 8, 16 ). What is the recommended RAM for server? ( 4GB, 8GB, 16GB )? etc. Are there any benchmarks from prefect that can be viewed? I tried to find the recommended system requirements in the docs, issues, Q&A, discourse etc but couldn’t find anything. Hoping to find some help here. Thanks.
  • a

    Alejandro Armas

    02/20/2023, 6:19 PM
    Sorry if this is incredibly basic of a question or if Prefect is not a good solution for this. Is it possible for me to setup a prefect server on an old gaming desktop with Cuda software? I want to run jobs from my laptop onto the desktop. Use case is training CNN models overnight. If this is possible, I'd love any helpful advice to get me going in the right direction.
  • e

    eli yosef

    02/21/2023, 7:09 AM
    Hi everyone, I try to publish notification to mattermost channel, The channel is within a specific team group. How can i publish a notification to different channel on different team group. example to publish webhook from jira: "mattermost.com/webhook?secret="TOKEN"&team="TEAMGROUP"&channel="prefectchannel" " Posted in #prefect-community
    r
    • 2
    • 2
  • k

    Kalise Richmond

    02/21/2023, 5:57 PM
    📺 Join @Bianca Hoch in 5 minutes for a Getting Started with Prefect Demo livestream
  • g

    Guillaume Bertrand

    02/22/2023, 9:43 AM
    Hi 😄 I was wondering why we can only write or read bytes on prefect filesystems. Why aren't there any method to write more conventional files like images or videos ?
    c
    • 2
    • 5
  • a

    Austin Weisgrau

    02/22/2023, 8:31 PM
    Is there any way to make task retry logic specific to particular exceptions? It seems like an anti-pattern to handle any/every exception the same way. I don't see anything in the documentation for the
    @task
    decorator that would allow this.
    r
    • 2
    • 2
  • d

    Dendi Handian

    02/23/2023, 6:58 AM
    Hi everyone, I want to run prefect locally using the official docker image (https://hub.docker.com/r/prefecthq/prefect/tags?page=2). I made a simple docker-compose.yml to get started and I've created 2 services (agent and ui), but I can't access the prefect ui at
    localhost:4200
    Here is my code repository https://github.com/dendihandian/transfermarkt-prefect, it requires docker-compose and anyone can try it by executing
    docker-compose up -d
    🙌 1
    👀 1
  • o

    Ouail Bendidi

    02/23/2023, 4:48 PM
    Hello everyone, Any advice on how to have a prefect flow triggered by rabbitmq events ? My current implem is to have a flow that is always running and listens on messages, then triggers the corresponding deployment when an event is received, but it seems very flaky and would prefer if prefect server is able to do it instead of running it in an agent 🤔
    j
    • 2
    • 1
  • n

    Nimesh Kumar

    02/24/2023, 12:24 PM
    Hi I have this flow and i want to run some flows based on condition, but even if the condition is false, rest of the task also execute
    ef start_inferencing(my_param, file_path):
        job_id = my_param
        algo_id = "402"
        res_1 = generate_uuid.submit(algo_id, job_id)
        call_on_failure_if_failed(res_1)
        res_2 = get_file.submit(file_path)
        call_on_failure_if_failed(res_2)
        res_4 = choose_valid_file.submit(prev_task=res_2)
        call_on_failure_if_failed(res_4)
        res_5 = prepare_request_upload.submit(prev_task=res_4)
        call_on_failure_if_failed(res_5)
        res_6 = send_data_request.submit(prev_task=res_5)
        call_on_failure_if_failed(res_6)
        res_7 = prepare_predict_request.submit(prev_task=res_6)
        call_on_failure_if_failed(res_7)
        res_8 = send_predict_request.submit(prev_task=res_7)
        call_on_failure_if_failed(res_8)
        res_9 = extract_lunit_jobid.submit(prev_task=res_8)
        call_on_failure_if_failed(res_9)
        res_10 = prepare_fetch.submit(prev_task=res_9)
        call_on_failure_if_failed(res_10)
        res_11 = fetch_request.submit(prev_task=res_10)
        call_on_failure_if_failed(res_11)
        if res_11:
            res_12 = convert_output.submit(prev_task=res_11)
            call_on_failure_if_failed(res_12)
            res_13 = zip_file.submit(prev_task=res_12)
            call_on_failure_if_failed(res_13)
            res_14 = send_to_HGW.submit(prev_task=res_13)
            call_on_failure_if_failed(res_14)
        else:
            pass
            
    
    if __name__ == "main":
        start_inferencing(parameters=dict_)
    i want to reun convert_output, zip_file, send to hgw, only when condition is true, else exit.
  • w

    Wellington Braga

    02/24/2023, 2:37 PM
    is there any prefect native function in python which is equivalent to the command:
    prefect deployment ls --flow-name <flow-name>
    ?
    ✅ 1
    r
    • 2
    • 1
  • v

    Vera Zabeida

    02/24/2023, 5:13 PM
    hi there beautiful community! I'm getting started with Prefect and trying out CI/CD im watching this

    video▾

    from the Prefect team about setting up CI/CD with Prefect here's roughly the code for the github workflow, that is supposed to apply a deployment on main branch push:
    name: Ad Hoc Deployment
    
    env:
      PREFECT_API_KEY: ${{ secrets.PREFECT_API_KEY }}
      PREFECT_API_URL: ${{ secrets.PREFECT_API_URL }}
    
    on:
      push:
        branches:
          - main
    
    jobs:
      ad-hoc-deployment:
        name: Build and apply deployment
    
        runs-on: ubuntu-latest
        timeout-minutes: 45
    
        steps:
          - uses: actions/checkout@v3
            with:
              persist-credentials: false
              fetch-depth: 0
    
          - name: Set up Python 3.11.1
            uses: actions/setup-python@v4
            with:
              python-version: "3.11.1"
              cache: "pip"
              cache-dependency-path: "requirements*.txt"
    
          - name: Install packages
            run: |
              python -m pip install --upgrade pip
              pip install --upgrade --upgrade-strategy eager -r requirements.txt
    
          - name: Build and apply deployment
            run: |
              prefect deployment build ./flows/pipeline.py:hello \
              --name "Ad Hoc Deployment" \
              --params '{"user_input": "github action"}' \
              --version $GITHUB_SHA \
              --tag ad-hoc-gh \
              --work-queue github \
              --apply
    In the video they set up a
    --storage-block
    and use github as a storage block, and use that on the prefect deployment build command, but I'm wondering why that'd be needed. In the step above, we're checking out the repo so for me that would mean that we have all of the code already in this job, so no github storage is needed? I've had one build that succeeded with this setup, but I didn't see any deployments in the UI, double checked my env vars, and since then getting this error message that isn't very helpful, here's the repo where you can see it:
    prefect.exceptions.PrefectHTTPStatusError: Client error '404 Not Found' for url '***/flows/'
    Response: {'detail': 'Not Found'}
    I'm just trying to get a minimal setup going, and understand what's needed and not needed for that. TIA for any pointers!
Powered by Linen
Title
v

Vera Zabeida

02/24/2023, 5:13 PM
hi there beautiful community! I'm getting started with Prefect and trying out CI/CD im watching this

video▾

from the Prefect team about setting up CI/CD with Prefect here's roughly the code for the github workflow, that is supposed to apply a deployment on main branch push:
name: Ad Hoc Deployment

env:
  PREFECT_API_KEY: ${{ secrets.PREFECT_API_KEY }}
  PREFECT_API_URL: ${{ secrets.PREFECT_API_URL }}

on:
  push:
    branches:
      - main

jobs:
  ad-hoc-deployment:
    name: Build and apply deployment

    runs-on: ubuntu-latest
    timeout-minutes: 45

    steps:
      - uses: actions/checkout@v3
        with:
          persist-credentials: false
          fetch-depth: 0

      - name: Set up Python 3.11.1
        uses: actions/setup-python@v4
        with:
          python-version: "3.11.1"
          cache: "pip"
          cache-dependency-path: "requirements*.txt"

      - name: Install packages
        run: |
          python -m pip install --upgrade pip
          pip install --upgrade --upgrade-strategy eager -r requirements.txt

      - name: Build and apply deployment
        run: |
          prefect deployment build ./flows/pipeline.py:hello \
          --name "Ad Hoc Deployment" \
          --params '{"user_input": "github action"}' \
          --version $GITHUB_SHA \
          --tag ad-hoc-gh \
          --work-queue github \
          --apply
In the video they set up a
--storage-block
and use github as a storage block, and use that on the prefect deployment build command, but I'm wondering why that'd be needed. In the step above, we're checking out the repo so for me that would mean that we have all of the code already in this job, so no github storage is needed? I've had one build that succeeded with this setup, but I didn't see any deployments in the UI, double checked my env vars, and since then getting this error message that isn't very helpful, here's the repo where you can see it:
prefect.exceptions.PrefectHTTPStatusError: Client error '404 Not Found' for url '***/flows/'
Response: {'detail': 'Not Found'}
I'm just trying to get a minimal setup going, and understand what's needed and not needed for that. TIA for any pointers!
View count: 1