https://prefect.io logo
Docs
Join the conversationJoin Slack
Channels
announcements
ask-marvin
best-practices-coordination-plane
data-ecosystem
data-tricks-and-tips
events
find-a-prefect-job
geo-australia
geo-bay-area
geo-berlin
geo-boston
geo-chicago
geo-colorado
geo-dc
geo-israel
geo-japan
geo-london
geo-nyc
geo-seattle
geo-texas
gratitude
introductions
marvin-in-the-wild
prefect-ai
prefect-aws
prefect-azure
prefect-cloud
prefect-community
prefect-contributors
prefect-dbt
prefect-docker
prefect-gcp
prefect-getting-started
prefect-integrations
prefect-kubernetes
prefect-recipes
prefect-server
prefect-ui
random
show-us-what-you-got
Powered by Linen
prefect-community
  • a

    Alexey Stoletny

    09/23/2022, 8:13 PM
    Hey everyone! All of our flows on Prefect 2.0 started crashing all of a sudden with this error:
    prefect.exceptions.PrefectHTTPStatusError: Client error '403 Forbidden' for url '<https://api.prefect.cloud/api/accounts/43db7ccd-9f39-41f2-8989-000b28747858/workspaces/cedd89e9-9f12-421e-a17b-94045c976a2a/block_types/e91e7544-7ecd-4fa6-b6d7-a53068cb67fc>'
    Response: {'detail': 'protected block types cannot be updated.'}
    For more information check: <https://httpstatuses.com/403>
    :thank-you: 1
    ➕ 1
  • a

    Alexey Stoletny

    09/23/2022, 8:14 PM
    We’re not even updating the blocks, just reading them, but it still happens @Alix Cook
  • a

    Alix Cook

    09/23/2022, 8:14 PM
    ^^ yes im also seeing all my flows crash
    :thank-you: 1
  • a

    Alix Cook

    09/23/2022, 8:15 PM
    starting at 3:42 pm ET today
  • a

    Alexey Stoletny

    09/23/2022, 8:15 PM
    @Christopher Boyd prefect team, this is affecting production flows, is anyone available to help from Prefect?
    👀 1
    :thank-you: 1
    ➕ 1
    ✅ 1
    c
    a
    m
    • 4
    • 4
  • a

    Alexey Stoletny

    09/23/2022, 8:15 PM
    ^^ same here, starting at the same time
    👍 1
  • i

    Ilya Galperin

    09/23/2022, 8:33 PM
    Getting the same errors here
    :thank-you: 1
  • a

    Anna Geller

    09/23/2022, 8:38 PM
    @Ilya Galperin and everyone else following: thank you for reporting the issue. We'll update the status page and keep you posted as soon as the issue is resolved.
    :thank-you: 2
    :blob-attention-gif: 3
  • a

    Anna Geller

    09/23/2022, 8:42 PM
    It's now resolved. The status page is updated ✅ Thanks again for all reports!
    🎉 3
    🚀 3
  • a

    Alexey Stoletny

    09/23/2022, 8:44 PM
    Appreciate it, confirming the flows are now running again
    :thank-you: 3
  • x

    Xavier Babu

    09/23/2022, 11:31 PM
    Dear Prefect Community, is there any fix available to use a specific Postgres schema for Prefect Orion DB? Thanks, Xavier
    m
    • 2
    • 3
  • i

    Iuliia Volkova

    09/24/2022, 6:53 AM
    @Anna Geller, hi Anna, I have non-tech question, do you have any materials/articles about main ideas under development Prefect 2.0? why it was created, what problems & gaps team tried to solve with it?
    ✅ 1
    a
    • 2
    • 2
  • t

    Tadej Svetina

    09/24/2022, 1:54 PM
    Hi! Is there an easy way to redirect a python logger to the prefect logger - so the logs would show up in console? I am using a 3rd part library, which uses its own logger (from the standard logging library), and I would want to see its logs in prefect console as well
    👀 1
    ✅ 1
    o
    a
    • 3
    • 3
  • o

    Oliver Mannion

    09/24/2022, 1:57 PM
    Hiya does Prefect 2 have the concept of heartbeats and the Zombie Killer?
    ✅ 1
    a
    o
    • 3
    • 5
  • a

    Alex Turek

    09/24/2022, 7:24 PM
    Question: How can I manually control a task status, without the orchestrator trying to rerun it? I'm returning a (non final) task state and getting automatically retried. I'd like to not retry until the task is
    Failed
    , or skip retries when it's marked as
    Succeeded
    , by my code
    ✅ 1
    c
    a
    • 3
    • 25
  • g

    Georgi Yanev

    09/24/2022, 9:29 PM
    Hey everyone, just updated to 2.4.2 and all my flows stop working, did I miss a braking change. The error is Response: {'detail': 'protected block types cannot be updated.'}. The strange thing is that the block that is accessed is of type local-file-system and I do not have such block in my system.
    ✅ 1
    a
    • 2
    • 14
  • f

    flavienbwk

    09/25/2022, 3:10 PM
    Hi, I am using Prefect 1.x for a while now and trying to migrate to 2.x. But I feel like the documentation is missing a lot of examples and it is pretty complicated to migrate to 2.x although I consider the effort on publishing a lot of (too?) basic articles to get started. For example, I don't find anywhere an example on adding parameters to my flow. I'm here trying to register a flow in a S3 storage with pre-defined parameters at flow registration. The parameters include S3 credentials to create a bucket IN the flow.
    @flow(name="get_paris_weather")
    def get_paris_weather(
        minio_endpoint: str,
        minio_access_key: str,
        minio_secret_key: str,
        minio_use_ssl: bool,
        bucket_name: str,
    ):
        create_bucket(
            minio_endpoint,
            minio_access_key,
            minio_secret_key,
            minio_use_ssl,
            bucket_name,
        )
        city_coordinates = get_city_coordinates("Paris")
        return get_weather(city_coordinates[0], city_coordinates[1])
    
    
    # --- Deployment definition
    
    if __name__ == "__main__":
        bucket_name = os.environ.get("MINIO_PREFECT_FLOWS_BUCKET_NAME")
        minio_endpoint = os.environ.get("MINIO_ENDPOINT")
        minio_use_ssl = os.environ.get("MINIO_USE_SSL") == "true"
        minio_scheme = "https" if minio_use_ssl else "http"
        minio_access_key = os.environ.get("MINIO_ACCESS_KEY")
        minio_secret_key = os.environ.get("MINIO_SECRET_KEY")
    
        flow_identifier = uuid.uuid4()
        block_storage = RemoteFileSystem(
            basepath=f"s3://{bucket_name}/{flow_identifier}",
            key_type="hash",
            settings=dict(
                use_ssl=minio_use_ssl,
                key=minio_access_key,
                secret=minio_secret_key,
                client_kwargs=dict(endpoint_url=f"{minio_scheme}://{minio_endpoint}"),
            ),
        )
        block_storage.save("s3-storage", overwrite=True)
    
        deployment = Deployment.build_from_flow(
            name="get_weather_s3_example",
            flow=get_paris_weather,
            storage=RemoteFileSystem.load("s3-storage"),
            work_queue_name="flows-example-queue",
            parameters={
                minio_endpoint: minio_endpoint,
                minio_access_key: minio_access_key,
                minio_secret_key: minio_secret_key,
                minio_use_ssl: minio_use_ssl,
                bucket_name: bucket_name,
            },
        )
        deployment.apply()
    But what I get as error is :
    prefect.exceptions.SignatureMismatchError: Function expects parameters ['minio_endpoint', 'minio_access_key', 'minio_secret_key', 'minio_use_ssl', 'bucket_name'] but was provided with parameters ['False', 'minio', 'minio123', 'prefect-flows', '172.17.0.1:9000']
    Could you explain me how I can pass parameters to my flow ?
    ✅ 1
    r
    • 2
    • 2
  • f

    flavienbwk

    09/25/2022, 6:40 PM
    Hi, I'm trying to migrate from 1.x to 2.x with Docker storage and infrastructure. The following image (from blogpost) states that Prefect can store flows data in Docker images, but the documentation doesn't refer to it. Is it still possible in 2.x ? Is there any example I can find ? Thanks.
    ✅ 1
    👍 1
    a
    • 2
    • 9
  • d

    Deepanshu Aggarwal

    09/25/2022, 8:10 PM
    hi! i was trying to build a s3 block on github actions . but it fails at s3.save . attaching the screenshot for the same. it would be helpful if anyone faced similar issue and got any leads. thank you
    ✅ 1
    a
    • 2
    • 10
  • u

    张强

    09/26/2022, 3:41 AM
    Does DaskTaskRunner support designated workers? Specify in the dask as follows:
    future = client.submit(func, *args, workers=['Alice'],
                           allow_other_workers=True)
    ✅ 1
    m
    • 2
    • 3
  • j

    JV

    09/26/2022, 5:00 AM
    Hello. I am getting below error while executing Databricks pipeline using Prefect flow on windows machine. Environment details - Python - 3.10 Prefect - 2.4.0, also tested with 2.4.1, same issue persists. Error -
    prefect.exceptions.ParameterTypeError: Flow run received invalid parameters:
     - run_name: none is not an allowed value
    
    in jobs_runs_submit_and_wait_for_completion
        jobs_runs_state, jobs_runs_metadata = await jobs_runs_wait_for_completion(
    ValueError: too many values to unpack (expected 2)
    ✅ 1
    j
    a
    • 3
    • 10
  • h

    Ha Pham

    09/26/2022, 9:05 AM
    Hi all, as I understand, to run a flow, this is the steps: • first you create a work queue, and then you start an agent polling that work queue. • if you start an agent without creating a work queue first, the queue will be created automatically If you follow the first case, there maybe work queues created without agents. How do you check which agents are running? I don't see that option in the CLI.
    j
    • 2
    • 2
  • a

    Andreas Nord

    09/26/2022, 9:44 AM
    How can I locally logout from prefect cloud?
    ✅ 1
    a
    • 2
    • 3
  • h

    Hedgar

    09/26/2022, 10:14 AM
    @Anna Geller I think I have challenge with a GitHub template, I’m trying to implement the serverless with Aws Lambda template on GitHub. Under the environment section of the
    serverless.yml
    file I don’t understand the KVPs
    PREFECT_API_KEY
    and
    PREFECT_API_URL
    are the values supposed to be variables or secrets and from where?
    👍 1
    ✅ 1
    a
    j
    • 3
    • 8
  • a

    Andreas Nord

    09/26/2022, 10:20 AM
    Hi! I've had some problems with flows getting stuck after all tasks are finished and did not receive support so I've tried to recreate a minimal reproduceable example see code below. What is happening for me: The tasks can run fine, but after the tasks completes the memory usage explodes (goes to 100%). Depending on the size of the data the flow will either: • get stuck after last finished task • finish succesfully • either of above intermittently I can run the same flow perfectly fine in Prefect 1, it also runs a lot faster even when increasing data a lot. I attach a log of a unsuccesful prefect 2 run in the thread:
    code.py
    ✅ 1
    j
    • 2
    • 3
  • e

    Erick Joca

    09/26/2022, 11:28 AM
    Hello, guys! I believe I´ve found an unexpected behavior that obligates us to use scapes in JSON parameter. I´m running Prefect 2.4.1 under Windows 11. To reproduce the error, just try something similiar this command: prefect deployment build testetalend.py:talend --name talend_deploy --apply --params='{"job_name": "myjobname", "batch_file": "C:/Users/myuser/Documents/testeTalend/myscript.bat"}' After hit Enter, we got this message: json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1) To workarround the problem, type de same command using scapes in JSON string: prefect deployment build testetalend.py:talend --name talend_deploy --apply --params='{\"job_name\": \"myjobname\", \"batch_file\": \"C:/Users/myuser/Documents/testeTalend/myscript.bat\"}' Anyway, this behavior is very similar that was reported by the issue below: https://github.com/PrefectHQ/prefect/issues/3295 Thanks in advance!
    ✅ 1
    j
    r
    • 3
    • 6
  • a

    Aisha Merhebi

    09/26/2022, 11:56 AM
    Hello am getting this error when i run a deployment
    ✅ 1
    j
    • 2
    • 9
  • s

    Sylvain Hazard

    09/26/2022, 12:14 PM
    Hey ! Prefect 1 question : is there no way for me to schedule a Flow with required arguments, even though the schedule has
    default_params
    that fill required parameters ? I want to make sure users can't run the flow without specifying some arguments but also schedule it.
    ✅ 1
    r
    • 2
    • 2
  • b

    Blake Stefansen

    09/26/2022, 2:32 PM
    Hey y'all, I'm getting this weird error when running a prefect 2 deployment from a custom agent running in an ubuntu docker container. My flow code cannot be found when downloading from my s3 storage block during a deployment run. I normally upload the flow code when building my deployment. I can confirm that my flow file definitely exists in my s3 bucket. Any ideas?
    👀 1
    k
    j
    • 3
    • 10
  • x

    Xavier Babu

    09/26/2022, 2:36 PM
    When I use REST API with Scheduler enabled (e.g., RRULE), I get the following error in Prefect Orion 2.4. It was working fine in 2.0. Even though I have dedicated storage directory and REST API way of running workflow is pointing to default /tmp folder, not the one I have it using the parameter PREFECT_LOCAL_STORAGE_PATH. But If I run using .yml, it uses the path specified in PREFECT_LOCAL_STORAGE_PATH and it works fine. Please let me know what am I missing in the payload? Flow could not be retrieved from deployment. Traceback (most recent call last): File "<frozen importlib._bootstrap_external>", line 879, in exec_module File "<frozen importlib._bootstrap_external>", line 1016, in get_code File "<frozen importlib._bootstrap_external>", line 1073, in get_data FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpz0cmn1ddprefect/AOPS_SQL_Workflow_v1.py'
    k
    • 2
    • 7
Powered by Linen
Title
x

Xavier Babu

09/26/2022, 2:36 PM
When I use REST API with Scheduler enabled (e.g., RRULE), I get the following error in Prefect Orion 2.4. It was working fine in 2.0. Even though I have dedicated storage directory and REST API way of running workflow is pointing to default /tmp folder, not the one I have it using the parameter PREFECT_LOCAL_STORAGE_PATH. But If I run using .yml, it uses the path specified in PREFECT_LOCAL_STORAGE_PATH and it works fine. Please let me know what am I missing in the payload? Flow could not be retrieved from deployment. Traceback (most recent call last): File "<frozen importlib._bootstrap_external>", line 879, in exec_module File "<frozen importlib._bootstrap_external>", line 1016, in get_code File "<frozen importlib._bootstrap_external>", line 1073, in get_data FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpz0cmn1ddprefect/AOPS_SQL_Workflow_v1.py'
k

Khuyen Tran

09/26/2022, 2:42 PM
Hi @Xavier Babu, thanks for raising the issue. My teammates might be able to help. Can you move the error to the comment so that it is easier for others to follow other threads?
x

Xavier Babu

09/26/2022, 2:45 PM
Payload used: 2022-09-24 16:40:54.170 DEBUG 18568 --- [http-nio-5080-exec-962] com.aops.waves.service.FlowService : jsonObject.toJSONString(): { "parameter_openapi_schema": { "type": "object", "title": "Parameters", "properties": { "kwargs": "{\"type\": \"string\", \"title\":\"kwargs\"}" }, "required": [ "kwargs" ] }, "infrastructure_document_id": "736c6e6f-6c03-40fa-8ea4-71f946a05343", "infra_overrides": {}, "description": "AOPS_SQL_Workflow_DS_12", "version": "1", "work_queue_name": "waves_q", "tags": [ "waves_q" ], "path": "/localpart0/aop-shared/WAVES/workflows/", "schedule": { "rrule": "DTSTART:20220924T124300\nRRULE:FREQ=HOURLY;INTERVAL=1;COUNT=1;UNTIL=20220924T124500", "timezone": "US/Eastern" }, "flow_id": "d7202f6b-b929-4139-9e4d-73e2636a3fe0", "entrypoint": "AOPS_SQL_Workflow_v1.py:AOPS_SQL_Workflow", "name": "AOPS_SQL_Workflow_DS_12", "parameters": { "kwargs": { "sqltype": 1, "date_range": "CURRENT_DATE - INTERVAL '1 months'", "dbname": "gpprod", "selection": "count(*)", "flow_name": "AOPS_SQL_Workflow", "rpt_flag": "1", "tab1": "whse.dim_company", "schd_run_name": "AOPS_SQL_Workflow_DS_12", "sql": "SELECT {selection} FROM {tab1} WHERE show_in_report_flag = {rpt_flag} and (creation_date > ({date_range}));" } }, API Response: { "id": "ba109826-6846-4e0e-815b-ee905c593dab", "created": "2022-09-23T19:49:43.563714+00:00", "updated": "2022-09-24T16:40:54.208488+00:00", "name": "AOPS_SQL_Workflow_DS_12", "version": "1", "description": "AOPS_SQL_Workflow_DS_12", "flow_id": "d7202f6b-b929-4139-9e4d-73e2636a3fe0", "schedule": { "rrule": "DTSTART:20220924T124300\nRRULE:FREQ=HOURLY;INTERVAL=1;COUNT=1;UNTIL=20220924T124500", "timezone": "US/Eastern" }, "is_schedule_active": true, "infra_overrides": {}, "parameters": { "kwargs": { "sql": "SELECT {selection} FROM {tab1} WHERE show_in_report_flag = {rpt_flag} and (creation_date > ({date_range}));", "tab1": "whse.dim_company", "dbname": "gpprod", "sqltype": 1, "rpt_flag": "1", "flow_name": "AOPS_SQL_Workflow", "selection": "count(*)", "date_range": "CURRENT_DATE - INTERVAL '1 months'", "schd_run_name": "AOPS_SQL_Workflow_DS_12" } }, "tags": [ "waves_q" ], "work_queue_name": "waves_q", "parameter_openapi_schema": { "type": "object", "title": "Parameters", "required": [ "kwargs" ], "properties": { "kwargs": "{\"type\": \"string\", \"title\":\"kwargs\"}" } }, "path": "/localpart0/aop-shared/WAVES/workflows/", "entrypoint": "AOPS_SQL_Workflow_v1.py:AOPS_SQL_Workflow", "manifest_path": null, "storage_document_id": null, "infrastructure_document_id": "736c6e6f-6c03-40fa-8ea4-71f946a05343" } "is_schedule_active": "true" }
Moved.
:thank-you: 2
If we run a workflow for every minute for 5 minutes, it fails sporadically by showing the following error: Flow could not be retrieved from deployment. Traceback (most recent call last): File "<frozen importlib._bootstrap_external>", line 879, in exec_module File "<frozen importlib._bootstrap_external>", line 1016, in get_code File "<frozen importlib._bootstrap_external>", line 1073, in get_data FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmp7v3me6ytprefect/AOPS_SQL_Workflow_v1.py' The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/prefect/engine.py", line 257, in retrieve_flow_then_begin_flow_run flow = await load_flow_from_flow_run(flow_run, client=client) File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/prefect/client/orion.py", line 82, in with_injected_client return await fn(*args, **kwargs) File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/prefect/deployments.py", line 70, in load_flow_from_flow_run flow = await run_sync_in_worker_thread(import_object, str(import_path)) File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/prefect/utilities/asyncutils.py", line 57, in run_sync_in_worker_thread return await anyio.to_thread.run_sync(call, cancellable=True) File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run result = context.run(func, *args) File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/prefect/utilities/importtools.py", line 193, in import_object module = load_script_as_module(script_path) File "/localpart0/aop-shared/WAVES/prefect2/lib/python3.10/site-packages/prefect/utilities/importtools.py", line 156, in load_script_as_module raise ScriptError(user_exc=exc, path=path) from exc prefect.exceptions.ScriptError: Script at 'AOPS_SQL_Workflow_v1.py' encountered an exception
Khuyen, Any quick tip to resolve this issue? It has stopped the entire DEV work. Impacting our application release date. Please help.
k

Khuyen Tran

09/26/2022, 6:15 PM
Hmm. I’m not sure how to solve this issue. Do you mind opening this issue on Prefect GitHub so the engineers can take a look at this?
x

Xavier Babu

09/27/2022, 2:24 PM
Good morning Khuyen. I opened a issue/defect yesterday. Since it made our entire system down, could you please expedite if possible? https://github.com/PrefectHQ/prefect/issues/6979
View count: 3