• Heeje Cho

    Heeje Cho

    7 months ago
    I am a little bit confused at the way prefect treats mapping. If you map to an empty list/tuple do no child tasks get created and the task is skipped? Or is are all child tasks failed automatically?
    Heeje Cho
    1 replies
    Copy to Clipboard
  • R Zo

    R Zo

    7 months ago
    Hi, I had a few issues/questions which I will ask in different posts: post 1)I have setup two different python environments. In the first environment the latest prefect version is installed and jobs run to completion. Using the first environment I create a second environment and install tensorflow 2.4.0 which causes some of the packages such as numpy to be downgraded from 1.22.x to 1.19.5 . In the second environment jobs either hang (stop running) or there is a segmentation fault, depending on the OS. Is there a reason for this behavior? Any pointers I could use or info I could provide? I am using the LocalDaskExecutor for running jobs locally.
  • R Zo

    R Zo

    7 months ago
    post 2) I have also made a distributed environment where I am using the prefect server/agent and dask-schedular/worker. When running the dask worker locally (i.e all in the same machine) jobs seems to run okay however when running a worker on a different machine following is some of the errors that I get (the prefect version is the same across the machines): raise ClientError(result["errors"]) prefect.exceptions.ClientError: [{'message': 'Invalid task ID', 'locations': [{'line': 2, 'column': 5}], 'path': ['get_or_create_task_run_info'], 'extensions': {'code': 'INTERNAL_SERVER_ERROR', 'exception': {'message': 'Invalid task ID'}}}] ERROR - 2022-02-25 13:01:57,769 - task_runner - Failed to retrieve task state with error: ClientError([{'message': 'Invalid task ID', 'locations': [{'line': 2, 'column': 5}], 'path': ['get_or_create_task_run_info'], 'extensions': {'code': 'INTERNAL_SERVER_ERROR', 'exception': {'message': 'Invalid task ID'}}}])
  • R Zo

    R Zo

    7 months ago
    post 3) This is related to a post 1 above, will it be possible to have prefect/dask setup to run jobs on two different python environments, one running tensorflow with perhaps older packages and the other running a newer environment.
    R Zo
    Kevin Kho
    5 replies
    Copy to Clipboard
  • Ben Muller

    Ben Muller

    7 months ago
    Hey, I am running a flow locally and have another flow deployed in the cloud. I loop over something and create multiple sub flows.
    create_flow_run.run(
                flow_name="advanced_cleaning",
                project_name="modelling",
                run_name=f"data_update={data_update}",
                parameters=dict(data_update=data_update),
            )
    When I run this it is called multiple times, but only one sometimes two flows are triggered in the cloud, is this a possible bug? Do run names need to be unique? This is being called from within a task fyi
    Ben Muller
    Kevin Kho
    20 replies
    Copy to Clipboard
  • t

    Tomer Cagan

    7 months ago
    Is it possible to run dask code within prefect task? Is there a way to get submit work to the dask cluster that runs my flow (KubeCluster to LocalDaskCluster)? I am looking for a way to make my core functionality rely on dask and still be able to leverage the orchestration through prefect...
    t
    Anna Geller
    +2
    20 replies
    Copy to Clipboard
  • Aditi Tambi

    Aditi Tambi

    7 months ago
    Hi I am new to this , I am using python, after initialising a prefect flow , we use flow.run() to start the flow . I would like to know any command which can terminate this flow . Did not find any such thing on prefect doc.
    Aditi Tambi
    Anna Geller
    12 replies
    Copy to Clipboard
  • datamongus

    datamongus

    7 months ago
    Hi everyone, I have a question regarding Prefect Core ~ Prefect Orion. Is it possible to use Prefect Core’s tasks inside Prefect Orion ? Meaning can I use the DBT Task library for core in Orion.
    datamongus
    Anna Geller
    4 replies
    Copy to Clipboard
  • Donnchadh McAuliffe

    Donnchadh McAuliffe

    7 months ago
    I have a general questions about updating tasks in a flow: say you deploy a flow with an interval schedule, you realise after a few flow runs that there is a bug in one of the tasks, you make a fix for this, is there a way to update an already deployed flow with this fix or do you need to delete the deployed flow and deploy again with the new code?
    Donnchadh McAuliffe
    Anna Geller
    6 replies
    Copy to Clipboard
  • Matthias

    Matthias

    7 months ago
    I am trying to replicate running this flow on my own laptop with Docker Desktop. I have created a docker image (but did not push it to e.g. DockerHub) and registered the flow just like in the script to Prefect Cloud. Started a docker agent and did a quick run from the UI. So far so good everything works. But, when I modify the Dockerfile so that it can run as non-root, I can no longer do a successful run. I get the following error from the agent:
    docker.errors.NotFound: 404 Client Error for <http+docker://localhost/v1.41/containers/f30a3e9241fab0d272c3f36eae16867487fb187b964c8b3f22bc8fd05d2aa4d0/json>: Not Found ("No such container: f30a3e9241fab0d272c3f36eae16867487fb187b964c8b3f22bc8fd05d2aa4d0")
    and the flow is stuck in submitted state. Does anyone knows how to fix it? Just to make sure, the only difference between the successful run and the run stuck in submitted state is the fact that I added a non-root user to the image.https://github.com/anna-geller/packaging-prefect-flows/blob/master/flows_no_build/docker_script_docker_run_local_image.py
    Matthias
    Anna Geller
    9 replies
    Copy to Clipboard