task) but keep running into this error. Any suggestions? The SSH key is set up correctly, but there doesn't appear to be a user associated with the
commands. This wasn't an issue previously, as older flow images can pull git projects just fine. Only came up recently after I rebuilt the image again (Prefect version & python version are still the same).
Cloning into './dbt_project'... No user exists for uid 1000190000 fatal: Could not read from remote repository. Please make sure you have the correct access rights and the repository exists.
v2.1.0 was just released an hour ago and it has broken the healthcheck when we try to register a flow. We're using the
as a base image. We now get this error when building the docker image.
Step 21/21 : RUN python /opt/prefect/healthcheck.py '["/opt/prefect/flows/prefect-dbt-run-modelling.prefect"]' '(3, 7)' ---> Running in 1d14c333ced9 Beginning health checks... System Version check: OK Traceback (most recent call last): File "/opt/prefect/healthcheck.py", line 152, in <module> flows = cloudpickle_deserialization_check(flow_file_paths) File "/opt/prefect/healthcheck.py", line 44, in cloudpickle_deserialization_check flows.append(cloudpickle.loads(flow_bytes)) AttributeError: Can't get attribute '_make_function' on <module 'cloudpickle.cloudpickle' from '/usr/local/lib/python3.7/site-packages/cloudpickle/cloudpickle.py'>
folder like this
. When i do so, i get an error. So i tried to reproduced it with the dummy code you have on the image. I get the same error which is
from flow_utilities.config import funcA funcB
. This really seems stupid to me as it is just importing some func from local file but somehow i can't make it work with this file architecture. What am i doing wrong ?
ModuleNotFoundError: No module named 'flow_utilities'
it apparently only search for local secrets. The doc says i have to change
to false but i can't understand how... I suppose it is in the config.toml file but as the flow has to run on an eks cluster, how can i specify this for the cluster ?
Alvaro Durán Tovar
is it possible to use docker storage? thinking on possible issues trying to find the path of the flow inside the docker file, there won't any "flow" variable on the module level
def build(...): with Flow(...) as Flow: ... return flow
on my own. I see that there is a
on the master branch; was there a conscious decision made not to include this marker file in 2.0? If so, why? Without the marker file, I get:
With the marker file, I get:
from prefect.flows import flow reveal_type(flow) # Unknown
which is much better. For example, now when I decorate a function with
from prefect.flows import flow reveal_type(flow) # Overload[(__fn: (**P@flow) -> R@flow, /) -> Flow[P@flow, R@flow], (*, name: str = None, version: str = None, task_runner: BaseTaskRunner = ConcurrentTaskRunner, description: str = None, timeout_seconds: int | float = None, validate_parameters: bool = True) -> (((**P@flow) -> R@flow) -> Flow[P@flow, R@flow]), (__fn: Unknown | None = None, *, name: str = None, version: str = None, task_runner: BaseTaskRunner = ConcurrentTaskRunner, description: str = None, timeout_seconds: int | float = None, validate_parameters: bool = True) -> (Flow[P, R] | (((**P) -> R) -> Flow[P, R]))]
and use it in an
, the return type of the task is known to have a
method. When it was just an Unknown, pyright would complain that I'm accessing a method that doesn't exist. For someone who enforces fully passing mypy (now pyright) checks on every pull request, this is kind of a necessity.