Tim Leers
07/20/2024, 8:32 AMSamuel Hinton
07/20/2024, 8:40 AMResult with value None persisted to Prefect.
clearly in the sidebar. However, prefect is rerunning all those tasks which say that their return value (None being correct) is persisted. What am I missing here?Arthur
07/21/2024, 2:38 PMfutures=[]
for i in list:
future= run_deployment(name=deployment1)
futures.append(future)
for future in futures:
result = task(future)
Samuel Hinton
07/22/2024, 3:35 AMSamuel Hinton
07/22/2024, 5:47 AM@task(tags=["fetch_data"])
def fetch_data(date: dt) -> Something:
return some_http_response_payload
@task
def process_stuff(payloads: list) -> None:
process_and_save_or_something(payloads)
@flow(task_tag_concurrency=dict(fetch_data=3))
async def some_flow(start: dt, end: dt):
futures = [fetch_data.submit(d) for d in dates_between(start, end)]
process_stuff(futures)
Either that or something akin to
with task_run_limit(5):
...some collection of tasks, of which 5 at most will run concurrently
SantanM
07/22/2024, 10:05 AM.deploy
cannot create a deployment with build and push set to False
? When doing so, Prefect throws an error as follows -
ValueError: Work pool 'test-pool' does not support custom Docker images. Please use a work pool with an `image` variable in its base job template.
This may be a design choice but to me it looks like a inconsistency when compared with prefect.yaml
way of creating a deployment. I would like to understand the nuance of the design choice made and would be grateful if someone can explain this? Also, is there a way of creating a deployment for ProcessWorker using the python API by just mentioned the pull action policy?Ankit
07/22/2024, 12:17 PMAdeel Shakir
07/22/2024, 2:23 PMLucas
07/22/2024, 5:08 PMprefect deploy
command, but I’m curious if there are other methods that are easier or more commonly used within the community. Any suggestions or best practices would be greatly appreciated!Alan
07/22/2024, 8:50 PMNico Neumann
07/23/2024, 12:00 PMA deprecation warning was applied to Prefect interfaces related to agent-based deployments in Prefect’s 2.16.4 release, including infrastructure blocks andhttps://docs-3.prefect.io/3.0rc/resources/upgrade-prefect-3#workers-replace-agents What exactly means supported until September 2024? What happens after that date when I use Prefect Cloud? 1. My agents are going to stop working completely 2. My agents are continuing to work, but I am not able to add new agents 3. My agents are continuing to work and I am able to add new agents but when I update prefect 2 to a new version after September 2024 I cannot use the agents anymore 4. Something differentcommands. Agents continue to be supported in Prefect 2 until September 2024, but are not included in Prefect 3.prefect deployment
Arthur
07/23/2024, 1:15 PMGabriel Lespérance
07/23/2024, 6:51 PMJuanes Grimaldos
07/24/2024, 2:31 AMaviv
07/24/2024, 7:53 AM17:10:38.717 | INFO | prefect.flow_runs.worker - Worker 'DockerWorker 031b26fc-c041-4d9d-88fc-80b26aeca1d7' submitting flow run '72155316-0a16-4a38-a9bd-754cc3aff59a'
17:10:38.924 | INFO | prefect.worker.docker.dockerworker 031b26fc-c041-4d9d-88fc-80b26aeca1d7 - Creating Docker container '9c7fbabd-15fe-4195-8e70-c21a463a7136'...
17:10:38.977 | INFO | prefect.worker.docker.dockerworker 031b26fc-c041-4d9d-88fc-80b26aeca1d7 - Docker container '9c7fbabd-15fe-4195-8e70-c21a463a7136' has status 'created'
17:10:39.246 | INFO | prefect.flow_runs.worker - Completed submission of flow run '72155316-0a16-4a38-a9bd-754cc3aff59a'
17:10:39.247 | INFO | prefect.worker.docker.dockerworker 031b26fc-c041-4d9d-88fc-80b26aeca1d7 - Docker container '9c7fbabd-15fe-4195-8e70-c21a463a7136' has status 'running'
14:10:40.752 | INFO | prefect.flow_runs.runner - Opening process...
<frozen runpy>:128: RuntimeWarning: 'prefect.engine' found in sys.modules after import of package 'prefect', but prior to execution of 'prefect.engine'; this may result in unpredictable behaviour
14:10:41.501 | INFO | Flow run '9c7fbabd-15fe-4195-8e70-c21a463a7136' - Downloading flow code from storage at '/my/file/path/projects/flows/dummy_block_flow/'
14:10:41.504 | ERROR | Flow run '9c7fbabd-15fe-4195-8e70-c21a463a7136' - Flow could not be retrieved from deployment.
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/prefect/engine.py", line 409, in retrieve_flow_then_begin_flow_run
flow = await load_flow_from_flow_run(flow_run, client=client)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/prefect/client/utilities.py", line 51, in with_injected_client
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/prefect/deployments/deployments.py", line 235, in load_flow_from_flow_run
await storage_block.get_directory(from_path=from_path, local_path=".")
File "/usr/local/lib/python3.11/site-packages/prefect/filesystems.py", line 162, in get_directory
copytree(from_path, local_path, dirs_exist_ok=True, ignore=ignore_func)
File "/usr/local/lib/python3.11/shutil.py", line 559, in copytree
with os.scandir(src) as itr:
^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/my/file/path/projects/flows/dummy_block_flow/'
Yaron Levi
07/24/2024, 2:05 PMAlan
07/24/2024, 4:37 PMAman Pervaiz
07/25/2024, 2:09 AMJuanes Grimaldos
07/25/2024, 3:05 AM# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Add our requirements.txt file to the image and install dependencies
COPY requirements.txt .
COPY src/ /src/
# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
WORKDIR /src
EXPOSE 4200
# Run our flow script when the container starts
CMD ["python", "workflow.py"]
and I want to run a code that end like this:
@flow
def main_flow(
mlflow_path: str = "sqlite:///mlflow/mlflow.db",
experiment: str = "random-forest",
) -> None:
"""The main training pipeline"""
datasets = load_data()
best_hyper_params = train_model(datasets)
logging.info(f"Best hyperparameters: {best_hyper_params}")
register_model(mlflow_path, experiment)
if __name__ == "__main__":
main_flow.serve(
name="src",
cron="*/2 * * * *",
tags=["mlops", "tracking"],
description="keep track on the model performance",
version="0.1.0",
)
Juanes Grimaldos
07/25/2024, 3:16 AM# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Add our requirements.txt file to the image and install dependencies
COPY requirements.txt .
COPY src/ /src/
# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
WORKDIR /src
EXPOSE 4200
# Run our flow script when the container starts
CMD ["python", "workflow.py"]
what I expect is a docker container running a server with the flow deployed and with the schedules ready:
@flow
def main_flow(
mlflow_path: str = "sqlite:///mlflow/mlflow.db",
experiment: str = "random-forest",
) -> None:
"""The main training pipeline"""
datasets = load_data()
best_hyper_params = train_model(datasets)
<http://logging.info|logging.info>(f"Best hyperparameters: {best_hyper_params}")
register_model(mlflow_path, experiment)
if __name__ == "__main__":
main_flow.serve(
name="src",
cron="*/2 * * * *",
tags=["mlops", "tracking"],
description="keep track on the model performance",
version="0.1.0",
)
The above code depends on several other file, like utils, other task and so on.
so I need to deploy, but when I run this I got an error
docker run --network="host" -e PREFECT_API_URL=<http://host.docker.internal:4200/api> workflow
the error is
httpx.ConnectError: All connection attempts failed
httpcore.ConnectError: All connection attempts failedJuanes Grimaldos
07/25/2024, 6:32 AMAbhishek Mitra
07/25/2024, 6:35 AMAkhil Jain
07/25/2024, 1:14 PMPatricio Navarro
07/25/2024, 1:53 PMprefect block types inspect s3-bucket
gives no help 😕 .
It's not clear how to create a s3-bucket block linked to an aws-credential block.
I tried different approach but non of them is working.
This is my code:
resource "prefect_block" "block_aws_credentials_prod" {
name = "test-credentials"
type_slug = "aws-credentials"
data = jsonencode({
"aws_access_key_id" = "XXXXX",
"aws_secret_access_key" = ""
})
}
resource "prefect_block" "block_s3_wsf_event_data" {
name = "test-bucket"
type_slug = "s3-bucket"
data = jsonencode({
"bucket_name" = "wsf-event-data"
"credentials" = prefect_block.block_aws_credentials_prod.data
})
}
Does anybody have a good idea of how to solve it?
thanks.LvffY
07/25/2024, 2:32 PMThomas van Riet
07/25/2024, 3:18 PMDerek
07/25/2024, 6:04 PMAngelika Tarnawa
07/26/2024, 8:37 AMSamuel Hinton
07/26/2024, 9:47 AMAryan Pathania
07/26/2024, 12:45 PM