Daniel
07/31/2024, 2:22 PMFile "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/deployments/steps/core.py", line 125, in run_step
result = await from_async.call_soon_in_new_thread(
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 327, in aresult
return await asyncio.wrap_future(self.future)
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 352, in _run_sync
result = self.fn(*self.args, **self.kwargs)
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect_docker/deployments/steps.py", line 200, in build_docker_image
raise BuildError(event["error"])
prefect.utilities.dockerutils.BuildError: NotFound: content digest sha256:722de475f5c2221f2a474ab46be25e9693c92ad57e73082c55a1296c7a646a21: not found
Here is my prefect_dev.yaml file:
# Generic metadata about this project
name: amy-atl-pipline
prefect-version: 2.19.1
# build section allows you to manage and build docker images
build:
- prefect_docker.deployments.steps.build_docker_image:
id: build_image
requires: prefect-docker>=0.3.1
image_name: europe-west2-docker.pkg.dev/amygdadss/prefect-repo/atl-data-pipeline
tag: v1
dockerfile: ./Dockerfile
platform: linux/amd64
buildargs:
GITHUB_API_KEY: '{{ $GITHUB_TOKEN }}'
# push section allows you to manage if and how this project is uploaded to remote locations
push:
- prefect_docker.deployments.steps.push_docker_image:
requires: prefect-docker>=0.3.1
image_name: '{{ build_image.image_name }}'
tag: '{{ build_image.tag }}'
# pull section allows you to provide instructions for cloning this project in remote locations
pull:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/amy-atl-pipline
# the deployments section allows you to provide configuration for deploying flows
deployments:
- name: atl-pipeline-deploy-dev
version:
tags: ["dev"]
description:
entrypoint: flows/atl_flow.py:process_atl_data
parameters:
new_file_list: ["angel_trains/class180_109_UML/June 2021/export_position_engine_data_2021-06-02.xlsx"]
work_pool:
name: cloud-run-work-pool
work_queue_name: default
job_variables:
image: '{{ build_image.image }}'
schedules: []
Marvin
07/31/2024, 2:22 PMwhy am i getting this error
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/deployments/steps/core.py", line 125, in run_step
result = await from_async.call_soon_in_new_thread(
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 327, in aresult
return await asyncio.wrap_future(self.future)
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 352, in _run_sync
result = self.fn(*self.args, **self.kwargs)
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect_docker/deployments/steps.py", line 200, in build_docker_image
raise BuildError(event["error"])
prefect.utilities.dockerutils.BuildError: NotFound: content digest sha256:722de475f5c2221f2a474ab46be25e9693c92ad57e73082c55a1296c7a646a21: not found
HereDaniel
07/31/2024, 2:22 PM@Marvin why am i getting this error
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/deployments/steps/core.py", line 125, in run_step
result = await from_async.call_soon_in_new_thread(
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 327, in aresult
return await asyncio.wrap_future(self.future)
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 352, in _run_sync
result = self.fn(*self.args, **self.kwargs)
File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect_docker/deployments/steps.py", line 200, in build_docker_image
raise BuildError(event["error"])
prefect.utilities.dockerutils.BuildError: NotFound: content digest sha256:722de475f5c2221f2a474ab46be25e9693c92ad57e73082c55a1296c7a646a21: not found
Marvin
07/31/2024, 2:22 PM> why am i getting this error``` File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/deployments/steps/core.py", line 125, in run_step result = await from_async.call_soon_in_new_thread( File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 327, in aresult return await asyncio.wrap_future(self.future) File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect/_internal/concurrency/calls.py", line 352, in _run_sync result = self.fn(*self.args, **self.kwargs) File "/Users/dansimons/amygda/angel-class180-project/menv/lib/python3.10/site-packages/prefect_docker/deployments/steps.py", line 200, in build_docker_image raise BuildError(event["error"]) prefect.utilities.dockerutils.BuildError: NotFound: content digest sha256722de475f5c2221f2a474ab46be25e9693c92ad57e73082c55a1296c7a646a21 not found
Daniel
07/31/2024, 2:23 PMMarvin
07/31/2024, 2:23 PMprefect-docker
) are installed and correctly configured in your environment.
3. Inspect Dockerfile: Verify the contents of your Dockerfile and ensure all paths and configurations are correct.
Here's a resource that might help:
- Building Docker Images with Prefect Docker
If the problem persists, I recommend checking your Dockerfile content and the specific configuration provided in the build_docker_image
step.
If additional context or a minimal reproducible example is shared, it might be easier to pinpoint the issue.Daniel
07/31/2024, 2:26 PMMarvin
07/31/2024, 2:27 PM--platform=linux/amd64
is necessary for your use case. It generally isn't needed unless specifically required.
2. Run Commands: You should use &&
correctly and avoid HTML entities like &
which could cause issues.
3. Copy Instructions: It is recommended to copy as few files as necessary to keep the Docker layers small.
4. Modify requirements.txt
: Single quotes should be used consistently.
5. Environment Variable ARG: Ensure the variable is correctly passed.
6. Entry Point Syntax: Double-check quotation marks in your entry point.
Here's a cleaned-up version of your Dockerfile:
dockerfile
# Use the official Python image as a base image
FROM python:3.10.12-slim
# Set the working directory in the container
WORKDIR /opt/prefect/amy-atl-pipline
# Install system dependencies, including Git and g++
RUN apt-get update && apt-get install -y \
gcc \
g++ \
libpq-dev \
git \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements file
COPY requirements.txt .
# Modify requirements.txt to exclude shap
RUN sed -i '/shap/d' requirements.txt
# Set Git buffer size to allow AmygdaAnalytics lib to be installed
RUN git config --global http.postBuffer 104857600
# Copy the rest of the application code
COPY . .
# Accept build argument for GitHub API key
ARG GITHUB_TOKEN
# Set it as an environment variable
ENV GITHUB_TOKEN=$GITHUB_TOKEN
# Specify the entry point for the container
ENTRYPOINT ["python"]
Make sure your GITHUB_TOKEN
is securely handled and not hard-coded.
If this doesn't resolve the issue, ensure:
- Docker daemon is running properly.
- Dependencies required in requirements.txt
are installable.
- No lingering issues with path or file access permissions.