Kalise Richmond
11/01/2022, 2:13 PMAnna Geller
Mason Menges
11/01/2022, 2:15 PMBianca Hoch
11/01/2022, 2:15 PMJeff Hale
11/01/2022, 2:15 PMKhuyen Tran
11/01/2022, 2:15 PMRyan Peden
11/01/2022, 2:15 PMVadym Dytyniak
11/01/2022, 3:04 PMRob Freedy
11/01/2022, 3:09 PMStephen Herron
11/01/2022, 3:11 PMJamie Zieziula
11/01/2022, 3:17 PMDavid Hlavaty
11/01/2022, 3:19 PMClaire Herdeman
11/01/2022, 3:28 PMJean Luciano
11/01/2022, 3:29 PMNate
11/01/2022, 3:30 PMJohn Mizerany
11/01/2022, 3:33 PMJames Sopkin
11/01/2022, 3:34 PMChris Reuter
11/01/2022, 4:51 PMpbutler
11/01/2022, 8:57 PMJimmy Le
11/01/2022, 10:43 PMPekka
11/02/2022, 12:43 PMwith prefect.settings.temporary_settings(updates={prefect.settings.PREFECT_API_URL: prefect_api_url}):
block = prefect.filesystems.RemoteFileSystem(basepath=f"s3://{bucket_name}/",
settings={"client_kwargs": {"endpoint_url": endpoint_url}})
block.save("s3-testbucket", overwrite=True)
block.register_type_and_schema()
deployment = prefect.deployments.Deployment.build_from_flow(
flow=prefect_cc.coreflows.flows.mh_fi_etl_flow.the_flow,
name=deployment_name,
storage=block,
work_queue_name=work_queue_name
)
t = deployment.apply()
it's raising
E prefect.exceptions.PrefectHTTPStatusError: Client error '403 Forbidden' for url '<http://localhost:4200/api/block_types/27b5cbc1-88e9-4378-bd82-3adb9125b432>'
E Response: {'detail': 'protected block types cannot be updated.'}
E For more information check: <https://httpstatuses.com/403>
this happens regardless of the block.register_type_and_schema
or not -- I don't know how to debug
-- the offending line is the the line: deployment.apply()
even this breaks
block = prefect.filesystems.RemoteFileSystem.load("s3-testbucket")
# leaving the server running and making sure the block exists
deployment = prefect.deployments.Deployment.build_from_flow(
flow=prefect_cc.coreflows.flows.mh_fi_etl_flow.the_flow,
name=deployment_name,
storage=block,
work_queue_name=work_queue_name
)
t = deployment.apply()
-- both the Block (s3-testbucket
) and the Flow show up but there is no Deployment or Work Queue
INFO: 172.28.0.1:38624 - "PATCH /block_types/6701d78b-8719-4db5-9303-cfd648460183 HTTP/1.1" 403 Forbidden
-- https://docs.prefect.io/api-ref/rest-api/#/Block%20types/update_block_type_block_types__id__patch -- this is not documentedSzymon Szyszko
11/02/2022, 4:55 PMWAL
which is not allowed in network file systems. I changed that to delete
without success - logs do not change.
I have seen similar logs to mine here
In my project, I have other Sqlite databases in the EFS directory and they work fine.Tobias
11/09/2022, 10:48 AMUpload to S3 - maintenance flow
with a permission error.
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
I checked the permissions of dataflowops_ecs_task_role
and it does have full access to S3 "s3:*"
Mike Grabbe
11/09/2022, 9:41 PMKrishnan Chandra
11/10/2022, 2:01 PM• You must configure remote Storage. Local storage is not supported for ECS tasks. The most commonly used type of storage withNormally, I run flows that are stored within the same Docker image that contains the agent, and it looks like theis S3. If you leverage that type of block, make sure thatECSTask
is installed within your agent and flow run environment. The easiest way to satisfy all the installation-related points mentioned above is to include the following commands in your Dockerfile:s3fs
DockerContainer
and KubernetesJob
infrastructures both support this (I primarily use KubernetesJob
and Process
infra atm): https://prefect-community.slack.com/archives/CL09KU1K7/p1666879201904529
It seems like it should be possible to run flows directly within the agent container on ECS: https://prefect-community.slack.com/archives/CL09KU1K7/p1665145511193949?thread_ts=1665144718.756049&cid=CL09KU1K7, but I see that the example in dataflow-ops utilizes S3 storage.
If possible, I would like to just run flows within the agent container itself on ECS and specify that the code is there too - is there a way to specify for ECS that the flow code is inside the image that’s being run? If not, is this on the roadmap anywhere?
Thanks for reading :)Claire Herdeman
11/10/2022, 2:50 PMSantiago Toso
11/11/2022, 4:54 PMbotocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
. It is uploading the file correctly, and even the image is building correctly but it is failing when I run the flow from the UI. From what I understand in the error message, when running the flow from the docker image, it can't access S3 to read the code. Here are a few more details:
• Storage block --> S3. It uploads correctly
• Docker image --> Custom image retrieved from ECR
• Code --> It is the same code I use in other non-dockerized deployment.
• The EC2 instances for the Prefect Agent, Prefect Server and S3 Bucket are all in the same AWS account
• The EC2 instance for the PrefectAgent has S3 full access (should the EC2 instance for the Prefect Server also have full S3 access?)
I leave in the image below with the full error I get in the Prefect UI and in the Agent's terminal.
I saw a similar AccessDenied
error in this channel when trying to upload the code to S3 but this seems different. The problem is when the flow is running in docker and can't retrieve the code.
Any ideas why this could be? Thanks a lot for your help!Claire Herdeman
11/15/2022, 9:11 PMtask_definition_arn
in an ECSTask
block, details belowJoshua Grant
11/16/2022, 3:25 PMDockerRegistry
block with AWS ECR?Claire Herdeman
11/16/2022, 3:33 PM