when you put AWS_CREDENTIALS in via the UI as a Se...
# prefect-community
when you put AWS_CREDENTIALS in via the UI as a Secret, do you use the text or the json options, and do you put it in as a JSON snippet?
The syntax is JSON-based:
Copy code
so in the UI, it would be:
Copy code
  "ACCESS_KEY": "abcdef",
okay. That is what Ive done. Since I'm running an ECSAgent, Im suspecting there's something more I have to do? ``````
Are you running into a specific error with the ECS agent? or when you deploy a Flow?
upvote 1
when I run a flow
says no credentials found
Your Flow is in S3?
@Kevin Kho I had resolved what I thought were the errors with my earlier thread about awsclientwait, then when actually running the flow, this error now occurs. I looked at the docs, and added the secret, but I feel like there's one more thing to connect
Try adding the secret to storage like this
yes, flow is in s3.
I'll try this . thanks, folks.
one explosion after another:
Copy code
[31 January 2022 2:33pm]: Failed to load and execute Flow's environment: FlowStorageError("An error occurred while unpickling the flow:\n TypeError('code() takes at most 15 arguments (16 given)')\nThis may be due to one of the following version mismatches between the flow build and execution environments:\n - python: (flow built with '3.8.8', currently running with '3.7.12')")
seems to have fixed the credentials error tho
my ecsagent is using the "image": "prefecthq/prefect:latest-python3.8", so I'm not sure what I've done wrong. My flow in it's entirety:
Copy code
from ntpath import join
from prefect import Flow, task
from prefect.tasks.aws.batch import BatchSubmit
from prefect.tasks.aws.client_waiter import AWSClientWait
from prefect.client import Secret
from prefect.backend.kv_store import get_key_value
from prefect.storage import S3

FLOW_NAME="aws-batch-hello-world-flow" # name of your flow in cloud UI
PROJECT_NAME="tokenized-channel-fullfillment" #project that flows/runs are attached to
STORAGE=S3(bucket="sgmt-prefect-dev-flows",secrets=["AWS_CREDENTIALS"]) #bucket where flows are stored. Updates to this require you to run locally.
# boto_args = {
#     "containerOverrides": {
#         "environment": [
#             {"name": "workflow_SCRIPT", "value": get_key_value(key="workflow_SCRIPT")},
#             {"name": "workflow_OPTS", "value": get_key_value(key="workflow_OPTS")},
#         ],
#         "vcpus": 4,
#         "memory": 32750,
#     },
# }

def wait_for_hello(job_id): #, delay, max_attempts):

    waiter = AWSClientWait(
            'jobs': [job_id],
            # 'WaiterConfig': {
            #     'Delay': delay,
            #     'MaxAttempts': max_attempts
            # }

    return job_id

def say_hello():

    batchjob =  BatchSubmit(job_name="prefect-tcf-hello",
    job_id = batchjob.run() 

    return job_id

with Flow(FLOW_NAME, storage=STORAGE) as flow:

    job_id = say_hello()

flow.register(project_name=PROJECT_NAME, labels=["dev"])
You image is using 3.8. I think you registered in 3.8, but I think the ECS is pulling 3.7 container. Python minor versions (and preferably Prefect versions) need to match for the serialization and de-serialization to be consistent. You want to specify your ECS image
right, And I specified that in the agent config whenI spun it up with terraform.
Copy code
"image": "prefecthq/prefect:latest-python3.8",
I've got 3.8.8 local
The agent container is different from the flow run container. The agent spins up a new ECS task in another container and I think it’s that other ECS task that is 3.7
You can use the ECSRun configuration to add a container specification to the Flow Run
yeah, I ran on a virtualenv on 3.7 to test
Sorry I’m a bit confused. Are you good now?
I think so - Sorry - instead of changing the image, I just recompiled iunder a different python version. I also did what you suggested with ECS Run and have a flow in progress now. I think I have SOMETHING working so I can move forward. Thanks so much for all your help so far.
Ah ok. Yep sounds good!