Hi there. I'm signed up to Prefect 2.0 cloud. I'm ...
# prefect-community
Hi there. I'm signed up to Prefect 2.0 cloud. I'm trying to set up a simple agent to run on a docker container. So far this is my
for the agent:
Copy code
FROM prefecthq/prefect:2.0b2-python3.9

RUN prefect cloud login --key cloud_api_key

ENTRYPOINT prefect agent start 'queue_id'
However, the second line requires me to specify the workspace - is there some flag I can add (like
--workspace {workspace}
) ? There probably is a much better way to set up an agent, any other docs would be appreciated! Thank you
I think for you can manually configure the endpoint like this:
Copy code
$ prefect config set PREFECT_API_URL="<https://beta.prefect.io/api/accounts/[ACCOUNT-ID]/workspaces/[WORKSPACE-ID]>"
$ prefect config set PREFECT_API_KEY="[API-KEY]"
as seen a bit below here
ah yes, thank you! Getting this error on container startup:
Copy code
Agent started! Looking for work from queue 
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/prefect/client.py", line 747, in get_runs_in_work_queue
    response = await <http://self._client.post|self._client.post>(
  File "/usr/local/lib/python3.9/site-packages/prefect/utilities/httpx.py", line 137, in post
    return await self.request(
  File "/usr/local/lib/python3.9/site-packages/prefect/utilities/httpx.py", line 80, in request
  File "/usr/local/lib/python3.9/site-packages/httpx/_models.py", line 1510, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '404 Not Found' for url '<https://beta.prefect.io/accounts/{account_id}/workspaces/{workspace_id}/work_queues/{id}/get_runs>'
What is your version of Prefect?
the container also uses this image:
FROM prefecthq/prefect:2.0b2-python3.9
Could you try on local first and see if it works with the env variables? Seems like the API URL might not be right?
same error locally. However, if I use
prefect cloud login --key api_key
followed by
prefect agent start 'queue_id'
it works fine locally
I'm specifying in the
does exist
In the cloud UI there is a command you can copy in the bottom. Are you using this command?
ok it works now, apologies for wasting your time but I was missing the
. Thanks again for assisting, we're still evaluating prefect as the replacement for celery.
Oh no worries at all. Man to be honest, I don’t hear us being compared to celery a lot. It’s more like Airflow + Celery. I think this would really boil down to use case, but I’d love to personally hear your thoughts whatever they are we have floated the idea of what it would take to achieve task queue use cases
upvote 1
I'll message you privately about this
the flow runs are being picked up buy the agent (running on a docker container), however its now throwing this error:
Copy code
Flow could not be retrieved from deployment.
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/prefect/engine.py", line 199, in retrieve_flow_then_begin_flow_run
    flow = await load_flow_from_deployment(deployment, client=client)
  File "/usr/local/lib/python3.9/site-packages/prefect/client.py", line 82, in with_injected_client
    return await fn(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/prefect/deployments.py", line 329, in load_flow_from_deployment
    maybe_flow = await client.resolve_datadoc(deployment.flow_data)
  File "/usr/local/lib/python3.9/site-packages/prefect/client.py", line 1664, in resolve_datadoc
    return await resolve_inner(datadoc)
  File "/usr/local/lib/python3.9/site-packages/prefect/client.py", line 1657, in resolve_inner
    data = await self.retrieve_data(data)
  File "/usr/local/lib/python3.9/site-packages/prefect/client.py", line 1227, in retrieve_data
    return await storage_block.read(embedded_datadoc)
  File "/usr/local/lib/python3.9/site-packages/prefect/blocks/storage.py", line 216, in read
    async with await anyio.open_file(storage_path, mode="rb") as fp:
  File "/usr/local/lib/python3.9/site-packages/anyio/_core/_fileio.py", line 156, in open_file
    fp = await to_thread.run_sync(open, file, mode, buffering, encoding, errors, newline,
  File "/usr/local/lib/python3.9/site-packages/anyio/to_thread.py", line 28, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(func, *args, cancellable=cancellable,
  File "/usr/local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 818, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 754, in run
    result = context.run(func, *args)
FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/ck
I’d have to ask the team about that and get back to you
This looks to be storage related where the Flow cannot be retrieved from its storage. Did you specify a default storage?
@Donnchadh McAuliffe you can't use local storage with DockerFlowRunner - for now, the best option would be switching to object storage. This thread contains examples for S3 and GCS https://discourse.prefect.io/t/how-to-use-dockerflowrunner-with-s3-storage-im-getting-a[…]ed-403-when-calling-the-headobject-operation-forbidden/530
thanks @Anna Geller, I've set my storage to an S3 bucket using
prefect storage create
, however how do I tell my agent to use this S3 bucket as the storage when running the flow?
you don't have to as long as it's the default storage - you can set this storage as default using:
Copy code
prefect storage set-default STORAGE_BLOCK_ID
I still get the same error I've mentioned above:
FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/ck/m4st70r516n_329q20h359mr0000gn/T/prefect/e15caade-32ac-458c-b346-a32bd7a0c9d2'
These are the steps I've done so far: 1. Signed up to prefect orion cloud 2. Created a queue 3. Deployed a flow 4. Assigned default storage to S3 5. Started a local agent (not on docker) using
prefect agent start '{queue_id}'
actually, it worked, I just needed to redeploy the flow
👍 1
and then in my agent
I just needed to set the storage:
RUN prefect storage set-default ...
nice work! 🙌