https://prefect.io
Join Slack
Hi I'm currently trying to do some deployment testing, and I'm getting the following error: Specifi...
s

Sang Young Noh

over 3 years ago
Hi I'm currently trying to do some deployment testing, and I'm getting the following error: Specification in 'newflow.py', line 13 failed validation! You have not configured default storage on the server or set a storage to use for this deployment but this deployment is using a Universal flow runner which requires remote storage. I've currently got a file called: from prefect import flow @flow def hello_world(name="world"): print(f"Hello {name}!") # Note: a deployed flow does not need a command to # explicitly run the flow. The API handles this for you. # hello_world() from prefect.deployments import DeploymentSpec DeploymentSpec( flow=hello_world, name="hello-world-daily", ) and the error comes out when I run the following: prefect deployment create newflow.py
✅ 1
s
a
v
  • 3
  • 102
  • 641
Hi Everyone, I am trying to trigger prefect flow deployed on cloud whenever file is landed to S3 buc...
j

JV

almost 3 years ago
Hi Everyone, I am trying to trigger prefect flow deployed on cloud whenever file is landed to S3 bucket using AWS Lambda function by referring this documentation from Chris. Lambda function is failing with
"errorMessage": "HTTP Error 404: Not Found"
I am passing API URL
<https://api.prefect.io>
. I understand that this documentation is not latest and I also tried URL
<https://api.prefect.cloud/>
and getting same error. Request your inputs regarding this error in Prefect version 2.0
✅ 1
j
r
n
  • 3
  • 12
  • 627
Question, I am running a flow and have it saving a file to a relative path. When I check the relativ...
j

John Kang

about 3 years ago
Question, I am running a flow and have it saving a file to a relative path. When I check the relative path on my local filesystem the file is not updated when the flow executes from a Prefect agent. I tried it again with a remote filesystem and it does not update the files on the remote filesystem. I checked on the server and the blocks holding the credentials for both local and remote both state that the block has read and write privileges. Any idea on what I can do to actually have the file update the local (when the deployment is run using the local file system) and remote (when the deployment is run using the remote file system)?
j
a
k
  • 3
  • 10
  • 625
Hi Prefect, I was wondering if it is possible to connect the prefect UI to an internal network, tha...
p

Peter Roelants

over 4 years ago
Hi Prefect, I was wondering if it is possible to connect the prefect UI to an internal network, that might not be available from the host I'm viewing the UI with? It seems that currently
server.ui.apollo_url
is called from the browser (and not from the server where the UI is running). For example, I have a docker-compose managed prefect server somewhere with an internal network called
prefect-server
, which has both
apollo
,
ui
, and the other prefect server services. Using
apollo_url="<http://localhost:4200/graphql>"
only works if I also bind
apollo:4200
to the local
4200
port where I'm running the browser from. For example setting
apollo_url="<http://apollo:4200/graphql>"
and letting the UI -> GraphQL connection go via the docker network wouldn't work afaik. In this case I see the following error in my browser's JS console:
VM9:1 POST <http://apollo:4200/graphql> net::ERR_NAME_NOT_RESOLVED
p
n
+2
  • 4
  • 13
  • 625
Hey guys, has anyone considered using Celery as the executor for prefect tasks? ie, A part of the wo...
s

Santhosh Solomon (Fluffy)

over 1 year ago
Hey guys, has anyone considered using Celery as the executor for prefect tasks? ie, A part of the workflow have CPU intensive tasks and need more level of concurrent distribution, which can’t be achieved with prefect agents. So Prefect agent will submit the tasks to a message queue with necessary parameters and Celery will manage such executions. Open for Triage!
s
a
  • 2
  • 4
  • 615
Has anyone figured out how to overwrite parameters in prefect 2 for a flow on a per schedule basis, ...
s

Seth Coussens

about 3 years ago
Has anyone figured out how to overwrite parameters in prefect 2 for a flow on a per schedule basis, the way you could in prefect 1? A lot of overhead in Prefect 2 right now if you just want to send a simple variable change for a flow depending on the time the flow runs. Currently, the only solution I've found is to have 2 separate deployments registered with different schedules and variables.
✅ 1
s
s
t
  • 3
  • 7
  • 610
i have created a docker worker on a server and trying to run my private docker image, but this error...
s

siddhant

about 1 year ago
i have created a docker worker on a server and trying to run my private docker image, but this error occues as it could not fetch docker image due to docker not able to login to registry.
s
r
+2
  • 4
  • 16
  • 601
Hi, we have suddenly getting these errors in our cluster `ImportError: cannot import name 'SecretFie...
a

Ankit

over 2 years ago
Hi, we have suddenly getting these errors in our cluster
ImportError: cannot import name 'SecretField' from 'pydantic' (/usr/local/lib/python3.8/dist-packages/pydantic/__init__.py)
✅ 1
a
c
+3
  • 5
  • 12
  • 598
Hello everyone! Is the `EXTRA_PIP_PACKAGES` environment variable supposed to work with `KubernetesFl...
a

Anatoly Myachev

over 3 years ago
Hello everyone! Is the
EXTRA_PIP_PACKAGES
environment variable supposed to work with
KubernetesFlowRunner
?
a
k
+2
  • 4
  • 14
  • 593
Hello - in prefect 2.0, is there a way to provide the task name dynamically ? For example, if I have...
d

Darshan

over 3 years ago
Hello - in prefect 2.0, is there a way to provide the task name dynamically ? For example, if I have a function defined as a task which is being called multiple times from a flow, I want to append a dynamic suffix to the task name.
d
z
a
  • 3
  • 7
  • 582
Previous345Next

Prefect Community

Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.

Powered by