skaaks
06/13/2024, 10:53 PMMarvin
06/13/2024, 10:53 PMdeployment.yaml file to specify your configurations. Here are the key parameters and an example code snippet to help you get started:
### Key Parameters:
1. Work Queue: Specifies the queue that will handle the deployment runs.
2. Parameters: Default parameter values for flows.
3. Schedule: Schedule your deployment runs using cron, interval, or another type of schedule.
4. Infrastructure: Define the infrastructure type and related settings.
5. Example of scheduling parameters:
yaml
schedules:
- cron: "0 0 * * *"
timezone: "America/Chicago"
active: false
- cron: "0 12 * * *"
timezone: "America/New_York"
active: true
- cron: "0 18 * * *"
timezone: "Europe/London"
active: true
### Example Deployment Configuration:
Here is a minimal example of the `deployment.yaml`:
yaml
name: Example Deployment
flow_name: my_flow
parameters:
param1: value1
param2: value2
schedule:
cron: "0 12 * * *" # runs daily at noon
timezone: "America/New_York"
work_queue_name: example-queue
infrastructure:
type: process
env: {}
labels: {}
command:
- python
- -m
- prefect.engine
stream_output: true
### Scheduling Deployment Runs:
You can configure prefect.yaml to schedule your deployment runs with detailed settings. Ensure the schedules section is properly defined.
For more detailed guidance, you can refer to the following links:
- Deployment Concepts
- Scheduling Deployments
### Sample Code for Task Submission:
Additionally, here's a sample Python code using Prefect's flow and task decorators:
python
import time
from prefect import flow, task
@task
def sleep(t: int):
print(f"Going to sleep for {t} seconds...")
time.sleep(t)
print("...and I'm awake!")
@flow(log_prints=True)
def my_flow():
sleep(3)
future = sleep.submit(2)
future.wait()
sleep.map([5] * 10)
if __name__ == "__main__":
my_flow()
Keep all these configurations in mind and remember to validate your deployment YAML before applying it using prefect deployment apply.
If you need further details, consult the Prefect Documentation.Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.
Powered by