Vikash Sharma
04/02/2024, 9:45 AMpython
`exec_path = "main.py:crawling_tracker"`
`source = GitRepository(`
`url=repo,`
`branch=BRANCH,`
`)`
`my_flow = flow.from_source(source=source, entrypoint=exec_path)`
`deployment = my_flow.deploy(`
`name="my_flow",`
`work_pool_name=work_pool,`
`parameters={"test": test},`
`schedule=interval_schedule,`
`job_variables={`
`"env": {`
`"EXTRA_PIP_PACKAGES": "prefect==2.14.16 Scrapy==2.11.0 minio==7.2.3 uptrace==1.22.0 scrapeops-scrapy==0.5.4 prefect-docker==0.4.4"`
`}`
`}`
`)`
However, I now intend to optimize our workflow deployment process. Specifically, I aim to create Docker images with our workflows and required dependencies installed, and then deploy these images onto our Kubernetes infrastructure.
Could you please provide guidance on how to achieve this effectively?
Thank you.Jeff Hale
04/02/2024, 12:08 PMVikash Sharma
04/02/2024, 5:51 PMJeff Hale
04/02/2024, 5:55 PMfrom prefect import flow
@flow(log_prints=True)
def buy():
print("Buying securities")
if __name__ == "__main__":
buy.deploy(
name="my-code-baked-into-an-image-deployment",
work_pool_name="my-docker-pool",
image="my_registry/my_image:my_image_tag"
)
Jeff Hale
04/02/2024, 6:23 PM@flow(timeout_seconds=10)
If the flow runs for over 10 seconds it will TimeOut.
You can also set a Job Timeout on a number of work pool types (Kubernetes, Prefect Managed, ACI, ECS, Cloud Run).