https://prefect.io logo
Title
c

Chris Gunderson

01/10/2023, 2:11 PM
Is there a way to put the deployment files in a different folder? Right now it is in the root folder of my project. I'd like to separate this by environment. Am I over thinking this? Should I just delete these files?
prefect deployment build -n fidelity-allocations-deployment -q ecs-worker -a -sb s3/sra-s3 -ib ecs-task/default src/main/prefect/flows/allocations/prefect_fidelity_allocations.py:FidelityAllocationsFlow --cron "13 15 * * 1-5"
^^This is my current deployment command using the CLI
m

Miguel Moncada

01/10/2023, 2:27 PM
Not sure if this is any helpful, and if there's any other recommended way, so let's see if anyone else can shed some light šŸ™‚ In my case I had adapted this template to follow the next structure. TL;DR I have my deployment files in a different location, as a python module
dataflows/deployments
that is imported and called from the root of the repo through a python script
deployments.py
.
ā”œā”€ā”€ README.md
ā”œā”€ā”€ config
│   ā”œā”€ā”€ agent
│   │   └── Dockerfile
│   └── runner
│       └── Dockerfile
ā”œā”€ā”€ dataflows
│   ā”œā”€ā”€ __init__.py
│   ā”œā”€ā”€ _version.py
│   ā”œā”€ā”€ deployments
│   │   ā”œā”€ā”€ __init__.py
│   │   ā”œā”€ā”€ _constants.py
│   │   └── hello_deployment.py
│   └── flows
│       ā”œā”€ā”€ __init__.py
│       ā”œā”€ā”€ hello_flow.py
│       └── utils
ā”œā”€ā”€ deployments.py
ā”œā”€ā”€ docker-compose.yml
ā”œā”€ā”€ docs
ā”œā”€ā”€ requirements.txt
ā”œā”€ā”€ setup.py
ā”œā”€ā”€ tests
│   ā”œā”€ā”€ README.md
│   ā”œā”€ā”€ conftest.py
│   └── unit
│       ā”œā”€ā”€ __init__.py
│       └── flows
│           ā”œā”€ā”€ __init__.py
│           └── test_hello_flow.py
└── tox.ini
c

Chris Gunderson

01/10/2023, 3:13 PM
@Miguel Moncada Thanks. I'll take a look into using the Python scripts instead. That might be a better route to take
šŸ™‡ 1
d

Danilo Drobac

01/10/2023, 4:08 PM
@Miguel Moncada what's the contents of
deployments.py
?
m

Miguel Moncada

01/10/2023, 4:09 PM
import os
import logging
import argparse
from dataflows.deployments._constants import (
    FLOW_DEPLOYMENT_DICT,
    DEPLOYMENT_FILE_FUNC_DICT,
)


def get_file_name_file_extension(file_path: str) -> tuple:
    """Function to extract the file name and extension from a file URI
    Args:
        file_uri (str): URL pointing to the file
    Returns:
        tuple: file name and extension
    """
    file_name_with_ext = os.path.basename(file_path)
    file_name, file_ext = os.path.splitext(file_name_with_ext)
    return (file_name, file_ext)


def get_deploy_function(file_name: str) -> callable:
    """Function to get the deployment function for a flow

    Args:
        flow (str): flow file name without extension

    Returns:
        callable: deployment function
    """
    deploy_function = FLOW_DEPLOYMENT_DICT.get(file_name)

    if not deploy_function:
        deploy_function = DEPLOYMENT_FILE_FUNC_DICT.get(file_name)
    return deploy_function


if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description=__doc__,
        formatter_class=argparse.RawDescriptionHelpFormatter
    )
    parser.add_argument(
        "-f",
        "--file_path",
        help="File path",
        required=True,
    )

    args = parser.parse_args()
    logger = logging.getLogger()
    logger.setLevel(<http://logging.INFO|logging.INFO>)

    file_name, _ = get_file_name_file_extension(args.file_path)

    deploy_function = get_deploy_function(file_name)

    if deploy_function:
        <http://logger.info|logger.info>(f"Deploying {file_name} using {deploy_function.__name__}")
        deploy_function()
        <http://logger.info|logger.info>(f"{file_name} has been deployed!")
    else:
        <http://logger.info|logger.info>(
            f"No deployment function found for {file_name},"
            "skipping..."
        )
šŸ‘ 1
c

Chris Gunderson

01/10/2023, 4:10 PM
Thank you. I'm currently using the CLI for the deployment, so it is generating a yaml file and applying that
prefect deployment build -n fidelity-allocations-deployment -q ecs-worker -a -sb s3/sra-s3 -ib ecs-task/default src/main/prefect/flows/allocations/prefect_fidelity_allocations.py:FidelityAllocationsFlow --cron "13 15 * * 1-5"
d

Danilo Drobac

01/10/2023, 4:10 PM
Awesome. I like that method of deploying lots of Flows šŸ‘
c

Chris Gunderson

01/10/2023, 4:15 PM
@User If you were not using GitHub could you call python deployment.py when using your virtual environment?
m

Miguel Moncada

01/10/2023, 4:15 PM
Happy to see it's helpful! in my case it's packed up with a GitHub workflow that will trigger for any target changed file in the
dataflows/deployments
or
dataflows/flows
c

Chris Gunderson

01/10/2023, 4:17 PM
The company I work for is currently using codeCommit (AWS), but I can see how GitHub actions is very helpful. We have a test project setup in GitHub, so we might migrate over there.
šŸ‘ 1