Jose Daniel Posada Montoya
08/16/2021, 11:35 PMDockerRun
) in sync with the code/package I have in the repository. One of the approaches I came up with was to develop a Storage (very similar to Git) that would do the following:
1. Clone the repository into a temp folder.
2. Extract the Flow (as it is currently done).
3. Search the repo for a specified Dockerfile
. (the Dockerfile
would do something like an ADD
of the entire repo and a pip install -e .
so that the image has the latest code in it).
4. Build the image with a certain tag (the same tag that DockerRun
would use).
(This is all assuming that the Storage code is executed in the Docker Agent process just before the container is started which would give time to create the image before the run_config
create the container and run the flow inside)
I was going to do a test by creating the custom Storage but I ran into things I didn't take into account like serialization
. But before I explore this option further I wanted to ask:
• Is there any way to achieve what i want, that I am not seeing, that doesn't involve a Container Registry or a CI pipeline?
• Does the solution I propose make sense to you? Do you think it would work or am I missing something?
• Would it be useful for more people and would it be worth some contribution?
I would appreciate any guidance, advice or comments on the subject.Kevin Kho
Docker
container and the actual script sometimes. If you container has all of the dependencies you need, just use DockerRun
+ your Git storage. The flow will be downloaded from Git storage and run on top of the container. This won’t need any CI pipeline at all.
You can’t create a custom Storage because the storage information is pushed to our database. Upon flow run, it uses that Storage class to search for your Flow and load it. So by default, it has to be something we know how to work with.