Greg Roche
04/15/2021, 4:07 PMmain.py
file that imports code from these other files and defines the actual flow logic.
If main.py
is updated, a simple re-registration of the flow seems to be enough to allow the agent to execute the updated flow, because the updated logic is stored on S3 and is then downloaded by the agent during the next execution. However, if one of the other files (which main.py
imports) is changed, re-registration alone isn't enough to allow the agent to execute the updated flow, seemingly because only the content of main.py
is stored on S3 at registration. Practically this means that almost every time we make any change any of our flows, we need to rebuild our docker image with the updated logic, redeploy it, and replace the old agent with the new one, before then re-registering the flow.
Is there some way for us to register a flow so that all of the flow's code, not just the code in the file that defines the flow, is stored in S3 and we don't need to constantly rebuild and redeploy the agent's image for almost every change? Or is there a cleaner approach to solving this issue which has worked for anybody here? Thanks in advance.Zanie
Zanie
Zanie
DockerRun
(this would eliminate the need to redeploy your agent)Zanie
stored_as_script=True
), some functions from other modules/files will be pickled alongside your main.py
and used instead of importing the old functions available on the agent at unpickle time but this is honestly a weird effect and I wouldn't try to rely on it.Greg Roche
04/16/2021, 8:54 AM