Santiago Gonzalez
01/17/2022, 4:54 PMec2_instances
and others). In ec2_instances.py
there are 3 task methods, and in this way it works. However, since I made a refactor in that file in order to add some behavior I needed, it’s failing. I added some non-task methods that are going to be invoked in the existing task methods. The error that is being thrown is:
Failed to load and execute Flow's environment: StorageError('An error occurred while unpickling the flow:\n AttributeError("Can\'t get attribute \'get_boto_client_with_auth\' on <module \'...ec2_instances\' from \'/usr/local/lib/python3.7/site-packages/automation_library/ec2_instances.py\'>")\nThis may be due to one of the following version mismatches between the flow build and execution environments:\n - cloudpickle: (flow built with \'1.6.0\', currently running with \'2.0.0\')\n - python: (flow built with \'3.7.9\', currently running with \'3.7.12\')')
Why is it failing only when I add non-task methods? Does it make any sense?Anna Geller
01/17/2022, 5:39 PMAttributeError("Can\'t get attribute \'get_boto_client_with_auth\' on <module
Did you install this as a package on your agent? what agent do you use? what is your Storage and run configuration?Santiago Gonzalez
01/17/2022, 5:50 PMget_boto_client_with_auth
and move it into a task method, as an internal method of another function, it works.Anna Geller
01/17/2022, 5:52 PMSantiago Gonzalez
01/17/2022, 5:53 PMAnna Geller
01/17/2022, 6:04 PMSantiago Gonzalez
01/17/2022, 6:15 PMAnna Geller
01/17/2022, 6:23 PMpip install -e .
and then e.g. on your CI/CD push the code changes to S3 and on the EC2 instance you can build e.g. a cron job that regularly syncs the code from S3 to your instance:
aws s3 sync <s3://yourbucket/your_code> /path/to/your_code