Seonghwan Hong
03/16/2021, 6:09 AMTimo
03/16/2021, 10:45 AMbotocore.exceptions.NoCredentialsError: Unable to locate credentials
I register the flow with the following method:
flow.storage = S3(
bucket="prefect-flows",
secrets=["AWS_CREDENTIALS"],
)
flow.register(project_name="abc")
The backend is set to cloud and I logged in successfully (prefect auth login ...
)
How could I use the AWS_CREDENTIALS secret which is set in the cloud?Mark Koob
03/16/2021, 12:47 PMZach Khorozian
03/16/2021, 3:43 PMBrett Naul
03/16/2021, 4:54 PMCharles Liu
03/16/2021, 5:44 PM@property
def _boto3_client(self): # type: ignore
from prefect.utilities.aws import get_boto_client
kwargs = self.client_options or {}
return get_boto_client(resource="codecommit", credentials=None, **kwargs)
and then the get_boto_client func:
def get_boto_client(
resource: str,
credentials: Optional[dict] = None,
region_name: Optional[str] = None,
profile_name: Optional[str] = None,
**kwargs: Any
) -> "botocore.client.BaseClient":
but it seems it holds onto None and doesn't let anything else pass.
Is there anyone else here that has successfully connected to CodeCommit through Prefect?Charles Liu
03/16/2021, 8:28 PMTraceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/prefect/engine/flow_runner.py", line 245, in run
parameters=parameters,
File "/usr/local/lib/python3.6/site-packages/prefect/engine/cloud/flow_runner.py", line 402, in initialize_run
raise KeyError(msg) from exc
KeyError: 'Task slug CRED_secrets-1 not found in the current Flow; this is usually caused by changing the Flow without reregistering it with the Prefect API.'
matta
03/16/2021, 9:35 PMitay livni
03/16/2021, 10:34 PMPrefectSecret
and coiled
. Posted the question on stack in case somebody else runs into this.
https://stackoverflow.com/questions/66664086/how-does-one-use-prefect-cloud-secrets-with-coiledCA Lee
03/17/2021, 1:11 AMfrom prefect import task, Task
# Example 1
class Example(Task):
def run(self, **kwargs):
do something
# Example 2
@task
def example():
do something
In particular, how do I force a rebuild of the second task (using decorator)? Running into some issues running flows using ECS / ECR, where the task would not get rebuilt if using decorator formatS K
03/17/2021, 4:36 AMMorgan Omind
03/17/2021, 10:18 AMTim Enders
03/17/2021, 2:08 PMSven Teresniak
03/17/2021, 2:43 PMKarolína Bzdušek
03/17/2021, 2:46 PMRealese 0.14.11:
Added/`project_id` to flow run execution context in orchestrated flow runsproject_name
Matthew Blau
03/17/2021, 3:20 PMBraun Reyes
03/17/2021, 5:22 PMBraun Reyes
03/17/2021, 5:44 PMS K
03/17/2021, 6:09 PMCalvin Pritchard
03/17/2021, 6:43 PMtest.py
from prefect import task, Flow
from prefect.engine.results import LocalResult
OUTPUT_DIR = './outputs'
@task(target='x.txt', checkpoint=True, result=LocalResult(dir=OUTPUT_DIR))
def saver():
return 10
with Flow('example') as flow:
saved = saver()
flow.run()
when I run python test.py
I expect a file tree like
tree
.
├── outputs
│ └── x.txt
└── test.py
but when I run python test.py
I don't see an x.txt
file
.
├── outputs
└── test.py
I am using prefect==0.14.12
What do I need to do to get prefect
to save (and persist) file results?jeff n
03/17/2021, 7:13 PMJacob Hayes
03/17/2021, 7:52 PMJosh
03/17/2021, 8:13 PMExecuteNotebook
?Justin Essert
03/17/2021, 8:33 PMfrom prefect import Flow, task
@task
def add_ten(x):
return x + 10
with Flow('iterated map') as flow:
mapped_result = add_ten.map([1, 2, 3])
mapped_result_2 = add_ten.map(mapped_result)
This allows you to map the first add_ten call to the three elements in the list, and then to map the second add_ten call to the three outputs from the first call.
Instead of returning a single int from the add_ten function, we want it to return an list of ints and continue to fan-out (to 6 elements in the example below):
@task
def add_ten(x):
return [x + 10, x]
with Flow('iterated map') as flow:
mapped_result = add_ten.map([1, 2, 3])
mapped_result_2 = add_ten.map(mapped_result)
Is this something that is supported by Prefect, or can you only have 1 fan-out per map/reduce?Josh Greenhalgh
03/17/2021, 9:18 PMValueError: Tasks with variable positional arguments (*args) are not supported, because all Prefect arguments are stored as keywords. As a workaround, consider modifying the run() method to accept **kwargs and feeding the values to *args.
Has anyone had any luck using their own? The one I want to use is this;
def jitter(func: Callable) -> Callable:
"""
For use with short running mapped tasks
to avoid overwhelming the api server
calls function sleeps for random period
"""
def wrapped(*args, **kwargs):
res = func(*args, **kwargs)
sleep(random.gauss(1.2, 0.2))
return res
return wrapped
Basically just waits for a while and does so randomly so lots of short running mapped tasks don't all hit my api server (on k8s) - I much prefer this than having sleeps in my actual task codeManik Singh
03/18/2021, 6:56 AMfiles
arg for the Docker storage? I'm trying to use some common scripts across flows packaged into separate Docker images. I think the files
arg will serve my purpose, but the Prefect API docs suggest to use absolute paths, while relative paths would make more sense.Akash Rai
03/18/2021, 7:47 AMHawkar Mahmod
03/18/2021, 12:53 PMprint(state.result[baseline_fractions]._result)
. All that happens is that my dir
provided location produces the directories and then an empty folder. What am I missing here?Marwan Sarieddine
03/18/2021, 2:53 PMnot_all_skipped
trigger, it seems the naming might be a bit not intuitive for me, or I might be missing something - please see the example in the thread (any help would be appreciated)Zach Hodowanec
03/18/2021, 3:33 PMrun_config
parameters on a Kubernetes Agent for subsequent flows to consume rather than duplicating similar run_configs
across various flows. We currently make use of the PREFECT__CLOUD__AGENT__ENV_VARS
to pass along storage
configurations, but not having much success attempting to update the execution environment to use an internal custom image. I have tried passing the IMAGE
and the PREFECT__CONTEXT__IMAGE
environment variables to my job spec thus far to no avail.