xyzz
07/20/2022, 8:49 AMfs = RemoteFileSystem(basepath="<s3://my-bucket/folder/>")
fs.write_path("foo", b"hello")
fs.save("dev-s3")
What das fs.write_path do? Store actual content on the fs?
Also, is the secret key stored in the prefect database if you pass them to the settings parameter of RemoteFileSystem and then call fs.save? If so, how is ensured only I can read it and not e.g. a admin at prefect?Andreas Nigg
07/20/2022, 10:54 AMIsara Ovin
07/20/2022, 11:30 AMadd_json_index() missing 1 required positional argument: 'output'
output = filter_empty_responses(data)
# Parsing Jsons
indexed_output = add_json_index(output, upstream_tasks=[filter_empty_responses])
David
07/20/2022, 11:35 AM400 List of found errors: 1.Field: job_spec.worker_pool_specs[0].container_spec.env[12].value; Message: Required field is not set. 2.Field: job_spec.worker_pool_specs[0].container_spec.env[3].value; Message: Required field is not set. 3.Field: job_spec.worker_pool_specs[0].container_spec.env[4].value; Message: Required field is not set. [field_violations {
field: "job_spec.worker_pool_specs[0].container_spec.env[12].value"
description: "Required field is not set."
}
field_violations {
field: "job_spec.worker_pool_specs[0].container_spec.env[3].value"
description: "Required field is not set."
}
field_violations {
field: "job_spec.worker_pool_specs[0].container_spec.env[4].value"
description: "Required field is not set."
}
]
Chu
07/20/2022, 12:56 PMMatthew Seligson
07/20/2022, 1:35 PMharis khan
07/20/2022, 2:00 PMMatt Delacour
07/20/2022, 2:37 PMLocalDaskExecutor
and I cannot understand the problem.
I am trying to run parallel calls for Rest API endpoints. The problem is that the Dask executor will run some calls twice for no apparent reason. Here is the logic I have
Then I can see that some Dask tasks run on the same endpoint while when I log all the endpoints, there are all unique...
I will post the logic as a snippet in the thread 🧵Denys Volokh
07/20/2022, 2:39 PMPREFECT__CONTEXT__SECRETS_GITHUB_ACCESS_TOKEN
Registered flow with GitHubStorage
flow.storage = GitHub(
repo="company/prefect-workflows",
path="flows/benchmarks/flow_import_index_data.py",
ref="master",
access_token_secret="GITHUB_ACCESS_TOKEN",
)
but when I run flow from cloud.prefect.com I am getting error
Failed to load and execute flow run: ValueError('Local Secret "GITHUB_ACCESS_TOKEN" was not found.')
alex
07/20/2022, 2:46 PMRiccardo Tesselli
07/20/2022, 3:05 PMdevelopment
and production
, so to have a dedicated workspace for each environment. Then, after exploring Prefect 2.0 features, I’ve started to question this because one could use one workspace and setup everything in order to distinguish between development and production pipelines. So I wonder, what do you suggest for managing enviroments? Go with different workspaces or have one workspace? Then, what should be the ideal use case for a workspace?Christian Vogel
07/20/2022, 3:51 PMFile "/opt/conda/lib/python3.8/site-packages/distributed/worker.py", line 2742, in loads_function result = pickle.loads(bytes_object) File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 73, in loads return pickle.loads(x) ModuleNotFoundError: No module named 'prefect'
When entering the container, prefect seems to be available though. Do you have any idea what could be the reason?Jason
07/20/2022, 4:24 PMSam Maradwi
07/20/2022, 4:29 PMbotocore.exceptions.ClientError: An error occurred (AccessDeniedException) when calling the GetSecretValue operation: User: arn:aws:sts::xxx:assumed-role/code_deployments-role/iddoc- is not authorized to perform: secretsmanager:GetSecretValue on resource: 4i-adl-config because no identity-based policy allows the secretsmanager:GetSecretValue action
Matt Delacour
07/20/2022, 6:15 PMChris Reuter
07/20/2022, 7:00 PMTim Enders
07/20/2022, 8:30 PMmap
operator coming? That is a big stumbling block for us adopting 2.0. Thanks!kiran
07/20/2022, 9:46 PMAxiosError: Request failed with status code 401
. So I thought maybe I could sign in with my 1.0 credentials and got Error: Invalid username or password.
Am I unable to sign up for 2.0 if I also have 1.0?Mansour Zayer
07/20/2022, 10:48 PMsubprocess
to run my dbt project locally (Prefect 1.2.2, Windows). I create my command (dbt run --vars '{data_processing_start_date: 2022-07-20, data_processing_end_date: 2022-07-20}' --profiles-dir ./
) like this:
command = (
f"dbt run --vars '{{"
f"data_processing_start_date: {data_processing_start_date}, "
f"data_processing_end_date: {data_processing_end_date}}}' --profiles-dir ./ "
)
The command is created correctly, but dbt gives me this error dbt: error: unrecognized arguments: 2022-07-20, data_processing_end_date: 2022-07-20}'
Seems like dbt interprets 2022-07-20
as an argument instead of the value for data_processing_start_date
variable.
Keep in mind that when I run the same command in my CLI, dbt works fine. But when it's provided to dbt through subprocess
this occurs.
This is my subprocess:
subprocess.run(
command,
check=True,
stderr=True,
stdout=True,
shell=True,
cwd="dbt",
)
Any idea what might cause this, and how to solve this? Thank youR
07/20/2022, 11:16 PMPriyank
07/21/2022, 7:19 AMStefan
07/21/2022, 8:39 AMDenys Volokh
07/21/2022, 9:35 AMVadym Dytyniak
07/21/2022, 10:07 AM{
'apiVersion': 'batch/v1',
'kind': 'Job',
'spec': {
'template': {
'spec': {
'nodeSelector': {
"<http://topology.kubernetes.io/zone|topology.kubernetes.io/zone>": "us-east-1a",
"<http://dask.corp.com/subnet-type|dask.corp.com/subnet-type>": "private",
"<http://dask.corp.com/storage|dask.corp.com/storage>": 'true',
},
'containers': [
{'name': 'flow'},
]
}
}
}
}
Rajvir Jhawar
07/21/2022, 10:48 AMMahesh
07/21/2022, 12:24 PMPrass
07/21/2022, 12:27 PMtask1
. But do task2
only if task1
is completed, and task3
only if task1
and task2
are complete?
2. Does prefect 2.0 parallelize flows over async for
s?
3. I write (append) to a file in one of my tasks. Is that prefect
safe?Justin Trautmann
07/21/2022, 12:34 PMadmin/version
api route doesn't work for me. thanks a lot.Chu
07/21/2022, 12:59 PMToby Rahloff
07/21/2022, 2:19 PMKeyError: "No class found for dispatch key 'S3 Storage:sha256:68ed [...]' in registry for type 'Block'."
. The full error trace and code can be found in the first comment to this post.Toby Rahloff
07/21/2022, 2:19 PMKeyError: "No class found for dispatch key 'S3 Storage:sha256:68ed [...]' in registry for type 'Block'."
. The full error trace and code can be found in the first comment to this post.Kevin Kho
07/21/2022, 2:21 PMToby Rahloff
07/21/2022, 2:23 PMKevin Kho
07/21/2022, 2:27 PMToby Rahloff
07/21/2022, 2:28 PMKevin Kho
07/21/2022, 3:00 PMAnna Geller
07/21/2022, 3:15 PMToby Rahloff
07/22/2022, 6:51 AM